Ad
Opinion

Web3-AI: What’s Real, and What’s Hype

The biggest challenge for the evolution of Web3-AI might be overcoming its own reality distortion field, says Jesus Rodriguez, CEO, IntoTheBlock.

Updated Jul 16, 2024, 6:18 p.m. Published Jul 16, 2024, 6:14 p.m.
(Marian/Getty Images)
(Marian/Getty Images)

The Web3-AI space is one of the hottest in crypto, combining great promise with significant hype. It almost feels heretical to point out the number of Web3-AI projects with multi-billion dollar market caps but no practical use cases, driven purely by proxy narratives from the traditional AI market. Meanwhile, the gap in AI capabilities between Web2 and Web3 continues to widen alarmingly. However, Web3-AI is not all hype. Recent developments in the generative AI market highlight the value proposition of more decentralized approaches.

Considering all these factors, we find ourselves in an overhyped and overfunded market that is disconnected from the state of the generative AI industry, yet capable of unlocking tremendous value for the next wave of generative AI. Feeling confused is understandable. If we step back from the hype and analyze the Web3-AI space through the lens of current requirements, clear areas emerge where Web3 can deliver substantial value. But this requires cutting through a dense reality distortion field.

Web3-AI Reality Distortion

As crypto natives, we tend to see the value of decentralization in everything. However, AI has evolved as an increasingly centralized force in terms of data and computation, so the value proposition of decentralized AI needs to start by countering that natural centralization force.

When it comes to AI, there is an increasing mismatch between the value we perceive to be creating in Web3 and the needs of the AI market. The concerning reality is that the gap between Web2 and Web3 AI is widening rather than shrinking, driven fundamentally by three key factors:

Limited AI Research Talent

The number of AI researchers working in Web3 is in the low single-digits. This is hardly encouraging for those who claim that Web3 is the future of AI.

Constrained Infrastructure

We haven’t yet managed to get web apps to work properly with Web3 backends, so thinking about AI is a stretch, to say the least. Web3 infrastructure imposes computational constraints that are impractical for the lifecycle of generative AI solutions.

Limited Models, Data, and Computational Resources

Generative AI relies on three things: models, data and compute. None of the large frontier models are equipped to run on Web3 infrastructures; there is no foundation for large training datasets; and there is a massive quality gap between Web3 GPU clusters and those required for pretraining and fine-tuning foundation models.

The difficult reality is that Web3 has been building a “poor man’s” version of AI, essentially trying to match the capabilities of Web2 AI but creating inferior versions. This reality starkly contrasts with the tremendous value proposition of decentralization in several areas of AI.

To avoid making this analysis as an abstract thesis, let’s dive into different decentralized AI trends and evaluate them against their AI market potential.

Read more: Jesus Rodriguez - Funding Open-Source Generative AI With Crypto

The reality distortion in Web3-AI has led the initial wave of innovation and funding to focus on projects whose value propositions seem disconnected from the realities of the AI market. At the same time, there are other emerging areas in Web3-AI that hold tremendous potential.

Some Overhyped Web3-AI Trends

Decentralized GPU Infrastructure for Training and Fine-Tuning

In the last few years, we have seen an explosion of decentralized GPU infrastructures with the promise of democratizing the pretraining and fine-tuning of foundation models. The idea is to enable an alternative to the GPU monopolization established by incumbent AI labs. The reality is that the pretraining and fine-tuning of large foundation models require large GPU clusters with super-fast communication buses connecting them. A pretraining cycle of a 50B-100B foundation model in a decentralized AI infrastructure could take over a year, if it works at all.

ZK-AI Frameworks

The idea of combining zero-knowledge (zk) computations and AI has sparked interesting concepts to enable privacy mechanisms in foundation models. Given the prominence of zk infrastructure in Web3, several frameworks promise to embed zk computation in foundation models. Although theoretically appealing, zk-AI models quickly encounter the challenge of being prohibitively expensive from a computational standpoint when applied to large models. Additionally, zk will limit aspects such as interpretability, which is one of the most promising areas in generative AI.

Proof-Of-Inference

Crypto is about cryptographic proofs, and sometimes these are attached to things that don’t need them. In the Web3-AI space, we see examples of frameworks issuing cryptographic proofs of specific model outputs. The challenges with these scenarios are not technological but market-related. Basically, proof-of-inference is somewhat of a solution looking for a problem and lacks any real use cases today.

Some High Potential Web3-AI Trends

Agents with Wallets

Agentic workflows are one of the most interesting trends in generative AI and hold significant potential for crypto. By agents, we are referring to AI programs that can not only passively answer questions based on inputs, but also execute actions against a given environment. While most autonomous agents are created for isolated use cases, we are seeing the rapid emergence of multi-agent environments and collaboration.

This is an area where crypto can unlock tremendous value. For instance, imagine a scenario where an agent needs to hire other agents to complete a task or stake some value to vouch for the quality of its outputs. Provisioning agents with financial primitives in the form of crypto rails unlocks many use cases for agentic collaboration.

Crypto Funding for AI

One of the best-known secrets in generative AI is that the open-source AI space is undergoing a tremendous funding crunch. Most open-source AI labs can’t afford to work on large models anymore and instead are focusing on other areas that don’t require massive amounts of compute access and data. Crypto is extremely efficient at capital formation with mechanisms such as airdrops, incentives or even points. The concept of crypto funding rails for open-source generative AI is one of the most promising areas at the intersection of these two trends.

Small Foundation Models

Last year, Microsoft coined the term small language model (SLM) after the release of its Phi model, which, with less than 2B parameters, was able to outperform much larger LLMs in computer science and math tasks. Small foundation models – think 1B-5B parameters – are a key requirement for the viability of decentralized AI and unlock promising scenarios for on-device AI. Decentralizing multi-hundred-billion-parameter models is nearly impossible today and will remain so for a while. However, small foundation models should be able to run on many of today’s Web3 infrastructures. Pushing the SLM agenda is essential for building real value with Web3 and AI.

Synthetic Data Generation

Data scarcity is one of the biggest challenges with this latest generation of foundation models. As a result, there is an increasing level of research focused on synthetic data generation mechanisms using foundation models that can complement real-world datasets. The mechanics of crypto networks and token incentives can ideally coordinate a large number of parties to collaborate in creating new synthetic datasets.

Other Relevant Web3-AI Trends

There are several other interesting Web3-AI trends with significant potential. Proof-of-Human outputs is becoming increasingly relevant given the challenges with AI-generated content. Evaluation and benchmarking is an AI segment in which the trust and transparency capabilities of Web3 can shine. Human-centric fine-tuning, such as reinforcement learning with human feedback (RLHF), is also an interesting scenario for Web3 networks. Other scenarios are likely to emerge as generative AI continues to evolve and Web3-AI capabilities mature.

The need for more decentralized AI capabilities is very real. While the Web3 industry might not yet be in a position to rival the value created by the AI mega models, it can unlock real value for the generative AI space. The biggest challenge for the evolution of Web3-AI might be overcoming its own reality distortion field. There is plenty of value in Web3-AI; we just need to focus on building real things.

Note: The views expressed in this column are those of the author and do not necessarily reflect those of CoinDesk, Inc. or its owners and affiliates.

Jesus Rodriguez

Jesus Rodriguez is the CEO and co-founder of IntoTheBlock, a platform focused on enabling market intelligence and institutional DeFi solutions for crypto markets. He is also the co-founder and President of Faktory, a generative AI platform for business and consumer apps. Jesus also founded The Sequence, one of the most popular AI newsletters in the world. In addition to his operational work, Jesus is a guest lecturer at Columbia University and Wharton Business School and is a very active writer and speaker.

picture of Jesus Rodriguez