Ad
Opinion

The Next Wave of AI Is Mobile

AI is moving beyond tech giants as everyday smartphones take on complex computing tasks, says Mitch Liu, CEO of Theta Labs.

Updated Sep 25, 2024, 7:20 p.m. Published Sep 25, 2024, 7:16 p.m.
CDCROP: Mobile Payment via smartphone. (Guido Mieth/Getty Images)
CDCROP: Mobile Payment via smartphone. (Guido Mieth/Getty Images)

AI has an insatiable appetite for resources. It consumes vast amounts of power and data, with estimates of 460 terawatt hours in 2022 that are projected to increase sharply by 2026 to somewhere between 620 and 1,050 TWh. But, its most voracious demand is for compute: the processing power that fuels the training of complex models, the analysis of massive datasets, and the execution of large-scale inferences.

This computational hunger has reshaped many of our professional landscapes. In 2024, the global AI market surpassed $184 billion, with projections suggesting it could pass $800 billion by 2030 – a value comparable to the current GDP of Poland. ChatGPT, the industry’s most well-known product, famously reached 100 million active users within just two months of its launch in November 2022.

Yet, as AI products like ChatGPT multiply and grow, our perception of how AI operates is quickly becoming outdated. The popular image of AI – with sprawling data centers, enormous energy bills, and controlled by tech giants – no longer tells the whole story. This view has led many to believe that meaningful AI development is the exclusive domain of well-funded corporations and major tech companies.

A new vision for AI is emerging, one that looks to the untapped potential in our pockets. This approach aims to democratize AI by harnessing the collective power of billions of smartphones worldwide. Our mobile devices spend hours idle each day, their processing capabilities dormant. By tapping into this vast reservoir of unused compute power, we could reshape the AI landscape. Instead of relying solely on centralized corporate infrastructure, AI development could be powered by a global network of everyday devices.

Untapped potential

Smartphones and tablets represent an enormous, largely untapped reservoir of global compute power. With 1.21 billion units predicted to be shipped in 2024 alone, the true potential of spare compute this offers is hard to, well, compute.

Initiatives like Theta EdgeCloud for mobile aims to harness this distributed network of consumer-grade GPUs for AI computation. This shift from centralized computing to edge computing is a technical evolution that is capable of completely reinventing the way people interact with and power AI models.

By processing data locally on mobile devices, the industry stands to achieve far lower latency, enhanced privacy, and reduced bandwidth usage. This approach is particularly crucial for real-time applications like autonomous vehicles, augmented reality and personalized AI assistants. The edge is where new AI use cases will take off, especially those for personal usage. Not only will powering these programs become more affordable on the edge, but it will also become more reactive and customizable, a win-win for consumers and researchers alike.

Blockchains are designed perfectly for this distributed AI ecosystem. Their decentralized nature aligns seamlessly with the goal of harnessing idle compute power from millions of devices worldwide. By leveraging blockchain technology, we can create a secure, transparent, and incentivized framework for sharing computational resources.

The key innovation here is the use of off-chain verification. While on-chain verification would create bottlenecks in a network of millions of parallel devices, off-chain methods allow these devices to work together seamlessly, regardless of individual connectivity issues. This approach enables the creation of a trustless system where device owners can contribute to AI development without compromising their security or privacy.

This model draws on the concept of "federated learning," a distributed machine learning method that can scale to vast amounts of data across mobile devices while protecting user privacy. Blockchain provides both the infrastructure for this network and the mechanism to reward participants, incentivizing widespread engagement.

The synergy between blockchain and edge AI is fostering a new ecosystem that's more resilient, efficient, and inclusive than traditional centralized models. It's democratizing AI development, allowing individuals to participate in and benefit from the AI revolution directly from their mobile devices.

Overcoming tech challenges

AI training and inference can be done on a range of GPU types, including consumer grade GPUs in mobile devices. The hardware that powers our mobile devices has been steadily improving since smartphones hit the market, and shows no signs of slowing down. Industry leading mobile GPUs such as Apple’s A17 Pro and Qualcomm’s Adreno 750 (used in high-end Android devices like Samsung Galaxy and Google Pixel) are redefining what AI tasks can be completed on mobile devices.

Now, new chips known as Neural Processing Units (NPUs) are being produced that are specifically designed for consumer AI computation, enabling on-device AI use cases while managing the heat and battery power limitations of mobile devices. Add intelligent system design and architecture that can route jobs to the optimal hardware for that job, and the created network effect will be extremely powerful.

While the potential of edge AI is immense, it still comes with its own set of challenges. Optimizing AI algorithms for the diverse array of mobile hardware, ensuring consistent performance across varying network conditions, addressing latency issues, and maintaining security are all critical hurdles. However, ongoing research in AI and mobile technology are steadily addressing these challenges, paving the way for this vision to become reality.

Corporations to communities

One of the biggest complaints, and most just, as it relates to the development of AI is the incredible amount of power it consumes. Large data centers also require huge swaths of land for their physical infrastructure, and incredible amounts of power to stay online. The mobile model can alleviate many of these environmental impacts by using spare GPU in pre-existing devices – rather than relying on GPU in centralized data centers – is more efficient, and will produce less carbon emissions. The potential impacts as it relates to our environment cannot be understated.

The shift to edge computing in AI will also fundamentally change who can participate in supporting AI networks and who can profit off them. The corporations that own the data centers will no longer be in a walled garden. Instead, the gates will be open and access will be proliferated for individual developers, small businesses, and even hobbyists that will be empowered to run AI networks.

Empowering a much larger pool of users and supporters will also enable more rapid and open development, helping to curb the much discussed and much feared idea of stagnation in the industry. This increase in accessibility will also lead to more diverse applications, addressing niche problems and underserved communities that may be otherwise overlooked.

The economic impact of this shift will be profound. By allowing individuals and small to medium sized organizations to monetize their devices' idle computing power, new revenue streams will run deep. It also opens up new markets for consumer-grade AI hardware and edge-optimized software.

The future of AI innovation lies not in building larger data centers, but in harnessing the power that already exists in our pockets and homes. By shifting focus to edge computing, a more inclusive, efficient, and innovative AI ecosystem can emerge. This decentralized approach not only democratizes AI but also aligns with global sustainability goals, ensuring that the benefits of AI are accessible to all, not just a privileged few.

Note: The views expressed in this column are those of the author and do not necessarily reflect those of CoinDesk, Inc. or its owners and affiliates.

picture of Mitch  Liu