fbpx

Navigating the Convergence of Crypto and AI Technologies

*Get your crypto project/offer published on this blog and news sites. Email:contact@cryptoexponentials.com

The convergence of Artificial Intelligence (AI) and blockchain in the cryptocurrency sector marks a significant shift, unlocking advanced analytics and automation. This synergy enhances the efficiency and functionality of cryptocurrencies, showcasing novel applications like decentralized compute systems and identity solutions. While facing challenges such as limited blockchain computational capacity and regulatory hurdles, the integration is poised for accelerated adoption. Key takeaways emphasize the dominance of AI applications in enhancing developer efficiency, smart contract security, and user accessibility. The report underscores the value of permissionless, censorship-resistant decentralized AI networks, highlighting ongoing developments and regulatory reviews. The path forward involves refining on-chain AI integrations, overcoming barriers, and exploring AI agents’ potential, offering a unique advantage in crypto transactions. Broader adoption hinges on integrating with non-crypto products, emphasizing the need for industry participants to understand the transformative power of this intersection between crypto and AI.

Crypto AI Synergistic Landscape

The AI crypto landscape encompasses decentralized compute, ML training, AGI, zkML infrastructure, zkML DeFi, zkML gaming, and AI agents. In this dynamic space where artificial intelligence intersects with cryptocurrency, projects are actively building vital infrastructure for scalable on-chain AI interactions. The rise of decentralized compute marketplaces is noteworthy, addressing the demand for substantial physical hardware, particularly GPUs essential for AI model training and inferencing. Operating as two-sided platforms, these marketplaces connect users seeking to lease and offer compute resources, streamlining the value transfer and verification processes. Subcategories within decentralized compute include specialized ML training providers focusing on verifiable outputs and projects working towards connecting compute and model generation for achieving artificial general intelligence.

Crypto AI Market Map (Source: Galaxy)

An emerging focus in these projects is zkML (zero-knowledge machine learning), presenting a solution for on-chain, verifiable model outputs in a cost-effective manner. While zkML’s current instantiation is resource-intensive and time-consuming, it is increasingly adopted, evident in integrations with DeFi and gaming applications aiming to leverage AI models. The abundance of compute supply and on-chain verification capabilities open avenues for on-chain AI agents. These trained models, capable of executing user requests, have the potential to enhance on-chain experiences, allowing users to perform complex transactions seamlessly through interactions with chatbots. Agent projects are currently in the developmental phase, concentrating on creating necessary infrastructure and tools for swift deployment. The intersection of AI and crypto is a dynamic landscape, promising enhanced on-chain functionalities and the broader integration of cutting-edge technologies.

Developments in Decentralized Compute 🌐💻

As AI continues to evolve, so does the demand for massive compute power. Over the last decade, OpenAi witnessed an exponential surge in compute requirements, highlighting the need for innovative solutions. Enter decentralized compute – a game-changer addressing the challenges of cost, accessibility, and censorship resistance.

Decentralized compute responds to the apprehensions outlined in the  AI Index Report 2023 annual report, where the industry’s advancement surpassing academia in AI model development is noted, consolidating control within a limited number of technology leaders.

Decentralized Compute in the AI landscape is experiencing a surge in demand for computational power, driven by the escalating requirements of sophisticated AI models. Notably, the necessity for GPUs has intensified, with some crypto miners repurposing their GPUs for cloud computing services. The exponential growth in compute demands, coupled with rising costs, has prompted various projects to leverage crypto for decentralized compute solutions. These initiatives aim to provide on-demand compute resources at competitive prices, enabling affordable training and execution of AI models, albeit with potential trade-offs in performance and security.

State-of-the-art GPUs, like those from Nvidia, are in high demand, exemplified by Tether’s acquisition of a stake in Northern Data, investing $420 million for 10,000 H100 GPUs renowned for AI training. However, acquiring such top-tier hardware involves lengthy wait times and often necessitates long-term contracts, even for compute amounts that may not be fully utilized. This has led to inefficiencies where available compute is inaccessible on the market. Decentralized compute systems address these challenges by creating a secondary market, allowing compute owners to sublease excess capacity swiftly, thereby unlocking additional supply.

Beyond competitive pricing and accessibility, the central value proposition of decentralized compute lies in its censorship resistance. The domination of AI development by large tech firms has raised concerns about centralized control and influence over AI models, prompting the need for more decentralized solutions. Various decentralized compute models have emerged, each with unique focuses and trade-offs, including projects like Akash, io.net, iExec, and Cudos. Akash, as an example, operates as an open-source “supercloud” platform, utilizing Cosmos SDK in a proof-of-stake network. It facilitates a permissionless cloud compute marketplace, featuring storage, CPU leasing, and, more recently, GPU leasing for AI purposes.

In the Akash ecosystem, tenants (users) and providers (compute suppliers) engage in a reverse auction process to match compute requirements with bid prices. Akash validators ensure network integrity, and the platform’s native token (AKT) plays a vital role in securing the network and incentivizing participation. The decentralized compute space also includes secondary markets, addressing inefficiencies in the existing compute market by unlocking new supply.

While decentralized compute networks show promise, challenges and barriers to adoption persist. The demand for high-end GPUs remains strong, and decentralized compute platforms need to attract both demand and supply. Projects like Akash are implementing strategies, such as increasing emissions for GPU suppliers, to incentivize supply. Teams are also working on proof-of-concept models and real-time demonstrations to showcase the capabilities of their networks. Akash, for instance, has introduced chatbot and image generation offerings using its GPUs, while io.net is developing network functionalities to mimic traditional GPU data centers.

The decentralized compute landscape is evolving rapidly, and it’s exciting to witness how these initiatives shape the future of AI compute. 

Decentralized Machine Learning Training

In addition to generalized compute platforms catering to AI needs, a cohort of specialized AI GPU providers focused on machine learning model training is emerging. An exemplar in this domain is Gensyn, which aims to “coordinate electricity and hardware to build collective intelligence,” asserting that training should be permissible if there is a willing participant.

The protocol involves four primary actors: submitters, solvers, verifiers, and whistleblowers. Submitters propose tasks to the network, specifying training requirements, model details, and training data. Submitters pay an upfront fee based on the estimated compute needed from solvers during the submission. Solvers execute the training tasks, submitting completed work to verifiers for validation, while whistleblowers ensure the honesty of verifiers. To incentivize whistleblower participation, Gensyn plans periodic rewards for catching purposefully incorrect proofs.

Beyond providing compute for AI workloads, Gensyn’s key value lies in its verification system, still under development. The verification process is crucial to ensuring GPU providers perform external computations correctly, and Gensyn uses innovative methods such as “Probabilistic proof-of-learning, Graph-based pinpoint protocol, and Truebit-style incentive games.” This optimistic solving mode allows verifiers to confirm a solver’s correct execution without full rerunning, a more efficient approach.

Gensyn claims cost-effectiveness compared to centralized alternatives, offering ML training up to 80% cheaper than AWS and outperforming competitors like Truebit in testing.

Whether these promising results can be replicated at scale across a decentralized network remains uncertain. Gensyn aims to utilize excess compute from diverse providers, including small data centers, retail users, and potentially smaller devices like mobile phones in the future. However, relying on heterogeneous compute providers introduces new challenges, as Gensyn acknowledges. Unlike centralized providers where compute is expensive but communication is cheap, Gensyn reduces compute costs by enabling worldwide GPU contributions but increases communication costs coordinating jobs across varied hardware locations. While still in a proof-of-concept stage, Gensyn provides insight into the potential of decentralized machine learning training protocols.

Decentralized Generalized Intelligence

Decentralized compute platforms are propelling innovative approaches to artificial intelligence (AI) creation methodologies, and Bittensor, a decentralized compute protocol developed on Substrate, stands out as a leading player in this evolution. Launched in 2021, Bittensor seeks to transform AI into a collaborative endeavor by decentralizing and commodifying the generation of artificial intelligence, fostering continual iteration for improved AI outcomes.

Inspired by Bitcoin, Bittensor features a finite supply of its native currency TAO and employs a four-year halving cycle, embracing a “Proof of Intelligence” mechanism. Instead of Proof of Work, miners must run models that produce outputs in response to inference requests.

Bittensor Inference Process

Bittensor’s initial model, Mixture of Experts (MoE), aimed to enhance outputs by tapping into diverse models depending on the input type. However, challenges related to proper routing, message synchronization, and incentivization prompted a temporary sidelining of this approach until further project development.

The Bittensor network comprises validators and miners. Validators submit tasks, evaluate miner outputs, and rank them based on response quality, earning TAO emissions according to their vtrust scores, which reflect alignment with other validators. Miners run machine learning models, competing to provide accurate outputs and earning more TAO emissions with higher accuracy.

The interaction between validators (using Proof of Stake) and miners (using Proof of Model, a form of Proof of Work) forms the Yuma Consensus, encouraging miners to produce superior outputs for TAO rewards and validators to accurately rank miner outputs, enhancing their vtrust scores and TAO rewards.

Bittensor’s progression involves the introduction of subnets through its Revolution upgrade, enabling individual networks incentivizing specific behaviors. Subnets earn TAO based on performance, and as subnets mature, they create application integrations, paving the way for developers to build applications querying specific subnets. While Bittensor faces challenges such as miners attempting to exploit the network for more TAO rewards and ongoing governance refinements, it is gaining recognition for its unique paradigm, aiming to evolve into an agnostic platform capable of incentivizing various behaviors, powered by TAO. Practical constraints and the demonstration of scalability and incentives remain key considerations in achieving this ambitious vision.

Constructing a Decentralized Compute Stack for AI Models

The preceding sections offer a comprehensive overview of the diverse decentralized artificial intelligence compute protocols in development. While still early in their evolutionary journey, these protocols lay the groundwork for an ecosystem that could potentially give rise to “AI building blocks,” akin to DeFi’s “Money Legos” concept. The inherent composability of permissionless blockchains allows each protocol to leverage others, fostering a more integrated and decentralized artificial intelligence landscape.

As an illustration of potential interactions, consider how Akash, Gensyn, and Bittensor might collaborate in responding to an inference request.

Decentralized AI Stack

It’s crucial to note that this serves as a speculative example of future possibilities, not a representation of the present ecosystem, existing partnerships, or probable outcomes. Current constraints on interoperability and other factors limit integration opportunities. Issues like liquidity fragmentation and the necessity for multiple tokens can also impact the user experience negatively, a concern acknowledged by founders of Akash and Bittensor.

Decentralized Offerings Beyond Compute

Beyond compute, various decentralized infrastructure services are emerging to support the burgeoning AI ecosystem in the crypto space. While providing an exhaustive list is beyond this report’s scope, a few noteworthy examples include:

  1. Ocean: A decentralized data marketplace enabling users to create data NFTs, allowing monetization and greater control over data while providing AI teams access for model development.
  2. Grass: A decentralized bandwidth marketplace where users sell excess bandwidth to AI companies, enriching data scraping capabilities and enabling individuals to monetize their bandwidth.
  3. HiveMapper: Developing a decentralized maps offering based on information collected from everyday drivers, utilizing AI for interpretation and rewarding users with tokens for refining the AI model through human learning feedback.

Collectively, these initiatives hint at the vast opportunities for exploring decentralized marketplace models that support AI models or the surrounding infrastructure essential for their development. Presently in the proof-of-concept stage, these projects require more research and development to demonstrate scalability for comprehensive AI services.

Outlook

Decentralized compute offerings are still in early development, beginning to offer access to cutting-edge compute capable of training powerful AI models. To secure meaningful market share, they must demonstrate practical advantages over centralized alternatives. Potential catalysts for broader adoption include:

  1. GPU Supply/Demand: Escalating GPU scarcity and heightened compute demand create an opportune environment for decentralized compute providers like Akash and Gensyn, offering cost-competitive alternatives for high-performance compute.
  2. Regulation: Regulatory challenges persist for decentralized compute, necessitating clear guidelines and controls to address potential risks. The emergence of KYC-compliant platforms may provide a short-term solution, and discussions around new regulatory frameworks pose challenges for GPU access.
  3. Censorship: Actions to limit AI access could benefit decentralized compute platforms, especially amid discussions on AI regulation and the risks associated with centralized decision-making for powerful AI models.
  4. Data Privacy: Integrated with data privacy solutions, decentralized compute becomes attractive, offering autonomy over user data. Advances in secure enclaves and homomorphic encryption could further enhance privacy in decentralized compute.
  5. User Experience (UX): Improving UX remains critical for broader adoption of decentralized compute. Streamlining onboarding processes, abstracting blockchain interactions, and delivering high-quality outputs are essential for overcoming barriers and enhancing user experience.

As decentralized compute evolves, addressing these factors will be pivotal in determining its trajectory and impact on the broader AI landscape.

Smart Contracts & zkML: Transforming Blockchain with Machine Learning

Smart contracts stand as foundational elements within any blockchain ecosystem. Operating based on predefined conditions, they automate processes and eliminate the need for trust in a third party, enabling the creation of intricate decentralized applications, as evident in the realm of Decentralized Finance (DeFi). While current smart contracts exhibit utility in static environments, their functionality limitations arise when dealing with dynamic situations where risks are in constant flux. Updating these contracts to accommodate changes in risk tolerance becomes challenging, especially for decentralized autonomous organizations (DAOs) relying on decentralized governance processes, as they might struggle to adapt swiftly to evolving systemic risks.

Integrating machine learning models into smart contracts presents a potential avenue to enhance functionality, security, and efficiency while providing a superior user experience. Nonetheless, this integration introduces additional risks, particularly regarding the exploitation of the underlying models supporting these smart contracts, as well as challenges in handling longtail situations due to limited data inputs.

Zero Knowledge Machine Learning (zkML)

The resource-intensive nature of running complex machine learning models makes it impractical to directly execute them inside smart contracts due to high associated costs. DeFi protocols, for example, offering access to yield-optimizing models, face challenges running such models on-chain without incurring prohibitively high gas fees. One approach to overcome this hurdle is to increase the computational power of the underlying blockchain. However, this might compromise decentralization properties by escalating demands on the chain’s validator set. Alternatively, some projects explore the use of zkML to verify outputs in a trustless manner without requiring intensive on-chain computation.

zkML becomes particularly valuable when a user needs someone else to run data through a model and verify that their counterpart executed the correct model. For instance, a developer using a decentralized compute provider to train models may want to ensure the provider isn’t cutting costs by using a cheaper model with nearly indistinguishable output. zkML allows the compute provider to run the data through their models, generate a proof, and verify on-chain that the model’s output for the given input is correct. This flexibility extends to cases where a user wants to run a model using their data without sharing it, verifying the correctness of the model execution with a proof, thus addressing compute restrictions.

Infrastructure and Tooling

In the early stages of zkML development, the primary focus revolves around constructing infrastructure and tools for teams to convert models and outputs into verifiable proofs on-chain. Products like EZKL and Giza contribute to this effort by providing verifiable proofs of machine learning model execution. These tools aim to abstract the zero-knowledge aspect of development, facilitating the integration of machine learning models into smart contracts without extensive knowledge of zero-knowledge proofs.

EZKL, an open-source project, utilizes zk-SNARKS, while Giza, a closed-source project, produces zk-STARKS. Both projects, currently EVM compatible, leverage the Open Neural Network Exchange (ONNX) to transform machine learning models written in popular languages like TensorFlow and PyTorch into a standard format. They ensure that these models produce zk-proofs when executed. EZKL, in particular, has made significant progress in improving cost efficiency, security, and proof generation speed, demonstrating advancements in reducing aggregate proof time and introducing new software solutions like Lilith for enhanced performance.

Modulus Labs takes a different approach by developing a new zk-proof technique tailored for AI models. The closed-source Remainder, introduced by Modulus, focuses on reducing costs and proving time for AI models, aiming to make it economically viable for projects to integrate models into their smart contracts at scale. While closed-source, the work from Modulus Labs has garnered attention, reflecting the ongoing exploration of diverse approaches in the zkML space.

Coprocessors

Projects labeled as “coprocessors,” including RiscZero, Axiom, and Ritual, contribute to the development of zero-knowledge virtual machines. These networks abstract the zero-knowledge proof generation process, creating virtual machines capable of executing programs off-chain and generating proofs for on-chain verification. RiscZero and Axiom serve as more general-purpose coprocessors, while Ritual specializes in supporting AI models.

Infernet, a manifestation of Ritual, offers an SDK that enables developers to submit inference requests to the network and receive outputs and proofs in return. The Infernet Node processes these requests off-chain, verifying outputs on-chain. Ritual aims to become a foundational infrastructure layer, the Ritual Superchain, where various projects can plug in as service providers, fostering interoperability. Integrations with other providers, such as EZKL, showcase the potential for enhanced collaboration within the zkML space.

Applications

While zkML is still in the experimental phase, with a predominant focus on infrastructure development and proof-of-concepts, potential applications are emerging across different sectors:

  1. Decentralized Finance (DeFi): zkML broadens the design space for DeFi by enhancing smart contract capabilities. Partnerships between Giza and Yearn Finance, Modulus Labs and Lyra Finance, NOYA leveraging EZKL, and Mozaic providing access to offchain models illustrate the incorporation of zkML into DeFi protocols. Applications range from automated risk assessment engines to credit scoring mechanisms for predicting borrower default probabilities.
  2. Gaming: On-chain gaming with artificial intelligence becomes feasible through zkML. Proof-of-concepts include simple games like Leela vs. the World, a chess game where users compete against an AI chess model with zkML verifying the model’s moves. Cartridge, utilizing Giza, explores fully on-chain games, showcasing an AI driving game as a step toward more complex implementations.
  3. Identity, Provenance, and Privacy: zkML contributes to verifying authenticity, combatting AI-generated content, and enhancing privacy. Projects like WorldCoin use zkML to create proof of personhood solutions, relying on biometric IDs self-custodied on personal devices. Privacy-preserving applications include medical data analysis, dating algorithms, and identity verification for insurance and loan agencies.

Outlook

The zkML landscape is evolving, with ongoing challenges including computational costs, memory limitations, model complexity, limited tooling, and the need for specialized developer expertise. While there’s substantial work ahead before zkML can be implemented at a consumer product scale, it holds immense promise as a critical component for the seamless integration of AI and crypto.

As the field matures and addresses existing limitations, zkML is poised to play a pivotal role in unlocking new possibilities for on-chain AI applications. The ability to bring off-chain compute of any size on-chain, coupled with robust security assurances, positions zkML as a key enabler for innovative applications that bridge the gap between blockchains and AI. Early adopters will continue to navigate trade-offs between zkML’s privacy and security features and the efficiency of alternative solutions until the technology matures further.

AI Agents: Transforming Crypto Interactions

The evolving synergy between AI and crypto is exemplified by the burgeoning experimentation with AI Agents. These autonomous entities, driven by AI models, have the capability to interpret, execute tasks, and manage functions seamlessly. The spectrum of applications spans from personalized virtual assistants tailored to individual preferences to financial agents adept at portfolio management and adjustment based on risk preferences.

The compatibility of Agents with crypto is rooted in the permissionless and trustless nature of crypto transactions. Once trained, Agents equipped with wallets can independently transact with smart contracts. Presently, rudimentary Agents scour the internet for information and execute trades on prediction markets based on predefined models.

Leading AI Agent Providers:

  1. Morpheus: Emerging as an open-source project on Ethereum and Arbitrum in 2024, Morpheus introduces a Smart Agent Protocol, a local and wallet-managed open-source LLM (Large Language Model). The protocol employs a Smart Contract Rank to guide Agents in determining safe interactions with smart contracts, considering factors such as transaction volume.
  2. Decentralized Autonomous Infrastructure Network (DAIN): Operating on Solana, DAIN focuses on establishing an agent-to-agent economy, facilitating seamless interaction between agents across diverse businesses via a universal API. The protocol aims to extend the capabilities of AI agents to interact with both web2 and web3 products.
  3. Fetch.AI: Pioneering the deployment of AI Agent protocols, Fetch.AI builds a comprehensive ecosystem for creating, deploying, and utilizing Agents on-chain using its native FET token and Fetch.AI wallet. The platform offers a suite of tools for interacting with and ordering agents.
  4. Autonolas: Positioned as an open marketplace for decentralized AI agents, Autonolas facilitates the creation and utilization of agents across multiple blockchains, including Polygon, Ethereum, Gnosis Chain, and Solana. Their toolkit empowers developers to construct AI agents hosted off-chain for applications like prediction markets and DAO governance.
  5. SingularityNet: Envisaging a decentralized marketplace for AI agents, SingularityNet provides a platform for deploying specialized AI agents capable of executing intricate tasks. Integration initiatives like AlteredStateMachine explore the intersection of AI Agents with NFTs, creating agents with randomized attributes for various applications.

Bitcoin and AI Agents: In a landmark development, Lightning Labs introduced the LangChain Bitcoin Suite in July 2023. Functioning on the Lightning Network, this suite facilitates agent-driven interactions, addressing the challenge of expensive API keys in web applications. The LangChain framework enables agents to buy, sell, hold Bitcoin, query API keys, and conduct micro-payments with minimal fees on the Lightning Network.

Outlook: While the AI Agents space is in its infancy, initial projects showcase functional agents handling simple tasks within their infrastructure. Ongoing developments hint at the potential for AI agents to substantially improve user experience across diverse domains. As transacting evolves from point-and-click to text-based interactions, users can seamlessly engage with on-chain agents through Large Language Models (LLMs), exemplified by projects like Dawn Wallet.

Challenges persist, including the integration of agents into the web2 landscape, regulatory considerations, and the need for broader acceptance of crypto as a form of payment. As the AI Agents ecosystem matures, it is poised to become a significant consumer of decentralized compute and zero-knowledge machine learning (zkML) solutions, operating autonomously to receive and solve tasks in a non-deterministic manner.

Conclusion

The integration of AI and crypto introduces transformative innovations, enhancing infrastructure development, user experience, and accessibility. While the current focus is on off-chain integrations, projects are laying the groundwork for a future where crypto and AI are intricately interconnected. Decentralized compute, zkML, and AI Agents represent pivotal components that promise to shape this future landscape, fostering a symbiotic relationship between crypto technologies and artificial intelligence.

Also read: Bank to the Future: Data, AI, and Blockchain Revolutionizing Money

Participate and take advantage of DePIN revolution 👉 https://lnkd.in/eQW_PaXe

Crypto Exponentials

What started out as a curiosity to learn about Bitcoin during the year 2016 has turned into a mission to share my research with as many people as possible. With ever-increasing value combined with speculation, there are many ways we can win together with ABC (ai + blockchain + cloud) trio. Knowledge is power!


More to Explore

Want a Free DeFi eBook Delivered To Your Inbox?

Enter your email address below to get a FREE eBook "DeFi: The Ultimate Beginner's Guideand signed up for exclusive news letter.
You'll also enter into a random drawing to get Free access to a brand-new "The Crypto Code" Mastermind [$1,997 In Value ] in 2024 giveaway.
DOWNLOAD NOW!
close-link