image source head

AI+Web3: Towers and Squares

trendx logo

Reprinted from panewslab

12/17/2024·6M

Author: Coinspire

AI+Web3: Towers and Squares

►TL;DR

1. The AI ​​concept Web3 project has become a gold magnet in the primary and secondary markets.

2. The opportunities of Web3 in the AI ​​industry are reflected in: using distributed incentives to coordinate potential supply in the long tail -across data, storage and computing; at the same time, establishing an open source model and a decentralized market for AI Agents.

3. AI is mainly used in the Web3 industry for on-chain finance (encrypted payments, transactions, data analysis) and auxiliary development.

4. The effectiveness of AI+Web3 is reflected in the complementarity of the two: Web3 is expected to fight against AI centralization, and AI is expected to help Web3 break out of the circle.

introduction

In the past two years, the development of AI has been accelerated. The butterfly wings instigated by Chatgpt not only opened up a new world of generative artificial intelligence, but also set off an ocean current in Web3 on the other side.

With the support of the AI ​​concept, financing in the crypto market has been significantly boosted compared to the slowdown. According to media statistics, in the first half of 2024 alone, a total of 64 Web3+AI projects have completed financing. The artificial intelligence-based operating system Zyber365 achieved the highest financing amount of US$100 million in the A round.

The secondary market is more prosperous. According to data from the encryption aggregation website Coingecko, in just over a year, the total market value of the AI ​​​​track has reached 48.5 billion U.S. dollars, and the 24-hour transaction volume is close to 8.6 billion U.S. dollars; the benefits brought by the progress of mainstream AI technology Obviously, after the release of OpenAI's Sora text-to-video model, the average price of the AI ​​sector increased by 151%; the AI ​​effect also radiated to Meme, one of the cryptocurrency gold-absorbing sectors: MemeCoin, the first AI Agent concept -GOAT It quickly became popular and gained a valuation of US$1.4 billion, successfully setting off the AI ​​Meme craze.

Research and topics on AI+Web3 are equally hot. From AI+Depin to AI Memecoin to the current AI Agent and AI DAO, FOMO emotions can no longer keep up with the speed of new narrative rotation.

AI+Web3, this terminology combination full of hot money, trends and future fantasies, will inevitably be regarded as an arranged marriage brought together by capital. It seems difficult for us to distinguish what is under this gorgeous robe. Home court, or the night before the dawn breaks out?

To answer this question, a key question for both parties is, would it be better without the other party? Can you benefit from the other person’s model? In this article, we also try to stand on the shoulders of our predecessors and examine this pattern: how can Web3 play a role in all aspects of the AI ​​technology stack, and what new vitality can AI bring to Web3?

Part.1 What are the opportunities for Web3 under the AI ​​stack?

Before starting this topic, we need to understand the technology stack of the large AI model:

AI+Web3: Towers and Squares

Source: Delphi Digital

To express the whole process in more popular language: "Large model" is like the human brain. In the early stages, this brain belongs to a baby who has just arrived in the world. It needs to observe and absorb the massive information from the outside world around it to understand the world. It is the "collection" stage of data; since computers do not have multiple senses such as human vision and hearing, before training, large-scale unlabeled information from the outside world needs to be converted into an information format that the computer can understand and use through "preprocessing".

After inputting the data, the AI ​​builds a model with understanding and prediction capabilities through "training", which can be regarded as the process of the baby gradually understanding and learning the outside world. The parameters of the model are like the language ability of the baby that is constantly adjusted during the learning process. When the learning content starts to be divided into subjects, or when you communicate with others to get feedback and make corrections, you enter the "fine-tuning" phase of the large model.

As children grow up and learn to speak, they can understand meaning and express their feelings and thoughts in new conversations. This stage is similar to the "inference" of a large AI model. The model can predict and analyze new language and text input. . Babies express feelings, describe objects and solve various problems through language ability. This is also similar to how large AI models are used in various specific tasks in the reasoning stage after completing training and being put into use, such as image classification, speech recognition, etc.

The AI ​​Agent is closer to the next form of the large model - able to independently perform tasks and pursue complex goals. It not only has the ability to think, but also can remember, plan, and use tools to interact with the world.

Currently, in response to the pain points of AI in each stack, Web3 has initially formed a multi-layered, interconnected ecosystem, covering all stages of the AI ​​model process.

1. Basic layer: Airbnb with computing power and data

▎Computing power

Currently, one of the highest costs of AI is the computing power and energy required to train and infer models.

As one example, Meta's LLAMA3 required 16,000 H100 GPUs produced by NVIDIA (a top-of-the-line graphics processing unit designed for artificial intelligence and high-performance computing workloads.) for 30 days to complete training. The unit price of the latter 80GB version is between US$30,000 and US$40,000, which requires an investment of US$400-700 million in computing hardware (GPU

  • network chip). At the same time, monthly training requires 1.6 billion kilowatt hours, and energy expenditures are nearly US$2,000 per month. million dollars.

The decompression of AI computing power is also the earliest area where Web3 intersects with AI - DePin (decentralized physical infrastructure network). Currently, the DePin Ninja data website has displayed more than 1,400 projects, among which the GPU computing power sharing representative project Including io.net, Aethir, Akash, Render Network, etc.

Its main logic is: the platform allows individuals or entities with idle GPU resources to contribute their computing power in a permissionless, decentralized manner, increasing underutilized GPU resources through an online marketplace for buyers and sellers similar to Uber or Airbnb. Utilization rate, end-users also obtain lower-cost and efficient computing resources; at the same time, the pledge mechanism also ensures that if there is a violation of the quality control mechanism or network interruption, the resource provider will have corresponding penalties.

Its characteristics are:

  • Gather idle GPU resources: The suppliers are mainly the excess computing power resources of third-party independent small and medium-sized data centers, encryption mines and other operators, and the mining hardware with PoS consensus mechanism, such as FileCoin and ETH mining machines. Currently, there are also projects dedicated to launching devices with lower thresholds. For example, exolab uses local devices such as MacBook, iPhone, and iPad to establish a computing network for running large model inference.

  • Facing the long tail market of AI computing power:

a. "From a technical perspective" the decentralized computing power market is more suitable for the reasoning step. Training relies more on the data processing capabilities brought by ultra-large cluster-scale GPUs, while inference has relatively low GPU computing performance. For example, Aethir focuses on low-latency rendering work and AI inference applications.

b. "On the demand side" small and medium-sized computing power demanders will not train their own large models separately, but only choose to optimize and fine-tune around a few large models. These scenarios are naturally suitable for distributed idle computing resources.

  • Decentralized ownership: The technical significance of blockchain is that resource owners always retain their control over resources, can flexibly adjust according to needs, and obtain benefits at the same time.

▎Data

Data is the foundation of AI. Without data, calculations are useless, and the relationship between data and models is like the proverb "Garbage in, Garbage out". The quantity and input quality of data determine the output of the final model. quality. For the training of current AI models, data determines the model's language ability, understanding ability, and even values ​​and humanized performance. Currently, AI’s data demand dilemma mainly focuses on the following four aspects:

  • Data Hungry: AI model training relies on large amounts of data input. Public information shows that the number of parameters used by OpenAI to train GPT-4 has reached the trillion level.

  • Data quality: With the integration of AI and various industries, data timeliness, data diversity, professionalism of vertical data, and the intake of emerging data sources such as social media sentiment have also put forward new requirements for its quality.

  • Privacy and compliance issues: Currently, various countries and companies are gradually noticing the importance of high-quality data sets and are restricting the crawling of data sets.

  • Data processing is expensive: the amount of data is large and the processing is complex. Public information shows that more than 30% of AI companies' R&D costs are spent on basic data collection and processing.

Currently, web3 solutions are reflected in the following four aspects:

1. Data collection: The real-world data that can be captured for free is rapidly being exhausted, and AI companies’ expenditures for data are increasing year by year. But at the same time, this expenditure does not feed back to the real contributors of the data. The platform fully enjoys the value creation brought by the data. For example, Reddit achieved a total revenue of US$203 million through a data licensing agreement signed with an AI company.

It is the vision of Web3 to allow users who truly contribute to also participate in the value creation brought by data, and to obtain users' more private and valuable data at a low cost through distributed networks and incentive mechanisms.

  • For example, Grass is a decentralized data layer and network. Users can run Grass nodes, contribute idle bandwidth and relay traffic to capture real-time data across the entire Internet, and obtain token rewards;

  • Vana introduces a unique data liquidity pool (DLP) concept. Users can upload their private data (such as shopping records, browsing habits, social media activities, etc.) to a specific DLP, and flexibly choose whether to authorize this data to Certain third party uses;

  • In PublicAI, users can use #AI or #Web3 as classification tags on X and @PublicAI to achieve data collection.

2. Data preprocessing: In the data processing process of AI, since the collected data is usually noisy and contains errors, it must be cleaned and converted into a usable format before training the model, which involves standardization, filtering and processing of duplicates of missing values. Task. This stage is one of the few manual steps in the AI ​​industry, and the industry of data annotators has been spawned. As the requirements for data quality of models increase, the threshold for data annotators also increases, and this task is naturally suitable for Web3. Decentralized incentive mechanism.

  • Currently, both Grass and OpenLayer are considering adding data annotation as a key link.

  • Synesis proposed the concept of "Train2earn", emphasizing data quality, and users can receive rewards by providing labeled data, annotations or other forms of input.

  • The data labeling project Sapien gamifies labeling tasks and allows users to stake points to earn more points.

3. Data privacy and security: It needs to be clarified that data privacy and security are two different concepts. Data privacy involves the handling of sensitive data, while data security protects data information from unauthorized access, destruction and theft. Therefore, the advantages and potential application scenarios of Web3 privacy technology are reflected in two aspects: (1) training of sensitive data; (2) data collaboration: multiple data owners can jointly participate in AI training without sharing their original data.

Currently, the more common privacy technologies in Web3 include:

  • Trusted Execution Environment (TEE), such as Super Protocol;

  • Fully Homomorphic Encryption (FHE), such as BasedAI, Fhenix.io or Inco Network;

  • Zero-knowledge technologies (zk) such as Reclaim Protocol use zkTLS technology to generate zero-knowledge proofs of HTTPS traffic, allowing users to securely import activity, reputation and identity data from external websites without exposing sensitive information.

However, the field is still in its early stages and most projects are still being explored. One current dilemma is that the computational cost is too high. Some examples are:

  • The zkML framework EZKL takes approximately 80 minutes to generate a proof for a 1M-nanoGPT model.

  • According to Modulus Labs, the overhead of zkML is more than 1,000 times higher than pure computation.

4. Data storage: After you have the data, you still need a place to store the data on the chain, as well as the LLM generated using the data. With data availability (DA) as the core issue, before the Ethereum Danksharding upgrade, its throughput was 0.08MB. At the same time, training and real-time inference of AI models typically require 50 to 100GB per second of data throughput. This order of magnitude gap makes existing on-chain solutions unable to cope with “resource-intensive AI applications.”

  • 0g.AI is a representative project in this category. It is a centralized storage solution designed for the high-performance requirements of AI. Its key features include: high performance and scalability. It supports fast uploading and downloading of large files through advanced sharding and erasure coding technologies. scale data sets, with data transfer speeds approaching 5GB per second.

2. Middleware: model training and inference

▎Open source model decentralized market

The debate over whether AI models should be closed source or open source has never gone away. The collective innovation brought by open source is an incomparable advantage of the closed source model. However, without a profit model, how can the open source model improve developer driving force? It is a direction worth thinking about. Baidu founder Robin Li asserted in April this year, "The open source model will become increasingly backward."

In this regard, Web3 proposes the possibility of a decentralized open source model market, that is, tokenizing the model itself, retaining a certain proportion of tokens for the team, and flowing part of the future revenue of the model to token holders.

  • For example, the Bittensor protocol establishes an open source model P2P market consisting of dozens of "subnets" in which resource providers (computing, data collection/storage, machine learning talent) compete with each other to meet the goals of specific subnet owners, Individual subnetworks can interact and learn from each other, enabling greater intelligence. Rewards are distributed by community voting and further distributed among subnets based on competitive performance.

  • ORA introduces the concept of Initial Model Offering (IMO) to tokenize AI models, and AI models can be purchased, sold and developed through a decentralized network.

  • Sentient, a decentralized AGI platform, encourages contributors to collaborate, build, replicate and expand AI models, and rewards contributors.

  • Spectral Nova focuses on the creation and application of AI and ML models.

▎Verifiable reasoning

For the "black box" problem in the AI ​​inference process, the standard Web3 solution is to have multiple verifiers repeat the same operation and compare the results. However, due to the current shortage of high-end "Nvidia chips", this approach faces obvious challenges. AI inference is expensive.

A more promising solution is to perform ZK proofs for off-chain AI inference calculations. Zero-knowledge proofs, a cryptographic protocol in which one party's prover can prove to another party's verifier that a given statement is true without revealing anything except "Any additional information other than the statement is true", allowing for permissionless verification of AI model calculations on the chain. This requires cryptographically proving on-chain that off-chain computations were completed correctly (e.g. the data set has not been tampered with) while ensuring all data remains confidential.

Key benefits include:

  • Scalability: Zero-knowledge proofs can quickly confirm large numbers of off-chain computations. Even if the number of transactions increases, a single zero-knowledge proof can verify all transactions.

  • Privacy Protection: Data and AI model details remain private, while all parties can verify that the data and models have not been compromised.

  • Trustless: Computations can be confirmed without relying on centralized parties.

  • Web2 Integration: By definition, Web2 is integrated off-chain, meaning verifiable reasoning can help bring its data sets and AI calculations on-chain. This helps increase Web3 adoption.

Currently, Web3's verifiable technologies for verifiable reasoning are as follows:

  • zkML: Combining zero-knowledge proofs with machine learning to ensure privacy and confidentiality of data and models, allowing verifiable calculations without revealing certain underlying properties, such as Modulus Labs released the ZK prover built for AI based on ZKML , to effectively check whether the AI ​​provider's manipulation algorithm is correctly executed on the chain, but the current customers are basically DApps on the chain.

  • opML: Utilizes the principle of optimistic aggregation to improve the scalability and efficiency of ML calculations by verifying the time when disputes occur. In this model, only a small part of the results generated by the "verifier" needs to be verified, but the economic cost will be reduced. High enough to increase the verifier's cost of cheating and thus save redundant computation.

  • TeeML: Securely perform ML computations using a trusted execution environment, protecting data and models from tampering and unauthorized access.

3. Application layer: AI Agent

The current development of AI has shown a shift in development focus from model capabilities to AI Agents. Technology companies such as OpenAI, AI large model unicorn Anthropic, and Microsoft have turned to developing AI Agents in an attempt to break the current technology platform period of LLM.

OpenAI's definition of AI Agent is: a system driven by LLM as the brain, with the ability to autonomously understand perception, planning, memory and tool use, and can automatically execute complex tasks. When AI changes from a tool being used to a subject that can use the tool, it becomes an AI Agent. This is why AI Agent can become the most ideal intelligent assistant for human beings.

And what can Web3 bring to Agent?

▎Decentralization

The decentralized nature of Web3 can make the Agent system more decentralized and autonomous. Establishing incentive and punishment mechanisms for pledgers and delegators through mechanisms such as PoS and DPoS can promote the democratization of the Agent system. GaiaNet, Theoriq, and HajimeAI have all attempted it.

▎Cold start

The development and iteration of AI Agent often requires a large amount of financial support, and Web3 can help potential AI Agent projects obtain early financing and cold start.

  • Virtual Protocol launches fun.virtuals, an AI Agent creation and token issuance platform. Any user can deploy AI Agent with one click and achieve 100% fair issuance of AI Agent tokens.

  • Spectral proposed a product concept that supports the issuance of AI Agent assets on the chain: by issuing tokens through IAO (Initial Agent Offering), AI Agent can directly obtain funds from investors and become a member of DAO governance, providing investors with opportunities to participate in project development. and the opportunity to share in future earnings.

Part.2 How does AI empower Web3?

The impact of AI on the Web3 project is obvious. It benefits blockchain technology by optimizing on-chain operations such as smart contract execution, liquidity optimization, and AI-driven governance decisions. At the same time, it can also provide Better data-driven insights, improve on-chain security, and lay the foundation for new Web3-based applications.

1. AI and on-chain finance

▎AI and crypto economy

On August 31, Coinbase CEO Brian Armstrong announced the implementation of the first AI-to-AI encrypted transaction on the Base network, and said that AI Agents can now use USD on Base to conduct transactions with humans, merchants, or other AIs. These transactions are instant. , global, and free.

In addition to payment, Virtuals Protocol's Luna also demonstrated for the first time how AI Agent can autonomously execute on-chain transactions in the following ways, which attracted attention. As an intelligent entity that can perceive the environment, make decisions and perform actions, AI Agent is regarded as an on-chain The future of finance. At present, the potential scenarios of AI Agent are reflected in the following points:

1. Information collection and prediction: Help investors collect exchange announcements, project public information, panic, public opinion risks and other information, analyze and evaluate asset fundamentals and market conditions in real time, and predict trends and risks.

2. Asset management: Provide users with appropriate investment targets, optimize asset portfolios, and automatically execute transactions.

3. Financial experience: Help investors choose the fastest on-chain transaction method, automate manual operations such as cross-chain and adjusting gas fees, and reduce the threshold and cost of on-chain financial activities.

Imagine a scenario where you convey the following instructions to the AI ​​Agent: "I have 1,000 USDT, please help me find the combination with the highest return, and the lock-up time is no more than a week." The AI ​​Agent will provide you with the following advice: "The recommended initial allocation is 50 % in A, 20% in B, 20% in X, 10% At Y. I will monitor the interest rate and watch for changes in its risk level and rebalance if necessary. “In addition, looking for potential airdrop projects, as well as Memecoin projects that have signs of being popular in the community, are things that AI Agent will be able to achieve in the future. .

AI+Web3: Towers and Squares

Source: Biconomy

Currently, AI Agent wallet Bitte and AI interaction protocol Wayfinder are making such attempts. They are all trying to access OpenAI's model API, allowing users to command the Agent to complete various on-chain operations under a chat window interface similar to ChatGPT, such as WayFinder The first prototype released in April this year demonstrated the four basic operations of swap, send, bridge and stake on the three public chain mainnets of Base, Polygon and Ethereum.

Currently, the decentralized Agent platform Morpheus also supports the development of this type of Agent. For example, Biconomy also demonstrated an operation in which the AI ​​Agent can swap ETH into USDC without granting full wallet permissions.

▎AI and on-chain transaction security

In the Web3 world, on-chain transaction security is crucial. AI technology can be used to enhance the security and privacy protection of on-chain transactions. Potential scenarios include:

Transaction monitoring: Real-time data technology monitors abnormal transaction activities, and provides real-time alert infrastructure for users and platforms.

Risk analysis: Help the platform analyze customer transaction behavior data and assess its risk level.

For example, Web3 security platform SeQure uses AI to detect and prevent malicious attacks, fraud and data leaks, and provides real-time monitoring and alert mechanisms to ensure the security and stability of on-chain transactions. Similar security tools include AI-powered Sentinel.

2. AI and on-chain infrastructure

▎AI and on-chain data

AI technology plays an important role in on-chain data collection and analysis, such as:

  • Web3 Analytics: It is an AI-based analysis platform that uses machine learning and data mining algorithms to collect, process and analyze on-chain data.

  • MinMax AI: It provides AI-based on-chain data analysis tools to help users discover potential market opportunities and trends.

  • Kaito: Web3 search platform based on LLM search engine.

  • Followin: Integrates with ChatGPT to collect, integrate and present relevant information scattered on different websites and social platforms.

  • Another application scenario is oracles, where AI can obtain prices from multiple sources to provide accurate pricing data. For example, Upshot uses AI to target the volatile prices of NFTs and provides NFT prices with a percentage error of 3-10% through over 100 million evaluations per hour.

▎AI and development &auditing

Recently, Cursor, a Web2 AI code editor, has attracted a lot of attention in the developer circle. On its platform, users only need to describe it in natural language, and Cursor can automatically generate the corresponding HTML, CSS and javaScript codes, greatly simplifying the software. Development process, this logic is also suitable for improving the development efficiency of Web3.

Currently, deploying smart contracts and DApps on public chains usually requires proprietary development languages ​​such as Solidity, Rust, Move, etc. The vision of the new development language is to expand the design space of decentralized blockchain and make it more suitable for DApp development. However, given the large shortage of Web3 developers, developer education has always been a more troublesome problem.

At present, AI assists Web3 development. Conceivable scenarios include: automated code generation, smart contract verification and testing, DApp deployment and maintenance, intelligent code completion, AI dialogue to answer development problems, etc. With the assistance of AI, It not only helps to improve development efficiency and accuracy, but also lowers the programming threshold, allowing non-programmers to transform their ideas into practical applications, bringing new vitality to the development of decentralized technology.

Currently, the most eye-catching ones are one-click token launch platforms, such as Clanker, an AI-driven “Token Bot” designed for rapid DIY token deployment. You only need to tag Clanker on the SocialFi protocol Farcaster client such as Warpcast or Supercast, tell it your token idea, and it will activate the token for you on the public chain Base.

There are also contract development platforms, such as Spectral, which provides one-click generation and deployment of smart contracts to lower the Web3 development threshold. Even novice users can compile and deploy smart contracts.

In terms of auditing, the Web3 audit platform Fuzzland uses AI to help auditors check code vulnerabilities and provide natural language explanations to assist audit expertise. Fuzzland also uses AI to provide natural language explanations of formal specifications and contract code, as well as some sample code to help developers understand potential problems in the code.

3. AI and Web3 New Narrative

The rise of generative AI has brought new possibilities to the new narrative of Web3.

NFT: AI injects creativity into generative NFT. Through AI technology, various unique and diverse artworks and characters can be generated. These generative NFT can become characters, props or scene elements in games, virtual worlds or metaverses. , such as Binance's Bicasso, users can upload images and enter keywords to perform AI operations and generate NFTs. Similar projects include Solvo, Nicho, IgmnAI, and CharacterGPT.

GameFi: Centering on AI's natural language generation, image generation, and intelligent NPC capabilities, GameFi is expected to improve efficiency and innovation in game content production. For example, Binaryx’s first blockchain game AI Hero allows players to randomly explore different plot options through AI; there is also the virtual companion game Sleepless AI, which is based on AIGC and LLM, and players can unlock personalized gameplay through different interactions.

DAO: Currently, AI is also envisioned to be applied to DAO to help track community interactions, record contributions, reward members with the most contributions, proxy voting, etc. For example, ai16z uses AI Agent to collect market information on and off the chain, analyze community consensus, and make investment decisions based on the recommendations of DAO members.

Part.3 The significance of the combination of AI+Web3: towers and squares

In the heart of Florence, Italy, lies the Central Square, the most important place for political activities and a gathering place for citizens and tourists. Here stands a 95-meter-high city hall tower. The vertical and horizontal connection between the tower and the square The visual contrast complements each other and creates a dramatic aesthetic effect. Neil Ferguson, a professor of history at Harvard University, was inspired by this and associated it with the world history of networks and hierarchies in the book "Squares and Towers". Over time, the tides in the river ebb and flow.

This wonderful metaphor is not out of place when it comes to the relationship between AI and Web3 today. From the long-term, non-linear relationship history between the two, we can see that the square is more likely to produce new things and more creatively than the tower, but the tower still has its legitimacy and strong vitality.

With the ability of technology companies to highly cluster energy computing power data, AI has exploded with unprecedented imagination, and major technology companies have invested heavily in the field, ranging from different chat robots to "underlying large models" GPT-4, GP4-4o, etc. Iterative versions have appeared in turn, and the automatic programming robot (Devin) and Sora, which has the ability to initially simulate the real physical world, have been released. The imagination of AI has been infinitely magnified.

At the same time, AI is essentially a large-scale and centralized industry. This technological change will push technology companies that have gradually gained structural dominance in the "Internet Era" to a narrower high point. Huge electricity, monopoly cash flow and huge data sets required to dominate the smart era have created higher barriers for it.

As the towers get taller and taller, the decision-makers behind the scenes become smaller and smaller, and the centralization of AI brings many hidden dangers. How can the people gathered in the square avoid the shadows under the towers? This is the problem that Web3 hopes to solve.

In essence, the inherent properties of blockchain enhance artificial intelligence systems and bring new possibilities, mainly:

  • "Code is law" in the era of artificial intelligence - through smart contracts and cryptographic verification, a transparent system automatically executes rules and delivers rewards to people closer to the goal.

  • Token Economics – Create and coordinate participant behavior through token mechanisms, staking, slashing, token rewards and penalties.

  • Decentralized Governance – prompts us to question sources of information and encourages a more critical and insightful approach to AI technologies, preventing bias, misinformation and manipulation, ultimately fostering a more informed and empowered society.

The development of AI has also brought new vitality to Web3. Maybe it will take time to prove the impact of Web3 on AI, but the impact of AI on Web3 is immediate: whether it is the carnival of Meme or the AI ​​Agent helping to reduce the cost of on-chain applications. The usage threshold can be seen.

When Web3 was defined as the self-interest of a small group of people, and fell into doubts about copying traditional industries, the addition of AI brought it a foreseeable future: a more stable and larger Web2 user group, and a larger Innovative business models and services.

We live in a world where "towers and squares" coexist. Although AI and Web3 have different timelines and starting points, their end points are how to make machines better serve humans. No one can define a rushing river. We Looking forward to seeing the future of AI+Web3.

*All content on the Coinspire platform is for reference only and does not constitute an offer or recommendation for any investment strategy. Any personal decisions based on the content of this article are the sole responsibility of the investor, and Coinspire is not responsible for any resulting gains or losses. Investment is risky and decisions need to be made with caution.

more