The impact of DeepSeek on upstream and downstream protocols of Web3 AI

CN
1 month ago

DeepSeek bursts the last bubble of the Agent track, DeFAI may give birth to new life, and the industry's financing methods are undergoing a transformation.

Written by: Kevin, the Researcher at BlockBooster

TL;DR

  • The emergence of DeepSeek shatters the moat of computing power, with open-source models leading the new direction of computing optimization;

  • DeepSeek benefits the model and application layers in the industry’s upstream and downstream, negatively impacting the computing power protocols in the infrastructure;

  • DeepSeek inadvertently bursts the last bubble of the Agent track, with DeFAI most likely to give birth to new life;

  • The zero-sum game of project financing is expected to come to an end, with community launches + a small number of VCs becoming the new norm.

The impact triggered by DeepSeek will have far-reaching effects on the upstream and downstream of the AI industry this year. DeepSeek successfully enables consumer-grade graphics cards to perform large model training tasks that previously required high-end GPUs. The first moat surrounding AI development—computing power—begins to collapse. As algorithm efficiency races ahead at a staggering annual rate of 68%, while hardware performance follows the linear ascent of Moore's Law, the deeply entrenched valuation models of the past three years are no longer applicable. The next chapter of AI will be opened by open-source models.

Although Web3's AI protocols are completely different from Web2's, they inevitably bear the influence of DeepSeek, which will give rise to entirely new use cases in the upstream and downstream of Web3 AI: infrastructure layer, middleware layer, model layer, and application layer.

Sorting out the collaborative relationships of upstream and downstream protocols

Through the analysis of technical architecture, functional positioning, and practical use cases, I divide the entire ecosystem into: infrastructure layer, middleware layer, model layer, and application layer, and clarify their dependencies:

Infrastructure Layer

The infrastructure layer provides decentralized underlying resources (computing power, storage, L1), with computing power protocols including: Render, Akash, io.net, etc.; storage protocols including: Arweave, Filecoin, Storj, etc.; and L1 including: NEAR, Olas, Fetch.ai, etc.

The computing power layer protocols support model training, inference, and framework operation; storage protocols preserve training data, model parameters, and on-chain interaction records; L1 optimizes data transmission efficiency and reduces latency through dedicated nodes.

Middleware Layer

The middleware layer serves as a bridge connecting the infrastructure and upper-layer applications, providing framework development tools, data services, and privacy protection, with data annotation protocols including: Grass, Masa, Vana, etc.; development framework protocols including: Eliza, ARC, Swarms, etc.; and privacy computing protocols including: Phala, etc.

The data service layer provides fuel for model training, while the development framework relies on the computing power and storage of the infrastructure layer, and the privacy computing layer protects data security during training/inference.

Model Layer

The model layer is used for model development, training, and distribution, with open-source model training platforms like: Bittensor.

The model layer relies on the computing power of the infrastructure layer and the data of the middleware layer; models are deployed on-chain through development frameworks; the model market delivers training results to the application layer.

Application Layer

The application layer consists of AI products aimed at end users, with Agents including: GOAT, AIXBT, etc.; DeFAI protocols including: Griffain, Buzz, etc.

The application layer calls pre-trained models from the model layer; relies on privacy computing from the middleware layer; and complex applications require real-time computing power from the infrastructure layer.

DeepSeek may negatively impact decentralized computing power

According to a sample survey, about 70% of Web3 AI projects actually call OpenAI or centralized cloud platforms, with only 15% of projects using decentralized GPUs (such as Bittensor subnet models), and the remaining 15% using hybrid architectures (sensitive data processed locally, general tasks on the cloud).

The actual usage rate of decentralized computing power protocols is far below expectations and does not match their actual market value. There are three reasons for the low usage rate: Web2 developers continue to use their original toolchains when migrating to Web3; decentralized GPU platforms have yet to achieve price advantages; and some projects evade data compliance checks under the guise of "decentralization," while still relying on centralized clouds for actual computing power.

AWS/GCP holds over 90% of the AI computing power market share, while Akash's equivalent computing power is only 0.2% of AWS. The moat of centralized cloud platforms includes: cluster management, RDMA high-speed networks, and elastic scaling; decentralized cloud platforms have web3 improved versions of these technologies, but they have unaddressed flaws, such as latency issues: distributed node communication latency is six times that of centralized clouds; and toolchain fragmentation: PyTorch/TensorFlow does not natively support decentralized scheduling.

DeepSeek reduces computing power consumption by 50% through sparse training, enabling consumer-grade GPUs to train models with hundreds of millions of parameters. Market expectations for high-end GPUs in the short term have been significantly lowered, and the market potential for edge computing has been re-evaluated. As shown in the figure, before the emergence of DeepSeek, the vast majority of protocols and applications in the industry used platforms like AWS, with only a few use cases deployed on decentralized GPU networks. These use cases valued the price advantage of the latter in consumer-grade computing power and did not focus on the impact of latency.

This situation may worsen further with the emergence of DeepSeek. DeepSeek releases the constraints on long-tail developers, and low-cost, efficient inference models will spread at an unprecedented speed. In fact, many centralized cloud platforms and several countries have already begun deploying DeepSeek, and the significant reduction in inference costs will give rise to a large number of front-end applications, which will have a huge demand for consumer-grade GPUs. Faced with the impending massive market, centralized cloud platforms will engage in a new round of user competition, not only competing with leading platforms but also with countless small centralized cloud platforms. The most direct way to compete will be through price cuts. It is foreseeable that the price of the 4090 on centralized platforms will decrease, which would be a disaster for Web3's computing power platforms. When price is no longer the only moat for the latter, and computing power platforms in the industry are also forced to lower prices, the result will be unbearable for io.net, Render, Akash, and others. A price war will destroy the last remaining valuation ceiling of the latter, and the downward spiral caused by declining revenue and user loss may force decentralized computing power protocols to transform in a new direction.

The specific significance of DeepSeek to the industry’s upstream and downstream protocols

The impact of DeepSeek on Web3 AI upstream and downstream protocols_aicoin_fig1

As shown in the figure, I believe DeepSeek will have different impacts on the infrastructure layer, model layer, and application layer. From a positive perspective:

  • The application layer will benefit from a significant reduction in inference costs, allowing more applications to ensure that Agent applications remain online for extended periods and complete tasks in real-time;

  • At the same time, low-cost model expenses like those of DeepSeek can enable the DeFAI protocol to form more complex SWARMs, with thousands of Agents used for a single use case, where each Agent's role will be very detailed and clear, greatly enhancing user experience and preventing user inputs from being incorrectly disassembled and executed by the model;

  • Developers in the application layer can fine-tune models, feeding prices, on-chain data and analysis, and governance data to DeFi-related AI applications without having to pay high licensing fees.

  • The significance of the open-source model layer will be proven after the emergence of DeepSeek, as high-end models are opened to long-tail developers, stimulating a widespread development boom;

  • The computing power walls built around high-end GPUs over the past three years have been completely shattered, giving developers more choices and establishing direction for open-source models. In the future, the competition among AI models will no longer be about computing power but about algorithms, and this shift in belief will become the cornerstone of confidence for open-source model developers.

Specific subnets around DeepSeek will emerge one after another, with model parameters increasing under the same computing power, and more developers will join the open-source community.

From a negative perspective:

  • The objective existence of usage latency in computing power protocols within the infrastructure cannot be optimized;

  • Moreover, the hybrid network composed of A100 and 4090 requires higher coordination algorithm demands, which is not an advantage of decentralized platforms.

DeepSeek bursts the last bubble of the Agent track, DeFAI may give birth to new life, and the industry's financing methods are undergoing a transformation

Agents are the last hope for AI in the industry. The emergence of DeepSeek liberates the constraints of computing power, painting a future expectation of application explosion. Originally a huge boon for the Agent track, it has been punctured by the strong correlation with the industry, US stocks, and Federal Reserve policies, causing the track's market value to plummet.

In the wave of integration between AI and the industry, technological breakthroughs and market games always go hand in hand. The chain reaction triggered by Nvidia's market value fluctuations serves as a mirror, reflecting the deep-seated dilemmas in the AI narrative within the industry: from On-chain Agents to DeFAI engines, the seemingly complete ecological map conceals the harsh reality of weak technological infrastructure, hollowed-out value logic, and capital dominance. The superficially prosperous on-chain ecosystem hides hidden ailments: a large number of high FDV tokens compete for limited liquidity, outdated assets rely on FOMO sentiment to survive, and developers are trapped in PVP involution, consuming innovative momentum. When incremental funds and user growth hit a ceiling, the entire industry falls into the "innovator's dilemma"—eager for breakthrough narratives while struggling to break free from the shackles of path dependence. This state of tearing provides a historic opportunity for AI Agents: it is not only an upgrade of the technical toolbox but also a reconstruction of the value creation paradigm.

Over the past year, more and more teams in the industry have discovered that traditional financing models are failing—the old tricks of giving VCs small shares, high control, and waiting for them to pump the price are no longer sustainable. With VCs tightening their pockets, retail investors refusing to take over, and high thresholds for large exchanges to list tokens, a new playstyle more suited to bear markets is rising: collaborating with leading KOLs + a small number of VCs, launching with a large proportion of the community, and cold-starting with low market value.

Innovators represented by Soon and Pump Fun are opening new paths through "community launches"—collaborating with leading KOLs for endorsement, distributing 40%-60% of tokens directly to the community, and launching projects at valuation levels as low as $10 million FDV, achieving millions of dollars in financing. This model builds consensus FOMO through KOL influence, allowing teams to lock in profits early while exchanging high liquidity for market depth. Although it sacrifices short-term control advantages, it can repurchase tokens at low prices during bear markets through compliant market-making mechanisms. Essentially, this represents a paradigm shift in power structure: from a VC-led game of hot potato (institutional takeover - selling on exchanges - retail buying) to a transparent game of community consensus pricing, forming a new symbiotic relationship between project parties and the community within liquidity premiums. As the industry enters a period of transparency revolution, projects that cling to traditional control logic may become remnants of an era swept away by the tide of power migration.

The short-term pain in the market precisely confirms the irreversibility of the long-term technological wave. When AI Agents reduce on-chain interaction costs by two orders of magnitude, and adaptive models continuously optimize the capital efficiency of DeFi protocols, the industry is expected to welcome the long-awaited Massive Adoption. This transformation does not rely on conceptual hype or capital acceleration but is rooted in the technological penetration of real demand—just as the electricity revolution did not stall due to the bankruptcy of bulb companies, Agents will ultimately become the true golden track after the bubble bursts. DeFAI may be the fertile ground for new life, and as low-cost inference becomes routine, we may soon see use cases where hundreds of Agents are combined into a Swarm. With equivalent computing power, a significant increase in model parameters can ensure that Agents in the open-source model era can be fine-tuned more fully, even when faced with complex user input instructions, allowing them to be broken down into task pipelines that a single Agent can fully execute. Each Agent optimizing on-chain operations may promote increased overall activity and liquidity in DeFi protocols. More complex DeFi products led by DeFAI will emerge, which is precisely where new opportunities arise after the last bubble burst.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink