Companies increase investment in the "morpheme economy" industry chain layout

According to the data, as of this March, our country’s daily average Token (daily average token) usage has already exceeded 1.4 quadrillion, growing by more than 1,000 times over two years. The surge in Token usage not only confirms that AI application scenarios are continuously deepening, but also creates new opportunities—relevant companies are accelerating their deployments along the “Token economy” industry chain, such as high-performance computing power supply, Token operational services, and high-quality datasets.

So-called Tokens are the smallest information units processed by large models—for example, every time you ask AI a question or use AI to generate content in daily work and life is effectively calling Tokens. Especially since this year, the breakout of agents such as “lobsters” has driven a significant increase in Token consumption.

“Since late January this year, some model companies have set performance records in which their revenue has exceeded the total revenue for all of 2025 within 20 days. Behind these figures is a new business logic that is accelerating in evolution—based on Token-based billing.” Liu Ruilong, Director of the National Data Administration, said at the 2026 annual meeting of the China Development Forum in China recently. He added that, around the calling, distribution, and settlement of Tokens, a new value system is accelerating its evolution and formation, and is becoming an important path for monetizing the AI industry.

The surge in Token usage is, in turn, driven by the consumption of computing power, the operation of algorithms, and electricity investment, which also sets higher requirements for computing power. Recently, many companies have been speeding up innovative deployments of high-performance computing power, and architecture innovations represented by super-node technology are an important path to improving computing power efficiency.

For example, Inspur (Sugon) launched its first wireless-cable container-style super-node, scaleX40, earlier this month. In typical inference scenarios, compared with the same number of cards, inference throughput performance improves by more than 4 times, significantly enhancing the Token output capability of unit computing power. ZTE has rolled out super-node technology: by reconstructing the computing interconnect system, it integrates dozens to hundreds of multi-vendor GPUs into a unified computing unit, achieving system-level optimization of computing power.

In the view of industry insiders, the core innovation in the computing power industry will focus on reducing effective Token costs. Deep coordination between computing power and applications, end-to-end optimization across software and hardware, and ecosystem coordination across the entire industry chain will be key priorities. ZTE’s Strategic and Ecosystem Chief Expert, Tu Jiashun, said that the deployment and promotion of super-node technology will drive AI computing infrastructure to evolve in directions of higher efficiency, more green operation, and greater openness.

“Token surges impose higher requirements on computing power’s computation density, memory access bandwidth, and communication efficiency, which also prompts computing power to shift from pursuing single-card peak performance toward systematic coordinated optimization of ‘memory—bandwidth—interconnect.’” Lu Feng, President of the Beijing Frontier Future Technology Industry Development and Research Institute, told reporters that the “Token economy” will drive underlying computing infrastructure toward higher energy efficiency.

Given their characteristics of being measurable, priceable, and tradable, Tokens also serve as the “settlement unit” connecting technology supply and business demand. Recently, the three major telecom operators have all said they will further explore “Token operational services.”

“In the intelligent age, to build a new form of intelligent economy, we must accelerate the transformation and upgrading from ‘traffic-based operations’ to ‘Token-based operations.’” Liu Guoqing, General Manager of China Telecom, introduced that China Telecom has already carried out some initial practice in Token operations. Taking an example of a certain company’s AI private cloud deployment on the China Electric Information Soil platform, China Telecom custom-developed 73 agents, bringing annual consumption of 1.2 trillion Tokens.

In addition, China Unicom will accelerate building a computing power operational model of “agents + Token + AI cloud.” China Mobile said that, by integrating high-quality AI models, it creates trustworthy inference services, and it links the complete service chain: “use Tokens through agents, and Tokens pull computing power,” with the Token market opening up rapidly.

A large increase in daily average Token calls indicates that the supply of datasets is also increasing substantially. This is the stage where data elements empower innovative AI development through positive interaction. The reporter learned from the National Data Administration that, as of the end of 2025, more than 100k high-quality datasets have already been built nationwide, with a total size exceeding 890PB (petabyte; a unit of computer storage capacity), which is roughly equivalent to about 310 times the total amount of digital resources of the National Library of China.

“Data are processed and used by AI by being broken down into Tokens, providing solid support for model iteration and application deployment.” Lu Feng said that, as Tokens become the standardized unit for pricing and circulation of AI capabilities, value across the industry chain will also be reallocated to each optimization node across the “full lifecycle of Tokens.” In particular, in the data services segment, high-quality Token supply will become a pool of high-quality resources, and it is also expected to drive the development of standalone commercial products such as prompt engineering, Token compression, and vertical Token libraries.

It is reported that next, the National Data Administration will coordinate with all parties to deeply implement the next round of action plans for building high-quality datasets. Guided by scenario needs, it will create AI-Ready (AI readiness) high-quality datasets that are technically feasible, practical and convenient, and have quality assurance. This will achieve a quality-and-quantity upgrade in the supply of high-quality datasets.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin