American AI cannot cross the river like China does.

Over the past year, Silicon Valley giants have been running around everywhere, making things difficult not only for people like Zuckerberg and Musk, but also for the more foundational energy infrastructure.

To solve this problem, Musk bought an entire power plant from overseas and shipped it back to the United States, frequently arranging his team to travel to China to conduct research and procure solar photovoltaic equipment. Meta, represented by Zuckerberg, has signed at least three major nuclear power deals, and Google has also spent $4.8 billion to acquire a nuclear power plant.

It can be said that in the United States, it takes 7 years to build a power grid—yet Silicon Valley giants can’t even wait 1 day.

The explosive increase in training and inference demands for large models means that data centers’ need for stable, low-latency, and sustainable power supply far exceeds that of traditional internet infrastructure, further forcing power grids around the world to undergo deep upgrades in transmission and distribution capacity, energy storage technologies, renewable energy curtailment and utilization, and “power-compute” coordinated management.

At the same time, power resources themselves are beginning to be transformed into a new type of strategic asset. Competition between countries and regions over “compute power availability,” “the share of green energy,” and “data sovereignty” continues to intensify, causing data centers to upgrade from a purely technical facility into a key node that affects the global power structure.

“Political Economy of Infrastructure in the Age of Artificial Intelligence”

Just as railways reshaped logistics speed and the structure of national land space, and the internet changed how information flows and how businesses organize, the AI production mode centered on “computability” is also reconstructing the logic of value creation—giving rise to new industrial divisions, consumption patterns, and governance systems.

In this process, infrastructure and capital investment have become the fundamental prerequisites for unlocking the economic potential of artificial intelligence.

In other words, competition is not only reflected at the algorithm level—it is also reflected in who can build the corresponding infrastructure network faster, at larger scale, and in a greener way. As a result, capital is shifting from software to a new type of infrastructure spanning “compute power—energy—network,” which has become an important signal of changes in today’s global economic landscape. And differences in infrastructure and investment capacity will determine each country’s position and influence in the global AI economy in the future.

Looking back at the past decade, global data centers’ power demand has indeed grown significantly, but this growth has not always been “explosive.” Instead, it has gone through a process from slow to accelerating.

According to an analysis by the International Energy Agency (International Energy Agency, IEA) in 2024, between 2010 and 2018, global data center energy use rose by about 6%, with an average annual growth rate of about 0.7%. However, since 2018, growth has been about 50%–80%, and the average annual growth rate is equivalent to 8%-13%.

If this trend continues, by 2030, global data center energy consumption is expected to reach 600–800 terawatt-hours (TWh). A 2025 IEA report has already been updated to 935 terawatt-hours (TWh) (corresponding to the data center scale of 108GW capacity), accounting for 1.8%–2.4% of that year’s global forecast electricity demand. If AI drives an even higher consumption intensity (for example, large model training demand could cause energy consumption to grow at a 20% annual rate), then by 2030 data center energy consumption could reach 1100–1400 terawatt-hours, or about 3%–4% of the global forecast electricity demand.

In China, it is expected that by 2030, data center electricity demand will be double that of 2020, reaching 400 TWh.

With demand shifting from slow to accelerating, the backgrounds of the two phases differ.

Before 2018, although the volume of data services, network traffic, and storage demand all rose sharply, due to improvements in server hardware efficiency, advances in cooling technology, and the trend of hyperscaler data centers replacing traditional inefficient small data centers, over that decade, overall data center energy consumption did not “explode” year by year in line with business volume.

However, after 2018, global data center energy usage rose noticeably, and the growth rate jumped into the two-digit range. This shift was driven mainly by the surge in AI compute demand, the expansion of hyperscale data centers, and a spike in traffic from video content platforms. Data centers have developed into one of the single infrastructure types with the fastest growth in global electricity consumption, bringing new pressure to energy systems, carbon emissions, and digital governance.

Especially after large models emerged, data center construction in many regions entered a phase of rapid expansion. There is no fully unified and widely accepted “official” number of how many data centers exist globally, because countries differ in how they define data centers, their scale standards, and registration methods—meaning any estimate of their “total number” can only be an approximation.

According to a roundup by the market statistics firm Market.biz, as of March 2024, there are approximately 11,800 data centers worldwide in operation.

In terms of geographic distribution, Statista shows that by November 2025, the United States has the most data centers in the world, with 4,165 facilities. Next are the UK (499), Germany (487), China (381), France (321), Canada (293), Australia (274), India (271), Japan (242), and Italy (209).

It must be acknowledged that the current and future energy demand impact of data centers is distributed unevenly across the globe. For example, in the United States, data centers already account for more than one-fifth of Virginia’s total electricity consumption. In Europe, in 2022, data centers in Ireland had electricity demand of 5.3 TWh, equivalent to 17% of that country’s total electricity use. By 2026, as AI applications rapidly penetrate the market, that electricity usage will almost double, reaching 32% of the country’s total electricity demand.

The highly concentrated nature of data centers and their extremely high power density create major challenges at the local level, including grid interconnection and capacity constraints, water resource consumption, and community opposition.

There is another clear trend: in recent years, the electricity consumption of hyperscale data centers operated mainly by major technology companies has grown significantly. From 2017 to 2021, the combined electricity use of just four companies—Amazon, Microsoft, Google, and Meta—more than doubled, reaching about 72 TWh.

The rapid surge in the number of hyperscale data centers run by technology companies has brought enormous challenges to supply.

In many countries, power systems are highly fragmented—operated independently by multiple regional or local utilities, with a lack of unified dispatch and capacity planning—making it easy to run into problems such as voltage fluctuations, insufficient power, or dispatch delays. In addition, electricity prices, policies, and levels of power investment vary greatly across regions, further increasing the complexity of constructing and operating data centers. Overall, fragmentation in power systems not only constrains data centers’ ability to expand, but also, to a certain extent, affects the reliability and energy efficiency of digital infrastructure.

The more fundamental challenge is the source of energy supply itself.

In many countries, data centers still rely on fossil energy such as coal and natural gas. This not only creates pressure from carbon emissions, but also makes them vulnerable to fluctuations in fuel supply and price changes. Meanwhile, although renewable energy is growing rapidly, it suffers from uneven distribution and intermittency; without sufficient energy storage and smart dispatch methods, it is difficult to continuously meet data centers’ “7×24” continuous power supply demand. Against this backdrop, nuclear power is seen as a long-term viable solution. However, nuclear power has a long construction cycle and requires massive upfront investment, and it also needs strict safety regulation and policy support—so in practice it still faces challenges such as technology maturity, social acceptance, and waste handling capacity.

Taken together, data center energy problems are not only technical issues in power grid structure, but also a long-term test of energy strategy and policy planning.

The U.S. Model: Energy Constraints Driven by Market Forces

The development of data centers in the United States has long depended on market mechanisms and private capital. This model was extremely efficient in the early days of the internet: companies could deploy hyperscale data centers in places like Oregon, Virginia, and Texas based on differences in electricity prices and tax incentives.

According to a report by JLARC (The Joint Legislative Audit and Review Commission), Virginia’s data center capacity accounts for about 25% of North America’s total capacity and 13% of the world’s total capacity. Northern Virginia has more data centers than any other region, earning it the nickname “the world’s data center capital.”

The JLARC report states that Northern Virginia’s data center capacity is more than twice that of the next biggest competitor—Beijing, China—and also three times that of Hillsboro, Oregon, the next largest data center cluster in the United States. The tax exemptions in that state make Hillsboro a popular location for data centers, serving multiple companies including Meta, LinkedIn, TikTok, and X. However, with the arrival of the AI era, this market-driven expansion path is gradually running into hard constraints at the level of infrastructure and institutions.

Although the United States is ahead of China in many aspects of artificial intelligence—especially software and chip design—it faces a massive bottleneck in power supply for data centers and infrastructure approvals. AI compute power is like an “electric tiger,” devouring America’s power resources at a frantic pace and exacerbating the already fragile power grid.

Most of the facilities in the U.S. power grid were built in the 1960s and 1970s. Although the system has been upgraded through automation and some emerging technologies, aging infrastructure is increasingly difficult to meet modern power demand.

According to assessments by the American Society of Civil Engineers, the overall health of the U.S. power grid received only a C+ rating. Seventy percent of transformers have exceeded their 25-year design service life, and the average age of transmission lines is also nearing 40 years.

When the “pulsed” power consumption demands of artificial intelligence collide head-on with the power grid’s “old body,” this crisis not only severely limits the further development of the AI industry, but also exposes a deep contradiction in the United States between long-term underinvestment in infrastructure and emerging technology needs. If institutional barriers are not broken soon and grid investment is not increased promptly, the U.S.’s compute-power advantage in artificial intelligence is likely to turn into a mirage due to power shortages.

According to The Wall Street Journal, OpenAI’s model named Orion consumed as much as about 11 billion kilowatt-hours of electricity during two separate six-month-long large-scale training runs. This figure is equivalent to the annual electricity use of 1 million U.S. households, and it is also close to the amount of electricity the U.S. steel industry uses in a year. It would be enough for a Tesla Model 3 to drive 44 billion miles—roughly the equivalent of going back and forth to Neptune three times.

Energy intensity and power consumption during the deployment and usage phase are far lower than during the training phase, but as more people use AI tools like this, electricity demand during the deployment phase will keep growing. Moreover, because many companies and individuals are concerned about falling behind in the race to apply artificial intelligence technology, the “latest and strongest” models often attract massive usage, which further increases pressure on electricity demand.

On September 22, 2025, OpenAI announced a partnership with Nvidia to build AI data centers with electricity consumption as high as 10 gigawatts (GW). Andrew Chien, a professor of computer science at the University of Chicago, said: “A year and a half ago, they were still discussing a 5GW-scale project. Now they’ve raised the target to 10GW, 15GW, and even 17GW, showing a continuous escalation.”

OpenAI’s valuation for each data center project is about $50 billion, with a planned total investment of $850 billion. Nvidia alone has committed $100 billion to support this expansion plan and will provide millions of new Vera Rubin graphics processing units.

While this example shows massive electricity consumption, it is absolutely not an isolated case. Other major players in the AI industry—such as Google, Meta, Microsoft, Amazon, Anthropic, and more—will also take the same path when training next-generation AI models.

Because of the urgent demand for energy, some data centers in the United States are choosing to build their own power generation facilities rather than relying on connections to state public power grids. For example, on the wastelands of West Texas, a natural-gas-driven generation project is under construction. It is not an investment project of a traditional utility company; instead, it is an important part of the “Stargate” supercomputing center valued at up to $50 billion, which is being jointly built by OpenAI and Oracle.

At the same time, xAI is building two massive data centers called “Colossus” in Memphis, Tennessee, and is starting to adopt gas turbines for self-generation. Across the United States, more than a dozen data centers operated by Equinix—one of the world’s leading digital infrastructure and data center services companies—are also relying on fuel cells to generate electricity.

This trend is called “Bring Your Own Power.” Some describe it as an “energy Wild West” movement that is reshaping the U.S. energy landscape.

However, there is strong social resistance at the local level. Data centers may involve huge investment amounts, but their direct jobs typically number only dozens to a few hundred—far less than in traditional manufacturing projects. At the same time, their consumption of resources is extremely significant: a large data center can use millions of gallons of water per day (equivalent to thousands of tons), mainly for cooling systems; its electricity use can reach 100 megawatts (MW) or even higher—comparable to the electricity consumption of a small city. Under this “high consumption—low employment” structure, local community dissatisfaction gradually accumulates.

For example, in Loudoun County, Fairfax County, and Prince William County in Virginia, residents have protested data center expansion multiple times, arguing that it drives up housing prices, occupies land, and increases pressure on the power grid. According to reports, by 2025, at least 25 proposed data center projects were canceled due to opposition from local communities. In Oregon, some projects have been limited by local governments due to tight water resources. The explicit emergence of these “infrastructure externalities” means that data centers are no longer merely commercial investment projects, but have become local political issues in the United States.

Overall, the development of U.S. data centers is being affected by the combined impact of three overlapping constraints: first, physical bottlenecks in grid infrastructure that limit the speed of compute expansion; second, structural instability during the energy transition that raises electricity supply costs and risks; and third, local social and resource conflicts that weaken the political feasibility of projects being implemented. Together, these three factors form a new constraint mechanism, gradually revealing the institutional boundaries of what was originally a highly flexible, market-oriented data center expansion model in the AI era.

China’s Unique Approach

China’s power grid has unique advantages in the global energy system. These advantages come from scaled development, engineering capability, institutional coordination, and deep integration of technology and industrial supply chains. It not only supports domestic industrialization, urbanization, and digitalization, but also has become an important strategic variable in the global energy transition and data center industry layout.

China’s power system is the world’s largest and most complex grid—built with the world’s longest and highest-capacity UHV (ultra-high-voltage) transmission network. With UHV’s long-distance, low-loss characteristics, it enables “sending electricity from the west to the east” and “sending electricity from the north to the south.” There are no comparable cases worldwide. UHV enables the grid to connect large renewable energy bases (wind, solar, and hydropower) and deliver power stably to load centers, providing key foundations for renewable energy utilization.

China’s grid interconnection and reliability are also outstanding. The power supply reliability of some cities has reached world-advanced levels. In the Beijing–Tianjin–Hebei region, the Yangtze River Delta, and the Pearl River Delta, the annual average outage time for major cities is less than 1 hour per household. In core areas of major cities such as Beijing, Shanghai, Guangzhou, and Shenzhen, the annual outage time has entered the minute-level range, comparable to top international cities such as Tokyo and Singapore.

A large grid structure brings economies of scale and redundant supply, improving system resilience.

At the same time, China has made breakthrough progress in UHV technology, covering the entire process from equipment manufacturing to engineering design, construction, and operations. In the future, China’s ultra-high-voltage transmission projects will provide leading transmission solutions to more countries, and ultra-high-voltage will become China’s “new calling card.” In the field of UHV substation equipment, Chinese enterprises are world-leading, offering the full range of ultra-high-voltage products and taking the lead in formulating international standards. In power equipment manufacturing and infrastructure construction, China has already formed a complete industrial chain, giving it global advantages in cost, efficiency, and speed for large grid projects.

China’s power grid also has prominent advantages in digital infrastructure and intelligent dispatch. Technologies such as AI-assisted dispatch, smart substations, and unmanned inspections have already been deployed at scale, helping manage a massive and complex multi-source power structure. At the same time, China is among the global leaders in “rapidly increasing the share of new energy while keeping the power grid operating stably.”

As these grid advantages begin to translate into international influence.

Under the Belt and Road Initiative framework, China has assisted with or participated in the construction of large power projects in Southeast Asia, Africa, and the Middle East. Multiple Chinese grid standards have entered the IEC and ISO systems, which gives it potential influence and standard-setting power in future upgrades of global power infrastructure (such as high-voltage direct current and smart grids).

China can also play a key role in the global energy transition: for the world to raise the share of new energy, it cannot do without high-voltage transmission and China-made photovoltaic/wind turbine/energy storage equipment—this is also why Musk’s team came to China to purchase the equipment. It can be said that China’s scaled experience in power grids and the energy system has a demonstration effect for the world.

Unlike the United States’ reliance on global supply chains, China relies more on domestic independent industries for hardware and key materials, such as domestically made servers, AI chips, optical fibers, and energy storage equipment. It also emphasizes integrating domestic resources, using green energy, and coordinating with national planning. This reflects a infrastructure-and-digital strategy system with Chinese characteristics, while also guiding domestic companies to participate in global industrial chains, balancing independent controllability with international cooperation.

In the energy sector, China is strongly promoting the combination of data centers and clean energy. It is laying out “green data centers” powered by solar, wind, and nuclear energy to reduce reliance on fossil energy and improve sustainable development capabilities.

Strategically, China emphasizes combining regional hubs with national planning—building hyperscale data centers in core urban clusters such as the Guangdong–Hong Kong–Macao Greater Bay Area, the Yangtze River Delta, and the Beijing–Tianjin–Hebei region. At the same time, it connects regions nationwide through a “compute power network,” forming capabilities for compute dispatch and cross-province coordination.

But it needs to be noted that in promoting large-scale data center development, China also faces unique energy and structural risks.

First, the transition in the energy mix is a long-term plan, and at the current stage, China’s data centers still have a significant dependence on coal-fired power, leading to substantial carbon emissions and environmental pressure. According to statistics, in 2024, coal-fired power still accounts for about 45% of China’s total installed power capacity, with coal-fired power as the mainstay. Data centers’ demand for highly reliable power makes it realistically difficult to reduce the proportion of coal-fired power in the short term. High energy-consuming industries are concentrated in eastern coastal areas and energy-producing regions in central and western China, meaning it is highly difficult to coordinate carbon reduction with energy supply.

Second, China’s data center development shows a clearly defined east–west distribution pattern: compute hubs are mainly concentrated in eastern coastal cities such as Beijing, Shanghai, and Shenzhen, while power supply relies on central and western regions. Long-distance electricity transmission inevitably leads to line losses, increasing dependence on the stability of power grids in central and western China.

Highly concentrated data center layouts also bring potential systemic risks and resilience weaknesses. Once they encounter natural disasters, cyber/network attacks, or policy changes, they could trigger cascading impacts on AI services nationwide, cloud computing, and internet infrastructure.

So, in this year’s government work report, the concept of “power-compute coordination” was mentioned—promoting the integration of compute power and electricity, optimizing the electricity supply structure, and eliminating risks related to stability, among other aspects. This is a longer-term plan and consideration, and it will still take time to implement. In any case, what can be said with certainty is that in terms of energy supply, the U.S.’s artificial intelligence cannot “cross the river by feeling for stones” with China.

Source of this article: Tencent Technology

Risk warning and disclaimer terms

        There are risks in the market; investment involves caution. This article does not constitute personal investment advice, nor does it consider the specific investment objectives, financial situation, or needs of any individual user. Users should consider whether any opinions, viewpoints, or conclusions in this article align with their specific circumstances. Investing based on this is at your own risk and responsibility.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments