JEFFREY WU
By advancing open AI models and investing in renewable-energy capacity, China is creating an energy-compute flywheel: more clean power enables more compute, which in turn optimizes the grid. This puts China at a distinct advantage vis-à-vis the West, where energy constraints are creating digital bottlenecks.
HONG KONG – The next stage of the global AI race will be decided not by algorithms or chips, but by electricity – and that puts China at a distinct advantage. While Western tech giants are emphasizing closed, capital-intensive models that demand enormous computing power, China is embracing open-source AI and massively expanding its renewable- and nuclear-energy capacity, thereby positioning itself to deploy powerful AI technologies at scale without breaking the bank.
Financial Investors Can’t Profit From Complacency Forever
ŞEBNEM KALEMLI-ÖZCAN argues that bond-market participants and others are consciously choosing to ignore obvious policy risks.
These differences reflect a more fundamental split. Whereas the United States and its allies have treated AI as a proprietary technology, China has approached it as public infrastructure, building an open AI ecosystem that reflects the same philosophy it applied to manufacturing: broad adoption, fast iteration, and relentless cost reduction. Chinese open-source models like DeepSeek, Qwen, and Kimi are not just scientific achievements; they are strategic instruments designed for participation, and they are transforming the economics of AI.
DeepSeek’s latest version reportedly matches the capabilities of frontier systems, like those being developed by US companies, at a fraction of their compute cost. Qwen and Kimi’s API prices have fallen by orders of magnitude. In purely economic terms, the marginal cost of “thought” is collapsing. The inference costs of some Chinese models are a tenth or less of those incurred by OpenAI’s GPT-4.
The cheaper AI becomes, however, the more of it the world consumes, with each token saved inviting a thousand more to be generated. The same dynamic that once powered the coal age now drives the digital one. In China, this is by design: low inference costs, together with the open weights of Chinese models, are intended to invite experimentation across universities, startups, and local governments. But all that activity requires energy: the International Energy Agency expects global electricity consumption from data centers to double by 2030 (from 2024 levels), driven largely by AI workloads. Training GPT-4 alone likely consumed millions of kilowatt-hours – enough to power San Francisco for three days.
What once was a contest of algorithms is fast becoming a contest of kilowatts, and China is setting itself up to win. In 2024, the country added 356 gigawatts of renewable-energy capacity – more than the US, the European Union, and India combined – with 91% of all new generation coming from solar, wind, and hydro. Battery storage tripled from 2021 levels, and an ultra-high-voltage grid now carries clean power thousands of miles, from deserts to data hubs.
China is also investing heavily in nuclear energy. According to the Information Technology and Innovation Foundation, its nuclear research and development spending is roughly five times higher than that of the US. As its fourth-generation reactors and small modular designs advance from pilot to deployment, nuclear energy is quietly providing the baseload power that intermittent renewables cannot.
No comments:
Post a Comment