Daniela Rus is director of the MIT Computer Science and Artificial Intelligence Laboratory. Nico Enriquez, a student at the Stanford Graduate School of Business, has been a Technology-to-Market Scholar at the Energy Department’s Advanced Research Projects Agency.
To compete with China on AI, we need a lot more power
But AI uses a lot of electricity. A simple ChatGPT-4 query uses more than 10 times the voltage of a Google search. The difference arises from complexity: The GPT model utilizes 1.76 trillion parameters to predict each word in a sentence, while Google uses just 4 to 10 million to rank screenfuls of webpages. America’s aging power grid isn’t going to be able to handle AI’s increasing load.
Imagine each electron used to power a single computation — for a GPS route, say, or a medical test, or a cybersecurity system — is a car trying to get home. Today, that car must make its way on a complicated network of aluminum power lines that are the equivalent of back roads, many of them crowded, with no interstate highways. That’s the U.S. electricity transmission system: more than a century old, patching together 3,200 local and regional utilities, struggling to carry even the existing load.
Before we can use the most advanced microchips and run our state-of-the-art AI models, we need interstate-type corridors to transmit sufficient, reliable power to our data centers. More than 150 AI experts that we have interviewed in the last year affirmed that our outdated electric grid capacity is holding us back. If America’s computer servers hit their limits, suddenly the GPS in your car will take a while to load. The robotics assisting your gall bladder surgery will pause to compute. Even critical cybersecurity and infrastructure systems could be affected.
It takes the United States 10 to 20 years to get approval for and build new transmission lines. Compare that to China’s autocratic centralized efficiency: Beijing has largely consolidated its regional utilities into one state-run organization, and it can build new power lines in under five years. China now has a power system with a speed and scale that may be challenging for the United States to match; from 2014 to 2021 China built 80 times the interregional grid capacity that we did.
China is also the fastest builder of energy generation in the world. In the 11 years it took us to add two reactors to the Vogtle nuclear plant near Augusta, Ga., our only new reactors in three decades, China built almost 40. Meanwhile, between 2022 and 2023 China added more solar capacity than the rest of the world combined.
Our new “hyperscale” data center campuses consume massive amounts of electricity. Amazon’s development outside Pittsburgh uses more electricity than 750,000 homes. Microsoft and OpenAI are planning a “Stargate” campus that would use more power than 3 million homes. These sites often need to be in or near population centers, where our grid is already stretched to its limit.
We must innovate to assure we can power such concentrated computation. First, the fastest and greenest way to address the problem is to use less power. So we must build more efficient AI software and better computer chips.
This goal is within our reach: Our research at MIT has resulted in smaller, physics-based AI models that use 1/1000th the energy of ChatGPT — yet yield more accurate results. This exponential improvement would mean that calculations that now require supercomputers could run on our laptops and phones. And our colleagues have built computer chips that use a tenth the energy of the world’s best AI chips while yielding the same results. As such innovations are put into use, an AI calculation could become no more energy-intensive than a Google query.
Second, we must more quickly upgrade our power transmission system. This means encouraging utilities to be more efficient. Several safe, proven and low-cost “grid enhancing technologies” could save taxpayers billions if widely deployed. One example is upgrading existing metal power conductors by using new composite materials that can carry two or three times more electricity; a recent study estimated they could triple grid capacity at 25 percent of the cost. Furthermore, because they would be using existing transmission towers, they could be deployed in one to five years. Many countries in Europe and Asia already use these systems.
Regulators should encourage these more efficient investments by changing laws so utilities can profit from them — for example, by letting them retain 30 cents of every dollar they save for their customers. Britain has instituted this incentive system to much success.
Last, we must learn from China and create national high-voltage transmission corridors. To support this, we should take some of the $150 billion our tech companies devote annually to U.S. data center construction and pool it with federal and utility funding. This year, the Energy Department designated 10 National Interest Electric Transmission Corridors with access to inexpensive federal loans and faster permit processes because they would significantly improve our grid’s reliability and reduce consumer energy costs. We should expand these corridors to cover our nation.
To accelerate state and local approvals for transmission lines, Congress should pass legislation proposed this summer by Sens. Joe Manchin III (I-W.Va.) and John Barrasso (R-Wyo.) to speed environmental reviews, limit local comment periods and properly compensate people who have lines built in their backyards.
Much of the United States’ economic growth during the 2000s has come from our global leadership in tech. AI is the next wave. The United States has the talent, investor base, corporations and research institutions to write the most advanced AI models. But without a powerful data highway system, our great technology advances will be confined to back roads.