Mother Jones; Getty
This story was originally published on the author’s substack, Field Notes with Alexander C Kaufman, to which you can subscribe here.
Artificial intelligence is driving up demand for electricity—the only question is how much, and what provides the power.
Over the next three years, the Lawrence Berkeley National Laboratory estimates, AI’s thirst for power will double or triple. Last month, OpenAI unveiled its Stargate Project, a plan to invest $500 billion in the infrastructure for artificial intelligence over the next four years that includes adding 25 gigawatts of new electricity capacity.
Right now, the most likely source of electricity to power those data centers is gas. GE Vernova, the country’s leading manufacturer of gas turbines, has an order book roughly a decade long and recent deals to work with ExxonMobil and Chevron on generating plants to power the AI boom. While President Donald Trump retained the Biden administration’s executive order opening up federal lands to new data centers, he rescinded his predecessor’s other action directing agencies to consider the climate impacts of AI. Combined with Trump’s full-frontal assault on the fastest-growing sources of zero-carbon electricity, the US market for natural gas looks set to grow.
But new research from Duke University shows how we could do four Stargate Projects without building a single new power plant. The study, published Tuesday morning, found that scaling back power usage by just 0.5 percent per year could free up supply on the grid for nearly 100 gigawatts of additional power demand.
“This should be a tool in the toolbox to get new loads online more quickly, so we can win this economic race.”
The duration of power curtailment would be less than three hours on average at time—and it would be planned. That’s roughly within the bounds of what the Uptime Institute, an industry group that sets standards for data centers, considers high performance for unplanned outages.
The result comes just weeks after China’s Deepseek AI program went public with a major breakthrough in its computational approach, upending the debate over how much new power capacity is needed in the years to come.
The “clearest takeaway” from the Duke study is that we might not need as many new gas power power plants in the immediate future—or at all, according to the report’s lead author, who said the findings should help bring new data centers online even faster.
“If resources were no object and we had an infinite supply of electrical engineers, lineworkers, construction equipment, and transformers, that’d be one thing. But we don’t,” Tyler Norris, a fellow at Duke’s Nicholas Institute for Energy, Environment and Sustainability, told me. “There’s a limited set of upgrades we can get done in a certain timeframe. This can buy us some time to figure that out.”
The findings show that, simply by dialing down power usage at certain data centers during specific times, US regulators can determine which places need the most urgent attention, and even give cleaner generating technologies like new nuclear reactors, which take far longer to build than new gas plants, the time needed to get started.
“We’re not trying to claim we’re not going to need more capacity. We absolutely are. But it’s going to be different from jurisdiction to jurisdiction,” Norris said. “But, at minimum, this should be a tool in the toolbox to get new loads online more quickly, so we can win this economic race.”
For months now, data center companies have been seeking to buy up power from nuclear plants. In September, Microsoft agreed to spend $16 billion bringing the Three Mile Island station back online to power its AI efforts. Amazon and Google, meanwhile, invested in new small modular reactor companies. But there are only so many defunct nuclear plants to try to revive—and new, as-yet untested SMR technologies are still years away.
The amount of existing electricity capacity that can be directed to new data centers that are prepared to curtail power usage, on the other hand, is greater than what “the entire US nuclear fleet of 94 can supply,” Varun Sivaram, a senior fellow for energy and climate at the Council on Foreign Relations, told me.
“The guy just proved you could have 100 gigawatts more data centers in the United States by doing absolutely nothing,” Sivaram, who is launching a new startup aimed at promoting demand-response technology that can reduce power usage at certain times at data centers, said of Norris. “That’s shocking.”
But the implications are much bigger. While data centers are attracting much of the hype, they are hardly the only thing spurring growth in electricity demand in the US for the first time in decades.
Electrification of automobiles, factories and buildings—not to mention the rapidly surging need for air conditioning to survive increasingly brutal heat—are all set to add more demand on the grid.
When public utility commissioners consider new power plants to supply that new demand, advocates “should be able to wave this study and say, ‘Have you done everything you can do to ensure flexibility in this region before you build more fossil fuel power plants?’
“This is not about data centers,” Costa Samaras, the director of the Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon University, told me. “This is about the electrification project and getting serious about flexibility…as a way to free up space for all the things we need to do for electrification.”
With all these new demands for electricity, the only way to ensure that shifting from internal combustion engine cars to battery-powered ones, or coal-fired factories to electric ones, is to make sure new power plants are well planned and clean, said Samaras, who previously served as a senior energy adviser to the White House Office of Science and Technology Policy.
Data centers, he said, “are going to set the table for how we respond” to new demand for electricity. “If the response is to do nothing,” he added, “we have to build a bunch of new fossil plants. There goes clean electrification.”