Why You - and the Planet - Are Paying for the AI Gold Rush

Sierra Club recently released a framework for best practices for large load users seeking to green the grid, Demanding Better

The State of California has its roots in a frenzy of economic extraction that left ecological (and human) catastrophe in its wake. In the race to access gold first, and fast, companies used hydraulic mining that carved chasms and ravines into rock, collapsing hillsides into rivers, and permanently altering the landscape. Mining companies used mercury for processing to remove the gold from the stone in which it was embedded, releasing between three to eight million pounds of mercury into California’s rivers (according to one estimate cited by the USGS). Much of this mercury was then consumed by bacteria and converted into its organic form, methylmercury, a neurotoxin that bioaccumulates in fish and can cause acute disease in humans. Having begun with the discovery of gold in 1848, the rush peaked in 1852 and tailed off quickly, ending as early as 1855 in some accounts and certainly by the end of the decade.

The California Gold Rush is not the only example of a rapidly accelerating industry burning itself out in a matter of years, only to leave a centuries- or even millennia-long ecological legacy. But its location makes it a particularly apropos cautionary tale for artificial intelligence (AI) cheerleaders, many of whom are located in Silicon Valley offices only a few hours’ drive from where gold was discovered in 1848. Amazon, Facebook, Google, and Microsoft (known as “hyperscalers” in the industry) are spending billions of dollars on infrastructure for AI facilities–the data centers that perform the energy-intensive computations necessary to produce ChatGPT responses and Google’s “AI Overview” in search results, among other applications. 

Because the large-language models used by AI require more computing power, they also require more electricity. According to Goldman Sachs, a ChatGPT inquiry requires almost 10 times the electricity to perform the relevant computations than a Google search. New AI data centers will consume enormous amounts of electricity. Analysts predict that increased data center load (including but not exclusively AI machines) will require approximately 50 gigawatts of new power plants and will consume an additional 400 terawatt hours (that is, a trillion watt-hours) per year by 2030. A large gas plant has a nameplate capacity of around 1 gigawatt; adding this much demand to the grid threatens to prompt the construction of fifty new large gas plants. This growth is not just staggering in absolute terms, but is driving growth of electricity demand on the grid as a whole. According to an analysis by Goldman Sachs, U.S. electricity demand has grown by 0.5% on average over the past twenty years. This rate of growth is expected to quintuple between now and 2030, due in large part to data center demand.  As of 2023 data centers made up 3% of U.S. power demand; by 2030 that is predicted to grow to 8-9% (according to Goldman Sachs in April) or as high as 11-12% (according to a September report by McKinsey) by the end of this decade. Which means that the construction of new fossil fuel generation is, in significant part, to power these data centers. 

Like the hydraulic mining of the 1850s, this investment risks long-term ecological damage–and near-term costs for everyone who uses electricity in the United States, whether or not you ever ask ChatGPT a question or even live near one of the new data centers. Sierra Club has released a framework for meeting some of this demand with 24/7 clean energy. But even in the weeks since the report was released, AI’s biggest investors have taken steps that suggest that they are more interested in extracting what they can from the current boom than preserving our planet or protecting ratepayers.

 

Build Fast and Break the Grid

The builders of AI data centers are beginning to reveal both a preference for speed over all else and a desire to socialize the costs and risks of their unprecedented increase of electrical load. Many hyperscalers have sustainability commitments that would (in theory) require them to power these new data centers with renewable energy: Amazon has announced a net-zero goal by 2040, Google by 2030, and Microsoft aims to be carbon negative by 2030. But building renewable resources takes time–time which tech companies in a perceived race for AI market share don’t feel they have. The result: new gas plants and the continued (or even re-started) operation of coal and nuclear plants. A coal plant in an environmental justice community in North Omaha will burn coal for three more years than originally planned and, some worry, even further into the future, to serve Google and Meta facilities. A power deal with Microsoft prompting the reopening of the notorious Three Mile Island nuclear plant. And the construction of a 1.2 gigawatt gas facility outside Milwaukee, Wisconsin, for a massive new AI data center. In each of these cases, it is the speed that these data centers can obtain electricity–not the long-term ecological consequences for the surrounding community and a world on the brink of catastrophic climate change–that are driving the choice of energy generation resources.

Serving this new demand is also expensive for both the data center providers, and potentially the rest of us as well. First, there are direct costs: building the energy generation resource or the power purchase agreement itself. These costs (as in the case of Three Mile Island) may be absorbed directly by the hyperscaler through a “behind-the-meter” power purchase agreement. But they may also be incurred, at least initially, by the utility responsible for serving load in the area where the data center is being built. If a hyperscaler decides to come online as an industrial customer of a utility – just like a home or factory, only with much, much bigger electrical demand – the utility will either bill the hyperscaler as if it were such a large customer, at its usual industrial rates, or negotiate a specific contract to serve the giant new load. That contract (known as a tariff) may include provisions designed to make sure additional costs associated with generating electricity and moving it to the new data centers are not paid by all of the utility’s customers through their rates. But there is no guarantee that the utility will not spread the cost of the new generation it has to build across all its customers. For example: If the utility has to build a new gas plant to meet the increased demand from the datacenter, those costs may be incorporated into all of its customers’ rates. In Ohio, big tech companies are pushing back hard against a proposed rate that would require upfront financial assurances and other provisions designed to protect existing ratepayers from increased cost due to a staggering 4.4 gigawatt in interconnection requests for data centers in central Ohio. Amazon, Microsoft, and Google are similarly challenging a “large load tariff” proposed by Indiana Michigan Power that would include a minimum contract term, exit fee, minimum monthly billing demand, and increased collateral for companies seeking to add 150 megawatt or more of new electric demand to the utility’s system.

But there are also indirect costs, and these can be massive. Indirect costs include the transmission lines to serve new data centers and ensure reliability on a grid tasked with serving an unprecedented amount of new load. And it also includes the cost of providing capacity. “Capacity” can be thought of as the guarantee that a utility will be able to provide electricity at the right time. Utilities buy (and produce) electricity to serve their customers, but they are also required to have a certain amount of capacity–or the ability to produce electricity, on demand. In some states utilities largely own their own generation resources and thus “self-supply” capacity. But in others, especially in the mid-Atlantic region, utilities do not own their own generation resources and instead must buy capacity from a regional market to meet requirements set by the grid operator. Projected increases in electricity demand due to data centers are driving the prices for this capacity sky high. For example, in the Dominion, Virginia region, where data centers made up 24 percent of load in 2023, capacity prices were 65 percent higher than the remainder of its grid operator. 

In most places in the U.S., grid operators dispatch generators beginning with the least expensive to operate. Wind, solar, and battery resources, which generate electricity for free once wired to the grid, go first. The most expensive resources on a marginal basis are usually coal or (depending on fuel prices) gas. The price paid for electricity is the cost of the most expensive generation resource operating on the system. When AI data centers are added to existing grids, power plants that typically sat in reserve because they were too expensive to operate on a day-to-day basis come online, driving up both pollution and the cost of electricity for everyone on the system. In restructured electricity market regions that depend on price signals to ensure there is enough electricity to meet demand, this phenomenon affects not just the price of electricity, but the wholesale cost of capacity as well. When that energy supply can’t keep up, the cost of energy for everyone can shoot through the roof, like it did in Texas during Winter Storm Uri. Energy experts talk about an effect called “DRIPE”, or Demand Response Induced Price Effect, where just a little demand reduction in a big system can provide huge energy cost benefits for everyone. In this case, the demand increase coming around the corner will do exactly the opposite, and could result in massive price increases for everyone.

 

Preventing Climate Catastrophe is a “Timed Test”

But the damage wrought by climate change threatens to dwarf these immediate-term costs. These costs are monetary–spending to rebuild after major climate disasters made more frequent and more intense by warming seas or to attempt to protect cities from rising sea levels but also ecological and human—the loss of biodiversity and thousands or millions of human lives in storms, heat, and famine. On a national scale, the emissions associated with the use of fossil fuels to power large language-model machines are a relatively small percentage of overall emissions. But that is projected to change, rapidly–Goldman Sachs predicts data centers’ portion of U.S. electrical demand will double by 2030; McKinsey, triple or quadruple. And in some places (notably, Virginia), data center demand already makes up close to half of the overall demand for purposes of utility planning over the next decade, and thus, the decision as to whether a new gas plant will be built or a coal plant retired.

Once the carbon dioxide and methane associated with these coal and gas plants are in the atmosphere, the warming associated with these emissions is locked in. Carbon dioxide remains in the atmosphere, absorbing heat, for thousands of years. Methane breaks apart much more quickly—approximately 12 years—but has 86 times the global warming impact of carbon dioxide on a per pound basis. And once warming starts accelerating, it triggers a series of feedback loops–the melting of arctic ice reduces reflection, leading to more absorption of heat from the sun; drought in the Amazon may destroy its capacity to act as a carbon sink; and the carbon emissions associated with more intense and frequent forest fires rival that of whole countries. As Bill McKibben puts it, “climate change is a timed test.” Getting to zero carbon dioxide and methane emissions in 2040 will not prevent the worst impacts if we don’t cut emissions as much as possible, as soon as possible. That goal is incompatible with extending the lives of coal plants by three or more years. Given this hard reality, hyperscalers simply cannot square their purported commitments to green energy on a scientifically defensible timeline with their business priority of bringing as many AI machines online, as fast as possible.

This need for rapid emissions reductions is also why AI advocates cannot simply point to large-scale electrification (and thus massive load growth) to evade responsibility. It matters whether increased electricity demand comes online before or after the generators used to meet it are converted to renewable resources. The current timescale for electrification is too slow, but so is the transition to renewable resources on the grid. Surging load now exacerbates this problem. Whatever the U.S. grid will look like in 2035, adding data centers faster than the grid can add solar generation, wind, and batteries until then will harm the climate. And while electrification represents the shifting of emissions from industrial processes to the power sector, adding AI machines presents an absolute increase of emissions. 

The current and near-term results of the AI buildout are ironic, because the hyperscalers have previously articulated admirable commitments to greening the grid and, until very recently, could be considered potential allies in promoting the growth of wind, solar, and batteries. But recent developments suggest these companies are prepared to jettison their sustainability goals in hopes of striking AI gold. There is reason to believe a desire to increase stock prices by catering to investors’ fear of being left behind rather than actual consumer demand is driving the rush-to-market of some tools. As some analysts have pointed out, AI is now largely a technology without an application. It is unclear in which industries current large-language model AI technology will prove economically efficient, much less socially useful. Like cryptocurrency miners before them, AI hyperscalers appear poised to set back U.S. climate goals by increasing fossil fuel generation and forestalling coal retirements while driving up the cost of electricity in service of a desire to maximize short-term profits on a technology with questionable long-term societal benefits or viability. Silicon Valley should take a step back from this precipice, consider the rivers of its home state still bearing the contamination of 150 years ago, and prioritize the long-term sustainability not just of the AI industry, but the planet, in finding electricity to power its latest invention. We all must Demand Better of tech companies.


Up Next

Próximo Artículo