Articles Menu
July 11, 2024
A few weeks ago, I joined a small group of reporters for a wide-ranging conversation with Bill Gates about climate change, its causes and potential solutions. When the topic turned to the issue of just how much energy artificial intelligence was using, Gates was surprisingly sanguine.
“Let’s not go overboard on this,” he said during a media briefing on the sidelines of an event he was hosting in London.
A.I. data centers represent a relatively small additional load on the grid, Gates said. What’s more, he predicted that insights gleaned from A.I. would deliver gains in efficiency that would more than make up for that additional demand.
In short, Gates said, the stunning rise of A.I. will not stand in the way of combating climate change. “It’s not like, ‘Oh no, we can’t do it because we’re addicted to doing chat sessions,’” he said.
That’s an upbeat assessment from a billionaire with a vested interest in the matter. Gates is a big-time climate investor, and is the former head of Microsoft and remains a major stockholder in the company, which is at the center of the A.I. revolution.
And while it’s too early to draw a definitive conclusion on the issue, a few things are already clear: A.I. is having a profound impact on energy demand around the world, it’s often leading to an uptick in planet-warming emissions, and there’s no end in sight.
A.I. data centers have a big appetite for electricity. The so-called graphic processing units, or G.P.U.s, used to train large language models and respond to ChatGPT queries, require more energy than your average microchip and give off more heat.
With more data centers coming online almost every week, projections about how much energy will be required to power the A.I. boom are soaring.
One peer-reviewed study suggested A.I. could make up 0.5 percent of worldwide electricity use by 2027, or roughly what Argentina uses in a year. Analysts at Wells Fargo suggested that U.S. electricity demand could jump 20 percent by 2030, driven in part to A.I.
And Goldman Sachs predicted that data centers would account for 8 percent of U.S. energy usage in 2030, up from just 3 percent today.
“It’s truly astronomical potential load growth,” said Ben Inskeep, the program director at Citizens Action Coalition, a consumer watchdog group based in Indiana that is tracking the energy impact of data centers.
Microsoft, Google, Amazon and Meta have all recently announced plans to build new data centers in Indiana, developments that Inskeep said would strain the grid.
“We don’t have enough power to meet the projected needs of data centers over the next five to 10 years,” he said. “We would need a massive build-out of additional resources.”
Tech giants are scrambling to get a grip on their energy usage. For a decade now, those same four companies have been at the forefront of corporate efforts to embrace sustainability.
But in a matter of months, the energy demands from A.I. have complicated that narrative. Google’s emissions last year were 50 percent higher than in 2019, largely because of data centers and the rise of A.I. Microsoft’s emissions also jumped for the same reasons, up 29 percent last year from 2020. And Meta’s emissions jumped 66 percent from 2021 to 2023.
In statements, Google and Microsoft both said that A.I. would ultimately prove crucial to addressing the climate crisis, and that they were working to reduce their carbon footprints and bring more clean energy online. Amazon pointed to a statement detailing its sustainability efforts.
There are two ways for tech companies to meet the demand: tap the existing grid, or build new power plants. Each poses its own challenges.
In West Virginia, coal-fired power plants that had been scheduled to retire are being kept online to meet the energy needs of new data centers across the border in Virginia.
And across the country, utilities are building new natural-gas infrastructure to support data centers. Goldman Sachs anticipates that “incremental data center power consumption in the U.S. will drive around 3.3 billion cubic feet per day of new natural gas demand by 2030, which will require new pipeline capacity to be built.”
At the same time, the tech giants are working to secure a lot more power to fuel the growth of A.I.
Microsoft is working on a $10 billion plan to develop renewable energy to power data centers. Amazon has said it used 100 percent clean energy last year, though experts have questioned whether the company’s accounting was too lenient.
All that new low carbon power is great. But when the tech companies themselves are consuming all that electricity to power new A.I. data centers, pushing up energy demand, it isn’t making the grid overall any cleaner.
The energy demands from A.I. are only getting more intense. Microsoft and OpenAI are planning on building a $100 billion data center, according to reports. Initial reporting suggests it may require five gigawatts of power, or roughly the equivalent of five nuclear reactors.
And at the same time companies are building more data centers, many of the chips at the heart of the A.I. revolution are getting more and more power hungry. Nvidia, the leader in A.I. chips, recently unveiled new products that would draw exponentially more energy from the grid.
The A.I. boom is generating big profits for some companies. And it may yet deliver breakthroughs that help reduce emissions. But, at least for now, data centers are doing more harm than good for the climate.
“It’s definitely very concerning as we’re trying to transition our current grid to renewable energy,” Inskeep said. “Adding a massive amount of new load on top of that poses a grave threat to that transition.”
[Top photo: A data center in San Jose, Calif. A.I. is having a profound impact on energy demand around the world.Credit...Jim Wilson/The New York Times]