
AI to consume 50% of data center power by year-end
What's the story
By the end of 2025, artificial intelligence (AI) systems could account for nearly half of the total energy consumption in data centers, a new analysis has found.
The study was conducted by Alex de Vries-Gao, founder of the Digiconomist tech sustainability website. His findings were published in Joule, a sustainable energy journal.
The International Energy Agency (IEA) also predicts AI will require almost as much energy by 2030 as Japan uses today.
Energy analysis
AI's energy consumption in data centers
De Vries-Gao's analysis is based on the power consumed by chips from NVIDIA and Advanced Micro Devices (AMD), which are used to train and operate AI models.
The study also includes the energy consumption of chips from other companies like Broadcom.
According to the IEA, all data centers (excluding cryptocurrency mining) consumed 415 terawatt-hours (TWh) of electricity last year.
De Vries-Gao estimates AI could already account for 20% of that total.
Consumption factors
Factors influencing data center energy consumption
De Vries-Gao took into account a number of factors in his calculations, including a data center's energy efficiency and electricity consumption associated with cooling systems for servers processing an AI system's busy workloads.
He estimates that by the end of 2025, energy consumption by AI systems could come close to as much as 49% of total data center power consumption (excluding crypto mining).
Demand factors
Potential slowdown in AI hardware demand
De Vries-Gao also highlighted potential factors that could slow down hardware demand, such as waning demand for applications like ChatGPT and geopolitical tensions affecting AI hardware production.
He cited China's access to chips as an example of this issue.
"These innovations can reduce the computational and energy costs of AI," he said. However, any efficiency gains could encourage even more AI use, potentially increasing hardware demand further.
Forecast
AI consumption could reach 23GW
De Vries-Gao predicts AI consumption could hit 23 gigawatts (GW), double the Netherlands's total energy consumption.
He also noted that multiple countries trying to build their own AI systems—a trend dubbed "sovereign AI"—could drive up hardware demand.
De Vries-Gao cited US data center start-up Crusoe Energy, which secured 4.5GW of gas-powered energy capacity for its infrastructure with OpenAI among potential customers through its Stargate joint venture.
Environmental impact
AI drives threaten environmental targets
Microsoft and Google have confessed that their AI drives are jeopardizing their chances of meeting internal environmental targets.
De Vries-Gao observed that information on AI's power demands has become increasingly scarce, calling it an "opaque industry."
The EU AI Act mandates AI companies to disclose the energy consumption behind training a model, but not for day-to-day use.
Transparency needed
Call for more transparency on AI's energy consumption
Professor Adam Sobey, mission director for sustainability at the UK's Alan Turing Institute, called for more transparency on how much energy is consumed by AI systems.
He also highlighted how much they could save by helping carbon-emitting industries like transport and energy become more efficient.
"I suspect that we don't need many very good use cases [of AI] to offset the energy being used on the front end," he said.