
GPT-5's energy consumption could match that of 1.5M US homes
What's the story
OpenAI's latest AI model, GPT-5, is said to consume significantly more energy than its predecessors. Experts who have been studying the energy and resource consumption of AI models for years have confirmed this trend. The enhanced capabilities of GPT-5, such as website creation and solving complex problems, come at a cost in terms of power consumption.
Information gap
Company has not disclosed model-specific energy use
Despite the increased energy use, OpenAI has not shared any official data on the power consumption of its models since GPT-3 was launched in 2020. In June, CEO Sam Altman shared some figures related to ChatGPT's resource consumption on his blog. However, these numbers were not specific to any model and lacked supporting documentation.
Expert insights
Models like GPT-5 consume more power
Rakesh Kumar, a professor at the University of Illinois, said that more complex models like GPT-5 consume more power during training and inference. He also noted that it is targeted at long thinking tasks, which would further increase its power consumption compared to GPT-4. Researchers from the University of Rhode Island's AI lab found that GPT-5 can use up to 40Wh of electricity for a medium-length response of about 1,000 tokens.
Energy comparison
Average energy consumption for a medium-length response
The average energy consumption for a medium-length response from GPT-5 is just over 18Wh, according to the researchers' dashboard. This figure is higher than all other models they have benchmarked except for OpenAI's o3 reasoning model and R1 by the Chinese AI company DeepSeek. The 18Wh energy consumed by GPT-5 would be equivalent to burning an incandescent bulb for 18 minutes.
Impact assessment
Total energy consumption of GPT-5
With ChatGPT reportedly handling 2.5 billion requests a day, the total energy consumption of GPT-5 could match the daily electricity needs of 1.5 million US homes. These numbers are in line with expectations for GPT-5's energy consumption, given its likely larger size compared to previous models. But note that OpenAI has not disclosed the parameter counts for any of its models since GPT-3, which had 175 billion parameters.
Size impact
Resource footprint of AI models
A disclosure from French AI company Mistral has found a "strong correlation" between a model's size and its energy consumption. Shaolei Ren, a professor at the University of California, Riverside, who studies the resource footprint of AI, said that based on GPT-5's model size, its resource consumption should be orders of magnitude higher than that for GPT-3.