AI is costing data centers a fortune in energy bills
Every time you ask ChatGPT something, it uses about 2.9W-hours of energy—that's nearly 10 times what a Google search needs.
With more people using AI everywhere, data centers are working overtime: data center electricity use has risen and is projected to increase substantially — some projections suggest it could nearly triple within a few years.
AI servers alone are set to eat up even more power in the next few years.
Task-specific models can be way more efficient
AI's growing appetite means data centers now take up roughly 5% to 15% of their total power just for AI tasks—driving up electricity bills and carbon emissions, especially since some projections indicate a substantial portion of additional demand could come from gas and coal by 2030.
But there's some good news: newer tech is getting smarter about saving energy. Task-specific models can be way more efficient (up to 30 times!), and GPUs and other hardware have improved over time.
Tricks like better cooling and power limits also help keep things greener, although some measures can involve performance trade-offs.