
This IEEE Spectrum article delves into an often-overlooked dimension of generative AI: its energy consumption and infrastructure demands. It begins by reminding us how a simple ChatGPT prompt triggering billions of responses daily conceals a massive electrical and computational burden.
The author examines estimates of energy per query, centering on a figure of 0.34 watt-hours (Wh), a value cited by OpenAI’s Sam Altman, which, when multiplied by the platform’s daily load of 2.5 billion queries, results in about 850 megawatt-hours used each day. That’s enough energy to charge thousands of electric vehicles. For a full year’s worth of queries, the consumption rivals the annual energy use of 29,000 U.S. homes.
But the article warns that 0.34 Wh per query is likely a conservative estimate. Some researchers argue that more sophisticated or resource-intensive prompts could demand well over 20 Wh. Beyond ChatGPT itself, the generative AI ecosystem, including competing models and API usages, could lift total consumption into the 15 terawatt-hour range in 2025.
Projecting forward to 2030, the piece anticipates that generative AI’s energy needs could swell into the hundreds of terawatt-hour range, driven mostly by inference (the real-time query work) rather than training. To meet that demand, the article says, the industry may need dozens of “Stargate-class” data centers, i.e., massive facilities capable of consuming over 8 TWh annually per campus.
The article’s core message is a stark one: the environmental cost of generative AI is rarely discussed, yet it’s immense. As usage scales, energy consumption and infrastructure footprints will become front-and-center concerns. The technologies that power creative automation can’t be disconnected from the physical, electrical systems that sustain them.