How Much Energy Does AI Use? The People Who Know Aren’t Saying

how-much-energy-does-ai-use?-the-people-who-know-aren’t-saying

“People are often curious about how much energy a ChatGPT query uses,” Sam Altman, the CEO of OpenAI, wrote in an aside in a long blog post last week. The average query, Altman wrote, uses 0.34 watt-hours of energy: “About what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes.”

For a company with 800 million weekly active users (and growing), the question of how much energy all these searches are using is becoming an increasingly pressing one. But experts say Altman’s figure doesn’t mean much without much more public context from OpenAI about how it arrived at this calculation—including the definition of what an “average” query is, whether or not it includes image generation, and whether or not Altman is including additional energy use, like from training AI models and cooling OpenAI’s servers.

As a result, Sasha Luccioni, the climate lead at AI company Hugging Face, doesn’t put too much stock in Altman’s number. “He could have pulled that out of his ass,” she says. (OpenAI did not respond to a request for more information about how it arrived at this number.)

As AI takes over our lives, it’s also promising to transform our energy systems, supercharging carbon emissions right as we’re trying to fight climate change. Now, a new and growing body of research is attempting to put hard numbers on just how much carbon we’re actually emitting with all of our AI use.

This effort is complicated by the fact that major players like OpenAI disclose little environmental information. An analysis submitted for peer review this week by Luccioni and three other authors looks at the need for more environmental transparency in AI models. In Luccioni’s new analysis, she and her colleagues use data from OpenRouter, a leaderboard of large language model (LLM) traffic, to find that 84 percent of LLM use in May 2025 was for models with zero environmental disclosure. That means that consumers are overwhelmingly choosing models with completely unknown environmental impacts.

“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” Luccioni says. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere.”

As a result of this lack of transparency, Luccioni says, the public is being exposed to estimates that make no sense but which are taken as gospel. You may have heard, for instance, that the average ChatGPT request takes 10 times as much energy as the average Google search. Luccioni and her colleagues track down this claim to a public remark that John Hennessy, the chairman of Alphabet, the parent company of Google, made in 2023.

A claim made by a board member from one company (Google) about the product of another company to which he has no relation (OpenAI) is tenuous at best—yet, Luccioni’s analysis finds, this figure has been repeated again and again in press and policy reports. (As I was writing this piece, I got a pitch with this exact statistic.)

“People have taken an off-the-cuff remark and turned it into an actual statistic that’s informing policy and the way people look at these things,” Luccioni says. “The real core issue is that we have no numbers. So even the back-of-the-napkin calculations that people can find, they tend to take them as the gold standard, but that’s not the case.”

One way to try and take a peek behind the curtain for more accurate information is to work with open source models. Some tech giants, including OpenAI and Anthropic, keep their models proprietary—meaning outside researchers can’t independently verify their energy use. But other companies make some parts of their models publicly available, allowing researchers to more accurately gauge their emissions.

A study published Thursday in the journal Frontiers of Communication evaluated 14 open-source large language models, including two Meta Llama models and three DeepSeek models, and found that some used as much as 50 percent more energy than other models in the dataset responding to prompts from the researchers. The 1,000 benchmark prompts submitted to the LLMs included questions on topics such as high school history and philosophy; half of the questions were formatted as multiple choice, with only one-word answers available, while half were submitted as open prompts, allowing for a freer format and longer answers. Reasoning models, the researchers found, generated far more thinking tokens—measures of internal reasoning generated in the model while producing its answer, which are a hallmark of more energy use—than more concise models. These models, perhaps unsurprisingly, were also more accurate with complex topics. (They also had trouble with brevity: During the multiple choice phase, for instance, the more complex models would often return answers with multiple tokens, despite explicit instructions to only answer from the range of options provided.)

Maximilian Dauner, a PhD student at the Munich University of Applied Sciences and the study’s lead author, says he hopes AI use will evolve to think about how to more efficiently use less-energy-intensive models for different queries. He envisions a process where smaller, simpler questions are automatically directed to less-energy-intensive models that will still provide accurate answers. “Even smaller models can achieve really good results on simpler tasks, and don’t have that huge amount of CO2 emitted during the process,” he says.

Some tech companies already do this. Google and Microsoft have previously told WIRED that their search features use smaller models when possible, which can also mean faster responses for users. But generally, model providers have done little to nudge users toward using less energy. How quickly a model answers a question, for instance, has a big impact on its energy use—but that’s not explained when AI products are presented to users, says Noman Bashir, the Computing & Climate Impact Fellow at MIT’s Climate and Sustainability Consortium.

“The goal is to provide all of this inference the quickest way possible so that you don’t leave their platform,” he says. “If ChatGPT suddenly starts giving you a response after five minutes, you will go to some other tool that is giving you an immediate response.”

However, there’s a myriad of other considerations to take into account when calculating the energy use of complex AI queries, because it’s not just theoretical—the conditions under which queries are actually run out in the real world matter. Bashir points out that physical hardware makes a difference when calculating emissions. Dauner ran his experiments on an Nvidia A100 GPU, but Nvidia’s H100 GPU—which was specially designed for AI workloads, and which, according to the company, is becoming increasingly popular—is much more energy-intensive.

Physical infrastructure also makes a difference when talking about emissions. Large data centers need cooling systems, light, and networking equipment, which all add on more energy; they often run in diurnal cycles, taking a break at night when queries are lower. They are also hooked up to different types of grids—ones overwhelmingly powered by fossil fuels, versus those powered by renewables—depending on their locations.

Bashir compares studies that look at emissions from AI queries without factoring in data center needs to lifting up a car, hitting the gas, and counting revolutions of a wheel as a way of doing a fuel-efficiency test. “You’re not taking into account the fact that this wheel has to carry the car and the passenger,” he says.

Perhaps most crucially for our understanding of AI’s emissions, open source models like the ones Dauner used in his study represent a fraction of the AI models used by consumers today. Training a model and updating deployed models takes a massive amount of energy—figures that many big companies keep secret. It’s unclear, for example, whether the light bulb statistic about ChatGPT from OpenAI’s Altman takes into account all the energy used to train the models powering the chatbot. Without more disclosure, the public is simply missing much of the information needed to start understanding just how much this technology is impacting the planet.

“If I had a magic wand, I would make it mandatory for any company putting an AI system into production, anywhere, around the world, in any application, to disclose carbon numbers,” Luccioni says.

Paresh Dave contributed reporting.

Related Posts

Leave a Reply