How Much Energy Does ChatGPT Use? The Real Numbers

A single ChatGPT query uses approximately ten times more electricity than a traditional Google search. That claim has circulated widely since 2023, appearing in academic papers, media coverage, and policy debates about AI's environmental impact. But in 2025, the real picture is considerably more nuanced, and the actual numbers have changed substantially as models have become more efficient. Understanding the true energy consumption of AI is essential for any sustainability professional evaluating the environmental trade-offs of AI adoption.

The commonly cited figure of 2.9 watt-hours per ChatGPT query comes from the Electric Power Research Institute (EPRI). However, more recent analysis tells a different story. In February 2025, Epoch AI estimated that a typical ChatGPT query using GPT-4o likely consumes roughly 0.3 watt-hours, ten times less than the older estimate. OpenAI CEO Sam Altman subsequently stated that the average ChatGPT query uses approximately 0.34 watt-hours, "about what an oven would use in a little over one second." However, this figure applies to standard text queries; complex tasks, reasoning models, and long-document analysis consume significantly more.


The Energy Per Query: What the Research Shows

Infographic showing horizontal bar comparison of energy per AI query from Google search at 0.3 Wh through GPT-4o at 0.3-0.34 Wh GPT-4.1 nano at 0.45 Wh short document at 2.5 Wh original 2023 estimate at 2.9 Wh reasoning model o3 at 3.9 Wh GPT-5 at 18.9 Wh to long document at 40 Wh showing 130x range with note that 10x Google claim was based on 2023 hardware

The energy consumption per AI query varies enormously depending on the model, the complexity of the task, and the hardware. The original 2023 estimate of approximately 3 watt-hours per query was based on GPT-3.5 running on older Nvidia A100 hardware, with assumptions of 4,000 input tokens and 2,000 output tokens (approximately 1,500 words) per query. As Epoch AI's analysis showed, this significantly overestimated typical query length and used less efficient hardware than what is deployed today.

Current estimates span a wide range. For a standard text query to GPT-4o: approximately 0.3 to 0.34 Wh (Epoch AI, OpenAI). For a Google search: approximately 0.3 Wh (Google's 2009 disclosure, the most recent available). For a complex reasoning query using models like o3: approximately 3.9 Wh (Jegham et al., 2025). For attaching a long document (100,000 tokens) to ChatGPT: approximately 40 Wh. For the most advanced model (GPT-5) with a medium-length prompt: approximately 18.9 Wh on average (University of Rhode Island AI lab). The range from 0.3 Wh for a simple query to 40 Wh for a complex document analysis represents a 130-fold difference, determined entirely by the task and model selected.

Daily Global Usage: The Scale Effect

Infographic showing daily global AI energy scale with three stat cards for 2.5 billion queries per day from ChatGPT 850 MWh daily equivalent to 29000 US homes and 15 TWh all generative AI in 2025 rising to 347 TWh by 2030 with context cards showing 945 TWh by 2030 more than Japan comparison to EVs air conditioning and industry and AI reaching 35-50 percent of data centre load

OpenAI reports that ChatGPT now serves approximately 2.5 billion queries per day, with over 700 million weekly active users as of late 2025. At 0.34 Wh per query, that translates to approximately 850 MWh of electricity per day, enough to power roughly 29,000 US homes for a year. A Schneider Electric report estimates that all generative AI queries combined consume approximately 15 TWh in 2025, projected to reach 347 TWh by 2030.

The IEA projects total data centre electricity consumption could reach 945 TWh by 2030, with AI accounting for 35 to 50% of this total. To contextualise: 945 TWh exceeds Japan's total annual electricity consumption. However, this growth in data centre demand (approximately 530 TWh by 2030) remains smaller than projected demand growth from electric vehicles (838 TWh), air conditioning (651 TWh), or industrial electrification (1,936 TWh) over the same period.

Comparisons to Everyday Activities

Infographic showing eight comparison cards for ChatGPT energy including one query equals 0.34 Wh or LED bulb for 2.5 minutes Google search at 0.3 Wh smartphone charge equals 44 queries laptop hour equals 147 queries US household daily equals 82000 queries printing a book equals 14700 queries and heavy user 100 queries per day equals 0.1 percent of household useTo make AI energy consumption tangible, it helps to compare it against familiar activities. A standard ChatGPT query (0.34 Wh) uses about as much electricity as running an 8-watt LED light bulb for 2.5 minutes, or what an electric oven consumes in just over one second. A heavy ChatGPT user sending 100 queries per day would consume approximately 34 Wh, which is roughly 0.1% of the average US household's daily electricity consumption of 28,000 Wh.

However, scale changes the picture dramatically. The 2.5 billion daily queries collectively consume approximately 850 MWh per day. That is equivalent to the daily electricity consumption of a small city. And ChatGPT is just one AI application. When you add Google's AI-powered search, Microsoft Copilot, image generation tools, and the thousands of AI applications running across enterprise environments, the aggregate energy demand becomes substantial.


How Newer Models Are Getting More Efficient

Infographic showing timeline of AI efficiency improvements from 2023 GPT-3.5 on A100 at 2.9 Wh through 2025 GPT-4o on H100 at 0.3 Wh to future Blackwell architecture with 10x efficiency gain in 2 years result bar noting demand is growing even fasterThe most encouraging trend in AI energy consumption is the rapid improvement in efficiency. The original GPT-3.5 estimate of approximately 3 Wh per query on A100 hardware has fallen to approximately 0.3 Wh on H100 hardware with GPT-4o, a ten-fold improvement in approximately two years. Epoch AI attributes this to three factors: more efficient hardware (H100 GPUs versus A100), smaller active parameter counts in newer architectures (GPT-4o's mixture-of-experts design activates only a fraction of total parameters per query), and more realistic query lengths (typical queries are much shorter than the 1,500 words assumed in the original estimate).

Looking ahead, the trajectory suggests continued efficiency gains. Nvidia's next-generation Blackwell architecture promises further improvements in performance per watt. Model distillation techniques allow smaller models to replicate the performance of larger ones at a fraction of the energy cost. And inference optimisation techniques such as speculative decoding and quantisation continue to reduce per-query energy consumption. However, these efficiency gains are being partially offset by growing model complexity, longer context windows, and the expansion of AI into more computationally intensive tasks such as video generation, multi-step reasoning, and autonomous agents.


The Transparency Gap

Infographic showing transparency gap across AI providers with OpenAI partial disclosure at 0.34 Wh Google outdated 2009 figure and Anthropic Meta Microsoft showing no per-query disclosure alongside four recommended disclosure items for energy per query by model carbon per query by location water per query and training energy totals with OECD quotePerhaps the most significant challenge in understanding AI's energy footprint is the lack of standardised disclosure. OpenAI's 0.34 Wh figure is one of the first public disclosures of per-query energy consumption by a major AI provider, but it comes without detailed supporting methodology. Google has not updated its per-search energy figure since 2009. Anthropic, Meta, and other major providers have not disclosed per-query energy consumption at all.

Without standardised, independently verified energy disclosures, it is impossible for organisations to accurately assess the energy footprint of their AI usage. The OECD has called for AI model cards to include energy and water consumption data alongside carbon footprint information. Until this transparency exists, sustainability professionals should request energy disclosure from their AI providers and use the available estimates as a reasonable range rather than relying on any single figure.


Conclusion

The energy consumption of a single ChatGPT query is modest: roughly equivalent to running an LED light bulb for a few minutes. But at the scale of 2.5 billion daily queries, and with AI applications multiplying across every sector of the economy, the aggregate energy demand is significant and growing. The good news is that efficiency is improving rapidly: the energy per query has fallen ten-fold in two years. The bad news is that demand is growing even faster, driven by larger models, more complex tasks, and broader adoption. For sustainability professionals, the practical implications are straightforward: understand the range of energy consumption across different AI models and tasks, factor AI energy into your organisational carbon footprint, choose efficient models where appropriate, and advocate for the transparency standards that will make informed decision-making possible.


Share this post

Loading...