No results found.

AI Energy Consumption: Statistics from Key Sources [2026]

Sıla Ermut
Sıla Ermut
updated on Dec 31, 2025

A recent forecast from Lawrence Berkeley National Laboratory predicts AI will use over half of data center electricity by 2028.1 That projection reflects a broader shift: AI energy consumption is no longer a marginal byproduct of computing, but a material driver of electricity demand, grid stress, and emissions.

As AI inference scales across everyday products and services, understanding where energy is used and how it can be measured, managed, and reduced has become a core infrastructure and policy challenge rather than a purely technical one.

Explore the key statistics on AI energy consumption and best practices derived from leading AI researchers and agencies.

Recommendations for managing AI energy consumption

The rapid growth of artificial intelligence has made AI energy consumption a planning issue rather than a technical side effect. Data centers that run AI models already account for a growing share of electricity consumption, and their impact on the power grid is becoming visible at both national and regional levels. The recommendations below focus on measurement, efficiency, power supply, and governance to support AI sustainability efforts.

Treat AI as a trackable energy sector.

AI is still not consistently measured as a distinct category in energy statistics, making planning difficult for utilities and policymakers.

  • Define AI workloads explicitly within data center energy reporting, rather than grouping them under general IT.
  • Develop standardized metrics for energy use, electricity consumed, and carbon intensity linked specifically to AI inference and model training.
  • Enable grid planners, the federal government, and agencies such as the US Energy Information Administration to track AI data center energy consumption over time.
  • Use consistent reporting to distinguish between traditional computing and large language models, which often have much higher energy demands.

Improve measurement, disclosure, and transparency.

Reliable data is a prerequisite for managing AI’s environmental footprint.

  • Require tech companies to disclose the energy usage and carbon footprint of AI-based products and services.
  • Publish reproducible measurement methods so energy impacts can be compared across different AI models and platforms.
  • Provide clearer information to users and organizations to help them understand how much energy is required to run AI models and generate outputs.
  • Support benchmarking initiatives and data-sharing collaborations with utility companies and energy regulators.

Increase energy efficiency across hardware and software.

Efficiency gains remain the most direct way to limit growth in electricity consumption as AI workloads scale.

  • Improve the energy efficiency of algorithms, including model architectures and inference techniques, to reduce unnecessary computation.
  • Favor efficient models where possible, especially for high-volume tasks that affect daily life.
  • Increase utilization of graphics processing units and other computer systems to reduce idle compute.
  • Continue efficiency strategies developed over more than a decade, while adding new approaches tailored to accelerated servers used in AI data centers.

Reduce waste at the system level.

Much AI energy usage is driven by how systems are deployed rather than by model design alone.

  • Use demand-aware model placement to keep AI inference closer to users or on lower-carbon parts of the power grid.
  • Optimize serving systems to reduce idle capacity in hyperscale data centers.
  • Apply load management techniques to smooth demand and avoid sudden spikes that stress the power supply.
  • Reduce redundant model calls and unnecessary output generation, particularly in consumer-facing applications used by hundreds of millions of people.

Plan power supply and grid infrastructure ahead of demand.

New data centers can be built in a few years, but the electricity infrastructure takes longer to develop.

  • Coordinate new data centers with long-term grid planning to avoid relying on natural gas or other fossil fuels in emergencies.
  • Invest early in transmission and generation capacity to support AI workloads without increasing carbon emissions.
  • Use scenario planning, including International Energy Agency sensitivity cases, to manage uncertainty and avoid overbuilding fossil fuel capacity.
  • Address local impacts directly, since AI data centers can dominate regional electricity consumption even if their share of global electricity use remains modest.

Improve the quality of the electricity mix.

The carbon impact of AI depends heavily on how electricity is generated.

  • Track the actual energy mix on the grid, not just clean-energy contracts, to understand the grid’s real-world carbon intensity.
  • Accelerate procurement of carbon-free energy, including renewable energy sources and, where appropriate, nuclear power.
  • Avoid short-term fossil fuel lock-in when expanding capacity to support AI development.
  • Align data center siting decisions with regions where the grid is cleaner or improving.

Manage water use and cooling systems.

Cooling systems are a major contributor to the environmental footprint of AI data centers.

  • Invest in energy-efficient, location-aware cooling systems to reduce electricity and water use.
  • Implement water stewardship programs, including recycling and replenishment commitments, especially in water-stressed regions.
  • Encourage reuse of components and responsible management of hardware to reduce indirect environmental impacts.

Integrate AI energy policies into broader regulation.

AI energy use intersects with climate policy, infrastructure planning, and digital governance.

  • Integrate AI-specific requirements into existing environmental and energy regulations rather than treating AI as a separate issue.
  • Encourage disclosure and accountability as part of broader efforts to manage carbon emissions and electricity rates.
  • Support coordination between the AI industry, utility companies, and regulators to balance innovation with environmental responsibility.

MIT Technology Review, 2025

MIT Technology Review breaks AI energy consumption into two main phases: model training and AI inference. It argues that inference is now the dominant driver of energy usage because AI features are being embedded into daily life across products and services. It also highlights a transparency gap: most major “closed” AI model providers do not disclose sufficient information to estimate their total energy use or carbon footprint reliably.2

Electricity and total demand:

  • US data centers: 4.4% of total US electricity goes to data centers.
  • AI within US data centers: AI-specific servers used an estimated 53-76 terawatt-hours (TWh) in 2024, and projections suggest 165-326 TWh by 2028.

Training vs inference:

  • Inference share: 80%-90% of AI computing is estimated to be used for inference.
  • Example training energy: GPT-4 training is described as about 50 GWh.

Per-task energy (electricity consumption):

  • Text (Llama 3.1 8B): ~114 joules per response when accounting for non-GPU overhead.
  • Text (Llama 3.1 405B): ~6,706 joules per response with overhead.
  • Images (Stable Diffusion 3 Medium, 1024×1024): ~2,282 joules total; higher steps can raise this to ~4,402 joules.
  • Video (CogVideoX examples): ~109,000 joules for a low-quality short output; ~3.4 million joules for a higher-quality 5-second video.

Infrastructure and grid emissions:

  • The data center electricity carbon intensity is 48% higher than the US average.
  • Cooling systems in data centers can use large amounts of water, sometimes potable water.

International Energy Agency, 2025

The IEA frames AI energy demands through the lens of data centers and their components. It provides a practical breakdown of where electricity is consumed within a data center and offers a global outlook on the growth of data center electricity consumption.3

Global electricity consumption from data centers:

  • Estimated ~415 TWh in 2024, about 1.5% of global electricity consumption.
  • Projected to reach ~945 TWh by 2030, just under 3% of global electricity consumption in the IEA base case.

Data center electricity consumption by equipment type:

  • Servers: around 60% of electricity demand in modern data centers (varies by type).
  • Storage systems: around 5%.
  • Networking equipment: up to 5%.
  • Cooling systems and environmental control: about 7% in efficient hyperscale data centers, and over 30% in less-efficient enterprise data centers.

Figure 1: Graph showing the 2024 data of the share of electricity consumption based on data centre and equipment type.

Google Cloud, 2025

Google published a methodology to measure the environmental impact of AI inference for Gemini prompts, including electricity, carbon emissions, and water consumption. It presents per-prompt medians and claims significant efficiency improvements over a recent 12-month period.4

Per-prompt median impacts (Gemini Apps text prompt):

  • 0.24 Wh of energy
  • 0.03 gCO₂e emissions
  • 0.26 milliliters of water

Efficiency improvement claims

  • Over the past 12 months, Google claimed that the energy per median prompt fell by 33×, and total carbon footprint fell by 44×.

Data center efficiency and infrastructure

  • Fleet-wide average Power Usage Effectiveness (PUE) 1.09 for Google data centers.
  • Google’s latest-generation TPU, Ironwood, is claimed to be 30× more energy-efficient than its first publicly available TPU.

Carbon Brief Organization, 2025

Carbon Brief synthesizes the International Energy Agency (IEA)5 and other sources into a set of charts that show baseline impacts, growth projections, and regional concentration risks. It highlights that the sector is small globally today but growing quickly and locally significant in some grids.6

Current global shares

  • Data centers are responsible for just over 1% of global electricity demand and 0.5% of CO₂ emissions (see Figure 2).

Growth

  • IEA central scenario: Data center electricity consumption rises to 945 TWh by 2030.
  • AI share of data center power use: Roughly 5% to 15% recently, potentially 35% to 50% by 2030.

Regional concentration examples:

  • Ireland: Around 21% of national electricity is used for data centers, potentially 32% by 2026.
  • Virginia (US): 26% of electricity consumed by data centers (as cited).

Power supply mix for data centers (global)

  • Fossil fuels: Nearly 60%
  • Renewables: 27%
  • Nuclear: 15%

Figure 2: Based on the IEA’s Global Energy Review 2025 and its Energy and AI report, this figure compares electricity consumption (TWh) and CO₂ emissions (MtCO₂) from global data centers in 2024 with those of other sectors.

United Nations Regional Information Centre for Western Europe (UNRIC), 2025

UNRIC frames AI’s environmental footprint across the full lifecycle: software (training, deployment, inference, maintenance) and hardware (materials, manufacturing, construction, e-waste). It emphasizes that electricity consumption and water use in data centers are direct impacts and advocates policy measures to improve disclosure and accountability.7

Statistics and category breakdown (electricity, water, lifecycle categories): This article is more categorical than numeric in the extracted sections. It explicitly groups AI impacts into direct, indirect, and higher-order effects. Here are some of the key findings:

  • Categories of environmental impact
    • Direct: Electricity and water consumption, greenhouse gas emissions, mineral extraction, pollution, and electronic waste.
    • Indirect: Emissions from AI-enabled applications and services.
    • Higher-order: Amplification of inequalities and issues related to biased or poor-quality training data.
  • Data centers and resource use
    • Data centers consume large amounts of electricity, much of it still supplied by fossil fuels.
    • Significant amounts of water are required for cooling systems and construction.
    • Global AI-related water demand is expected to reach 4.2–6.6 billion cubic meters by 2027, exceeding Denmark’s annual water use.
    • Producing a 2-kilogram computer can require around 800 kilograms of raw materials, including rare minerals.
  • Electricity use and growth
    • A ChatGPT query uses about 10× more electricity than a Google search, according to the IEA estimate.
    • AI and machine learning accounted for <0.2% of global electricity use and <0.1% of global emissions in 2021, but demand is rising quickly.
    • Some tech companies report annual growth of over 100% in computing demand for AI training and inference.
  • Expansion of data centers
    • Data centers represented about 1% of global electricity demand in 2022.
    • In Ireland, data centers accounted for 17% of national electricity use in 2022.
    • The number of data centers worldwide has grown from 500,000 in 2012 to around 8 million today.

MIT News on Generative AI’s environmental impact, 2025

MIT News explains why generative AI is resource-intensive and distinguishes between training and inference. It stresses power density, grid reliability issues, and the lack of incentives for users to reduce usage when impacts are invisible.8

Power density

  • A generative AI training cluster might consume 7 to 8× more energy than a typical computing workload.

Data center electricity consumption

  • Global data center electricity consumption cited as 460 TWh in 2022 and projected ~1,050 TWh by 2026.

Model training example

  • Training GPT-3 estimated at 1,287 MWh and about 552 tons of CO₂.

United States Data Center Energy Usage Report, 2024

This report estimates historical US data center electricity consumption and provides scenario ranges through 2028. It explicitly links the post-2017 growth inflection to accelerated servers, including graphics processing units used to run AI models.9

US total data center electricity consumption:

  • ~60 TWh (2014–2016), relatively stable.
  • 76 TWh by 2018, about 1.9% of US electricity consumption.
  • 176 TWh by 2023, about 4.4% of US electricity consumption.

2028 scenario range:

  • 325 to 580 TWh by 2028.
  • Equivalent to 6.7% to 12.0% of forecast US electricity consumption in 2028.

Drivers and categories:

  • Growth is driven by GPU-accelerated servers for artificial intelligence, which now account for a significant share of the installed base.
  • It describes efficiency strategies that previously held demand flat, including improved cooling systems, power management, higher utilization rates, and reduced idle power.

MIT Technology Review, 2023

MIT Technology Review reports on one of the first attempts to quantify the energy use and carbon emissions of AI during everyday use (inference), rather than focusing only on training. The article is based on a preprint study by researchers from Hugging Face and Carnegie Mellon University.

The study shows that while training large AI models is highly energy-intensive, most of an AI model’s lifetime carbon footprint comes from its use. Because popular models are deployed millions or billions of times, day-to-day inference emissions can quickly surpass training emissions.10

Per-task energy and carbon intensity:

The researchers measured energy use across 10 common AI tasks on the Hugging Face platform, testing 88 different models and running 1,000 prompts per task using the Code Carbon measurement tool. Key comparisons are:

  • Image generation: Generating a single image with a powerful model consumes roughly the same energy as fully charging a smartphone.
    • Generating 1,000 images with a model like Stable Diffusion XL produces CO₂ emissions comparable to driving about 4.1 miles in a gasoline car.
    • Image generation is by far the most energy- and carbon-intensive AI task measured.
  • Text generation: Generating text is significantly less energy-intensive.
    • Producing 1,000 text outputs uses only about 16% of a smartphone’s charge.
    • The least carbon-intensive text model studied emitted as much as driving just 0.0006 miles.

Model size and task specialization:

The study highlights a major efficiency gap between general-purpose generative models and task-specific models:

  • Large generative models consume far more energy because they are designed to perform many tasks (generate, classify, summarize). For example, using a generative model to classify movie reviews requires ~30× more energy than using a smaller model fine-tuned specifically for sentiment classification.
  • Smaller, specialized models are consistently less carbon-intensive for narrow applications.

Usage emissions vs training emissions:

The researchers compared training emissions with cumulative usage emissions:

  • Training Hugging Face’s largest BLOOM model was exceeded after about 590 million uses.
  • For extremely popular models like ChatGPT, usage emissions could exceed training emissions within weeks, due to massive daily user volumes.
  • This happens because training occurs once, while inference happens continuously at scale.

Broader implications and expert views:

  • Experts note that the emissions per task were higher than expected, raising concerns as generative AI becomes embedded in everyday software (email, search, word processing).
  • Researchers emphasize that newer, larger models are substantially more carbon-intensive than AI systems from just a few years ago.

FAQ

Industry Analyst
Sıla Ermut
Sıla Ermut
Industry Analyst
Sıla Ermut is an industry analyst at AIMultiple focused on email marketing and sales videos. She previously worked as a recruiter in project management and consulting firms. Sıla holds a Master of Science degree in Social Psychology and a Bachelor of Arts degree in International Relations.
View Full Profile

Be the first to comment

Your email address will not be published. All fields are required.

0/450