Contact Us
No results found.

AI Energy Consumption Statistics in 2026

Sıla Ermut
Sıla Ermut
updated on Jan 19, 2026

A recent forecast predicts AI will use over half of data center electricity by 2028.1 As compute-intensive workloads such as generative AI expand, total electricity demand is also expected to rise.

Explore the key statistics on AI energy consumption and best practices derived from leading AI researchers and agencies.

AI data center energy consumption

Loading Chart

We gathered data from multiple research organizations and industry analyses that focus on energy use in AI and data centers for the graph above. These sources include global energy agencies, academic studies, and initiatives by technology providers.

Here’s how the data was gathered:

  • We used figures and future projections from recent energy consumption analyses that estimate global electricity use in data centers and their growth.
  • Where available, we reference publicly available measurement methodologies from large cloud providers (e.g., Google Cloud’s environmental impact data) that disclose specific metrics, such as energy per inference.

Many reports use different measurement units or terminology, such as AI & ML share of global electricity, training vs. inference energy use, and data centers’ share of global electricity, making it difficult to compare them in a single chart. To ensure that all values could be presented together in a single chart, we included the studies that used comparable definitions and measurement units:

  • Global electricity consumption by data centers.
  • AI’s share of electricity consumption within data centers.
  • Electricity consumption by data centers in the United States, to highlight differences between global and U.S. energy use.

Recommendations for managing AI energy consumption

According to recent research, AI energy consumption is now dominated by inference and driven less by individual model runs than by scale, deployment patterns, and system inefficiencies. Here are our recommendations to effectively manage AI energy consumption:

Prioritize inference efficiency over training efficiency

Research shows that over 80% of AI compute is now used for inference.

  • Treat energy per inference (or per token / per output) as a primary optimization target.
  • Optimize inference paths before investing in marginal gains in training efficiency.
  • Focus optimization efforts on high-frequency endpoints, not rare or long-tail use cases.

Measure and publish per-task energy metrics, not just model-level claims

Based on MIT’s research, per-task energy consumption (electricity) varies across text, image, and video tasks.

  • Instrument pipelines to measure energy per task, including non-GPU overhead (memory, networking, orchestration).

Avoid using general-purpose generative models for narrow tasks

Higher energy use is closely linked to generative models, which are used for tasks like classification rather than specialized models. MIT Technology Review shows that task-specific models are less carbon- and energy-intensive.

  • Use task-specialized or distilled models for classification, ranking, extraction, and routing.
  • Reserve large generative models for tasks that require open-ended generation.
  • Introduce model cascades (from small model to large model, if needed).

Reduce system-level waste in inference serving

Infrastructure studies show that servers account for about 60% of data center electricity consumption.

  • Increase accelerator utilization via:
    • Batching
    • Caching
    • Smarter scheduling
  • Eliminate redundant calls across pipelines and microservices.
  • Implement demand-aware autoscaling rather than peak provisioning.

Treat hardware efficiency and Power Usage Effectiveness (PUE) as software concerns

  • Design models that fit efficiently within memory and bandwidth constraints.
  • Maximize utilization of existing hardware before scaling capacity.
  • Align model architecture choices with the most energy-efficient available accelerators.

Account for water use and hardware lifecycle in system design

Research from UNRIC shows that global AI-related water demand is projected to increase exponentially.

  • Favor deployments that reduce cooling intensity and water usage.
  • Extend hardware lifetimes through model efficiency and reuse.
  • Avoid unnecessary retraining or redeployment that accelerates hardware turnover.

MIT Technology Review, 2025

MIT Technology Review breaks AI energy consumption into two main phases: model training and AI inference. It argues that inference is now the dominant driver of energy usage because AI features are being embedded into daily life across products and services.

It also highlights a transparency gap. Most major “closed” AI model providers do not disclose sufficient information to estimate their total energy use or carbon footprint reliably.2

Electricity and total demand:

  • US data centers: 4.4% of total US electricity goes to data centers.
  • AI within US data centers: AI-specific servers used an estimated 53-76 terawatt-hours (TWh) in 2024, and projections suggest 165-326 TWh by 2028.

Training vs inference:

  • Inference share: 80%-90% of AI computing is estimated to be used for inference.
  • Example training energy: GPT-4 training is described as about 50 GWh.

Per-task energy (electricity consumption):

  • Text (Llama 3.1 8B): ~114 joules per response when accounting for non-GPU overhead.
  • Text (Llama 3.1 405B): ~6,706 joules per response with overhead.
  • Images (Stable Diffusion 3 Medium, 1024×1024): ~2,282 joules total; higher steps can raise this to ~4,402 joules.
  • Video (CogVideoX examples): ~109,000 joules for a low-quality short output; ~3.4 million joules for a higher-quality 5-second video.

Infrastructure and grid emissions:

  • The data center electricity carbon intensity is 48% higher than the US average.
  • Cooling systems in data centers can use large amounts of water, sometimes potable water.

International Energy Agency, 2025

The IEA frames AI energy demands through the lens of data centers and their components. It provides a breakdown of where electricity is consumed within a data center and offers a global outlook on the growth of data center electricity consumption.3

Global electricity consumption from data centers:

  • Estimated ~415 TWh in 2024, about 1.5% of global electricity consumption.
  • Projected to reach ~945 TWh by 2030, just under 3% of global electricity consumption in the IEA base case.

Data center electricity consumption by equipment type:

  • Servers: around 60% of electricity demand in modern data centers (varies by type).
  • Storage systems: around 5%.
  • Networking equipment: up to 5%.
  • Cooling systems and environmental control: about 7% in efficient hyperscale data centers, and over 30% in less-efficient enterprise data centers.

Figure 1: Graph showing the 2024 data of the share of electricity consumption based on data centre and equipment type.

Google Cloud, 2025

Google published a methodology to measure the environmental impact of AI inference for Gemini prompts, including electricity, carbon emissions, and water consumption. It presents per-prompt medians and claims significant efficiency improvements over a recent 12-month period.4

Per-prompt median impacts (Gemini Apps text prompt):

  • 0.24 Wh of energy
  • 0.03 gCO₂e emissions
  • 0.26 milliliters of water

Efficiency improvement claims

  • Over the past 12 months, Google claimed that the energy per median prompt fell by 33×, and total carbon footprint fell by 44×.

Data center efficiency and infrastructure

  • Fleet-wide average Power Usage Effectiveness (PUE) is 1.09 for Google data centers.
  • Google’s latest-generation TPU, Ironwood, is claimed to be 30× more energy-efficient than its first publicly available TPU.

Carbon Brief Organization, 2025

Carbon Brief synthesizes the International Energy Agency (IEA)5 and other sources into a set of charts that show baseline impacts, growth projections, and regional concentration risks. It highlights that the sector is small globally today but growing quickly and locally significant in some grids.6

Current global shares

  • Data centers are responsible for just over 1% of global electricity demand and 0.5% of CO₂ emissions (see Figure 2).

Growth

  • IEA central scenario: Data center electricity consumption rises to 945 TWh by 2030.
  • AI share of data center power use: Roughly 5% to 15% recently, potentially 35% to 50% by 2030.

Regional concentration examples:

  • Ireland: Around 21% of national electricity is used for data centers, potentially 32% by 2026.
  • Virginia (US): 26% of electricity consumed by data centers (as cited).

Power supply mix for data centers (global)

  • Fossil fuels: Nearly 60%
  • Renewables: 27%
  • Nuclear: 15%

Figure 2: Based on the IEA’s Global Energy Review 2025 and its Energy and AI report, this figure compares electricity consumption (TWh) and CO₂ emissions (MtCO₂) from global data centers in 2024 with those of other sectors.

United Nations Regional Information Centre for Western Europe (UNRIC), 2025

UNRIC frames AI’s environmental footprint across the full lifecycle: software (training, deployment, inference, maintenance) and hardware (materials, manufacturing, construction, e-waste). It emphasizes that electricity consumption and water use in data centers are direct impacts and advocates policy measures to improve disclosure and accountability.7

Statistics and category breakdown (electricity, water, lifecycle categories): This article is more categorical than numeric in the extracted sections. It explicitly groups AI impacts into direct, indirect, and higher-order effects. Here are some of the key findings:

  • Categories of environmental impact
    • Direct: Electricity and water consumption, greenhouse gas emissions, mineral extraction, pollution, and electronic waste.
    • Indirect: Emissions from AI-enabled applications and services.
    • Higher-order: Amplification of inequalities and issues related to biased or poor-quality training data.
  • Data centers and resource use
    • Data centers consume large amounts of electricity, much of it still supplied by fossil fuels.
    • Significant amounts of water are required for cooling systems and construction.
    • Global AI-related water demand is expected to reach 4.2–6.6 billion cubic meters by 2027, exceeding Denmark’s annual water use.
    • Producing a 2-kilogram computer can require around 800 kilograms of raw materials, including rare minerals.
  • Electricity use and growth
    • A ChatGPT query uses about 10× as much electricity as a Google search, according to the IEA estimate.
    • AI and machine learning accounted for <0.2% of global electricity use and <0.1% of global emissions in 2021, but demand is rising quickly.
    • Some tech companies report annual growth of over 100% in computing demand for AI training and inference.
  • Expansion of data centers
    • Data centers represented about 1% of global electricity demand in 2022.
    • In Ireland, data centers accounted for 17% of national electricity use in 2022.
    • The number of data centers worldwide has grown from 500,000 in 2012 to around 8 million today.

MIT News on Generative AI’s environmental impact, 2025

MIT News explains why generative AI can be resource-intensive and distinguishes between training and inference. It stresses power density, grid reliability issues, and the lack of incentives for users to reduce usage when impacts are invisible.8

Power density

  • A generative AI training cluster might consume 7 to 8× more energy than a typical computing workload.

Data center electricity consumption

  • Global data center electricity consumption cited as 460 TWh in 2022 and projected ~1,050 TWh by 2026.

Model training example

  • Training GPT-3 estimated at 1,287 MWh and about 552 tons of CO₂.

United States Data Center Energy Usage Report, 2024

This report estimates historical US data center electricity consumption and provides scenario ranges through 2028. It explicitly links the post-2017 growth inflection to accelerated servers, including graphics processing units used to run AI models.9

US total data center electricity consumption:

  • ~60 TWh (2014–2016), relatively stable.
  • 76 TWh by 2018, about 1.9% of US electricity consumption.
  • 176 TWh by 2023, about 4.4% of US electricity consumption.

2028 scenario range:

  • 325 to 580 TWh by 2028.
  • Equivalent to 6.7% to 12.0% of forecast US electricity consumption in 2028.

Drivers and categories:

  • Growth is driven by GPU-accelerated servers for artificial intelligence, which now account for a significant share of the installed base.
  • It describes efficiency strategies that previously held demand flat, including improved cooling systems, power management, higher utilization rates, and reduced idle power.

MIT Technology Review, 2023

MIT Technology Review reports on one of the first attempts to quantify the energy use and carbon emissions of AI during everyday use (inference), rather than focusing only on training. The article is based on a preprint study by researchers from Hugging Face and Carnegie Mellon University.

The study shows that while training large AI models is highly energy-intensive, most of an AI model’s lifetime carbon footprint comes from its use. Because popular models are deployed millions or billions of times, day-to-day inference emissions can quickly surpass training emissions.10

Per-task energy and carbon intensity:

The researchers measured energy use across 10 common AI tasks on the Hugging Face platform, testing 88 different models and running 1,000 prompts per task using the Code Carbon measurement tool. Key comparisons are:

  • Image generation: Generating a single image with a powerful model consumes roughly the same energy as fully charging a smartphone.
    • Generating 1,000 images with a model like Stable Diffusion XL produces CO₂ emissions comparable to driving about 4.1 miles in a gasoline car.
    • Image generation is by far the most energy- and carbon-intensive AI task measured.
  • Text generation: Generating text is significantly less energy-intensive.
    • Producing 1,000 text outputs uses only about 16% of a smartphone’s charge.
    • The least carbon-intensive text model studied emitted as much as driving just 0.0006 miles.

Model size and task specialization:

The study highlights a major efficiency gap between general-purpose generative models and task-specific models:

  • Large generative models consume far more energy because they are designed to perform many tasks (generate, classify, summarize). For example, using a generative model to classify movie reviews requires ~30× more energy than using a smaller model fine-tuned specifically for sentiment classification.
  • Smaller, specialized models are consistently less carbon-intensive for narrow applications.

Usage emissions vs training emissions:

The researchers compared training emissions with cumulative usage emissions:

  • Training Hugging Face’s largest BLOOM model was exceeded after about 590 million uses.
  • For extremely popular models like ChatGPT, usage emissions could exceed training emissions within weeks, due to massive daily user volumes.
  • This happens because training occurs once, while inference happens continuously at scale.

Broader implications and expert views:

  • Experts note that the emissions per task were higher than expected, raising concerns as generative AI becomes embedded in everyday software (email, search, word processing).
  • Researchers emphasize that newer, larger models are substantially more carbon-intensive than AI systems from just a few years ago.

FAQ

Industry Analyst
Sıla Ermut
Sıla Ermut
Industry Analyst
Sıla Ermut is an industry analyst at AIMultiple focused on email marketing and sales videos. She previously worked as a recruiter in project management and consulting firms. Sıla holds a Master of Science degree in Social Psychology and a Bachelor of Arts degree in International Relations.
View Full Profile

Be the first to comment

Your email address will not be published. All fields are required.

0/450