The Carbon Footprint of AI

Written by Climate Impact Partners Published 28 October 2025 10 MIN READ

Artificial Intelligence (AI) is everywhere now. If you’ve ever asked Siri for the weather, let Netflix recommend your next binge, or typed a sentence into Gmail only to have it finish your thought, you’ve been using AI. 

AI powers your phone, curates your social feeds, manages logistics behind the products you buy, and is increasingly running quietly in the background of almost every industry.

Why is AI such a big deal? AI is reshaping all aspects of our lives. It’s improving healthcare by helping doctors detect disease earlier. It’s transforming finance by spotting fraud faster than humans could. It’s giving businesses sharper insights from oceans of data. AI is not a passing trend. It’s embedded in the way we live, work, and do business.

As resourceful as AI is, it comes with a very real environmental cost. AI’s computer systems run on electricity, a lot of it. Training the massive models behind today’s AI platforms (such as ChatGPT, Gemini, Claude and Perplexity) requires server farms humming away for weeks or even months. That energy use translates into a large carbon footprint, which contributes to climate change.

What is a carbon footprint?

A carbon footprint is the total amount of greenhouse gases - mainly carbon dioxide - that are released into the atmosphere as a result of everyday activities. These gases trap heat, driving climate change over time. Driving a car, flying on a plane, and heating and cooling your home all contribute to your carbon footprint. Even the products we buy and the food we eat have a carbon footprint.

Electricity powers almost all of it. Much of the world still relies on fossil fuels like coal and natural gas to generate power. So, when a process (whether it’s running your fridge or training AI models) consumes a lot of electricity, it indirectly produces carbon emissions and increases the carbon footprint.

It’s not just heavy industry or transportation we have to think about anymore. Our digital world has its own footprint. Every photo, email, video stream, or cloud backup requires data centers full of servers, which in turn require massive energy inputs. AI is one of the latest, and fastest-growing, players in this digital space.

How does AI contribute to a carbon footprint?

AI is energy intensive. Training large-scale AI models, particularly large language models (LLMs) and advanced generative AI systems, are among the most energy-intensive computing tasks. It involves feeding enormous datasets into algorithms and letting them “learn” patterns over many cycles of computation. This is where the bulk of the energy gets used.

Why so much energy? Because these models don’t run on a single computer. They require massive data centers packed with specialized chips like GPUs (graphics processing units) and TPUs (tensor processing units). These machines work around the clock, often for weeks or months, consuming electricity that may or may not come from renewable sources.

Once a model is finished training and ready to be deployed, it still needs power every time it generates a response, makes a recommendation, or recognizes an image. This is called the inference phase, and at scale (think millions of queries per day) it adds up quickly.

Every AI query you make is a tiny tap on that energy system. Multiply it by billions of users, and suddenly the carbon cost is substantial.

Besides LLMs and generative AI, AI includes a wide range of systems and techniques, such as machine learning (ML) models; computer vision systems; reinforcement learning; natural language processing (NLP) tools; expert systems; and recommendation engines. These types of AI often require less energy than massive LLMs but still contribute to the overall footprint of AI.

How much electricity is AI using?

Training a single large model like GPT-3 can use over 1,200 MWh - enough electricity to power around 120 U.S. homes for a year. And that is just for training the model. More energy is consumed every time someone submits a query.

While smaller or optimized models consume far less, the scale of AI adoption means total energy use is becoming a meaningful contributor to global electricity demand. For perspective, the International Energy Agency (IEA) estimates that global data center electricity demand will nearly double by 2030, driven largely by AI workloads.

How much CO2 does AI produce?

As just one illustrative example of the CO2 consumption of AI, researchers estimated that training GPT-3 emitted roughly 500 metric tons of carbon dioxide (CO₂)—the equivalent of driving a car from New York to San Francisco about 438 times.

And that’s just for the training phase of a one model. After training, the model still consumes energy every time it’s used, which adds to its overall footprint. (It should also be noted that GPT-3 is becoming outdated, with a more advanced GPT-5 model now surpassing it.)

That said, it is hard to pin down an exact figure for AI’s total emissions. Many tech companies do not publicly disclose the carbon costs of their models, and reporting is currently voluntary, not mandated. This lack of transparency means we have only snapshots from research studies or occasional disclosures—not a full picture of AI’s environmental impact.

Training GPT-3 emitted roughly 500 metric tons of carbon dioxide (CO₂)—the equivalent of driving a car from New York to San Francisco about 438 times.

How bad is AI for climate change?

It’s worth putting this into perspective. Compared to aviation or cement production, AI’s total carbon footprint today is relatively small. However, AI is growing at speed.

The models are getting larger, the applications broader, and the adoption more widespread. Data centers currently consume about 1–2% of global electricity. Of that, AI is responsible for about 15%.

The International Energy Agency projects that energy demand will double by 2030 because of AI. Already, in just a five-year period, Google's carbon emissions surged nearly 50% due to AI energy demand.

Energy demand will double by 2030 because of AI.

What factors affect AI’s carbon emissions?

In terms of how much carbon emissions an AI system produces, a few factors are at play:

  • Algorithm complexity: Some algorithms, especially deep learning algorithms used to power generative AI, require heavier computation than others. A more complex algorithm can mean more cycles, more servers, more energy.
  • Model size: The size of the dataset and the number of model parameters directly affect training time and energy use. GPT-5, for example, is orders of magnitude larger than models from just a few years ago.
  • Data center efficiency: Not all server farms are created equal. A data center powered by Icelandic geothermal energy has a much smaller footprint than one powered by coal in West Virginia. Cooling systems and energy efficiency make a difference too.
  • Hardware and infrastructure: Specialized chips like TPUs are designed to accelerate AI training more efficiently than general-purpose hardware. As hardware improves, the energy cost per calculation drops—but overall demand keeps rising.

What are the sustainability initiatives in AI?

The good news is that awareness of AI’s environmental impact is growing—and with it, efforts to clean things up.

AI companies going green: Google has committed to running entirely on carbon-free energy by 2030. Microsoft has pledged to become carbon negative by the same year. Amazon is investing billions in renewable projects. These tech giants are illustrating the operational reality that data centers are massive energy sinks.

Efficient algorithms: Researchers are exploring ways to design algorithms that achieve the same performance with less computation. “Green AI” is a growing movement focused on balancing accuracy with efficiency.

AI for good: Ironically, AI can also help solve the very climate problem it contributes to. For example, AI is being used to optimize power grids for renewable energy, predict wildfires, monitor deforestation, and model future climate change scenarios. It’s even helping design new materials for batteries and carbon capture. In this sense, AI is both part of the problem and part of the solution.

Carbon credits: Many AI companies are now exploring carbon credits as a way to offset the emissions generated by their models. By purchasing verified credits, they can support projects that reduce or remove CO₂ from the atmosphere, helping to balance out the carbon footprint of AI training and operation while continuing to innovate.

Can an individual’s choices reduce AI’s carbon footprint?

The answer is yes—at least indirectly.

Here are a few ways:

  • Choose tech providers that are transparent about their sustainability goals. If a company has strong renewable energy commitments, your usage of their services carries a lighter footprint.
  • Support policies and initiatives that push the tech industry toward greener practices, such as government regulations on energy efficiency, corporate net-zero and renewable energy commitments, research programs on sustainable AI, and global reporting frameworks that promote transparency in emissions and energy use.
  • Choose the right AI model. Not every AI task needs the biggest, most complex system. Running GPT-5, a massive large language model, to summarize a few emails is like using a jet engine to power a bicycle. In many cases, smaller models like GPT-2, which are designed to be lighter and more efficient, can deliver similar results with a fraction of the energy use.

Challenges and trade-offs

AI’s carbon footprint involves more than just tons, watts, and flops (a measure of a computer’s performance). It also involves the trade-offs we’re willing to make as a society.

On one hand, AI innovation is racing forward, unlocking breakthroughs in medicine, climate science, education, and beyond. On the other hand, that innovation comes with a hefty energy bill. The question becomes: how do we balance progress with sustainability?

The need for balance

Pushing the boundaries of AI almost always requires more computing power. Training the next generation of models demands larger datasets, more complex architectures, and longer training cycles.

From a purely technological perspective, the bigger and more powerful the model, the more impressive the results. But from an environmental perspective, “bigger is better” quickly becomes “bigger is costlier.”

If we pull back on computing resources to save energy, we risk slowing innovation. But if we keep scaling AI without regard for its carbon footprint, we risk undermining the very future those innovations are supposed to protect. Businesses, researchers, and policymakers are now grappling with this balancing act.

Ethical considerations

There is also an ethical layer to consider. Should society accept that cutting-edge AI comes with high emissions as the price of progress? Or should we insist that sustainability be incorporated into research and development from the start?

What about using massive amounts of energy to generate silly images as a joke for friends, compared to using similar resources to predict climate patterns and help prevent natural disasters. Both create carbon emissions—but the value of one may be much easier to justify than the other.

Ultimately, AI’s future will depend not just on what’s possible, but also on what’s responsible. It’s up to innovators, businesses, and regulators to decide how much weight to give to sustainability alongside performance.

AI and environmental regulation and policy

When it comes to AI’s carbon footprint, regulation is still catching up. Most existing policies do not target AI directly but focus instead on the broader digital infrastructure—like energy efficiency standards for data centers or national targets for renewable energy adoption.

The European Union, for example, has introduced guidelines around sustainable data center practices, while in the U.S., efficiency standards for servers and cooling systems are beginning to take shape. But explicit rules around AI’s environmental impact remain rare.

Existing policies

Right now, the closest we get are energy-efficiency frameworks for cloud providers and reporting requirements that some large tech companies follow voluntarily. While these help, they do not fully address the unique energy intensity of training and running AI models.

Proposed solutions

Governments could step in with targeted incentives—such as tax breaks or grants for companies that use renewable-powered data centers for AI training. Another option is mandatory carbon reporting for AI companies, making emissions data transparent so businesses and consumers can make informed choices.

Does AI affect the environment beyond just its carbon footprint?

Yes, AI can affect the environment beyond its carbon footprint. For example:

  • Water use: Data centers need substantial amounts of fresh water for cooling to prevent servers from overheating. In regions with limited water availability, this can put additional stress on local water resources and affect surrounding ecosystems.

For example, about 17.5 billion gallons of fresh water was consumed directly by U.S. data centers in 2023. This is approximately equivalent to 26,500 Olympic-size swimming pools, or the annual water use of a mid-sized American city such as South Bend, Indiana or Fort Collins, Colorado.

  • Electronic waste: AI relies on specialized hardware like GPUs, TPUs, and high-performance servers, which have relatively short lifespans. Frequent upgrades and disposal of these devices create large amounts of e-waste, which can be difficult to recycle and may release harmful chemicals into the environment.
  • Resource extraction: Manufacturing AI hardware requires mining metals such as lithium, cobalt, and rare earth elements. These extraction processes can cause habitat destruction, water pollution, and soil degradation, contributing to long-term ecological harm.
  • Land use: Building large-scale AI data centers often requires significant land, potentially disrupting natural habitats and local biodiversity. The construction and maintenance of these facilities can also alter local landscapes and contribute to urbanization pressures.

The future of AI’s environmental impact

AI doesn’t have to be an environmental enemy. With the right innovations and incentives, it could evolve into a far greener technology than it is today.

The future of AI and sustainability has a few promising developments:

  • Better hardware: Chips designed specifically for AI are getting more energy efficient every year.
  • Greener data centers: The shift toward renewable energy is accelerating, and innovations in cooling and energy storage are making data centers more sustainable.
  • Smarter algorithms: Researchers are exploring ways to design more efficient generative AI models, including streamlined large language models that use less training data to reduce energy consumption without losing performance.
  • Quantum computing: Quantum computing is still emerging, but it promises a potential leap in efficiency for certain kinds of problems, compared to our traditional binary computing.

The bottom line

AI is transforming the way we live and work, but it is not without cost. Training and running these systems consumes significant amounts of energy, which translates into carbon emissions. The size of the models, the complexity of algorithms, and the efficiency of data centers all play a role.

But there is reason for optimism. Tech giants are investing heavily in renewable energy. Researchers are working on more efficient models. And consumers are starting to ask the right questions.

If we are thoughtful, AI can be both powerful and sustainable. The challenge for businesses, policymakers, and individuals alike is to keep the environmental costs in mind as we charge into an AI-driven future.

Related AI FAQs

  • How much e-waste does AI produce?

    AI itself doesn’t directly create e-waste, but the hardware it depends on—GPUs, TPUs, and servers in data centers—eventually becomes outdated. As models grow larger, demand for more powerful chips increases, adding to the global pile of discarded electronics.

    While hard numbers specific to AI aren’t widely available, in 2022, 62 million tonnes of electronic waste (e-waste) were generated globally, and only about 22% was properly collected and recycled.

  • How much water does AI consume?

    Training and running large AI models requires cooling massive data centers, many of which rely on water-based cooling systems. A 2023 study estimated that training GPT-3 consumed about 700,000 liters of clean freshwater, enough to produce hundreds of cars or thousands of smartphones.

    Water use varies depending on where the data center is located and how it’s cooled, but it’s becoming an emerging environmental concern alongside carbon emissions.

  • What happens to the data once a model is trained?

    The data itself doesn’t generate emissions once stored, but maintaining and accessing massive datasets requires storage infrastructure—which still consumes energy. Companies are experimenting with “data pruning” and more efficient storage methods to reduce this footprint.

  • Are all AI applications equally bad for the environment?

    Not at all. The footprint varies enormously. Training a model like GPT-5 requires orders of magnitude more energy than training a small model designed for medical diagnostics or traffic prediction. Choosing the right model for the right job is one of the simplest ways to cut down on unnecessary emissions.