ContactBook a demo

The Carbon Footprint of Large Language Models: What sustainability consultants should know

Discover this blog post

Artificial intelligence is rapidly becoming part of everyday digital infrastructure. Large language models (LLMs) now support tasks ranging from customer support and software development to internal knowledge management and data analysis.

As adoption grows, organisations increasingly ask about the carbon footprint of AI and large language models. Are LLMs a significant source of emissions? Should their use be considered in corporate climate strategies?

Public discussions often focus on the energy required to generate responses to prompts. However, research shows that this is not where most emissions occur. The majority of the environmental impact of LLMs happens earlier in the model life cycle, particularly during training.

Understanding where emissions actually occur is important for sustainability consultants advising organisations on carbon accounting, ESG reporting, and digital infrastructure impacts.

Where Do LLM Emissions Come From?

Across the full life cycle of a large language model, the vast majority of emissions occur before the model is used by end users.

Training an LLM requires enormous computational resources. During this stage, the model learns from large datasets through repeated optimisation processes that require extensive computing power.

A study conducted by Carbone4 on the Mistral LLM provides a useful breakdown of emissions across the model life cycle:

  • ~85% of emissions come from electricity consumption during the training phase
  • ~10% come from embodied emissions of the hardware used for training
  • Only ~3–4% occur during model usage

This means that roughly 95% of the total carbon footprint of an LLM is determined during development, not during everyday use.

Two factors explain this distribution.

1. High computational demand during training

Training involves extremely large computing workloads, often including:

  • billions or trillions of model parameters
  • repeated processing of large training datasets
  • thousands of GPUs running simultaneously for extended periods

These training runs can last weeks or months depending on the size of the model.

2. Embodied emissions of specialised hardware

The production of high-performance AI hardware also contributes to the overall footprint. Manufacturing:

  • GPUs and AI chips
  • servers
  • networking infrastructure
  • cooling systems

generates significant embodied emissions.

Together, electricity consumption during training and hardware manufacturing dominate the life-cycle footprint of large language models.

The Carbon Impact of Using Large Language Models

In contrast to training, the emissions associated with using LLMs are comparatively small.

When users submit prompts, the model performs an operation called inference, generating an answer using its trained parameters. This process requires computing power but is far less energy intensive than training.

Estimates suggest that generating answers to 1,000 prompts results in emissions between roughly 5 and 500 gCO₂e. These figures assume an electricity carbon intensity of approximately 480 gCO₂/kWh.

The range depends on several factors:

  • model size
  • data centre efficiency
  • infrastructure configuration
  • number of tokens processed per query

Even at the higher end of the estimate, the emissions remain small relative to most components of a typical personal or corporate carbon footprint.

For example, emissions from:

  • a short car journey
  • everyday electricity use
  • common consumer goods

often exceed the emissions associated with thousands of AI queries.

For this reason, LLM usage emissions are typically negligible within most corporate carbon footprints.

Why Model Size Drives the Carbon Footprint of AI

Although usage emissions are small, the overall carbon footprint of AI models is strongly linked to model size.

Larger models require:

  • more computing power
  • longer training times
  • larger datasets
  • more specialised hardware

As a result, emissions increase significantly with the number of parameters in the model.

This creates an important sustainability lever. In many real-world applications, extremely large general-purpose models are not required.

Instead, organisations can often use:

  • smaller language models
  • task-specific models
  • fine-tuned models optimised for a particular task

These approaches can significantly reduce computational requirements while still delivering adequate performance.

For sustainability consultants advising organisations on AI adoption, model selection becomes one of the most important factors influencing environmental impact.

What AI Emissions Mean for Corporate Carbon Accounting

The rise of AI raises several practical questions for carbon accounting and sustainability reporting.

Three aspects are particularly relevant.

1. Contextualising AI emissions

In most organisations, emissions linked to AI usage will represent a very small share of total emissions.

Other emission sources typically dominate corporate footprints, including:

  • purchased goods and services
  • supply chain emissions
  • transport and logistics
  • product use

This means that AI usage rarely becomes a major focus of corporate decarbonisation strategies.

2. Understanding life-cycle boundaries

Most emissions occur during model development and hardware production, which typically take place within the infrastructure of technology providers.

This creates challenges for corporate accounting frameworks because:

  • operational electricity consumption may appear in Scope 2 or Scope 3
  • hardware manufacturing emissions often remain outside organisational boundaries

Understanding these life-cycle boundaries is essential when assessing the environmental impact of digital technologies.

3. Identifying practical reduction levers

For organisations deploying AI systems, the most realistic emission reduction options include:

  • selecting smaller or more efficient models
  • avoiding unnecessary computation
  • optimising infrastructure usage
  • choosing data centre providers with lower-carbon electricity

These choices can reduce energy consumption while preserving the benefits of AI technologies.

The Future Carbon Footprint of Artificial Intelligence

The environmental impact of AI will largely depend on how models are designed, trained, and deployed.

Several trends will influence the future carbon footprint of large language models:

  • improvements in hardware efficiency
  • optimisation of model architectures
  • increased use of smaller specialised models
  • lower-carbon electricity in data centres

At the same time, the rapid growth of AI usage means that total energy demand may continue to increase.

For sustainability professionals, this reinforces the importance of understanding the digital infrastructure behind AI systems, including:

  • cloud computing platforms
  • data centre energy systems
  • hardware supply chains
  • software architecture decisions

Key Takeaways: The Carbon Impact of Large Language Models

For sustainability consultants advising organisations on AI adoption, several conclusions stand out:

  • Most LLM emissions occur during model training, not during everyday usage.
  • Usage emissions are typically negligible within corporate carbon footprints.
  • Model size strongly influences environmental impact.
  • Smaller or task-specific models can significantly reduce emissions.

In practice, the most effective strategy for reducing the carbon footprint of AI is not limiting individual usage, but improving model efficiency and deployment choices.

As artificial intelligence continues to expand across digital systems, understanding these dynamics will become increasingly important for sustainability professionals working at the intersection of AI, carbon accounting, and climate strategy.

This blog was based on the following sources:


About Carbon+Alt+Delete

We provide carbon accounting software for sustainability consultants and consultancies that guide companies towards net zero.

Curious to discover how our software can improve your carbon accounting services?

Feel free to reach out to [email protected] or book a meeting to talk to one of our experts here.