EN CZ
Bitter

Training One AI Model Emits as Much CO₂ as Five Cars in Their Lifetimes

In June 2019, Emma Strubell, Ananya Ganesh, and Andrew McCallum at the University of Massachusetts Amherst published a paper that would fundamentally change how the AI community thinks about environmental costs. Their study, "Energy and Policy Considerations for Deep Learning in NLP," was the first rigorous quantification of the carbon emissions from training large AI models — and the numbers were shocking (Strubell et al., 2019).

The researchers estimated that training a single large neural network model (a Transformer with neural architecture search) produced approximately 284 tonnes of CO₂ equivalent — roughly five times the lifetime emissions of an average American car (including manufacturing). Even training a standard Transformer base model emitted roughly the equivalent of a trans-American flight. These figures did not include the energy used for experimentation, failed runs, or deployment — only the final training process.

The paper arrived at a critical moment. The AI field was in the early stages of what would become an exponential scaling trend. GPT-2 had just been released (February 2019), and labs were racing to train ever-larger models. Strubell's team warned that "the financial cost of training has been the main consideration, while the environmental cost has been largely ignored." They called for mandatory reporting of energy consumption in AI research papers and the development of more efficient training methods.

The paper's impact was significant but incomplete. It contributed to growing awareness of AI's environmental footprint and helped spark the "Green AI" movement — a push for energy-efficient machine learning. Google and other companies began reporting some energy metrics for their AI systems. However, the trend toward larger models only accelerated: GPT-3 (2020) was 100x larger than GPT-2, and GPT-4 (2023) larger still. By 2025, the cumulative emissions of AI systems dwarfed what Strubell had measured in 2019.

Strubell's work established a crucial principle: AI capabilities come with environmental costs that must be measured, reported, and minimized. That principle remains as relevant today as it was in 2019 — arguably more so, as the scale of AI deployment has grown beyond what anyone imagined at the time the paper was written.

Key Sources

  • Strubell E., Ganesh A., McCallum A. (2019). Energy and Policy Considerations for Deep Learning in NLP. ACL 2019.
  • MIT Technology Review (2019). Training a single AI model can emit as much carbon as five cars in their lifetimes.

Connected Research

You may also like