EN CZ
Bitter

Power Hungry Processing: Measuring the Real Energy Cost of AI Model Deployment

Luccioni et al. from Hugging Face provided one of the most comprehensive lifecycle carbon assessments of a large language model (BLOOM, 176B parameters):

  • Training: 24.7 tonnes CO2eq — relatively low because BLOOM was trained on a French nuclear-powered grid
  • Manufacturing emissions of the hardware: additional 7.6 tonnes
  • Daily inference: once deployed, the model consumed approximately 914 kWh per day
  • Over its expected lifetime, deployment energy exceeds training energy — a crucial finding often overlooked

Key Insight

The research community has focused on training costs, but this study shows that inference (actual usage) quickly dominates the total carbon footprint. With billions of daily AI queries worldwide, the inference carbon budget is enormous and growing.

Source

Luccioni, A. S. et al. (2023). Estimating the Carbon Footprint of BLOOM. JMLR, 24(253), 1-15.

Connected Research

You may also like