Cost of Compute
Energy. Water. Carbon. Labor. The costs behind every query, every model, every company — that nobody puts on the label.
This index uses the best available public data. Where companies don't disclose, we estimate using published methodologies and mark confidence levels. Companies can submit corrections via our corrections email or contact page.
10 queries/day × 365 days × ~3g each = 0.07% of average American's annual footprint
GPT-3 (official) to ~175,000+ MWh estimated (Grok 3)
45,000x your annual AI useBP invented the personal carbon footprint to shift blame onto individuals. The AI industry is running the same playbook.
Sources: Hannah Ritchie, Sustainability by Numbers; Patterson et al. 2021
Who's Doing It Right
20x less carbon than GPT-3 at similar scale, trained on French nuclear grid. First full lifecycle analysis of a large language model.
Researcher: Sasha Luccioni, Hugging Face33x energy reduction per query in 12 months. Only major company to publish per-query methodology with energy, carbon, and water metrics.
10-50x more efficient than frontier models. 3.8B parameters outperforming much larger models on key benchmarks.
First ISO-compliant lifecycle assessment for an AI model. 1.14 gCO2e per 400-token response. Partnered with Carbone 4 and ADEME.
166+ models rated for energy efficiency. Referenced by EU AI Act Code of Practice.
Azure Clean Deployment Guide
Reality check: “100% renewable” = annual accounting, not real-time clean power.
Disclaimer
Assessments reflect publicly available information and the published methodology of the Behind the AI Research Team. Grades represent analytical assessments derived from the published scoring framework, not statements of fact about internal company operations. If you believe any claim is inaccurate, contact corrections@behindtheai.org with the specific claim and your evidence.