Let me guess what happened in your last meeting about AI.
Someone raised their hand: “But what about the environmental impact?”
Valid concern. Responsible question.
But here’s the problem: most people are working with last year’s assumptions.
The myth we all believed
For years, the narrative was clear:
- Every ChatGPT query chews through server farms.
- Every image generated melts polar ice caps.
- Headlines screamed about AI’s “insatiable” energy appetite.
Much of that was based on training costs or early inference estimates – not real production data.
What actually happens today
Google recently published the first large-scale, measured analysis of AI inference: billions of Gemini text prompts, in live production. Not models in a lab:
Here’s what they found for a median text prompt:
- Energy: 0.24 Wh
- Carbon: 0.03 g CO₂e
- Water: 0.26 mL (≈ five drops)
In CEO terms: one prompt = the energy of watching TV for 9 seconds.

And here’s the kicker: in the past 12 months alone —
- Energy per query fell 33×
- Emissions per query fell 44×
- Water per query fell 19×
That’s not incremental. That’s exponential.
Objections your team will raise, and why they don’t stick
- “But that’s just text, not images or video.”
True – image/video prompts cost more. But the same efficiency curve applies, and most enterprise AI use cases today are text-based. - “Billions of queries still add up.”
Also true. But per-query efficiency is falling faster than usage is growing. Net environmental impact is lower today than a year ago. - “Google’s using market-based carbon accounting.”
They are – but even with location-based grids, per-query numbers are still orders of magnitude below the old estimates. - “What about embodied emissions (chips, data centers)?”
Important point. But those are fixed costs. The more queries per chip, the lower the per-query share – and efficiency gains mean fewer chips are needed overall. - “Isn’t this just Google PR?”
Skepticism is healthy. But this is the first dataset based on real production inference at global scale. Previous studies were projections. With how much a hot topic this is, we can expect more studies from other players.
Reality check: Compared to what?
The danger isn’t that AI consumes 0.24 Wh per query.
It’s that we ignore the inefficiencies AI can replace. Without AI, we may have had to visit a doctor or expert in person to get the same information, absorb hours of research from watching documentaries or even manually perform 47 browser tabs of research.
- A single in-person meeting trip = more carbon than millions of queries.
- One hour of TV in the average U.S. household = energy of 400 AI queries.
- Keeping 47 browser tabs open (more energy than 100 AI queries/hour)
But isn’t total demand still exploding?
Yes, 0.24 Wh is small, but at Google scale billions of queries could still add up to gigawatt-hours. Isn’t that still environmentally significant?
Yes, but again – compared to what previous activity? The manual, legacy processes often wastes more energy than the AI inference that could replace them.
If it gets too cheap, people will just use more of it. That rebound effect could wipe out the sustainability gains – it’s called the Jevons Paradox. But three things matter here:
(1) Historical precedent: Every major technology (lighting, transport, computing) faced rebound effects. Yet efficiency gains were still the main driver of long-term decarbonization.
(2) Counterfactuals: Even with higher volumes, AI often replaces far less efficient processes (travel, legacy IT, human time). Eliminating those wastes creates a net environmental gain.
(3) Governance: Policy frameworks and corporate ESG targets act as guardrails, ensuring that efficiency improvements don’t spiral into unchecked consumption.
The question that matters
It’s not “Does AI use energy?” (Everything does.)
It’s not even “Is AI getting more efficient?” (It is – by orders of magnitude).
The real question:
What inefficiencies can AI eliminate that waste 100×, 1,000×, even 100,000× more energy than AI ever consumes?
Your team’s environmental concerns are admirable. But channel them toward the right enemy: trillions of watt-hours lost to outdated systems AI can replace.
Reducing inefficiencies is exactly what we’ll dive into in a few complimentary AI Masterclasses I’m considering hosting, based on my book, Do More With Less: The AI Playbook for Amplifying Talent & Output. These sessions are designed to help teams spot their first high-impact use cases and deliver quick wins with the tools they already have (e.g., Copilot and Excel).
If you’d like to join, or know someone who should, here’s the form to let us know: https://docs.google.com/forms/d/e/1FAIpQLSf8ZbGGY1ZdaiUxxCOTTS3CIFmkO9f-oSzU0cfYLjO_cQn7dw/viewform
Next time someone brings up AI’s footprint in a meeting, ask: “Compared to what?”