Will Mendoza Torrico
Co-founder i90.io
If you work in a sustainability-related field, you’ve probably heard contrasting claims around the true influence of artificial intelligence (AI); some label it an environmental menace while others chalk its impact up to a rounding error. In other words, its energy consumption is viewed as either very high (menace) or almost negligible (error). The truth sits in between, closer to the way we already think about big loads like buildings or industrial sites. A single AI query can demand a small amount of energy, but the sum total of billions of queries running in thousands of facilities can be impactful. In other words, AI’s footprint is mostly a scale problem. How we measure that scale, and how quickly we clean up the electricity supplying it, will determine whether AI supports or slows climate goals.
Note on sources: At the time of writing, Google is one of the few large AI providers that has published detailed, per-prompt numbers for energy, emissions, and water use in a production system. I use their data here as a worked example, not as an endorsement or a final “gold standard.” As other providers and independent teams publish similarly detailed work, they should be part of the same comparison set.
The good news: when the computational work is measured carefully inside a real, large-scale system, the per-use impact can be much smaller than many headlines suggest1. Google’s new production study2 finds that the median text prompt in its Gemini apps uses about 0.24 watt-hours (Wh) of electricity, produces ~0.03 grams of CO₂, and consumes ~0.26 milliliters of water (roughly five drops). The authors even translate the electricity into something intuitive: about nine seconds of watching TV. Third-party tests land in the same ballpark. Ethan Mollick, Professor and Co-Director of Generative AI Labs at The University of Pennsylvania's Wharton School of Business, notes3 that a typical GPT-4o prompt uses around 0.34 Wh, which he compares to roughly 16 seconds of HD Netflix (still very small per use compared to the TIME’s article’s1 estimation of the energy of an LED bulb running for an hour), but a reminder that numbers vary by model, setup, and conditions.
The bad news: total data-center electricity use is rising fast as AI spreads. US data center use grew from 58 TWh (Tera Watt per hour) (2014) to 176 TWh (2023), equivalent to 4.4 percent of total US electricity, and analysts project 325–580 TWh by 2028 (12 percent of total US electricity) 4 depending on buildout and efficiency; these numbers are large enough to move national demand. The International Energy Agency likewise warns that AI is set to drive a surge in global data-center electricity needs this decade 5; US electric power demand overall is now expected to hit record highs in 2025–26 in part because of data centers 6.
Put simply: The electrical consumption of one AI prompt is tiny, but the impact of billions of them being executed by AI users is not. Treat AI like any growing load: measure it and operate the data centers that host it as “cleanly” as possible—using efficient buildings, conservative water habits, and low-carbon electricity on the local grid.
Make the numbers useful (and comparable)
Much of the confusion comes from how AI’s impact is measured. Different studies use different boundaries. Some studies count only the Large Language Model (LLM) while others include cooling, buildings, or upstream power. Google’s self-published research paper on environmentalism and artificial technology 2 is one of the first detailed attempts to measure AI’s energy, emissions, and water use in a live, large-scale system. It defines a clear (if company-designed) set of boundaries and publishes concrete per‑prompt numbers for Gemini, then encourages others to disclose in similar terms so we can move toward more apples‑to‑apples comparisons. It has also drawn criticism from independent experts for what it leaves out and thus should be referenced as a useful starting point, not a gold standard.
Two layers of numbers matter:
- Per-use intensity: Once you understand the electricity use of an AI prompt, it’s natural to widen the lens to its overall footprint by including how much electricity it uses, the emissions associated with that electricity on the grid where it runs, and the water needed to keep the hardware cool. Packaging those into a simple “per‑query” score makes different models and vendors easier to compare and pushes them to keep improving efficiency 2.
- Fleet/site performance (i.e., how a whole data-center site performs): Hourly carbon-free energy score, location-based emissions, power-usage effectiveness, water use, and whether waste heat is reused all factor into this layer. The EU is already requiring data centers to report standardized Key Performance Indicators (KPIs) to a public database on a set schedule. In the US, by contrast, there is no single, mandatory national database yet—federal efforts are still mostly guided by voluntary programs (DOE studies, EPA efficiency labels, utility reporting) rather than a unified reporting scheme—so buyers often have to ask vendors directly for the same metrics.
This is a practical lesson from sustainability writ large: metrics shape markets. When you set the buying rule, as is the case for organizations like EPEAT (Electronic Product Environmental Assessment Tool) rating for laptops, LEED (Leadership in Energy and Environmental Design) points for buildings, EPD (Environmental Product Declarations) caps for concrete, hour-by-hour CFE (Carbon-Free Energy) for data centers, you don’t just measure; you steer supply. The metric you buy against is the market you create.
Why “where and when” the power comes from matters
Many organizations (including tech companies) have matched annual electricity use with renewable purchases. That’s helpful, but it can miss the hourly reality of the grid: you might still rely on fossil power at night or during lulls in wind and sun. A growing number of buyers are shifting to 24/7 carbon-free energy, aiming to match each hour of consumption with a local, carbon-free supply. Think of it as moving from “annual offsets” to “real-time clean operations.” Google 9 and others have published roadmaps for this shift.
For sustainability leaders, the takeaway is straightforward- ask your AI and cloud vendors for their hourly carbon-free energy score, not just annual renewable claims. If they’re on a 24/7 path, they’ll have an answer.
The US/EU contrast (and what to borrow)
The US is in a sprint: data-center demand is accelerating while transmission and interconnection queues slow the addition of new clean capacity. Federal analysts now explicitly flag data centers as a driver of near-term demand growth6. Europe, meanwhile, is leaning into disclosure and benchmarking. Under the recast Energy Efficiency Directive, operators must report energy, water, renewable consumption, and heat-reuse metrics to an EU database, creating a public yardstick that policymakers and buyers can use 8.
We don’t need to copy Europe wholesale, but US buyers can act as if these rules apply: ask for the same KPIs, disclosed the same way. Markets often move faster than policy.
What actually works
From a sustainability angle, the mitigation toolbox is familiar:
Use less energy for the same service. Software tuning and smarter operations have cut the energy used for a typical prompt dramatically in the past year in at least one large system. In simpler terms, better settings and better scheduling mean fewer kilowatt-hours per task.
Run efficient buildings. Data centers can be designed and operated to use electricity and water more carefully by being aware of their ratio of Power Usage Effectiveness and Water Usage Effectiveness (good PUE/WUE) and reusing waste heat where feasible. The EU’s reporting rules will make leaders and laggards visible on these basics.
Shift flexible work to cleaner hours. Many AI jobs don’t need to run instantly. If you give them a window (say, over the course of a few hours), software can move the timeframe to hours when the grid is cleaner—cutting emissions without new hardware. This “carbon-aware” approach is already in use.
Buy truly clean power where you operate. Moving toward 24/7 carbon-free energy aligns consumption with real clean supply on the same grids and at the same time, reduces the gap between paper claims and physical emissions.
None of these requires deep expertise in-house to implement, but they do require vendors to show their work.
Practical recommendations for sustainability teams
1) Ask for two views of impact 12. Request per-use numbers (e.g., electricity, emissions, water per-prompt) and site-level KPIs (e.g., hourly carbon-free score, location-based emissions, water use, and heat reuse). If a vendor can’t provide both, treat that as a red flag.
2) Prefer providers on a 24/7 path 9. Annual renewable matching is yesterday’s bar. Look for hourly, local, carbon-free targets and published progress. If you buy AI indirectly through a software vendor, ask their cloud provider for this information.
3) Build “flex windows” into your workflows 13. For non-urgent tasks (e.g., analytics, batch jobs, large document runs), allow a few hours of flexibility so the system can catch cleaner power. You’ll cut emissions at effectively zero cost.
4) Mind water and sitting. Encourage transparent reporting of water use and local engagement on cooling choices. The EU’s database will make water and heat-reuse metrics easier to compare; US buyers can demand the same disclosures.
5) Keep the big picture in view 6. Even with efficiency gains, overall load is growing quickly. National demand is rising to record highs, partly due to data centers, so planning for more transmission and firm, clean capacity is essential. Track these grid-level realities when setting your own targets and timelines.
Bottom line
We don’t have to choose between claiming “AI is wasteful” or “AI doesn’t matter.” Both statements can be true in different contexts. The environmental impact of one well-served prompt can be tiny. The impact of billions of prompts, added to a grid that isn’t yet clean enough, can be huge. An appropriate response should be familiar to anyone in sustainability: measure clearly, purchase wisely, and align operations with clean energy in real time.
The next two to three years, when fast AI growth meets slow grid build-outs, are pivotal. If buyers insist on transparent metrics, if operators keep improving efficiency, and if more of the load runs on hour-by-hour carbon-free power, AI can scale without locking in emissions. The tools exist. Now, it’s about using them at the scale of the problem.
Works Cited
Shah, S. (2025, July 2). Some AI prompts can cause 50 times more CO2 emissions than others. TIME. https://time.com/7295844/climate-emissions-impact-ai-prompts/
Google Cloud. (2025). Measuring the environmental impact of AI inference. Google Cloud. https://cloud.google.com/blog/products/infrastructure/measuring-the-environmental-impact-of-ai-inference
Mollick, E. (2025). We now have audited data on water consumption for AI... [Bluesky post]. Bluesky. https://bsky.app/profile/emollick.bsky.social/post/3luljwvstrs2d
U.S. Department of Energy. (2025). DOE releases new report evaluating increase in electricity demand from data centers. U.S. Department of Energy. https://www.energy.gov/articles/doe-releases-new-report-evaluating-increase-electricity-demand-data-centers
International Energy Agency. (2025). Energy and AI. International Energy Agency. https://www.iea.org/reports/energy-and-ai
DiSavino, S. (2025). Data center demand to push US power use to record highs in 2025, '26, EIA says. Reuters. https://www.reuters.com/business/energy/data-center-demand-push-us-power-use-record-highs-2025-26-eia-says-2025-06-10/
European Commission. (2024). Energy efficiency directive. European Commission. https://energy.ec.europa.eu/topics/energy-efficiency/energy-efficiency-targets-directive-and-rules/energy-efficiency-directive_en
European Commission. (2024). Commission adopts EU-wide scheme for rating sustainability of data centres. European Commission. https://energy.ec.europa.eu/news/commission-adopts-eu-wide-scheme-rating-sustainability-data-centres-2024-03-15_en
Google. (2020). 24/7 by 2030: Realizing a carbon-free future. Google Sustainability. https://sustainability.google/reports/247-carbon-free-energy/
Google. (2020). Using location to reduce our computing carbon footprint. Google. https://blog.google/outreach-initiatives/sustainability/carbon-aware-computing-location/
Google Cloud. (2025). Google’s approach to carbon-aware data center. Google Cloud. https://cloud.google.com/blog/topics/sustainability/googles-approach-to-carbon-aware-data-center
Elsworth, C., Huang, K., Patterson, D., Schneider, I., Sedivy, R., Goodman, S., Townsend, B., Ranganathan, P., Dean, J., Vahdat, A., Gomes, B., & Manyika, J. (2025). Measuring the environmental impact of delivering AI at Google scale [White paper]. Google. https://services.google.com/fh/files/misc/measuring_the_environmental_impact_of_delivering_ai_at_google_scale.pdf
Golin, C., & Swezey, D. (2022, April 14). A policy roadmap for 24/7 carbon-free energy. Google Cloud. https://cloud.google.com/blog/topics/sustainability/a-policy-roadmap-for-achieving-247-carbon-free-energy