Google Lifts the Lid on Gemini AI’s Energy Footprint

Google Lifts the Lid on Gemini AI's Energy Footprint

Photo by FOX ^.ᆽ.^= ∫ on Pexels

Google has unveiled a detailed technical report outlining the energy consumption of its Gemini AI models per query, marking a major stride towards transparency in the AI sector. The report reveals that a median Gemini prompt consumes approximately 0.24 watt-hours of electricity, comparable to a single second of microwave operation. Beyond electricity, the report also offers estimates for water usage and carbon emissions associated with Gemini’s operation.

The comprehensive analysis, championed by Google’s Chief Scientist Jeff Dean, takes into account the energy demands of AI chips, supporting infrastructure, and data center operations. AI chips are the biggest energy consumer, accounting for 58% of the total electricity used, while supporting hardware represents 25%. Backup equipment contributes 10%, and data center overhead accounts for the final 8%.

Notably, Google highlights a significant reduction in energy consumption over time, thanks to model advancements and software optimizations. The median Gemini prompt in May 2025 now consumes 33 times less energy than it did in May 2024. The associated greenhouse gas emissions are estimated at a mere 0.03 grams of carbon dioxide per prompt, based on electricity purchases from clean energy projects. Water consumption per prompt is estimated at 0.26 milliliters.

The report has garnered praise from experts like Mosharaf Chowdhury from the University of Michigan and Sasha Luccioni from Hugging Face, who see it as a valuable contribution to understanding AI’s energy impact. However, some key figures, such as the total number of daily Gemini queries, are not disclosed. Luccioni is calling for the creation of a standardized AI energy score, akin to the Energy Star rating, to facilitate easier comparisons and encourage further improvements.