Photo by moein moradi on Pexels
The burgeoning field of artificial intelligence, fueled by models like ChatGPT and Gemini, is facing increased scrutiny regarding its energy consumption and environmental impact. For a long time, AI developers were not forthcoming about the energy cost of AI models. OpenAI and Google are now shedding some light on the power usage per query.
OpenAI’s Sam Altman revealed that a typical ChatGPT query consumes 0.34 watt-hours, while Google estimates 0.24 watt-hours for Gemini interactions. However, these figures represent only a narrow slice of AI activity, excluding energy demands from intensive applications such as video and image generation. A holistic understanding of AI’s energy footprint necessitates application-specific data to assess its true environmental implications.
While some companies are trying to achieve sustainability goals by using AI to create more efficient systems or mineral discoveries, the transparency regarding any efficiency from AI is lacking. The industry acknowledges that the impact of AI on the energy system is directly tied to its future growth and demand. The amount of AI demand will determine if the shift in the energy system is lasting. AI will continue to be scrutinized as more data centers are needed to house these models.