Google boasts that a single Gemini prompt uses roughly the same energy as a basic search - but that's not painting the full picture
Briefly

Google boasts that a single Gemini prompt uses roughly the same energy as a basic search - but that's not painting the full picture
"One of the challenges of generative AI is how much energy is required to power the data centers needed to run a model to answer a question, generate some text, and so on. Unpicking that isn't easy, but early reports suggested that using OpenAI's ChatGPT for search was ten-times more energy intensive than using plain-old Google, though CEO Sam Altman has since confirmed it's about 0.034 watt-hours (Wh) of energy. That said, it's an easier question for Google researchers to answer than their academic peers, what with the access to the company's own systems."
"All told, the average text prompt in Gemini Apps uses 0.24 watt-hours of energy - a figure the company is keen to point out is "substantially lower than many public estimates". Elsewhere, a single prompt emits 0.03 grams of carbon dioxide equivalent and consumes 0.26 milliliters of water, about five drops. This, researchers said in a separate blog post, is equivalent to "watching TV for less than nine seconds"."
Measurements of the full infrastructure stack for Gemini Apps examined active AI accelerator power, host system energy, idle machine capacity, and data center overhead. The average text prompt uses 0.24 watt-hours of energy. A single prompt emits 0.03 grams of carbon dioxide equivalent and consumes 0.26 milliliters of water, roughly five drops, an amount compared to watching TV for less than nine seconds. Energy breakdown shows about 58% from AI chips, 25% from server CPU and memory, 8% from cooling and power conversion, and 10% from backup equipment. The per-prompt energy use aligns with other public figures and with basic Google search levels.
Read at IT Pro
Unable to calculate read time
[
|
]