ChatGPT may not be as power-hungry as once assumed | TechCrunch
Briefly

A new analysis by Epoch AI reveals that the energy consumption of ChatGPT queries is much lower than the commonly cited figure of 3 watt-hours. By assessing OpenAI's GPT-4o model, Epoch estimated the average consumption at about 0.3 watt-hours, which is comparable to many household devices. The conversation surrounding AI's energy use is pivotal, especially as AI companies expand their infrastructure amid concerns over environmental impact and resource depletion. Epoch's findings challenge outdated assumptions and highlight the efficiency of current AI models, prompting a reevaluation of their perceived energy demands.
The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car,
I've seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn't really accurately describe the energy that was going to AI today,
Read at TechCrunch
[
|
]