Altman: You think AI is inefficient? Try raising a human
Briefly

Altman: You think AI is inefficient? Try raising a human
"He claimed such complaints ignore the total amount of energy it takes to create and train an actual human. He said it was unreasonable to focus on "how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query." "It takes like 20 years of life and all of the food you eat during that time before you get smart," he said."
"AI's biggest energy burn comes in the training phase, Altman said. "If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably AI has already caught up on an energy efficiency basis, measured that way.""
Concerns about AI resource consumption are presented as reasonable but sometimes overstated. Datacenters often use liquid closed-loop cooling, reducing evaporative water use. Claims equating a single AI query to multiple phone charges are disputed as inaccurate. Comparisons neglect the cumulative energy required to grow and educate a human, including decades of life and the collective evolutionary and societal resources of billions of people. The largest AI energy cost occurs during model training, while inference energy per query may already be comparable to or more efficient than human responses. Estimating total historical human energy is complex but yields very large figures.
Read at Theregister
Unable to calculate read time
[
|
]