AI could gobble up a quarter of all electricity in the U.S. by 2030 if it doesn't break its energy addiction, says Arm Holdings exec
Briefly

"We won't be able to continue the advancements of AI without addressing power," Badani told the Fortune Brainstorm AI conference... "ChatGPT requires 15 times more energy than a traditional web search."
"It takes 100,000 AI chips working at full compute capacity and full power consumption in order to train Sora," Badani said. "That's a huge amount."
Data centers, where most AI models are trained, currently account for 2% of global electricity consumption, according to Badani. But... it could end up devouring a quarter of all power in the United States in 2030.
Read at Fortune
[
|
]