"We won't be able to continue the advancements of AI without addressing power," Badani told the Fortune Brainstorm AI conference... "ChatGPT requires 15 times more energy than a traditional web search."
"It takes 100,000 AI chips working at full compute capacity and full power consumption in order to train Sora," Badani said. "That's a huge amount."
Data centers, where most AI models are trained, currently account for 2% of global electricity consumption, according to Badani. But... it could end up devouring a quarter of all power in the United States in 2030.
Collection
[
|
...
]