AI datacenters might consume 25% of US electricity by 2030
Briefly

Arm CEO warns that AI advancements could lead to substantial energy consumption by datacenters, forecasting a 20-25% grid usage by 2030, with ChatGPT models being particularly power-intensive.
The IEA anticipates AI datacenter power use globally to rise tenfold by 2024, especially due to energy-intensive models like ChatGPT, indicating the necessity of government regulations for power management.
Enhancing efficiency is key in addressing the escalating power needs of AI datacenters. Efforts must focus on optimizing performance to manage growing electricity demands effectively.
Simply improving AI hardware efficiency may not lead to reduced energy consumption if the saved power is allocated to expanding computing capabilities, maintaining overall power usage.
Read at Theregister
[
add
]
[
|
|
]