'Watt's' The Problem - The Energy Demands Behind AI - Above the Law
Briefly

Generative AI is transforming technology, yet its expansion faces challenges, primarily in computing power and energy consumption. Nvidia, a leader in GPUs essential for processing data in AI systems, has seen a significant boost in stock due to these demands. The CHIPS and Science Act was introduced to alleviate semiconductor shortages and reduce dependency on other nations. Furthermore, the energy requirements for AI applications, such as ChatGPT, are substantial, highlighting concerns regarding the sustainability and environmental impact of new technologies as data centers consume increasing electricity.
AI models are energy-hungry and will drive increased cloud computing consumption. Data centers worldwide consumed roughly 1.5% of global electricity, and that number is expected to grow.
One of the limiting factors to scaling AI is computing power. Nvidia manufactures the leading GPUs needed to operate large language models, resulting in stock growth.
The CHIPS and Science Act of 2022 was enacted to enhance semiconductor research and manufacturing in the US in response to dependencies and computing shortages.
Google's energy usage in 2023 was nearly 26 Terawatt hours, larger than all but 75 countries, highlighting the growing energy demands of tech giants.
Read at Above the Law
[
|
]