"Nvidia announced Monday that it plans to invest $100 billion in OpenAI in a deal that would give the ChatGPT maker a major leg up in the AI race - access to 10 gigawatts worth of the high-powered GPUs it needs to satisfy its mushrooming growth strategy. What the deal can't guarantee - and what neither company mentioned - is how OpenAI will access the enormous amount of electricity needed to fire up those chips."
"The deal, which the companies committed to in a letter of intent, highlights a growing constraint of the AI race: Access to electricity. The US power grid is already strained by an explosion of data center construction. Adding another 10 gigawatts of demand would amount to adding a power load nearly equivalent to New York City at its summer peak."
OpenAI and Nvidia committed to a $100 billion deal providing roughly 10 gigawatts of high-powered GPUs for AI workloads. Securing GPU capacity does not resolve the parallel need for massive electricity to run those chips. The US power grid is already strained by rapid data center construction, and utilities project about 60 gigawatts of new power required to serve new data centers by the end of the decade. Adding 10 gigawatts would create a load comparable to New York City at summer peak. Difficulty accessing adequate power is slowing data center development and creating a major infrastructure bottleneck for AI expansion.
Read at Business Insider
Unable to calculate read time
Collection
[
|
...
]