
"The International Energy Agency has estimated that some AI-focused data centers consume as much electricity as 100,000 homes, and some of the largest facilities under construction could even use 20 times that amount. At the same time, worldwide data center capacity will increase by 46% over the next two years, equivalent to a jump of almost 21,000 megawatts, according to real estate consultancy Knight Frank."
"First, the company seeks to be as diversified as possible in the kinds of energy that power AI computation. While many people say any form of energy can be used, that's actually not true, he said. "If you're running a cluster for training and you bring it up and you start running a training job, the spike that you have with that computation draws so much energy that you can't handle that from some forms of energy production," Kurian explained."
Google Cloud planned energy sourcing and usage to address immense AI computing electricity needs. Energy and data centers emerged as bottlenecks alongside chips, prompting efficiency-focused machine design. Some AI-focused data centers can consume electricity comparable to 100,000 homes, and the largest facilities under construction could use many times that amount. Worldwide data center capacity is set to increase by roughly 46%, adding nearly 21,000 megawatts. Google Cloud pursues a three-pronged approach: diversify energy types, maximize efficiency and reuse energy inside data centers, and apply AI in control systems to monitor and optimize operations.
Read at Fortune
Unable to calculate read time
Collection
[
|
...
]