
"According to Nvidia CEO Jensen Huang, the amount of computation necessary to run artificial intelligence (AI) is 1,000 times higher than the computing power needed to run non-AI software."
"With the GB200s and NVL72s we're looking at 120 kilowatts per rack. More powerful graphic processing units [GPUs] equates to more demand for power distribution units [PDUs], and that has a knock-on effect on the electrical infrastructure."
"Air cooling cannot keep up, we're seeing liquid cooling take over. Rear door heat exchanger [RDHx], direct-to-chip and immersion cooling are all helping to solve the power in/heat out problem."
"Sustainability is still treated like a compliance checkbox, something reviewed after the fact in a quarterly report. Treating sustainability as a separate reporting line only makes the problem worse."
Nvidia CEO Jensen Huang states that AI requires 1,000 times more computing power than non-AI software. Traditional datacentre racks generate 20 to 40 kilowatts, but Nvidia's advancements are pushing this to 120 kilowatts per rack. This surge in power demand necessitates enhanced electrical infrastructure and backup solutions, including small modular reactors. Additionally, the rise in GPU usage is increasing cooling demands, with liquid cooling becoming essential. Many organizations fail to architect their compute usage effectively, treating sustainability as a compliance issue rather than an integrated approach.
Read at ComputerWeekly.com
Unable to calculate read time
Collection
[
|
...
]