How to stop AI from straining networks | Computer Weekly
Briefly

How to stop AI from straining networks | Computer Weekly
"Every ChatGPT query, every AI-powered trade and AI digital assistant, among other things, put massive unseen pressure on these critical networks. Datacentres, the cloud and graphics processing units (GPUs) dominate much of the tech sustainability conversation currently due to their vast energy needs. However, it's the network infrastructure, including routing, interconnects and protocols, which is becoming the real bottleneck as AI workloads increase, because of heat output, cost and energy usage."
"Some AI systems, used in applications such as high-frequency financial trading, autonomous driving or certain conversational AI models also require latency-sensitive inference. Hyperscale interconnects between datacentres, GPUs and clouds are needed for both AI inference and training. These need massive power loads both for operation and for cooling essential components such as switches, fibre hubs and undersea cable landing stations, which can generate significant heat. In this way, AI is applying more pressure on both network quality and capacity."
AI model sizes are doubling every few months, and rising energy requirements are straining European networking infrastructure. Every ChatGPT query, AI-powered trade and AI digital assistant generate substantial unseen pressure on critical networks. Datacentres, cloud services and GPUs consume vast energy, but routing, interconnects and protocols are emerging as the primary bottleneck due to heat output, cost and energy usage. AI workloads differ from streaming or web browsing by requiring high-bandwidth, persistent east-west traffic and latency-sensitive inference for some applications. Hyperscale interconnects between datacentres, GPUs and clouds demand massive power for operation and cooling. European datacentre electricity consumption could rise from 62 TWh to over 150 TWh by 2030 due to AI.
Read at ComputerWeekly.com
Unable to calculate read time
[
|
]