fromTheregister
4 days agoAI giants call for energy grid kumbaya
The paper, " Power Stabilization for AI Training Datacenters," argues that oscillating energy demand between the power-intensive GPU compute phase and the less-taxing communication phase, where parallelized GPU calculations get synchronized, represents a barrier to the development of AI models. The authors note that the difference in power consumption between the compute and communication phases is extreme, the former approaching the thermal limits of the GPU and the latter being close to idle time energy usage.
Environment