AI's thirst for power underscores need for efficient silicon
Briefly

"If you look at it at the macro level for those huge deployments, we're talking about not even finding enough power sources and being concerned about the grids and the distribution," he observed.
"As it turns out ... if you throw [in] more compute, you grow the size of the model, you get better performance, accuracy, levels of intelligence, however you want to think about this," he argued.
"Whatever power budget you think you're limited at, if you get higher performance, you could either train larger models and get to intelligence quicker, or you can serve it more cost effectively," he explained.
The reason folks in the industry claim that Moore's Law is alive and well, Pend opined, is because many of these challenges can be overcome by moving to chiplet architectures and advanced packaging.
Read at Theregister
[
|
]