AMD gun for Nvidia H200 with MI325X AI chips
Briefly

"We actually said at Computex up to 288 GB, and that was what we were thinking at the time. There are architectural decisions we made a long time ago with the chip design on the GPU side that we were going to do something with software we didn't think was a good cost-performance trade off, and we've gone and implemented at 256 GB."
"AMD has sought to differentiate itself from Nvidia by cramming more HBM onto its chips, which is making them an attractive option for cloud providers like Microsoft looking to deploy trillion parameter-scale models, like OpenAI's GPT4o, on fewer nodes."
Read at Theregister
[
]
[
|
]