AMD buys HBM4 from Samsung for AI data centers
Briefly

AMD buys HBM4 from Samsung for AI data centers
"Samsung and AMD share a commitment to advancing AI computing, and this agreement reflects the growing scope of our collaboration. From industry-leading HBM4 and next-generation memory architectures to cutting-edge foundry and advanced packaging, Samsung is uniquely positioned to deliver unrivaled turnkey capabilities that support AMD's evolving AI roadmap."
"Powering the next generation of AI infrastructure requires deep collaboration across the industry. We are thrilled to expand our work with Samsung, bringing together their leadership in advanced memory with our Instinct GPUs, Epyc CPUs and rack-scale platforms. Integration across the full computing stack, from silicon to system to rack, is essential to accelerating AI innovation."
"Samsung's HBM4 is made on a 10nm process and a 4nm logic base die, and features processing speeds of up to 13Gbps and a maximum bandwidth of 3.3TBps. Samsung and AMD will also work together on high-performance DDR5 memory optimized for the 6th gen Epyc CPUs."
AMD and Samsung announced a memorandum of understanding to collaborate on advanced memory and computing solutions for AI infrastructure. AMD will use Samsung's HBM4 high-bandwidth memory, manufactured on 10nm process with 4nm logic base die, for its Instinct MI455X AI accelerator GPU and 6th generation Venice Epyc CPUs. Samsung's HBM4 delivers processing speeds up to 13Gbps and maximum bandwidth of 3.3TBps. The companies will also develop optimized DDR5 memory for the Epyc CPUs and explore foundry partnerships. AMD's integrated approach combines Instinct GPUs, Epyc CPUs, and rack-scale Helios platforms to support comprehensive AI computing stacks from silicon through system architecture.
Read at GSMArena.com
Unable to calculate read time
[
|
]