Samsung and Micron have started shipping HBM4 memory, offering higher speed, density, improved thermal and energy efficiency to power next-generation AI accelerators.
Shipments to customers have 'successfully' ramped in the first quarter, he noted, with that development occurring a quarter ahead of when the company was initially expecting.
Samsung expects high demand for memory chips this year and next, reveals new designs
Demand for Samsung's memory chips and HBM4 will remain high through 2027 due to unprecedented AI hyperscaler orders, driving higher prices and mass production.
The $100B memory war: Inside the battle for AI's future
HBM4's doubled bandwidth and larger stacks will break memory bandwidth bottlenecks, accelerating AI training and improving energy efficiency in large GPU datacenters.
SK Hynix says its HBM4 is ready for mass production
SK Hynix completed HBM4 development and is preparing high-volume production to supply next-generation GPUs, enabling much higher memory capacity and bandwidth.