Samsung enters AI infrastructure market with first HBM4 deliveries
Briefly

Samsung enters AI infrastructure market with first HBM4 deliveries
"Samsung Electronics has begun delivering its latest-generation high-bandwidth memory, HBM4. With this step, the South Korean chip manufacturer aims to regain ground in the rapidly growing market for AI infrastructure, where it has so far lagged behind its competitors. This was reported by Reuters. The global wave of investment in AI data centers has significantly increased demand for advanced memory. HBM plays a key role in this, as it is designed to process enormous amounts of data at lightning speed. This is essential for applications such as generative AI and high-performance computing, where traditional memory technologies fall short."
"According to Samsung, the new HBM4 offers a significant speed improvement over the previous generation. The chip achieves a constant data rate of 11.7 gigabits per second, representing a clear performance leap over HBM3E. In peak situations, the transfer rate can even reach 13 gigabits per second. With this, the company aims to reduce bottlenecks in data processing, which are becoming increasingly common with complex AI workloads."
Samsung Electronics has started delivering HBM4 high-bandwidth memory to strengthen its position in the AI infrastructure market. HBM4 delivers a constant 11.7 gigabits per second data rate with peak transfers up to 13 gigabits per second, representing a performance leap over HBM3E. The increased bandwidth targets data-processing bottlenecks in generative AI and high-performance computing workloads. Samsung had lagged in the advanced AI memory segment and conceded ground in supplies to Nvidia. Samsung plans HBM4E trial samples in the second half of the year. SK Hynix and Micron are also producing HBM4 at scale, increasing competition.
Read at Techzine Global
Unable to calculate read time
[
|
]