#high-bandwidth-memory

[ follow ]
Artificial intelligence
fromTechCrunch
1 day ago

OpenAI ropes in Samsung, SK Hynix to source memory chips for Stargate | TechCrunch

OpenAI secured agreements with Samsung and SK Hynix to produce up to 900,000 HBM DRAM chips monthly and build AI data centers in South Korea.
Gadgets
fromTheregister
1 day ago

Raspberry Pi prices hiked as AI gobbles all the memory

Raspberry Pi raised prices on higher-memory devices because HBM costs surged about 120% year-over-year, affecting 4GB/8GB Compute Modules, Pi 500, Development Kit, and Pi 3B+.
fromTechzine Global
2 weeks ago

Huawei challenges Nvidia with new AI chip technology

HBM, or High-Bandwidth Memory, plays a crucial role in the operation of modern AI chips. By stacking DRAM layers vertically, signal paths become shorter and the chip's bandwidth increases significantly. This not only delivers higher performance, but also reduces energy consumption for data-intensive tasks such as training and applying large language models. Because the memory is placed directly next to the processor, unnecessary data movement is minimized.
Artificial intelligence
fromTheregister
2 weeks ago

Huawei lays out multi-year AI accelerator roadmap

First off the rank, in the first quarter of 2026, will be the Ascend 950PR which, according to slideware shown at the conference, will boast one petaflop performance with the 8-bit floating-point (FP8) computation units used for many AI inferencing workloads. The chip will also include 2 TB/s interconnect bandwidth and 128GB of 1.6 TB/s memory. In 2026's final quarter Huawei plans to deliver the 950DT, which will be capable of two petaflops of FP4 performance thanks to the inclusion of 144GB of 4 TB/s memory.
Artificial intelligence
Tech industry
fromTechzine Global
2 months ago

Samsung's profits plummet, Nvidia-related woes continue

Samsung's profits have declined significantly due to delays in HBM chip certification and strong competition in the memory market.
[ Load more ]