
"Artificial intelligence has a funny way of turning obscure hardware components into economic kingmakers. A year ago, most investors barely thought about DRAM memory prices. Today, they may matter as much to AI growth as Nvidia ( NASDAQ:NVDA | NVDA Price Prediction) GPUs. Why? Because AI models do not just need processing power - they need massive amounts of memory to move, store, and retrieve data in real time."
"According to export pricing data, DRAM export prices from Korea have surged 497% over the past year. Flash memory and high-bandwidth memory (HBM) prices have doubled or tripled, but DRAM stands in a class by itself. The question now is, does this choke off AI growth, or does it simply make the memory makers richer?"
"South Korea matters because it sits at the center of the global memory market. Samsung Electronics and SK Hynix together dominate DRAM and HBM production, while Micron Technology ( ) controls much of the remaining supply. In fact, Samsung, SK Hynix, and Micron collectively provide roughly 95% of the world's DRAM supply. That concentration matters because DRAM is becoming AI infrastructure's hidden tollbooth."
"AI servers use dramatically more memory than traditional cloud servers. Training large language models requires constant high-speed data movement between GPUs and memory chips. Essentially, AI systems cannot think quickly if they cannot access information quickly. Memory suppliers are struggling to keep up with demand from hyperscalers building AI data centers. The Big Four are collectively spending as much as $725 billion on AI infrastructure. GPUs grab the headlines, but memory has quietly become the supply chain constraint."
AI growth depends on large, fast memory for moving, storing, and retrieving data in real time. South Korea is central to the global memory market, with Samsung Electronics and SK Hynix dominating DRAM and HBM production and Micron supplying much of the remaining DRAM. Together these companies provide about 95% of worldwide DRAM supply. DRAM is becoming a hidden bottleneck because AI servers use far more memory than traditional cloud servers and require constant high-speed data movement between GPUs and memory chips. Memory suppliers are struggling to meet demand from hyperscalers investing heavily in AI data centers, making memory a supply chain constraint.
#ai-infrastructure #dram-memory #semiconductors #south-korea-memory-market #data-center-supply-chain
Read at 24/7 Wall St.
Unable to calculate read time
Collection
[
|
...
]