Microsoft Strengthens Partnership with SK Hynix on Its Own AI Chips
Briefly

Microsoft Strengthens Partnership with SK Hynix on Its Own AI Chips
"Microsoft is strengthening its partnership with South Korean memory chip company SK Hynix as part of a broader strategy to reduce its reliance on NVIDIA's AI hardware. This is reported by South Korean media based on sources close to Microsoft's private CEO Summit 2026 in Redmond. Neowin cites those media outlets in its own article."
"During the multi-day event, approximately one hundred international executives and policymakers are gathering to discuss generative AI, cloud infrastructure, and the growing demand for AI data centers. According to reports, SK Hynix CEO Kwak Noh-Jung is also participating in discussions with Microsoft CEO Satya Nadella and co-founder Bill Gates."
"The Maia 200 plays a key role in this effort. This is Microsoft's proprietary inference accelerator for AI workloads, which went into operation earlier this year at a data center in Des Moines, Iowa. According to those involved, the hardware delivers a better price-performance ratio than previous generations of AI systems within Microsoft's infrastructure."
"According to reports, SK Hynix is the exclusive supplier of the high-bandwidth memory for the Maia 200. The chip features six memory stacks of 36 GB each, providing a total capacity of 216 GB and a memory bandwidth of 7 TB per second. Such configurations are designed to run large AI models faster and more consistently without delays caused by memory bottlenecks."
Microsoft is strengthening its partnership with SK Hynix as part of a strategy to reduce reliance on NVIDIA AI hardware. Executives and policymakers are gathering in Redmond to discuss generative AI, cloud infrastructure, and demand for AI data centers. SK Hynix CEO Kwak Noh-Jung is reported to participate in discussions with Microsoft CEO Satya Nadella and co-founder Bill Gates. SK Hynix is positioned as important to Microsoft’s AI chip strategy, which aims to gain more control over AI infrastructure by deploying Microsoft chips alongside NVIDIA GPUs. The Maia 200 inference accelerator is operating in a data center in Des Moines, Iowa, delivering improved price-performance. SK Hynix is reported as the exclusive supplier of high-bandwidth memory for Maia 200, with 216 GB total capacity and 7 TB per second bandwidth to reduce memory bottlenecks.
Read at Techzine Global
Unable to calculate read time
[
|
]