
"The chip, named the Maia 200, is designed to be what Microsoft calls an "AI inference powerhouse" as Microsoft describes, meaning it's optimized for the compute-intensive work of running AI models in production. The company released some impressive processing-speed specs for Maia, saying it outperforms Amazon's latest Trainium chips and Google's latest Tensor Processing Units (TPU). All of the cloud giants are turning to their own AI chip designs in part because of the difficulty,"
"But even with its own state-of-the-art, high-performance chip in hand, Microsoft CEO Satya Nadella said the company will still be buying chips made by others. "We have a great partnership with Nvidia, with AMD. They are innovating. We are innovating," he explained. "I think a lot of folks just talk about who's ahead. Just remember, you have to be ahead for all time to come.""
Microsoft deployed the Maia 200 AI chip in a data center and plans further rollouts. The Maia 200 is positioned as an AI inference powerhouse optimized for compute-intensive production model runs and is claimed to outperform Amazon's Trainium and Google's latest TPUs. Cloud providers are designing custom AI chips partly because of a persistent Nvidia supply crunch. Microsoft says it will continue purchasing chips from other vendors and maintains partnerships with Nvidia and AMD. Maia 200 will be used by Microsoft's Superintelligence team to train frontier models and will also support OpenAI models running on Azure.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]