Microsoft's latest AI chip goes head-to-head with Amazon and Google
Briefly

Microsoft's latest AI chip goes head-to-head with Amazon and Google
"Built on TSMC's 3nm process, Microsoft says its Maia 200 AI accelerator "delivers 3 times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google's seventh generation TPU." Each Maia 200 chip has more than 100 billion transistors, which are all designed to handle large-scale AI workloads. "Maia 200 can effortlessly run today's largest models, with plenty of headroom for even bigger models in the future," says Scott Guthrie, executive vice president of Microsoft's Cloud and AI division."
""Maia 200 is also the most efficient inference system Microsoft has ever deployed, with 30 percent better performance per dollar than the latest generation hardware in our fleet today," says Guthrie. Microsoft's performance flex over its close Big Tech competitors is different to when it first launched the Maia 100 in 2023 and didn't want to be drawn into direct comparisons with Amazon's and Google's AI cloud capabilities. Both Google and Amazon are working on next-generation AI chips, though."
Maia 200 is built on TSMC's 3nm process and contains more than 100 billion transistors optimized for large-scale AI workloads. The accelerator delivers three times the FP4 performance of third-generation Amazon Trainium and FP8 performance above Google's seventh-generation TPU. Maia 200 can run today's largest models with headroom for larger future models and offers approximately 30% better performance per dollar compared with the current fleet hardware. Microsoft will deploy Maia 200 in Azure starting in the US Central region, use it to host GPT-5.2 and other services, and provide early SDK previews to researchers and developers.
Read at The Verge
Unable to calculate read time
[
|
]