Microsoft’s latest AI chip goes head-to-head with Amazon and Google
Summary
Microsoft has announced the Maia 200, the successor to its in-house AI chip, built on TSMC’s 3nm process. Microsoft claims the Maia 200 delivers three times the FP4 performance of Amazon's third-generation Trainium and better FP8 performance than Google’s seventh-generation TPU. Containing over 100 billion transistors, the chip is designed for large-scale AI workloads and will be used internally to host models like OpenAI’s GPT-5.2 and power Microsoft 365 Copilot. According to Scott Guthrie, EVP of Cloud and AI, the Maia 200 is also Microsoft's most efficient inference system, offering 30 percent better performance per dollar than current fleet hardware. Unlike the initial launch of the Maia 100, Microsoft is now openly comparing the Maia 200 against its Big Tech competitors. Deployment of the new chips begins today in the Azure US Central data center region, with Microsoft also inviting external developers and academics to preview the software development kit.
(Source:The Verge)