Microsoft debuts Maia 200, its most powerful and efficient AI chip

Microsoft debuts Maia 200, its most powerful and efficient AI chip

Microsoft

Built on TSMC’s 3nm process, the chip delivers 30 per cent more performance per dollar than previous models

Alice Chambers

By Alice Chambers |


Microsoft has unveiled its latest AI chip, Maia 200, which delivers up to three times the performance of competing cloud AI chips from Amazon and Google. Scott Guthrie, executive vice president of cloud and AI at Microsoft, highlighted its speed and efficiency for large-scale AI workloads.

Built on TSMC’s 3nm process and containing over 140 billion transistors, Maia 200 is designed to handle today’s largest AI models, with extra capacity for even bigger models in the future. It is also Microsoft’s most cost-efficient AI chip, delivering 30 per cent more performance per dollar than previous models.

Watch: Microsoft’s Jessica Hawk introduces Maia 200

The chip will power multiple AI models, including OpenAI’s GPT 5.2, which underpins Microsoft 365 Copilot and Foundry. Deployment begins in Azure’s US Central data centre, with other regions to follow. Developers, academics, AI labs and open-source projects can access an early preview of the Maia 200 software development kit.

“Maia 200 was designed for seamless deployment in Azure data centres,” said Guthrie, in a Microsoft blog titled ‘Maia 200: The AI accelerator built for inference’. “It integrates directly with Azure’s management systems to ensure security, reliability and maximum uptime for critical AI workloads.”

Maia 200 is Microsoft’s second in-house AI chip, following the Maia 100, launched in 2023 to support large language model training and inferencing. It joins Majorana 1, the company’s first quantum processor introduced in 2025, as part of Microsoft’s growing in-house silicon portfolio.

Subscribe to the Technology Record newsletter


  • ©2026 Tudor Rose. All Rights Reserved. Technology Record is published by Tudor Rose with the support and guidance of Microsoft.