Microsoft installed the first of its AI chips in one of its data centers this week, and the company says it plans to deploy more chips in the coming months.
The chip, named Maia 200, is designed to be what Microsoft calls an “AI inference powerhouse,” meaning it’s optimized for the compute-intensive task of running AI models in production. The company announced impressive processing speed specs for Maia, saying it outperforms Amazon’s latest Trainium chips and Google’s latest Tensor Processing Units (TPUs).
Cloud giants are turning to their own AI chip designs, in part because getting the latest and greatest from NVIDIA is difficult and expensive, and the supply shortage shows no signs of abating.
But Microsoft CEO Satya Nadella said that even with his company’s cutting-edge, high-performance chips, the company will continue to buy chips from other companies.
“We have great partnerships with Nvidia and AMD. They continue to innovate. We continue to innovate,” he explained. “I think a lot of people just talk about who’s ahead. Remember, we’ve always got to stay ahead.”
He added: “Just because you can integrate vertically doesn’t mean you just integrate vertically.” This means building your own system from top to bottom without using other vendors’ products.
However, the Maia 200 will be used by Microsoft’s own so-called superintelligence team, AI experts who build the software giant’s own frontier models. Mustafa Suleiman, former Google DeepMind co-founder and current team leader, says: Microsoft is working on developing its own models to reduce dependence on OpenAI, Anthropic, and other model makers, perhaps someday.
tech crunch event
boston, massachusetts
|
June 23, 2026
The Maia 200 chip also supports models from OpenAI running on Microsoft’s Azure cloud platform, the company said. But no matter how you look at it, securing access to cutting-edge AI hardware remains a challenge for both paying customers and internal teams.
So, in a post to X, Suleiman was clearly happy to share the news that his team had earned first dibs. When this chip was released, he wrote, “Today is a big day.” “Our Superintelligence team will be using Maia 200 for the first time when developing frontier AI models.”
