Microsoft introduced two new chips at its Ignite conference in Seattle. The Maia 100 artificial intelligence chip is positioned as a competitor to Nvidia’s sought-after AI graphics processing units, while the Cobalt 100 Arm chip targets general computing tasks, potentially competing with Intel processors. These additions align with a trend among tech companies to provide clients with more options for cloud infrastructure. Microsoft, with substantial cash reserves, holds a 21.5% share of the cloud market, ranking second to Amazon.
The Cobalt chips are expected to be available on Microsoft’s Azure cloud in 2024, with no timeline provided for the Maia 100. Unlike Nvidia or AMD, Microsoft and other cloud providers don’t plan to sell servers containing their chips.
The Maia chip was designed based on customer feedback, undergoing testing for applications like Bing search engine’s AI chatbot, GitHub Copilot coding assistant, and the GPT-3.5-Turbo language model from OpenAI. Microsoft has also created liquid-cooled hardware called Sidekicks to complement Maia servers. The company aims to address challenges in data center space utilization that can arise with GPU installations.
Microsoft anticipates potential faster adoption of Cobalt processors compared to Maia AI chips based on initial performance tests. The move towards efficient cloud spending has led AWS customers to embrace Graviton chips, and Microsoft is testing Teams app and Azure SQL Database service on Cobalt, showing a 40% performance improvement.
While transitioning from GPUs to AWS Trainium AI chips can be complex, AWS’ top 100 customers have already adopted Arm-based chips for a 40% price-performance improvement. The company has shared specifications with the ecosystem and partners for the benefit of Azure customers. Details on Maia’s performance compared to alternatives like Nvidia’s H100 are not provided, while Nvidia recently announced the shipment of its H200 in 2024.