
Microsoft's breakthrough: Unveiling Azure Maia AI chip & Cobalt CPU to redefine Azure data centre capabilities
As Microsoft pioneers custom silicon, the Azure Maia AI Chip and Cobalt CPU promise to propel Azure data centres into a new era of AI-driven innovation, reshaping the digital landscape.
artificial intelligence
Highlights
- Microsoft unveils Azure Maia AI Chip and Cobalt CPU, reshaping AI infrastructure
- The 128-core CPU promises 40% performance boost, optimising Azure cloud services
- With 105 billion transistors, Maia accelerates Microsoft's AI capabilities, signalling a transformative leap
In a landmark announcement, Microsoft is making waves in the ever-evolving landscape of artificial intelligence by unveiling its bespoke AI chip, Azure Maia, and an Arm-based CPU, Azure Cobalt.
This strategic move is poised to redefine the possibilities within Microsoft's Azure data centres, signalling a transformative era in the AI ecosystem.
These chips, scheduled for release in 2024, aim to bolster Microsoft's AI endeavours, steering away from the reliance on Nvidia's GPUs, which have seen a surge in demand and escalated prices.
We are announcing new innovations across our datacenter fleet, including the latest AI optimized silicon from our industry partners and two new Microsoft-designed chips. #MSIgnite https://t.co/5488N3pkKG pic.twitter.com/XeO0PDHAiR
— Microsoft (@Microsoft) November 15, 2023
A legacy of silicon development
Rani Borkar, head of Azure hardware systems at Microsoft, emphasises the company's extensive experience in silicon development, dating back to collaborations on the Xbox over two decades ago.
The new chips result from a meticulous overhaul of Microsoft's cloud server stack, optimising performance, power efficiency, and cost-effectiveness.
Azure Cobalt CPU: A powerhouse for general cloud services
The Azure Cobalt CPU, boasting 128 cores and built on an Arm Neoverse CSS design, is tailored for general cloud services on Azure. With intentional design choices focusing on performance control and power management, Microsoft aims for a significant performance boost compared to its existing Arm-based servers.

Azure Maia AI chip: Fueling cloud AI workloads
The Maia 100 AI accelerator, designed for cloud AI workloads like large language model training and inference, is set to power Microsoft's most extensive AI workloads on Azure.
Manufactured on a 5-nanometer TSMC process, Maia boasts 105 billion transistors and supports sub-8-bit data types for faster model training and inference times.

Collaboration with OpenAI and industry standardisation
Microsoft's collaboration with OpenAI on Maia's design underscores its commitment to advancing AI capabilities. The company is actively standardising the next generation of data formats for AI models, collaborating with industry giants like AMD, Arm, Intel, Meta, Nvidia, and Qualcomm.
Microsoft hints at a series with the Maia 100 and Cobalt 100, indicating ongoing development beyond these initial releases. While the company remains tight-lipped about future roadmaps, the pace of AI advancement suggests frequent releases may be on the horizon.
In the quest for AI dominance, Microsoft's strategic move into custom silicon marks a pivotal moment, promising enhanced performance, efficiency, and cost-effectiveness for its Azure cloud services.
COMMENTS 0