Microsoft unveils new AI chip ‘Maia’ in collaboration with OpenAI
Microsoft has unveiled a new chip that is intended for use in artificial intelligence (AI) applications in a news post published on Nov. 15.
The company said that it unveiled the chip, named the Azure Maia 100 AI Accelerator, at its Microsoft Ignite conference on the same day. It said that the Maia 100 chip is optimized for use in artificial intelligence applications, including generative AI.
Microsoft’s blog post also quoted OpenAI CEO Sam Altman, who said that his company — best known for ChatGPT — contributed to the chip’s design. Altman said:
“Since first partnering with Microsoft, we’ve collaborated to co-design Azure’s AI infrastructure at every layer … We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models.”
Altman added that Azure’s architecture is “now optimized down to the silicon with Maia.” He said that this improvement will lead to more capable AI models and more affordable access. OpenAI has incidentally paused paid subscriptions due to high demand.
In a separate interview with CNBC, Microsoft Corporate Vice President for Azure Hardware Systems Rani Borkar said that Microsoft is currently testing how the Maia 100 works with its Bing AI chatbot and its GitHub Copilot coding assistant. It is also testing the chip’s ability to power OpenAI’s GPT-3.5 Turbo large language model (LLM).
Microsoft simultaneously announced the Azure Cobalt CPU Cobalt 100 Arm chip, which is intended for general-purpose computing on the cloud. The company said that both chips will be introduced in early 2024 within its own data centers.
AI race
Apart from Microsoft, several other industry heavyweights are also making strides in AI chip development. Nvidia currently leads the way with its H100 Tensor Core GPU chips, which are widely used throughout the AI industry.
In fact, Microsoft said in today’s blog post that it is growing its industry partnerships in order to provide customers with access to Nvidia’s H100 chips and its upcoming H200 chips. CNBC, meanwhile, suggested that Maia will compete with the H100 chip.
A shortage of Nvidia’s chips in the market presents a significant opportunity for other tech firms to bolster their chip production. Reports in October said that OpenAI could begin producing chips in-house. Meta announced details of its next generation of AI chips in May. Google in August announced a chip called Cloud TPUv5e for use in AI.