Microsoft rolls out next generation of its AI chips, takes aim at Nvidia's software


Microsoft logo is seen near computer motherboard in this illustration taken January 8, 2024. REUTERS/Dado Ruvic/Illustration

SAN FRANCISCO, Jan 26 (Reuters) - Microsoft ‌on Monday unveiled the second generation of its in-house artificialintelligence chip, along with ‌software tools that take aim at one of Nvidia's biggest competitive advantages ‌with developers.

The new "Maia 200" chip comes online this week in a data center in Iowa, with plans for a second location in Arizona, Microsoft said. It is the second generation of an AI chip called ‍Maia that Microsoft introduced in 2023.

The Maia 200 comes as ‍major cloud computing firms such ‌as Microsoft, Alphabet's Google and Amazon.com's Amazon Web Services - some of Nvidia's biggest customers - are ‍producing ​their own chips that increasingly compete with Nvidia.

Google, in particular, has garnered interest from major Nvidia customerssuch as Meta Platforms, which is working closely with Google ⁠to close one of the biggest software gaps between Google ‌and Nvidia's AI chip offerings.

For its part, Microsoft said that along with the new Maia chip, it ⁠will be offering ‍a package of software tools to program it. That includes Triton, an open-source software tool with major contributions from ChatGPT creator OpenAI that takes on the same tasks as Cuda, the ‍Nvidia software that many Wall Street analysts say is ‌Nvidia's biggest competitive advantage.

Like Nvidia's forthcoming flagship "Vera Rubin" chips introduced earlier this month, Microsoft's Maia 200 is made by Taiwan Semiconductor Manufacturing Co using 3-nanometer chipmaking technology and will use high-bandwidth memory chips, albeit an older and slower generation thanNvidia's forthcoming chips.

But Microsoft has also taken a page from the playbook of some of Nvidia's rising competitors by packing the Maia 200 chip with a significant amount of what is known as SRAM, a ‌type of memory that can provide speed advantages for chatbots and other AI systems when they field requests from a large number of users.

Cerebras Systems, which recently inked a $10 billion deal with OpenAI to ​supply computing power, leans heavily on that technology, as does Groq, the startup that Nvidia licensed technology from in a reported $20 billion deal.

(Reporting by Stephen Nellis in San Francisco; Editing by Jamie Freed)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

EU to make WhatsApp more responsible for tackling harmful content
Google settles Google Assistant privacy lawsuit for $68 million
Nvidia invests $2 billion in CoreWeave to boost data center build-out
EU opens investigation into X over Grok sexualised imagery, lawmaker says
Meta, TikTok, YouTube to stand trial on youth addiction claims
Is your apartment nice enough for this home-swapping app?
Iranians struggle as Internet shutdown hits livelihoods
Social media giants face landmark trial over addiction claims
This AI tool is going viral. Five ways people are using it.
Korea kicks off AI Squid Game in bid to compete with US, China

Others Also Read