Cisco rolls out chip designed to connect AI data centers over vast distances


The logo of networking gear maker Cisco Systems Inc is seen during GSMA's 2022 Mobile World Congress (MWC) in Barcelona, Spain February 28, 2022. REUTERS/Nacho Doce

SAN FRANCISCO (Reuters) -Cisco Systems launched on Wednesday a new networking chip designed to connect artificial intelligence data centers, with the cloud computing units of Microsoft and Alibaba enrolling as the chip's customers.

The P200 chip, as Cisco calls it, will compete against rival offerings from Broadcom. It will sit at the heart of a new routing device that the company also rolled out on Wednesday and is designed to connect the sprawling data centers that are located over vast distances and which train AI systems.

Inside those data centers, companies such as Nvidia are connecting tens of thousands and eventually hundreds of thousands of powerful computing chips together to act as one brain to handle AI tasks.

The purpose of the new Cisco chip and router is to connect multiple data centers together to act as one massive computer.

"Now we're saying, 'the training job is so large, I need multiple data centers to connect together,'" Martin Lund, executive vice president of Cisco's common hardware group, told Reuters in an interview. "And they can be 1,000 miles apart."

The reason for those big distances is that data centers consume huge amountsof electricity, which has driven firms such as Oracle and OpenAI to Texas and Meta Platforms to Louisiana in search of gigawatts. AI firms are putting data centers "wherever you can get power," Lund said.

He did not disclose Cisco's investment in building the chip and router or sales expectations from them.

Cisco said the P200 chip replaces what used to take 92 separate chips with just one, and the resulting router uses 65% less power than comparableones.

One of the key challenges is keeping data in sync across multiple data centers without losing any, which requires a technology called buffering that Cisco has worked on for decades.

“The increasing scale of the cloud and AI requires faster networks with more buffering to absorb bursts" of data, Dave Maltz, corporatevice president of Azure Networking at Microsoft, said in a statement. "We’re pleased to see the P200 providing innovation and more options in this space."

(Reporting by Stephen Nellis in San Francisco; Editing by Muralikumar Anantharaman)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Smartphone on your kid’s Christmas list? How to know when they’re ready.
A woman's Waymo rolled up with a stunning surprise: A man hiding in the trunk
A safety report card ranks AI company efforts to protect humanity
Bitcoin hoarding company Strategy remains in Nasdaq 100
Opinion: Everyone complains about 'AI slop,' but no one can define it
Google faces $129 million French asset freeze after Russian ruling, documents show
Netflix’s $72 billion Warner Bros deal faces skepticism over YouTube rivalry claim
Pakistan to allow Binance to explore 'tokenisation' of up to $2 billion of assets
Analysis-Musk's Mars mission adds risk to red-hot SpaceX IPO
Analysis-Oracle-Broadcom one-two punch hits AI trade, but investor optimism persists

Others Also Read