Cisco rolls out chip designed to connect AI data centers over vast distances


The logo of networking gear maker Cisco Systems Inc is seen during GSMA's 2022 Mobile World Congress (MWC) in Barcelona, Spain February 28, 2022. REUTERS/Nacho Doce

SAN FRANCISCO (Reuters) -Cisco Systems launched on Wednesday a new networking chip designed to connect artificial intelligence data centers, with the cloud computing units of Microsoft and Alibaba enrolling as the chip's customers.

The P200 chip, as Cisco calls it, will compete against rival offerings from Broadcom. It will sit at the heart of a new routing device that the company also rolled out on Wednesday and is designed to connect the sprawling data centers that are located over vast distances and which train AI systems.

Inside those data centers, companies such as Nvidia are connecting tens of thousands and eventually hundreds of thousands of powerful computing chips together to act as one brain to handle AI tasks.

The purpose of the new Cisco chip and router is to connect multiple data centers together to act as one massive computer.

"Now we're saying, 'the training job is so large, I need multiple data centers to connect together,'" Martin Lund, executive vice president of Cisco's common hardware group, told Reuters in an interview. "And they can be 1,000 miles apart."

The reason for those big distances is that data centers consume huge amountsof electricity, which has driven firms such as Oracle and OpenAI to Texas and Meta Platforms to Louisiana in search of gigawatts. AI firms are putting data centers "wherever you can get power," Lund said.

He did not disclose Cisco's investment in building the chip and router or sales expectations from them.

Cisco said the P200 chip replaces what used to take 92 separate chips with just one, and the resulting router uses 65% less power than comparableones.

One of the key challenges is keeping data in sync across multiple data centers without losing any, which requires a technology called buffering that Cisco has worked on for decades.

“The increasing scale of the cloud and AI requires faster networks with more buffering to absorb bursts" of data, Dave Maltz, corporatevice president of Azure Networking at Microsoft, said in a statement. "We’re pleased to see the P200 providing innovation and more options in this space."

(Reporting by Stephen Nellis in San Francisco; Editing by Muralikumar Anantharaman)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

US jury says Apple must pay Masimo $634 million in smartwatch patent case
Opinion: How to save Wikipedia from AI
Google ad tech antitrust trial closing arguments moved back
Apple intensifies succession planning for CEO Tim Cook, FT reports
Here’s why Google is warning you to avoid using public WiFi at all costs
Apple dropped a new iPhone accessory. The Internet can’t decide if it’s whimsical or a ‘piece of cloth’
Google plans $40 billion Texas data center investment amid AI boom
Berkshire reveals new $4.3 billion Alphabet stake, sells more Apple
Tiger Global slashes Meta stake by 63%
JPMorgan secures deals with fintech aggregators over fees to access data, CNBC reports

Others Also Read