Lightmatter shows new type of computer chip that could reduce AI energy use


Silicon Valley startup Lightmatter's Envise chip, which uses light rather than electrons to carry out computations, is seen in an undated handout photo provided on April 8, 2025. Courtesy Lightmatter/Handout via REUTERS THIS IMAGE HAS BEEN SUPPLIED BY A THIRD PARTY. NO RESALES. NO ARCHIVES MANDATORY CREDIT

SAN FRANCISCO (Reuters) -Silicon Valley startup Lightmatter revealed on Wednesday it had developed a new type of computer chip that could both speed up artificial intelligence work and use less electricity in the process.

Valued at $4.4 billion after raising $850 million in venture capital, Lightmatter is one of a number of companies seeking to use beams of light, rather than electronic signals, to move data around more quickly between computers. Those connection speeds are critical for artificial intelligence because the software is so complex that it must be spread over many computers.

But Lightmatter also believes that it can use beams of light to carry out the computation itself, which was the focus of a paper it published in the scientific journal Nature on Wednesday. Conventional computers use transistors, which are akin to tiny on-off electrical switches, and gain more computing power by making transistors smaller and cramming more onto a chip.

In recent years, the chip industry has struggled with shrinking those transistors. Lightmatter's chip skips those problems by steering carefully calibrated beams of light into one another and measuring the results with an integrated package of chips made at its manufacturing partner GlobalFoundries.

Previous photonic computers struggled to compute with precision, meaning that if the outcome of a computation was a very small number, the chip might report the answer as a zero. Lightmatter gets around that by breaking up very big and very small numbers into groups before sending them through the photonic circuits so that very small numbers do not get lost.

Nick Harris, Lightmatter's CEO, told Reuters on April 8 that the result is a chip that can work on some current AI problems with the same precision as conventional chips, though he said it will likely be a decade before the technology goes mainstream.

"What we're doing is looking at the future of where processors can go. We fundamentally care about computers, and this is one of the alternative paths. There's trillions of dollars of economic value that's behind the idea that computers will keep getting better," Harris said.

(Reporting by Stephen Nellis in San Francisco; Editing by Stephen Coates)

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Rohm, Toshiba, Mitsubishi Elec to begin power chip integration talks, Nikkei says
South Korea to invest $166 million in AI chip startup Rebellions
In NYC classes, teachers can use AI to plan but not to assign grades
Google top India counsel quits in latest departure amid regulatory hurdles, sources say
Uber, Pony.ai and Verne team up to launch Europe's first robotaxi service in Croatia
The EU’s biggest test for device makers: Replaceable batteries
US activists work to connect Iranians via Starlink
New on the iPhone: Shazam songs even when offline with iOS 26.4
First Robot: Melania Trump brings droid to White House event
Why AI means animal testing is not always needed to trial new medicines

Others Also Read