Tesla driver testifies Autopilot failed to prevent fatal crash


A photo shows the accident scene after a Tesla Model S crashed into a parked vehicle in Key Largo, Florida, in 2019. The driver of a Tesla car that killed a woman in 2019 testified in federal court on Monday, July 21, 2025, that the company’s Autopilot driver-assistance system failed to warn him of an impending accident or engage the brakes. — Monroe County Sheriff's Department via The New York Times

MIAMI: The driver of a Tesla car that killed a woman in 2019 testified July 21 in federal court that the company’s Autopilot driver-assistance system failed to warn him of an impending accident or engage the brakes.

The driver, George Brian McGee, was driving his new Tesla Model S on a dark, two-lane road in South Florida when his phone fell to the floor and he bent to find it. That’s when he failed to see that the road was ending in a T-intersection and that an SUV was parked on the other side, with two people standing next to that car.

Neither he nor Autopilot hit the brakes, and the Tesla crashed into the SUV at 62mph (99kmph), killing a 22-year old woman and gravely injuring her boyfriend.

In a civil case in federal court in Miami, McGee said on the witness stand that he was responsible for keeping his eyes on the road even with Autopilot engaged. But he also said he had been relying on Tesla’s semi-automated driving system to serve as his co-pilot, and thought it had the ability to avoid such a crash.

“I thought it would assist me if I made a mistake,” said McGee, 48, a partner in a Florida private equity firm. “It didn’t warn me of the car and the individuals and hit the brakes.”

The case, in the US District Court for Southern Florida, was filed by the family of the woman killed in the crash, Naibel Benavides, and her companion, Dillon Angulo. The plaintiffs are seeking unspecified damages from Tesla and aim to convince the jury that Tesla was partly responsible for the crash.

The case claims Autopilot has defects that kept the car from braking or warning McGee of the collision. The plaintiffs also contend that the system’s design is flawed because it allows drivers to become distracted.

The judge in the case, Beth Bloom, previously ruled that the plaintiffs could seek punitive damages against Tesla, saying in a recent order that “a reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximising profit.”

The case represents a considerable risk to Tesla. The automaker and its CEO, Elon Musk, have built Tesla’s brand on the idea that its cars are nearly capable of driving themselves. Tesla offers an advanced version of Autopilot that it calls Full Self-Driving and last month started trials of a limited autonomous taxi service in Austin, Texas.

A loss in this case could dent Tesla’s reputation and hurt its sales and stock price, at least in the short term, said Sam Fiorani, an analyst at AutoForecast Solutions, a market research firm.

“All of the stock value in the company is based on the future and the future is autonomous,” Fiorani said.

The automaker’s car sales have been falling in recent months partly because of a backlash against Musk, who has become a leading supporter of conservative political parties around the world. He was also one of President Donald Trump’s closest advisers and donors until the two men fell out recently.

In court, Tesla lawyers have argued that McGee was solely responsible for the crash. “He’s rummaging around for his phone and he runs through the intersection,” Joel H. Smith, of Bowman and Brooke, said in his opening statement. “This can happen in any car, at any time. This is not about Autopilot.”

Court documents and other testimony have revealed that McGee had his foot on the accelerator pedal just before the accident. That pushed his car’s speed to 62mph (99kmph), above the 45mph (72kmph) limit that Autopilot would normally enforce on the road where the crash took place, Card Sound Road near Key Largo. Pressing the accelerator also overrode the part of Autopilot that is able to brake when it detects obstacles or other vehicles.

The plaintiffs have presented videos from the car that showed that the Autopilot system identified the parked vehicle, the end of the road and Angulo but did not activate the brakes. Expert witnesses have also told the jury that the car was equipped with two other systems that were capable of slowing or stopping the car.

One, called automatic emergency braking, is standard on most vehicles sold in the United States and is supposed to brake even if the accelerator is depressed. McGee’s vehicle had a third system that is supposed to stop the car if it determines that the vehicle is about to leave the roadway. The SUV was parked on a gravelly area that Autopilot had marked as outside of “driveable space,” video from his car showed.

Autopilot can be activated on Card Sound Road although Tesla owners’ manuals say the system should not be used on such undivided roads. General Motors and Ford Motor Co. offer similar systems that can be used only on divided highways and can’t be used on Card Sound Road.

“My professional opinion is that Tesla Autopilot is defective because it allows it to be operated in domains it wasn’t designed for,” Mary Cummings, a George Mason University professor who is an expert in autonomous driving systems and for a time worked at the National Highway Traffic Safety Administration, testified last week as a witness for the plaintiffs.

Part of the plaintiffs’ case also focuses on what they said was Tesla’s ineffective way of getting drivers to pay attention to the road while they were using Autopilot. McGee’s 2019 Model S monitored whether he was paying attention simply by requiring him to touch the steering wheel, sometimes only very briefly. But it could not tell if he was looking at the road, and at times his Autopilot system continued operating even when his hands were off the steering wheel for several minutes.

Tesla’s means of monitoring driver behavior “cannot prevent misuse” and represent a “crucial safety gap” in Autopilot, Cummings said in her testimony.

In 2023, Tesla issued a recall of all Autopilot-equipped vehicles to make its driver monitoring system safer.

An earlier civil suit filed by the plaintiffs against McGee was settled. The parties have not disclosed the terms of that deal. – ©2025 The New York Times Company

This article originally appeared in The New York Times.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Wingtech invites Nexperia custodians to Beijing for talks on control of the company - source
South Korea to require advertisers to label AI-generated ads
EU court cuts Intel's EU antitrust fine
New report shows rise in violence against women journalists and activists linked to digital abuse
Survey: Most US teens use YouTube and TikTok daily, some ‘almost constantly’
South Korea to consider setting up $3.1 billion foundry to grow local chip sector
Australian mum of late teen says social media ban 'bittersweet'
Coupang CEO resigns after online retailer hit by massive data breach
Australia's social media ban leaves a 15-year-old worried about losing touch with friends
What's my age again? The tech behind Australia's social media ban

Others Also Read