Tesla Held Partly Responsible in Fatal Autopilot Crash, Ordered to Pay $242.5 Million

A federal jury in Miami has found Tesla partially liable for a 2019 crash involving its Autopilot driver-assistance system, marking a rare legal defeat for the company over its controversial self-driving tech.

The incident resulted in the death of 20-year-old Naibel Benavides Leon and serious injuries to her boyfriend, Dillon Angulo. During the three-week trial, jurors concluded that both the driver and Tesla shared responsibility. The driver, who was sued separately, was deemed two-thirds at fault, while Tesla was assigned one-third of the blame.

The total damages awarded in the case—including both punitive and compensatory damages—amount to approximately $242.5 million, according to CNBC.

What Happened

According to court testimony, neither the human driver nor Tesla’s Autopilot system applied the brakes before the car entered an intersection and collided with an SUV, killing Leon. The trial raised key questions about how Tesla’s technology is marketed and used in real-world settings.

Plaintiffs’ Argument: A Misleading System

Brett Schreiber, lead attorney for the plaintiffs, argued that Tesla knowingly allowed Autopilot to operate on roads it wasn’t designed for, despite public claims—especially from CEO Elon Musk—that the system performed better than a human driver.

“Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology,” said Schreiber. “This verdict is a step toward holding Tesla and Musk accountable for putting lives at risk to maintain a sky-high valuation built on self-driving hype.”

Tesla’s Response: Plans to Appeal

Tesla has rejected the jury’s findings and announced plans to appeal the decision. In a statement to TechCrunch, the company said:

“Today’s verdict is wrong and sets a dangerous precedent. No vehicle in 2019, or today, could have prevented this crash. This wasn’t about Autopilot—it was a story constructed by the plaintiffs’ attorneys to shift blame away from the driver.”

Tesla argues that the driver involved accepted responsibility from the outset, and that the case misrepresents what Autopilot was designed to do.

Autopilot Under Scrutiny

This case is one of the first major jury decisions to go against Tesla over its driver-assistance features. While the company has settled similar lawsuits in the past, this public trial highlighted a long-standing issue: driver overreliance on semi-automated systems.

Back in 2020, the National Transportation Safety Board (NTSB) faulted Tesla for not addressing safety concerns following a separate fatal crash in 2018. In that case, the driver was using Autopilot while playing a mobile game before hitting a barrier. The NTSB accused Tesla of ignoring its safety recommendations.

Even Elon Musk acknowledged the risks of driver complacency. On a 2018 conference call, he admitted:

“They just get too used to it… It’s not that they don’t understand Autopilot’s limits—it’s that they think they know more than they do.”

What’s Next for Tesla

The ruling comes just as Tesla begins testing its long-promised Robotaxi program in Austin, Texas, powered by a more advanced version of its Full Self-Driving software. This technology, still in development, promises to eliminate the need for a human driver entirely—though it remains a lightning rod for regulatory scrutiny and public concern.

As Tesla continues pushing the boundaries of automated driving, this case serves as a reminder that legal accountability is evolving right alongside the technology.

TAGGED:
Share This Article
Leave a Comment