NHTSA Opens Investigation Into Nearly 3 Million Teslas Over Full Self-Driving Safety Concerns

By Aayush
When you purchase through links on our site, we may earn an affiliate commission.

The U.S. National Highway Traffic Safety Administration (NHTSA) has launched a formal investigation into 2.88 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) system, following dozens of reports of traffic violations and several crashes, according to Reuters.

FSD is meant to assist drivers rather than replace them, but recent incidents suggest the system may not always behave safely. Reports describe Teslas running red lights, steering into oncoming lanes, or ignoring road signs and signals.

A separate report from The Washington Post revealed that regulators are reviewing 58 complaints, which include 14 crashes and 23 injuries. Of particular concern are six collisions where Teslas allegedly drove through red lights and hit other vehicles. The NHTSA is also examining how FSD performs near railroad crossings, following multiple near-miss reports brought to lawmakers’ attention.

Tesla has recently rolled out a software update for FSD, though details remain limited. The agency’s current probe is classified as a preliminary evaluation, the first step in determining whether the system poses a safety risk. Depending on the findings, the review could eventually lead to a recall or mandatory software changes.

Following the news, Tesla’s stock dropped 2.1%.

Why It Matters

2017-tesla-roadster

This investigation highlights growing scrutiny of advanced driver-assistance technologies — and the confusion between “assisted” and “autonomous” driving.

Despite its branding, Tesla’s Full Self-Driving feature does not make the car fully autonomous. The company explicitly instructs owners to keep their hands on the wheel and stay alert at all times.

Still, the number of reported safety violations is fueling debate over whether these features were deployed prematurely and if current regulations are strong enough to keep roads safe.

What It Means for Drivers

If you own a Tesla, this investigation could lead to recalls, software restrictions, or changes to how FSD behaves. The company may be required to dial back some features or make the system more conservative in traffic situations.

For everyone else, it’s a reminder that driver-assist does not mean driver-free. Until full autonomy is proven safe, the human behind the wheel remains responsible for every decision.

What Happens Next

NHTSA’s review could escalate into a broader recall campaign or tighter federal oversight of self-driving technology. Tesla may face pressure to modify FSD’s behaviour — particularly its approach to traffic signals and intersections.

In the meantime, drivers should expect ongoing software tweaks, feature pauses, or new safety prompts. The message from regulators is clear: self-driving cars aren’t ready to drive themselves just yet.

So, buckle up — the future of automated driving may be arriving faster than expected, but the road to true autonomy still has a few sharp turns ahead.

TAGGED:
Share This Article
Follow:
Aayush is a B.Tech graduate and the talented administrator behind AllTechNerd. . A Tech Enthusiast. Who writes mostly about Technology, Blogging and Digital Marketing.Professional skilled in Search Engine Optimization (SEO), WordPress, Google Webmaster Tools, Google Analytics
Leave a Comment