Skip to main content
  1. Resources/

Tesla FSD & Autopilot Liability: Crash Claims, NHTSA Investigations & Lawsuits

Table of Contents

Tesla’s Self-Driving Promise vs. Reality
#

Tesla has sold millions of vehicles with “Autopilot” and “Full Self-Driving” capabilities, charging up to $15,000 for the FSD package while CEO Elon Musk has repeatedly promised fully autonomous robotaxis “next year” since 2016. The reality has been starkly different: 65 verified Autopilot-related fatalities, $329 million in jury verdicts, multiple NHTSA investigations, a 2.4 million vehicle recall, and California DMV proceedings threatening Tesla’s license to sell vehicles in the state.

As of 2025, Tesla’s driver assistance systems remain SAE Level 2—requiring constant human supervision—despite names suggesting otherwise. The gap between marketing and reality has spawned an unprecedented wave of litigation, regulatory enforcement, and consumer protection actions.

65
Verified Fatalities
Autopilot-related deaths
$329M
Jury Verdict
2025 Florida Autopilot case
2.4M
Vehicles Recalled
NHTSA Autopilot investigation
1,399
Reported Crashes
With driver assist engaged

Understanding Autopilot vs. Full Self-Driving
#

What Tesla Actually Offers
#

Despite the names, neither Autopilot nor FSD provides autonomous driving:

FeatureSAE LevelWhat It DoesWhat It Doesn’t Do
AutopilotLevel 2Lane centering, adaptive cruise controlHandle intersections, stop signs, traffic lights
Enhanced AutopilotLevel 2Navigate on Autopilot, auto lane changeDrive autonomously on surface streets
Full Self-DrivingLevel 2Traffic light/sign control, autosteer on city streetsOperate without constant driver supervision

Key Point: All Tesla driver assistance features require the driver to remain attentive with hands on the wheel at all times. SAE Level 2 means the human driver is fully responsible—the vehicle provides assistance, not automation.

The Naming Problem
#

The California DMV has charged that Tesla’s use of terms like “Autopilot” and “Full Self-Driving Capability” creates false impressions about vehicle capabilities:

  • “Autopilot” — Suggests aircraft-style automation where the pilot can disengage
  • “Full Self-Driving” — Explicitly promises complete autonomous operation
  • Reality — Level 2 assistance requiring constant human supervision

As UC Berkeley Professor Scott Moura explained: “Tesla has a technology product branded as FSD, which is ‘fully self-driving.’ There are levels of automated driving: 1-2-3-4-5. But their FSD tech corresponds to Level 2, not Level 5. Thus, one can argue that it is misleading.”


Fatal Crashes and NHTSA Investigations
#

The Fatality Record
#

As of late 2025, there have been 65 verified fatalities involving Tesla Autopilot or FSD, including 54 confirmed by NHTSA investigations or expert testimony and 2 specifically attributed to FSD engagement after 2022.

Timeline of Notable Fatal Crashes:

DateLocationSystemDetails
May 2016Williston, FLAutopilotFirst known Autopilot fatality; Model S struck semi-trailer
March 2018Mountain View, CAAutopilotApple engineer Walter Huang killed; vehicle struck barrier
March 2019Delray Beach, FLAutopilotModel 3 drove under semi-trailer; driver killed
November 2023Rimrock, AZFSDPedestrian fatally struck by Model Y with FSD engaged
April 2019Key Largo, FLAutopilotPedestrian killed; led to $329M verdict

NHTSA Investigation: 2.4 Million Vehicles (October 2024)
#

In October 2024, NHTSA opened a preliminary evaluation (PE24031) covering approximately 2.4 million Tesla vehicles after identifying four FSD-related crashes in low-visibility conditions.

Vehicles Covered:

  • 2016-2024 Model S
  • 2016-2024 Model X
  • 2017-2024 Model 3
  • 2020-2024 Model Y
  • 2023-2024 Cybertruck

NHTSA Findings: The agency found FSD’s engineering controls failed to “react appropriately to reduced roadway visibility conditions” including:

  • Sun glare
  • Fog
  • Airborne dust

One crash resulted in a pedestrian fatality (Rimrock, Arizona, November 2023); another caused serious injury.

The Autopilot Crash Pattern
#

NHTSA’s comprehensive investigation documented concerning patterns:

From the Standing General Order Database (July 2021 - October 2024):

  • 1,399 crashes with Tesla driver assistance engaged within 30 seconds of collision
  • 31 fatalities in those crashes
  • 956 crashes documented between January 2018 and August 2023
  • 29 deaths in that subset

Emergency Vehicle Collision Pattern: NHTSA specifically investigated Autopilot crashes into stationary emergency vehicles:

  • At least 16 incidents since 2018
  • First responders on scene with lights active
  • Vehicles drove directly into police cars, fire trucks, ambulances

Major Recalls and OTA Software Liability
#

The 2 Million Vehicle Recall (December 2023)
#

Tesla’s largest-ever recall affected over 2 million vehicles after NHTSA determined Autopilot’s driver monitoring was insufficient.

Recall Details (24V-051):

ElementDetails
Vehicles affected2012-2023 Model S, 2016-2023 Model X, 2017-2023 Model 3, 2020-2023 Model Y
Total units2,031,220
IssueInadequate driver attention monitoring; system allows misuse
RemedyOver-the-air software update

What the Update Changed:

  • Increased alert text size
  • Added setting for single-tap Autopilot activation
  • Implemented five-strike penalty disabling Autopilot for ignored warnings

The OTA Update Problem
#

Tesla’s remedy was an over-the-air software update—but this created new issues:

Reported Problems After Update:

  • Autopilot hardware failures during update attempts
  • Vehicles stuck in update cycles for 72+ hours
  • Complete disabling of ADAS features
  • Battery drain from failed update loops

Legal Implications of OTA Updates: Tesla’s use of OTA updates raises novel liability questions:

  1. Is an OTA update a sufficient “fix”? — NHTSA has continued investigating whether Tesla’s software remedies actually resolve safety defects
  2. Who is liable when updates fail? — Vehicle owners have experienced new failures caused by recall updates
  3. Can software changes eliminate physical defects? — Critics argue sensor limitations require hardware fixes

Subsequent FSD Recalls
#

February 2023 (23V-085): 362,758 vehicles recalled for FSD Beta issues including:

  • Running stop signs when “rolling stops” engaged
  • Speed limit violations in school zones
  • Improper lane changes

These patterns suggest systemic software design issues rather than isolated bugs.


Landmark Verdicts and Settlements
#

$329 Million Florida Verdict (August 2025)
#

A Miami jury delivered a historic verdict against Tesla—the first to find Autopilot defective.

Product Liability / Wrongful Death

Benavides v. Tesla (Florida Autopilot)

$329M
Jury Verdict

22-year-old pedestrian Naibel Benavides Leon killed when driver George McGee's Model S struck her on Card Sound Road with Autopilot engaged in April 2019. Jury found Tesla 33% liable, awarded $129M compensatory damages plus $200M punitive. Tesla ordered to pay $243M total. First jury to find Autopilot system defective.

Key Largo, FL 2025
Wrongful Death / Fraudulent Misrepresentation

Mendoza-Martinez v. Tesla

Pending
Litigation Ongoing

Family of 15-year-old Genesis Giovanni Mendoza-Martinez suing Tesla for 'fraudulent misrepresentation' after fatal 2024 collision in Model S. Brother Caleb seriously injured. Plaintiffs allege Tesla misled consumers about Autopilot/FSD capabilities.

Walnut Creek, CA 2024

Key Findings from the Florida Verdict
#

The jury specifically found:

  1. Tesla allowed Autopilot on unsafe roads — The system could be activated anywhere, not just controlled-access highways
  2. Inadequate driver monitoring — The system didn’t sufficiently ensure driver attention
  3. Musk overhyped capabilities — Marketing encouraged excessive trust in the system

Plaintiffs’ attorney argument: “Tesla designed Autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere.”

Other Significant Cases
#

About a dozen active lawsuits focus on similar claims involving Autopilot or FSD engagement before fatal or injurious crashes. Tesla has won some cases by successfully arguing driver responsibility, but the Florida verdict signals potential shift in how juries view Tesla’s liability.


Consumer Class Actions
#

FSD Class Action Certified (August 2025)
#

U.S. District Judge Rita Lin certified a class action allowing Tesla customers to sue collectively for FSD misrepresentation.

Case Details:

  • Title: In re Tesla Advanced Driver Assistance Systems Litigation
  • Court: N.D. California
  • Allegation: Tesla and Musk misled customers for nearly eight years about FSD capabilities

Two Certified Classes:

  1. May 2017 - July 2024: Customers who purchased FSD but opted out of arbitration
  2. October 2016 - May 2017: Earlier purchasers not bound by arbitration

Key Ruling: Judge Lin found Tesla’s claim that all post-2016 vehicles contained hardware necessary for full autonomy was widespread enough to affect California purchasers and could form the basis for class-wide fraud claims. The court rejected Tesla’s argument that its disclaimers neutralized its promotional statements.

What Plaintiffs Seek:

  • Refunds for FSD purchases and subscriptions
  • Injunction stopping Tesla from making similar claims

Shareholder Securities Fraud Suit
#

Separately, Tesla shareholders have filed a securities fraud class action in the Western District of Texas, alleging Tesla and Musk concealed testing data showing FSD capabilities were overstated and potentially dangerous.


California DMV Deceptive Marketing Proceedings
#

The Stakes
#

In July 2025, a five-day administrative trial began in Oakland where the California DMV seeks to suspend Tesla’s license to sell vehicles in the state for at least 30 days.

DMV’s Allegations
#

The California DMV filed formal administrative charges in July 2022 under California Vehicle Code provisions banning misleading marketing of partially automated features:

  • “Autopilot” implies aircraft-style automation
  • “Full Self-Driving Capability” explicitly claims autonomous operation
  • Both names violate state consumer protection laws

From DMV’s July 2025 trial brief: “Tesla’s use of the terms ‘Autopilot’ and ‘Full Self-Driving Capability’ creates a false impression about the level of automation in its vehicles. While Tesla drivers must remain fully engaged and attentive at all times, the company’s advertising implies otherwise.”

Tesla’s Defenses
#

Tesla has argued:

  1. First Amendment protection — Marketing claims are protected speech
  2. Disclaimers suffice — Warnings require “active human supervision”
  3. Aspirational language — “Self-driving” describes the goal, not current capability
  4. Implicit state approval — California allowed these names for years

Potential Consequences
#

If the DMV prevails:

  • 30-day sales suspension minimum
  • Restitution to affected consumers
  • Potential license revocation for continued violations

Market Impact: California accounts for roughly one-third of U.S. EV sales, and Tesla’s largest delivery capacity is centered there.


Legal Theories for FSD/Autopilot Claims#

Product Liability
#

Design Defect:

  • Autopilot allows engagement on roads it wasn’t designed for
  • FSD fails to handle foreseeable conditions (sun glare, fog, dust)
  • Driver monitoring insufficient to prevent misuse
  • Camera-only approach lacks redundancy other AVs use (lidar, radar)

Manufacturing Defect:

  • Individual vehicles with sensor malfunctions
  • Hardware Quality issues affecting ADAS performance

Failure to Warn:

  • Marketing overstates capabilities
  • “Autopilot” and “FSD” names mislead consumers
  • Warnings inadequate to counter aggressive marketing
  • In-car alerts insufficient for the level of attention required

Negligence
#

Negligent Design:

  • Deploying systems known to fail in certain conditions
  • Allowing use outside intended operational design domain
  • Inadequate testing before public release

Negligent Marketing:

  • CEO statements creating unreasonable consumer expectations
  • Social media posts suggesting imminent full autonomy
  • Failure to correct public misconceptions

Consumer Fraud
#

Misrepresentation:

  • Promising capabilities the system doesn’t deliver
  • Charging $15,000 for features that may never materialize
  • “Hardware 3.0” promises never fulfilled

Unfair Business Practices:

  • Violation of state consumer protection laws
  • California’s Consumers Legal Remedies Act
  • State unfair competition statutes

OTA Update Liability
#

Novel Questions:

  • When a software update causes new problems, who is liable?
  • Can an OTA update satisfy recall obligations for hardware-related defects?
  • Does removing features via OTA create product liability exposure?
  • What duty of care applies to software patches affecting safety systems?

Evidence Preservation After a Tesla FSD/Autopilot Crash
#

Critical First Steps
#

  1. Do NOT allow Tesla remote access to modify the vehicle
  2. Do NOT accept OTA updates that might overwrite data
  3. Photograph the vehicle display showing Autopilot/FSD status
  4. Note the software version displayed in the vehicle
  5. Preserve your Tesla account including purchase records

Essential Evidence
#

Evidence TypeWhy It Matters
Vehicle EDR dataEvent Data Recorder captures pre-crash system status
Autopilot engagement logsProves system was active at time of crash
Camera footageTesla’s cameras may have captured the incident
Software versionIdentifies specific bugs and recall status
OTA update historyShows what patches were/weren’t installed
Marketing materialsDocuments what Tesla promised when you purchased
Purchase agreementFSD price paid, features promised

Spoliation Concerns
#

Tesla vehicles constantly upload data to Tesla’s servers and can receive remote software updates. Send preservation letters immediately through an attorney to:

  • Tesla, Inc. legal department
  • Tesla Service Center that maintains the vehicle
  • Any towing or storage companies holding the vehicle

Critical: Request that Tesla disable remote access and OTA updates pending litigation.


Frequently Asked Questions
#


Related Resources#


Injured in a Tesla Autopilot or FSD Crash?

If you or a loved one was injured in a crash involving Tesla's Autopilot or Full Self-Driving system, you may have claims for product liability, negligence, and consumer fraud. The $329 million Florida verdict shows juries are holding Tesla accountable. Connect with attorneys experienced in Tesla ADAS litigation.

Get Free Consultation

Related