When Self-Driving Cars Crash#
Autonomous vehicles were supposed to eliminate human error from driving. Instead, they’ve created entirely new categories of accidents—and complex legal questions about who bears responsibility when software, sensors, and algorithms fail.
From Waymo robotaxis blocking emergency vehicles to Cruise vehicles dragging pedestrians to Tesla Autopilot crashes killing drivers, the promise of autonomous safety has collided with a troubling reality. Since NHTSA began requiring crash reports in 2021, over 600 incidents involving automated driving systems have been documented. Behind each statistic is a victim navigating unprecedented legal terrain.
The AV Accident Landscape#
NHTSA Crash Reporting Data#
The National Highway Traffic Safety Administration’s Standing General Order requires manufacturers and operators to report crashes involving automated driving systems (ADS) and advanced driver assistance systems (ADAS).
Reporting Requirements:
- Crashes must be reported if ADS was engaged within 30 seconds of the crash
- Required when crashes result in property damage, injury, or fatality
- Violations carry penalties up to $27,874 per violation per day
- Maximum penalty of $139 million for related violations
Key Statistics (July 2021 - March 2025):
| Metric | Figure |
|---|---|
| Total ADS crashes reported | 1,100+ |
| Crashes June 2024 - March 2025 | 570 |
| Peak monthly incidents (Dec 2024) | 81 |
| California’s share | 60%+ (761 incidents) |
| Arizona incidents | 269 |
| Texas incidents | 103 |
Major Players and Incident Patterns#
Waymo:
- Operates robotaxis in San Francisco, Phoenix, Los Angeles, Austin
- Robotaxis blocked emergency vehicles 66 times in San Francisco (2023)
- Generally lower injury rates but significant near-miss incidents
- Expanding to new markets including Atlanta and Miami
Cruise (GM):
- Operations suspended after October 2023 pedestrian dragging incident
- $8-12 million settlement with dragging victim
- $500,000 federal fine for false reporting to investigators
- $1.5 million NHTSA penalty; $112,500 California fine
- GM cut $1 billion from Cruise budget; leadership replaced
Tesla Autopilot/FSD:
- 1,033 crashes involving Full Self-Driving (2020-2024)
- 54 verified fatalities involving Autopilot
- 2.4 million vehicles under FSD investigation (opened October 2024)
- Multiple recalls, including December 2023 Autopilot recall
- Ongoing litigation over “fraudulent misrepresentation” of capabilities
Landmark Cases and Settlements#
The Cruise Pedestrian Dragging Incident#
What Happened (October 2, 2023):
A pedestrian crossing the street in San Francisco was first struck by a hit-and-run driver, then thrown into the path of a Cruise robotaxi. The Cruise vehicle stopped after running over the pedestrian—but its detection system failed to recognize a person was underneath. The vehicle then attempted to pull over to the side of the road, dragging the woman over 20 feet.
The victim was hospitalized in critical condition with severe injuries.
What Went Wrong:
- Cruise’s detection system failed to identify the pedestrian under the vehicle
- The “pullover” maneuver was inappropriate given the situation
- Cruise initially provided incomplete information to regulators
- Internal communications revealed executives knew of the dragging sooner than disclosed
Consequences:
- Settlement of $8-12 million with the victim
- $500,000 federal fine for filing false reports to influence investigation
- $1.5 million NHTSA penalty
- $112,500 California PUC fine
- California DMV suspended Cruise’s driverless permit
- CEO and multiple executives resigned or fired
- Operations suspended; GM cut $1 billion from budget
Cruise Pedestrian Dragging Settlement
Pedestrian struck by hit-and-run driver was thrown into path of Cruise robotaxi. Vehicle's detection system failed to recognize person underneath, dragged victim 20+ feet while attempting pullover maneuver. Cruise fined $500K by DOJ for false reporting, $1.5M by NHTSA. GM-owned company suspended operations, replaced leadership.
Tesla Autopilot Fatality Lawsuit
Family of Genesis Giovanni Mendoza-Martinez suing Tesla for 'fraudulent misrepresentation' of Autopilot after fatal 2023 collision in Model S. Brother Caleb seriously injured. Plaintiffs allege Tesla misled consumers about system capabilities. Tesla moved case from state to federal court.
Tesla Autopilot and FSD Crashes#
NHTSA Investigation (October 2024):
NHTSA opened a preliminary evaluation covering 2.4 million Tesla vehicles after four FSD-related crashes, including one pedestrian fatality in Arizona. The agency found FSD’s engineering controls failed to “react appropriately to reduced roadway visibility conditions” including sun glare, fog, and airborne dust.
Pattern of Incidents:
- 467 crashes involving Autopilot found in NHTSA investigation
- 54 injuries and 14 deaths in those crashes
- 2 verified fatalities during FSD engagement (post-2022)
- Crashes often involve distracted drivers relying excessively on automation
Tesla FSD Pedestrian Fatality
Pedestrian killed after being struck by 2021 Tesla Model Y with Full Self-Driving engaged. Incident in November 2023 prompted October 2024 NHTSA investigation into 2.4 million Tesla vehicles. Agency found FSD failed to react appropriately to reduced visibility conditions.
Liability Framework for AV Accidents#
Who Can Be Held Liable?#
AV accidents often involve multiple potentially liable parties:
Vehicle Manufacturer:
- Traditional product liability for vehicle defects
- Liability for integrating AV systems into vehicles
- Failure to warn of automation limitations
AV Software Developer:
- Design defects in autonomous driving algorithms
- Sensor fusion and perception system failures
- Inadequate testing before deployment
- Failure to address known software bugs
Fleet Operator (Robotaxi Companies):
- Negligent deployment in unsafe conditions
- Inadequate vehicle maintenance
- Failure to respond to incident patterns
- Negligent supervision of operations
Safety Driver (If Present):
- Failure to monitor and intervene appropriately
- Distraction while responsible for vehicle
- Violation of training protocols
Third-Party Service Providers:
- Negligent maintenance of AV systems
- Improper sensor calibration
- Software update errors
Product Liability Theories#
Design Defect:
- AV system architecture creates unreasonable risk
- Sensor coverage inadequate for operating conditions
- Software fails to handle foreseeable scenarios
- Human-machine interface causes confusion
Manufacturing Defect:
- Specific sensors or components don’t meet specifications
- Assembly errors affecting AV systems
- Quality control failures
Failure to Warn:
- Misleading marketing about AV capabilities
- Inadequate warnings about system limitations
- Failure to communicate when human intervention required
- Insufficient training on proper use
Critical Evidence Preservation
AV systems generate massive amounts of data that can be overwritten quickly. Act immediately after an AV accident:
- Document the vehicle’s status displays and any error messages
- Photograph the scene, vehicle position, and any damage
- Note weather and road conditions
- Get names of witnesses and any safety driver present
- Do not allow the vehicle to be moved, repaired, or updated before your attorney sends preservation letters
- Request police to note the AV system’s engagement status in the report
Negligence Theories#
Negligent Testing and Deployment:
- Deploying AV systems before adequate testing
- Operating in conditions known to cause failures
- Ignoring data showing safety problems
- Rushing to market despite unresolved issues
Negligent Maintenance:
- Failure to maintain sensors and systems
- Ignoring software update requirements
- Inadequate inspection protocols
Negligent Supervision:
- Inadequate safety driver training
- Allowing distracted safety drivers
- Insufficient monitoring of vehicle performance
- Failure to respond to incident patterns
Safety Driver vs. Fully Driverless#
The presence or absence of a safety driver affects liability analysis:
| Factor | Safety Driver Present | Fully Driverless |
|---|---|---|
| Human oversight | Driver should monitor and intervene | No human backup |
| Liability focus | Driver negligence + AV defects | AV system defects primary |
| Defense arguments | Driver should have intervened | System should have handled scenario |
| Evidence | Driver attention/distraction key | Pure system performance |
Defensive Arguments to Expect#
Comparative Negligence:
- Victim’s own conduct contributed to crash
- Pedestrian jaywalking, cyclist without lights, etc.
- Other driver’s fault in multi-vehicle crashes
Assumption of Risk:
- Passenger knew AV was experimental
- User agreed to terms accepting limitations
- Generally weak against third-party victims
Federal Preemption:
- Argument that federal standards preempt state claims
- Courts have generally rejected broad preemption
- NHTSA hasn’t set standards that preclude liability claims
State AV Regulatory Landscape#
The Patchwork of State Laws#
As of 2025, 36 states plus the District of Columbia have enacted laws or executive orders addressing autonomous vehicle testing or operation. Approaches vary dramatically:
| Approach | States | Key Features |
|---|---|---|
| Permissive | AZ, TX, FL | Minimal restrictions, easy deployment |
| Regulated | CA, NV | Permit requirements, reporting mandates |
| Cautious | NY, NJ | Limited testing, more restrictions |
| No specific law | 14 states | General vehicle laws apply |
How State Law Affects Claims#
Insurance Requirements:
- States set minimum liability coverage for AV operators
- Some require special AV endorsements
- Coverage requirements affect available compensation
Reporting Mandates:
- California requires detailed incident reporting
- Reports can provide evidence for claims
- Public records requests may access this data
Liability Allocation:
- Some states specify manufacturer vs. operator liability
- May affect which parties you can sue
- Insurance subrogation rules vary
Statute of Limitations:
- Personal injury deadlines vary (typically 2-4 years)
- Product liability may have different deadlines
- Wrongful death claims often shorter
Evidence in AV Accident Cases#
Critical Data Sources#
Vehicle Data:
- Onboard sensor logs (camera, lidar, radar)
- GPS and mapping data
- Decision algorithm logs
- System status and error codes
- Software version information
Operator Records:
- Safety driver training records
- Driver attention monitoring data
- Shift logs and driving hours
- Prior incidents involving same driver
Company Records:
- Testing and validation data
- Known issues and bug reports
- Internal communications about safety
- Prior incidents with similar scenarios
Regulatory Filings:
- NHTSA crash reports
- State DMV filings
- Permit applications and conditions
- Recall notices and technical bulletins
How Prior Incidents Affect Your Case#
Pattern Evidence:
- Similar crashes suggest systemic defects
- Company knowledge of problems before your crash
- Failure to implement fixes
- Strengthens design defect claims
Regulatory Investigations:
- NHTSA investigations create public record
- Findings can support your claims
- Recall acknowledgments show defect awareness
Settlement History:
- Prior settlements suggest liability acknowledgment
- May indicate available insurance coverage
- Demonstrates case value range
Practical Guidance for Victims#
Immediate Steps After an AV Accident#
- Seek medical attention — Even if injuries seem minor
- Call police — Request report noting AV system status
- Document everything — Photos, videos, witness information
- Note vehicle displays — Any error messages or status indicators
- Don’t sign anything — No quick releases or statements
- Preserve your vehicle — If you were in your own car
- Contact an attorney — Before speaking with AV company representatives
What Makes AV Cases Different#
Complex liability: Multiple potential defendants with different roles Technical evidence: Requires expert analysis of software and sensors Evolving law: Legal frameworks still developing Sophisticated defendants: Well-funded companies with experienced counsel Data battles: Critical evidence controlled by defendants
Typical Case Timeline#
| Phase | Duration | Activities |
|---|---|---|
| Investigation | 3-12 months | Evidence gathering, expert analysis |
| Demand/Negotiation | 2-6 months | Settlement discussions |
| Litigation | 1-3 years | If settlement not reached |
| Trial | 1-4 weeks | If case goes to verdict |
| Appeals | 1-2 years | If verdict appealed |
Frequently Asked Questions#
Related Resources#
- Tesla FSD & Autopilot Liability — Dedicated Tesla Autopilot/FSD guide with recalls, verdicts, and class actions
- Autonomous Vehicles Industry Overview — AV technology and liability landscape
- Tesla Factory Robotics Injuries — Tesla manufacturing safety issues
- Understanding Liability — Product liability frameworks
- Evidence Checklist — What to preserve after any injury
- Filing a Claim — Step-by-step claims process
Injured in an Autonomous Vehicle Accident?
Whether you were hit by a robotaxi, injured as a passenger in a self-driving vehicle, or crashed while using Tesla Autopilot or FSD, you may have claims against vehicle manufacturers, software developers, and fleet operators. Connect with attorneys experienced in autonomous vehicle liability.
Get Free Consultation