Autonomous Vehicle Car Accident Statistics: What the Data Shows

Autonomous vehicles were supposed to make roads safer. The data in 2025 tells a more complicated story.

As of November 2025, NHTSA’s mandatory crash reporting database records 5,202 autonomous and semi-autonomous vehicle accidents in the United States since reporting began in June 2021. Self-driving car accidents nearly doubled in 2024 alone, rising from 288 reported crashes in 2023 to 544. Semi-autonomous vehicle accidents increased 35 percent year over year during the same period.

Sutliff & Stout, a car accident attorney firm based in Houston, tracks autonomous vehicle crash trends closely. It’s because these cases raise liability questions that standard car accident law does not resolve on its own. When a vehicle’s software, not the driver, makes the decision that causes a crash, determining who bears legal responsibility requires a different legal analysis than a traditional rear-end collision.

Texas recorded 345 autonomous and semi-autonomous vehicle crashes, placing it third nationally behind California, with over 1,600 incidents, and Arizona, with 34. For Texas drivers and their families, those numbers are not abstract.

How Autonomous Vehicles Are Classified

Not every self-driving car carries the same level of automation. NHTSA uses a six-level scale developed by SAE International.

Level 0 means no automation. The driver controls everything. 

Level 1 covers basic driver assistance like adaptive cruise control. 

Level 2 includes partial automation, where the vehicle handles steering and acceleration, but the driver must remain engaged. Tesla Autopilot and Full Self-Driving (FSD) operate at Level 2. 

Level 3 allows the vehicle to manage the full driving task in certain conditions, but requires the driver to resume control on demand. 

Level 4 operates without human input in defined environments. 

Level 5 represents full automation with no human driver required.

Most vehicles on American roads with advanced features operate at Level 1 or Level 2. Waymo’s robotaxi fleet in select cities operates at Level 4. No Level 5 vehicle exists in commercial deployment as of 2025.

This distinction matters legally. A Level 2 system requires constant human supervision. When a crash occurs while Level 2 is engaged, the driver’s failure to intervene becomes a central question in any liability analysis.

Tesla: The Most Reported AV Crash Manufacturer

Tesla reports more autonomous vehicle crashes than any other manufacturer in the NHTSA database. Through November 2025, Tesla accounts for 2,093 incidents involving its ADAS systems. Honda follows with 112 and Subaru with 47.

Tesla’s high number reflects two factors. First, Tesla fields a larger fleet than most competitors. Second, Tesla built a telemetry system that actively reports crash data to NHTSA, including crashes where Autopilot deactivated within five seconds before impact. Most manufacturers rely on customer-reported incidents. Tesla’s system captures crashes that would otherwise go unreported, which inflates its raw count relative to companies that underreport.

Tesla’s own safety report for Q3 2025 records one crash for every 7.63 million miles driven with Autopilot engaged, compared to one crash per 1.25 million miles driven without it. Tesla presents this as evidence that Autopilot improves safety. NHTSA presents a different view.

What NHTSA’s Investigation Found

NHTSA’s Office of Defects Investigation reviewed 956 crashes where Tesla Autopilot was alleged to be engaged. The agency’s 2024 report identified driver disengagement as the central problem.

NHTSA found that 109 out of 143 crashes in the detailed analysis involved drivers who fulfilled Tesla’s pre-recall engagement monitoring criteria. Those drivers were technically complying with the system’s requirements. The crashes happened anyway.

The agency concluded that Autopilot’s design did not adequately maintain driver engagement. Almost all incidents in one crash category involved a crash within five seconds of Autopilot operating without sufficient driver attention. NHTSA issued a recall of all Tesla vehicles equipped with Autopilot in December 2023. Tesla resolved the recall through an over-the-air software update, which NHTSA began reviewing for effectiveness in April 2024.

As of October 2025, 65 fatalities connect to Tesla Autopilot and FSD incidents in verified reports. NHTSA investigations confirmed 54 of those deaths.

Three Tesla Cases That Defined the Legal Debate

May 7, 2016 (Florida) Joshua Brown died when his Tesla Model S, operating on Autopilot, failed to brake for an 18-wheel tractor-trailer crossing the highway. The truck’s white trailer blended with the bright sky. The system did not detect it. Brown’s vehicle passed beneath the trailer at 74 miles per hour. NHTSA opened a formal investigation. This case marked the first publicly confirmed Autopilot fatality.

2019 (Key Largo, Florida) A Tesla Model S with Autopilot engaged failed to stop at a T-intersection, ran a stop sign, and struck a parked SUV. The impact killed Naibel Benavides Leon and seriously injured a second pedestrian. The driver admitted he was distracted and did not intervene. In 2025, a federal jury found Tesla 33 percent at fault, determining that Autopilot’s design and inadequate warnings contributed to the collision alongside the driver’s failure to supervise.

April 19, 2024 (Snohomish County, Washington) A driver in a 2022 Tesla Model S rear-ended a motorcyclist who had slowed for traffic on State Route 522. The driver reported he had Autopilot engaged and was looking at his phone. The motorcyclist was ejected and pinned under the vehicle. The driver was arrested for vehicular homicide.

These three cases span nearly a decade and illustrate how legal outcomes shifted as courts began distinguishing between driver negligence and manufacturer system design as contributing causes of the same crash.

Liability in Autonomous Vehicle Crashes: A Changing Legal Landscape

Traditional car accident law places liability on the negligent driver. Autonomous vehicle crashes introduce a second potential defendant: the manufacturer whose software made a driving decision that contributed to the crash.

Courts apply a products liability framework when evaluating whether a vehicle’s automation system was defectively designed or inadequately warned about. The 2025 Key Largo verdict, which found Tesla 33 percent at fault, demonstrates that juries will assign partial manufacturer liability even in cases where a human driver also failed to exercise proper supervision.

Texas follows a modified comparative fault rule. Under Texas Civil Practice and Remedies Code Section 33, fault can be distributed among multiple parties. In an autonomous vehicle crash, that can include the driver, the manufacturer, and in commercial fleet cases, the company that deployed the autonomous technology.

Front-end damage accounts for 62 percent of semi-autonomous vehicle crashes. Rear damage accounts for 54 percent of fully autonomous vehicle crashes. These patterns tell investigators where system failures most commonly occur in the vehicle’s sensor coverage range.

Liability depends on the specific facts of each crash. Drivers carry a legal duty to supervise Level 2 systems at all times. When a manufacturer’s system design contributed to the crash, juries have assigned partial liability to the manufacturer separately from the driver. The 2025 Key Largo verdict found Tesla 33 percent at fault alongside the driver, the clearest example of courts treating driver negligence and manufacturer system defect as two independent contributing causes of the same collision.

Texas enacted House Bill 1791 in 2017, which permits autonomous vehicle operation on public roads and places operational responsibility on the manufacturer when no human driver is present. For Level 2 systems like Tesla Autopilot, the driver retains legal responsibility for supervising the system at all times under Texas law.

State-by-State Crash Concentration

Autonomous vehicle crashes do not distribute evenly across the country. California leads with over 1,600 reported incidents, driven by the high density of Tesla vehicles and the active deployment of Waymo’s robotaxi service in San Francisco and Los Angeles.

Texas sits third nationally with 345 crashes, ahead of Florida’s 200 total incidents. Arizona places second with 341, driven by the heavy Waymo testing presence in the Phoenix metro area.

For Texas drivers, 345 reported crashes represents a conservative floor. The figure covers only crashes where manufacturers reported NHTSA-required data. Crashes where Autopilot or another ADAS was not formally logged as engaged at the time of the crash do not appear in these totals.

Reported AV crashes do not simply rise because autonomous vehicles become more dangerous. NHTSA’s mandatory reporting requirement, introduced in June 2021, captures more incidents each year as the fleet of ADAS-equipped vehicles grows and manufacturer compliance with the Standing General Order improves. The nearly doubled crash count from 2023 to 2024 reflects both expanded deployment and more complete reporting, not a single-variable increase in system failure rates.

What Injured Drivers Need to Know

An autonomous vehicle crash produces evidence that a standard car accident does not. Vehicle telemetry, ADAS engagement logs, software version records, and over-the-air update histories all become relevant to proving what the system did in the seconds before impact. This data can confirm whether the automation was engaged, what it detected, and what action it took before the collision. Preserving these records requires a formal legal hold sent to the manufacturer immediately after a crash. Manufacturers control their own telemetry. Without that hold, records may not survive into litigation.

Product Liability

Courts have allowed product liability claims against Tesla based on defective ADAS design and inadequate driver warnings. NHTSA’s own investigation found Autopilot’s pre-recall design insufficient to maintain driver engagement. Federal juries have found Tesla partially at fault in crash deaths connected to Autopilot use. Injured drivers in autonomous vehicle crashes face a two-defendant problem: establishing what the human driver did wrong and establishing what the manufacturer’s system did wrong. Each defendant carries separate legal exposure. Insurance limits for manufacturers and commercial fleets typically far exceed the limits of a private driver’s personal auto policy.

Vehicle damage documentation is part of that same claim. Tesla uses aluminum body panels on most models, which require paintless dent repair techniques calibrated for aluminum rather than steel. A shop trained in steel panel work can cause micro-fractures in aluminum during a standard dent removal tesla process, which affects both structural integrity and sensor alignment. Post-crash inspections on Tesla vehicles must confirm that camera housings and sensor mounts remain properly aligned after any body repair, because a misaligned sensor changes how Autopilot reads the road going forward. 

Repair estimates from non-certified shops undervalue Tesla aluminum panel work by 20 to 40 percent compared to Tesla-certified centers. That gap belongs in any property damage claim filed against the at-fault driver or manufacturer.

Leave a Comment