Tesla’s Autopilot and Full Autonomous Driving Led to Hundreds of Accidents and Dozens of Deaths

In March 2023, a North Carolina student was getting off a school bus when he was struck by a Tesla Model Y traveling at “freeway speeds,” according to a federal investigation released today. The Tesla driver was using Autopilot, the automaker’s advanced driver assistance feature that Elon Musk says will eventually lead to fully autonomous cars.

The 17-year-old student who was hit was flown by helicopter to a hospital with life-threatening injuries. But what the investigation found, after examining hundreds of similar accidents, was a pattern of driver inattention combined with the deficiencies of Tesla’s technology, resulting in hundreds of injuries and dozens of deaths.

Drivers using Autopilot or the system’s more advanced equivalent, Full Self-Driving, “were not sufficiently engaged in the driving task,” and Tesla’s technology “did not adequately ensure that drivers focused their attention on the driving task.” , the NHTSA concluded.

Drivers who used Autopilot or the system’s more advanced counterpart, Full Self-Driving, “were not sufficiently involved in the driving task.”

In total, NHTSA investigated 956 crashes starting in January 2018 and ending in August 2023. In those crashes, in which some other vehicles collided with the Tesla vehicle, 29 people died. There were also 211 accidents in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path.” These accidents were often the most serious – 14 people died and 49 were injured in these accidents.

The NHTSA was prompted to begin its investigation after several incidents occurred in which Tesla drivers collided with stationary emergency vehicles parked on the side of the road. Most of these incidents occurred after dark, with the software ignoring scene control measures including warning lights, flares, cones and an illuminated arrow board.

In its report, the agency found that Autopilot – and in some cases FSD – is not designed to keep the driver engaged in the driving task. Tesla says it is warning its customers that they need to pay attention when using Autopilot and FSD, which means keeping their hands on the wheels and their eyes on the road. However, according to NHTSA, in many cases drivers would become overly complacent and lose focus. And when it was time to react, it was often too late.

In 59 crashes NHTSA investigated, the agency found that Tesla drivers had “five or more seconds” enough time to react before crashing into another object. In 19 of these accidents, the hazard was visible for at least 10 seconds before the collision. In reviewing accident logs and data provided by Tesla, NHTSA found that in most crashes analyzed, drivers did not brake or steer to avoid the hazard.

“Accidents with no or late driver evasive attempts were identified across all Tesla hardware versions and accident circumstances,” NHTSA said.

NHTSA also compared Tesla’s Level 2 (L2) automation features with products available in other companies’ vehicles. Unlike other systems, Autopilot turns off rather than giving the driver the opportunity to adjust their steering. This “discourages” drivers from continuing to engage in the task of driving, NHTSA said.

“Accidents with no or delayed driver avoidance attempts have been identified across all Tesla hardware versions and accident circumstances.”

A comparison of Tesla’s design decisions with those of L2 competitors revealed that Tesla represents an industry outlier in its approach to L2 technology, failing to balance a weak driver engagement system with Autopilot’s permissive operational capabilities.

Even the brand name “Autopilot” is misleading, NHTSA said, creating the impression that the driver is not in control. While other companies use some version of “Assist,” “Sense,” or “Team,” Tesla’s products trick drivers into thinking they are more capable than they actually are. Both the California Attorney General and the state Department of Motor Vehicles are investigating Tesla for misleading branding and marketing policies.

NHTSA admits its investigation may be incomplete due to “gaps” in Tesla’s telemetry data. This could mean that there are many more crashes involving Autopilot and FSD than NHTSA has been able to detect.

Even the brand name “Autopilot” is misleading, NHTSA said

Tesla conducted a voluntary recall late last year in response to the investigation and released an over-the-air software update to add more warnings to Autopilot. The NHTSA announced today that it is opening a new investigation into the recall after several safety experts said the update was inadequate and still allowed for misuse.

The findings contradict Musk’s insistence that Tesla is an artificial intelligence company that is close to launching a fully autonomous vehicle for personal use. The company plans to unveil a robotaxi later this year that will usher in this new era for Tesla. During this week’s first quarter earnings call, Musk reiterated his statement that his vehicles are safer than human-driven cars.

“When you have a statistically significant amount of data at scale that shows conclusively that the autonomous car has, say, half as many accidents as a human-driven car, I think that’s hard to ignore,” Musk said. “Because at this point, ending autonomy means killing people.”

Sharing Is Caring:

Leave a Comment