NHTSA Closes Investigation Into Tesla Autopilot After Linking System to 14 Deaths

The National Highway Traffic Safety Administration (NHTSA) has completed an investigation into Tesla’s Autopilot driver assistance system after investigating hundreds of crashes, including 13 fatal incidents that resulted in 14 deaths. The organization has ruled that these accidents were due to the driver’s misuse of the system.

However, the NHTSA also found that “Tesla’s weak driver engagement system was not suited to Autopilot’s permissive operating capabilities.” In other words, the software did not prioritize driver attention. Drivers using Autopilot or the company’s Full Self-Driving technology “were not sufficiently engaged” because Tesla “did not adequately ensure that drivers focused their attention on the driving task.”

The organization investigated nearly 1,000 accidents from January 2018 to August 2023, in which a total of 29 people died. The NHTSA found that for about half (489) of these accidents, “there was insufficient data to evaluate.” In some incidents, the other party was at fault or Tesla drivers did not use the Autopilot system.

The most serious were 211 accidents in which “the Tesla’s frontal plane struck a vehicle or obstacle in its path,” and these accidents were often associated with Autopilot or FSD. These incidents resulted in 14 deaths and 49 serious injuries. The agency found that in 78 of those incidents, drivers had enough time to react but failed to do so. These drivers failed to brake or steer to avoid the danger, even though they had at least five seconds to move.

This is where complaints against the software come into play. NHTSA says drivers would simply become too complacent if they assumed the system would handle all hazards. When it came time to react, it was too late. “Accidents with no or late driver evasive attempts were detected across all Tesla hardware versions and accident circumstances,” the organization wrote. The imbalance between the driver’s expectations and the Autopilot’s operational capabilities created a “critical safety vulnerability” that led to “predictable misuse and preventable accidents.”

The NHTSA also expressed outrage at Autopilot’s branding, calling it misleading and suggesting that drivers could assume the software was in complete control. For this purpose, competing companies usually use trademarks with words such as “driver assistance”. Autopilot indicates an autonomous pilot. The California Attorney General and the state’s Department of Motor Vehicles are also investigating Tesla for misleading branding and marketing policies.

Tesla, on the other hand, says it is warning customers that they need to be careful when using Autopilot and FSD The edge. The company says the software has regular indicators that remind the driver to keep their hands on the wheel and their eyes on the road. NHTSA and other safety groups have said these warnings do not go far enough and are “insufficient to prevent abuse.” Despite these statements from security groups, CEO Elon Musk recently promised that the company would continue to fight “with all its might for autonomy.”

The results may represent only a small fraction of the actual number of autopilot and FSD-related accidents and incidents. The NHTSA noted that “gaps in Tesla’s telematics data lead to uncertainty regarding the actual frequency with which vehicles driving with Autopilot engaged are involved in accidents.” This means that Tesla only uses data from certain types of accidents receives, with NHTSA claiming that the company collects data on about 18 percent of crashes reported to police.

Given this, the organization has launched another investigation into Tesla. This is about a recent OTA software fix that was released in December following the recall of two million vehicles. The NHTSA will examine whether the Autopilot recall solution implemented by Tesla is effective enough.

Sharing Is Caring:

Leave a Comment