The problem plaguing Tesla’s Autopilot
The carmaker’s Autopilot system continues to face scrutiny for its “phantom” braking issues.
All new Teslas have featured Autopilot since 2014. This advanced “self-driving” system assists drivers with braking, parking, steering, and lane changes. Despite the system’s advanced safety features, it’s been plagued with problems that have been a factor in several accidents within the last year. It has also led to previous federal investigations into the safety of their cars.
Most recently, Tesla has come under fire for what’s being called a “phantom braking issue.” Drivers have reported the car braking unexpectedly while the car is moving. Sometimes, the vehicle can brake repeatedly without notice during a driving cycle. The issue seems to be more notable in Tesla Model 3 and Model Y vehicles, however, it seems to be a widespread issue with the system.
Over the course of the last nine months, more than 350 phantom braking issues complaints have been filed with the National Highway Traffic Safety Administration (NHTSA). Tesla is currently being investigated by the NHTSA over the matter. The NHTSA will look into the cause of the braking issue, which seems to be linked to a sensor malfunction. Tesla has taken steps to fix the issue already, with the company announcing that all Teslas would have more cameras onboard, but do away with the sensors to help “sense” its surroundings.
This isn’t the first time the NHTSA has investigated the auto manufacturer, however. Last August, a formal investigation was opened by the federal agency after several accidents involving Teslas and parked emergency vehicles were reported. The investigation included over 760,000 vehicles, which is almost every Tesla vehicle manufactured since 2014.
Since 2018, there have been 12 crashes involving Teslas and parked emergency vehicles. 11 people have been injured and one person tragically died in these accidents. According to reports, when Autopilot or Traffic-Aware Cruise Control is activated, the vehicle’s driver assistance program responds to emergency vehicle’s flashing lights, flares, cones, and illuminated arrow boards.
As a result of these crashes, the NHTSA is asking Tesla to outline how Autopilot operates in low-light conditions, and how it responds to crash scenes and detects emergencies. The federal agency is also interested in knowing how much field testing is done on the program’s software updates before it’s rolled out to consumers.
Although Autopilot and other driver assistance programs are not designed to be fully autonomous and still require engaged drivers, it’s an auto manufacturer’s duty to make sure a program is safe to use before introducing it to consumers. If a faulty program or glitch contributes to a car accident occurring, the manufacturer of the vehicle or system could be held liable for accident-related damages.
If you were involved in a car accident involving a self-driving vehicle, please schedule a consultation with a knowledgeable defective product attorney at Adamson Ahdoot LLP as soon as possible.