Tesla Recall Won’t Fix Autopilot Problems, Critics Say

Less than a week ago, Tesla announced it was recalling nearly 2 million cars sold in the US to address concerns about its Autopilot software.

ADVERTISEMENT

Critics Say Autopilot Update Is Not Enough

A Washington Post investigation revealed that Autopilot can be activated in situations where Tesla says it shouldn’t be used. The over-the-air (OTA) software update being implemented by Tesla aims to add more warnings and alerts to prevent inappropriate use of Autopilot, but critics argue that this does not effectively address the issue of misuse.

Matthew Wansley, a professor at the Cardozo School of Law, expressed disappointment with Tesla's approach, stating that there is no argument for allowing Autopilot to be used on roads with cross traffic, which is where many crashes occur. Senator Richard Blumenthal of Connecticut also criticized the update, calling it insufficient and arguing for more robust changes to the software.

The concerns raised by critics indicate a lack of faith in Tesla's ability to enforce responsible use of Autopilot and suggest that voluntary compliance is not enough to ensure safety.

Did NHTSA Go Easy On Tesla?

There has been speculation that the National Highway Traffic Safety Administration (NHTSA) may not have taken stricter action against Tesla due to the automaker's influence in the electric vehicle industry, which is a priority for the Biden administration. However, the NHTSA has stated that its investigation into Autopilot remains open, leaving room for further action.

Some Tesla critics and lawmakers expressed concern over the NHTSA's handling of the situation and called for stronger measures to address the safety issues. The recall requires Tesla to issue a software update with additional controls and alerts to prevent misuse of Autopilot, but it does not restrict the technology to specific situations where it is designed to be used.

The recall has raised doubts about the effectiveness of self-enforcement and voluntary compliance, with critics calling for stricter regulations and more comprehensive fixes to address the underlying safety defects.

The Operational Design Domain

One key issue with the recall is that Tesla has not explicitly stated that it will restrict Autopilot to its designated Operational Design Domain (ODD). This means that consumers will still be able to activate Autopilot outside of the intended circumstances, albeit with more alerts and precautions.

Critics argue that the recall overlooks the need for Tesla to fix the underlying safety defects in its self-driving software. They believe that banning the software altogether would be a more effective solution, rather than relying on increased monitoring.

The debate surrounding the recall highlights the complexity and challenges of enforcing responsible use of advanced driver-assistance systems. It also raises questions about the future development and regulation of autonomous driving technology.