Elon Musk’s Big Lie About Tesla Is Finally Exposed

More than 2 million of the cars are being recalled — because Tesla’s “self-driving” systems have always been anything but

ADVERTISEMENT

Tesla's False Claims about Self-Driving Technology

In 2016, Elon Musk falsely claimed that Tesla cars could drive autonomously with greater safety than a person. This deception sent Tesla's stock price soaring and contributed to Musk's wealth. However, the recent recall of 2 million Teslas exposes the truth that Tesla's self-driving technology has always required a human driver to be alert at all times.

Despite the hype surrounding Tesla's driving automation technology, close observers have long known that the company's self-driving systems are far from being fully autonomous. This hidden truth has now been revealed to the broader public. Furthermore, regulators have failed to recognize the risks associated with forcing humans to continuously monitor incomplete self-driving systems while on public roads.

The Problem Lies with Human Behavior

The official notice for Tesla's recall reveals an interesting twist - the issue is not with a defect in the Autopilot technology itself. Instead, the problem arises when humans start using the driving assistance system and become less attentive to the road. This would not be a concern if Tesla's cars could actually drive themselves safely and if the company assumed legal responsibility for the system's actions. However, since neither of these conditions is true, users must remain vigilant and ready to intervene to prevent accidents caused by Autopilot's mistakes.

Tesla's legal agreements stipulate that the vehicle owner is liable for everything the system does. By marketing its cars as almost fully self-driving and not implementing adequate safety measures, Tesla encourages drivers to become inattentive and then blames them when accidents occur. Tesla did not respond to requests for comment on this matter.

The Flaws in Tesla's Autopilot Design

If humans were considered a part of the Autopilot system, the designers would have taken into account our natural tendency to become inattentive when bored. Research has shown that when automation takes over too much of a task, humans are more likely to miss critical aspects of the task, especially when immediate action is required. Tesla's Autopilot fails to address this flaw, as it categorizes itself as a Level 2 driver assistance system, meaning the human driver retains legal responsibility.

The lack of driver monitoring and the ability to activate Autopilot on roads it was not designed for have led to fatal accidents. Despite in-depth investigations and fatalities, both Tesla and regulatory authorities have failed to take adequate action. Only now, with the recent recall and investigations into Autopilot-involved crashes, are the flaws in Tesla's design being acknowledged.