Cheating on Tesla’s Autopilot would be surprisingly easy
After yet another accident involving a Tesla car, a consumer association conducted a small experiment. Depending on the results, the Autopilot can be tricked by means of a scheme that is not
After yet another accident involving a Tesla car, a consumer association conducted a small experiment. Depending on the results, the Autopilot can be tricked by means of a scheme that is not very complicated. The association described the ease of circumventing security as frightening.
Autopilot questioned after an accident
As reported in an article in New York Times published on April 18, 2021, a Tesla Model S was involved in a terrible accident in Texas (United States). The vehicle has hit a tree before catching fire, killing his two passengers, namely two men of 59 and 69 years old. According to local police, no one was driving at the time of the impact. One of the two men was in the passenger seat and the other in the back seat.
Also according to the police, the women of the two victims had heard them talk about the Autopilot before their escapade. Very quickly, Elon Musk reacted on Twitter. He then claimed that Autopilot was not active, as the car had not purchased the option. He thus remits in question the version of the font and attacks the Wall Street Journal (WSJ), one of the many media that relayed the information.
A test involving a simple chain attached to the steering wheel
Elon Musk also praised the intervention of another net surfer, saying that Autopilot cannot operate if both the driver’s hands are not resting on the steering wheel. However, the consumer association Consumer reports published an article and a video on April 22, chronicling an experiment conducted on Autopilot proving that this system is very easy to fool.
A member of Consumer Reports conducted the test on a closed circuit. He started by activating the Autopilot and setting the speed to 0 km / h. Then he changed seats and increased the speed of the automobile using a simple dial on the steering wheel. The vehicle therefore restarted without a driver. To bend the system will simply have required the use of a lightly weighted chain hanging on the steering wheel. This indeed makes it possible to simulate the weight of the hands. The tester also fastened the driver’s seat belt and made sure not to open the door.
In view of this experience, it is therefore possible to imagine a driver being in another seat and letting his car run alone while maintaining slight pressure on the steering wheel. Described as “frightening”, this ease of bypassing security shows that the safeguards of Tesla cars are insufficient. It should also be remembered that this company released a beta version of its Fully Self drinving system a few weeks ago. However, this version involving fully automated driving would be a real danger in town!