This week, another fatal crash involving a Tesla is fueling conversations on the safety of the Autopilot driver-assistance program and whether the EV maker could take extra steps toward ensuring it isn’t misused. According to Consumer Reports, Tesla could and should go the extra length in this direction.
First, some context: the Texas crash killed both occupants, and police statements say that neither of them was in the driver’s seat when it happened. In other words, one of them engaged Autopilot on the Model S and slipped out of the driver’s seat. Tesla CEO Elon Musk says that data logs show Autopilot wasn’t engaged on the car. Moreover, the car couldn’t be driving itself since Autopilot has safeguards in place, preventing this from happening.
Specifically, Musk was referring to the fact that Autopilot disengages if hand pressure on the wheel is not detected every 10 seconds, if the seat is unbuckled, or if there’s no body weight detected in the seat. He also said that Autopilot wouldn’t work in a residential area like where the crash occurred because streets did not have lane markings.
Consumer Reports put some of Musk’s claims to the test and was able to prove that, indeed, you can “trick” a Tesla on Autopilot into driving itself while you casually slip out of the driver’s seat and chill on the passenger side. The video at the bottom of the page shows how.
Jake Fisher, CR’s senior director of auto testing, took a Model Y to a test track in Connecticut, started it, and then engaged Autopilot. He then added a weighted chain on the steering wheel, set the speed to zero, and slipped out of the seat, leaving the seat belt buckled. He increased the speed again and got the car to drive itself without anyone at the wheel.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher says. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Fisher’s conclusion is that Tesla needs to do better in terms of these safeguards. As with GM and Ford, it should use technology to monitor the driver actively and ensure that they are paying attention to the road.
On social media, Tesla supporters are comparing Fisher’s experiment to when you place a brick on the gas pedal and send a car off a cliff—meaning, you can’t blame a carmaker for the operator’s stupidity or recklessness. Musk seems to echo the sentiment, replying with laughing emojis to one such comparison.
Specifically, Musk was referring to the fact that Autopilot disengages if hand pressure on the wheel is not detected every 10 seconds, if the seat is unbuckled, or if there’s no body weight detected in the seat. He also said that Autopilot wouldn’t work in a residential area like where the crash occurred because streets did not have lane markings.
Consumer Reports put some of Musk’s claims to the test and was able to prove that, indeed, you can “trick” a Tesla on Autopilot into driving itself while you casually slip out of the driver’s seat and chill on the passenger side. The video at the bottom of the page shows how.
Jake Fisher, CR’s senior director of auto testing, took a Model Y to a test track in Connecticut, started it, and then engaged Autopilot. He then added a weighted chain on the steering wheel, set the speed to zero, and slipped out of the seat, leaving the seat belt buckled. He increased the speed again and got the car to drive itself without anyone at the wheel.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher says. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Fisher’s conclusion is that Tesla needs to do better in terms of these safeguards. As with GM and Ford, it should use technology to monitor the driver actively and ensure that they are paying attention to the road.
On social media, Tesla supporters are comparing Fisher’s experiment to when you place a brick on the gas pedal and send a car off a cliff—meaning, you can’t blame a carmaker for the operator’s stupidity or recklessness. Musk seems to echo the sentiment, replying with laughing emojis to one such comparison.
????????
— Elon Musk (@elonmusk) April 22, 2021
Your research as a private individual is better than professionals @WSJ!
— Elon Musk (@elonmusk) April 19, 2021
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.