Elon Musk has long been promising the incredible convenience and safety of self-driving Tesla. The company releases quarterly “safety” reports detailing safety and crash data for Tesla’s as an effort to boost confidence in the “Autopilot” mode. But how safe are you really when you are using Tesla’s Autopilot?
Data suggests that the Tesla Autopilot is not actually as safe as the quarterly reports make it out to be. Experts claim the reports are misleading and make unfair comparisons. While Musk recently claimed that a Tesla on autopilot is 10 times less likely to crash than an average car, experts say that the claim lacks important context.
The future of autonomous cars is exciting and something that many auto manufacturers thought we would have reached by now. But it remains a tough puzzle to solve. Let’s take a closer look at Tesla’s autopilot function and get a clearer idea of how safe you really are when you let the car “drive itself.”
What Does It Mean to Hack Tesla Autopilot?
Folks have been tinkering with Tesla’s computer systems for as long as they’ve been around, tapping into developer view modes and seeing just what they can pull off with these driving computers. Other folks have ventured into slightly more mechanical hacks to try and trick the vehicle’s robust array of sensors.
Tricking the Tesla Autopilot System
After a deadly crash in April 2021, a few engineers decided to put the “Fully Self Driving” mode to the test and see if it could be tricked into thinking there was a driver in the car when there actually wasn’t.
In the experiment, they found that not only did the car fail to recognize if the driver was engaged, but it also failed to observe if there was even a driver in the car at all.
Jake Fisher, the senior director of auto testing at Consumer Reports, led the experiment. Essentially, he engaged the fully self-driving mode of a Tesla Model Y and then brought the vehicle to a stop. He then hooked up weight to the steering wheel to simulate the weight of his hands.
Fisher then exited the vehicle, without opening any doors as this would’ve disengaged the fully self-driving mode and proceeded to “drive” the Tesla using a steering wheel dial. “The car drove up and down the half-mile lane of our track,” said Fisher. He pointed out that it never even noticed that there was nobody in the driver’s seat.
Consumer Reports completed this experiment on a closed track and was firm in asserting that nobody should attempt to recreate this effect. But the results are a bit unnerving. Tesla claims that the vehicle should keep tabs on the driver’s attention, but in this experiment, it didn’t even know that there wasn’t a driver to be keeping tabs on.
What Are Some Computer Hacks on Tesla?
Computer enthusiasts have always tried to open up and get inside anything with a computer. A Tesla is no exception. Popular Twitter hacker “Green,” @greentheonly, has gained a lot of attention for hacking into the augmented reality view of the Tesla and demonstrating different elements of the Autopilot feature in real-time.
Recently, a video surfaced from China of a Tesla in Autopilot approaching a sharp turn and failing to make it before falling into a small ditch on the side of the road. There is a moment just before the car crashes, where the autopilot seems to just give up and throw it back on the driver to fix, but it is already too late.
“Green” posted a thread on Twitter regarding this particular incident, demonstrating how it happens through his hacked view of the Tesla augmented reality interface. After several tests on an empty road with a sharp curve, he determines that the car attempts to slow down but fails to make the turn, and when it doesn’t crash, it ends up driving in the wrong lane.
The telemetry demonstration view in his thread is a helpful look at how the Autopilot registers the road and why it isn’t safe for you, the driver, to take your hands off the wheel or eyes off the road.
There are simply too many variables to leave to the current system without active driver support and input.
Can Tesla’s Autopilot be Tricked?
Hackers have gone further than just taking a look at restricted views. Keen Labs, a widely recognized top cybersecurity group based in China, has tested out some attacks that target Tesla’s Autopilot functions. In their tests, they were able to trick the car into driving in the wrong lane, and they were able to take over the steering wheel using a gamepad.
In their first experiment, they wanted to test the accuracy of Tesla’s lane recognition technology. A variety of cameras and sensors keep Tesla’s in the appropriate lane and are supposed to keep the Tesla from crossing into a lane of oncoming traffic. But a few carefully placed stickers on the road seem to trick the car.
The hackers placed three small triangle stickers on the road near an intersection. They believed that these stickers would trick the autopilot into following a “fake” lane. They were proved correct as the autopilot registered these stickers and followed them into what was actually the lane of oncoming traffic on the test track.
In another experiment, they sought to take over the autopilot steering wheel with a gamepad. The hackers performed a series of complex steps and broke through several security barriers, allowing them to assume control with a gamepad.
It is worth noting that they seized control while inside the vehicle and that it didn’t work when the car was taken from reverse to drive at any speed above 8 km per hour. The attack did appear to work without limitations when the car was seized in the cruise control mode. Tesla’s have addressed these threats as unrealistic and that they couldn’t occur in real-world conditions.
The remote-control hack may not be particularly realistic, but Keen Labs claims that lane detection is a serious issue. The hack is low cost and low intensity to pull off and presents a serious threat to road safety. They’ve used this experiment to emphasize the importance of high-quality reverse lane recognition in autonomous vehicles.
How Does Tesla’s Autopilot Function Work?
Let’s get one thing clear right from the start.
Tesla’s autopilot function still requires active attention and input from the driver. It is not a fully autonomous mode, and the car will not completely drive itself.
Autopilot in a Tesla is more like a highly advanced cruise control function that maintains distance from other vehicles and keeps you centered in your lane.
Tesla is not the only major car to feature this advanced-driver-assistance-system or ADAS for short. Recent models from many major players have featured some version of ADAS. It would seem that thanks to Musk’s high profile, Tesla continues to be widely recognized as the leader of self-driving vehicles.
The Gadgets and Gears Behind Tesla Autopilot
Tesla use an array of cameras and sensors, depending on the vehicle model, to see and sense the surrounding environment. A powerful onboard computer processes the camera input in a matter of milliseconds to make safe driving decisions, like adjusting speed to maintain distance and staying centered in the lane.
Other self-driving vehicles, like those made at Waymo, use a tremendous array of sensors, cameras, radar, and lidar to safely drive through the streets. These complex systems are deliberately built with redundancies to ensure optimal driving in the event of any one system failure. But it would seem that all these redundancies and sensors aren’t the Tesla way.
Is Tesla Autopilot Ready for Prime Time?
Musk has been preaching the capabilities of Tesla’s ADAS for years. The company is also extremely willing to push beta testing on users for data collection, which has broadened the reach of autopilot functions and knowledge. But it may also lead to some users not being fully prepared for the system that is delivered. Twitter users line up in Musk’s feed to try and get added to the ever-expanding whitelist of beta testers. But is the autopilot mode all it is cracked up to be?
Is Autopilot Mode on Tesla Safe?
In April 2021, Elon Musk tweeted that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than the average vehicle.” But electric vehicles and automated driving expert Sam Abuelsamid has a different opinion. Musk’s assertion is based on the company’s safety report from data collected during the first three months of 2021.
Abelsamid points out that the data on Tesla Autopilot safety is missing important context.
National Averages Provide Apple to Oranges Comparisons
The data compares Tesla driving on autopilot to other national driving averages. But Tesla’s autopilot function is primarily used on highways. So, when you compare big national driving statistics, including city driving where most accidents occur, you get a sort of unfair comparison skewed in Tesla’s favor.
Vehicle Age Flaws the Data
Another important point made by Abelsamid is related to the age of the vehicles. Tesla is relatively new. The oldest ones are from 2014. In contrast, the average age of a car on the road is around 12 years. This age gap makes the comparison further flawed. Older cars are generally less safe due to old brakes, worn tires, and the complete lack of ADAS.
So, it would seem that Tesla’s numbers and comparisons aren’t exactly built on the sturdiest of findings.
Data on Tesla Autopilot Involvement in Crashes
Additionally, Tesla has been notoriously tight-lipped about when autopilot is involved in a crash. It has been noted that many drivers aren’t even aware when the autopilot disengages. Tesla claims to include any incidents that occur within five seconds of the autopilot disengaging.
Misrepresenting the data is deceptive, and according to Missy Cummings, an engineer at Duke University, it can be dangerous. She is of the opinion that both the names “Autopilot” and “Fully Self Driving” are misleading and that these statistics fill consumers with false confidence. Tesla firmly states that drivers must remain engaged, but how many drivers actually will?
Still, Tesla Autopilot Is Relatively Safe
Overall, it would seem that Tesla’s autopilot feature is safe when used correctly. That means that the driver is still engaged with the road, has their hands on the wheel, ready to take over at any time. Tesla has included sensors to remind the driver to stay engaged in the event that they are not, but many people have started to hack this system so they can tune out further.
What Is the Future of Fully Autonomous Vehicles?
Fully autonomous vehicles have been “coming” for a long time now. Musk may be the most outspoken evangelist of this future, but he isn’t alone in this pursuit. Plenty of automakers are lining up to try and usher in an era full of fully autonomous “robotaxis.” But just how long will it be until the roads are filled with these truly self-driving cars?
Musk himself has even recently admitted that self-driving cars are tougher than he thought. Plenty of investors bet heavily on autonomous vehicles, and many automakers promised we would already be flush with these cars by now. The industry is left confronting the difficulties of achieving these goals. Not only is it tougher than they thought, but much more expensive.
Changes in Players
Recently, many players in the autonomous vehicle game have begun to consolidate resources. Lyft sold its self-driving car division to Toyota, Cruise bought Voyage, and Aurora merged with Uber’s self-driving car division. These makers are all realizing that they simply can’t solve this conundrum on their own.
Tech company valuations, like those at Waymo, a leader in the autonomous game, have begun to drop starkly from the robust billions they were a few years ago. Investors have sunk a lot of cash into these companies, and as of yet, they haven’t really seen any of the payoff.
How Close Are We?
Missy Cummings of Duke University has been saying for a long time that we are still a long way off from fully autonomous vehicles. It is finally becoming clear just how right she is. She has suggested that several years ago, many of these startups dreamed a bit too big. They aimed to be everything, taxis, trucking, agriculture, and so on.
The recent consolidation of resources has coincided with a consolidation of targets. Companies are coming to grips with how difficult it may be to fully change every kind of vehicle into an autonomous one, so they are setting their sights more narrowly. Perhaps a more targeted approach will yield stronger results, but it will still be a while before the road is full of robotaxis.
How Safe Are You Really in Tesla’s Autopilot?
As we’ve mentioned before, “Autopilot” and “Fully Self Driving” are a bit of misnomers, really. These ADAS tools are great innovations, but they aren’t really what the name implies. A driver is still required to be focused and ready to take over at a moment’s notice. Because as we’ve outlined, these systems are far from flawless.
Tesla’s data does indicate that when properly driving with the Autopilot feature, drivers are safer than other motorists.
This makes perfect sense, really, when you think about it. The driver is assisted by the computer. When the two work together, as intended, then, of course, you will be safer. It becomes unsafe when the driver tunes out and trusts the system to maintain progress.
So, it may be better to think of it less like Autopilot and more like advanced cruise control. It takes some of the work off you, the driver, and hands it over to the computer, but at the end of the day, you are still the driver, and you are still responsible for your safety on the road. Drivers are asked to agree to these terms when engaging the Autopilot mode in their Tesla.
While it isn’t likely you will be hacked while driving your Tesla; it does appear that there are some possibilities to be aware of. Keen Labs and Consumer Reports will continue to explore and exploit any vulnerabilities to ensure that the consumer is best protected at all times.
We may be quite a way off from the fully autonomous future we’ve been promised, but for now, the Tesla Autopilot is a pretty great tool for making a long road trip a little easier. Just be sure to keep your hands on the wheel and eyes on the road, because as of now, a computer still can’t outdrive a person.