Tesla seems to be the leading innovator of self-driving vehicles, but the autopilot system is far from perfect. As you may now know, the Tesla autopilot feature has seen its share of failures, and because of these failures, many drivers have begun questioning Tesla’s validity. But why is the autopilot failing in the first place?
Because their autopilot system is not yet level 5 autonomous, Teslas are not entirely capable of piloting without human supervision, and therefore fail to fulfill the purpose of autopilot. However, many of the autopilot features do, at least, aid in safer driving practices.
Artificial intelligence has come a long way, but we are not yet at the point of electric vehicles with full autonomy. In this article, we will further discuss why the Tesla autopilot feature has seen so much failure and ways the manufacturer could potentially improve it in the future.
Why Does the Tesla Autopilot Fail?
Tesla’s autopilot system is a valuable tool for assisting drivers in navigating busy highways and daily travel. However, the Tesla autopilot system is not currently made as a fail-safe driving option. The following are a few reasons why Tesla’s autopilot falls short:
No Full Vehicle Autonomy
The primary issue with Tesla’s autopilot is this: It’s not 100% autonomous—not yet, at least. Though some Tesla vehicles have reached a level 2 autonomy driver-assist system, these cars cannot fully function without some level of human navigation. For a car to be fully autonomous, it must be at least a level 5.
To give you a better idea of what this means for Tesla vehicles, let’s break down what these levels actually mean:
|Level 0 Autonomy – No Automation||The vehicle has no autonomy and cannot perform tasks related to user operation.|
|Level 1 Autonomy – Driver Assistance||A human operator controls most vehicle operations, but some features may be installed to assist the driver.|
|Level 2 Autonomy – Partial Automation||The vehicle can control certain functions of the driving process, but the driver must remain engaged at all times during travel.|
|Level 3 Autonomy – Conditional Automation||The vehicle can handle most driving controls, but the driver must remain ready to take over when notified.|
|Level 4 Autonomy – High Automation||The vehicle can maintain most driving operations, but the driver can take over if necessary/wanted.|
|Level 5 Autonomy – Full Automation||The vehicle is fully automated and can handle all operations related to piloting the vehicle.|
As you can see, Tesla’s autopilot is currently at the level of partial automation with aspirations of reaching level 5. But although Tesla has big plans to achieve this level of autonomy, we likely will not see fully functioning automated driving for many years.
However, the most significant issue regarding Tesla’s current level of autonomy is this: most problems related to the autopilot system come when drivers assume Tesla’s autopilot can fully function at a level 5.
User Dependence on the Autopilot System
Tesla’s autopilot is a wonderful tool to help users maintain safety while in motion. However, as mentioned above, the system is not designed for independent use.
Most Tesla crashes that have occurred in connection to autopilot have been a result of users relying on the system to navigate the vehicle’s controls. As the vehicle has not yet reached full autonomy, the system is not designed to operate itself. Additionally, as the system is not completely autonomous, the system cannot account for human error in the same way as we can, andhuman error is the number one cause of Tesla-related fatalities.
Lasting Effects on Driver Safety
As Elon Musk and Tesla push forward to create a fully self-driving vehicle, an issue has been presented: Many have begun to question whether Tesla autopilot has made it difficult to keep drivers involved in the driving process. Drivers have become reliant on the autopilot feature, and this has created reckless driving practices.
For example, in 2019, a Florida man was found passed out behind the wheel of a Tesla while intoxicated. This created a scenario of police officers using their cars to bring the unmanned vehicle to a stop. The question stands that a car that allows a sleeping driver to continue operations may pose a safety hazard.
Autopilot Safety Features Still Require Human Supervision
The mass-marketed autopilot system is actually designed as a safety feature to help out the motorist, and newer models of Tesla-designed vehicles have advanced tremendously in this type of technology. These electric vehicles have become equipped with features that can control multiple safety functions at once, including:
- Traffic-Aware Cruise Control – monitors the traffic surrounding your vehicle to maintain safe controls
- Autosteer – helps the driver steer while corresponding with the traffic-aware cruise control feature
- Automatic Emergency Braking – avoids impacts when a foreign object is detected
- Forward Collision Warning – warns of a possible crash detected by the forward-facing monitor
- Blind Spot Collision Warning – detects and warns of cars or foreign objects that have entered your blind spot
Most of these features work with or independently of the autopilot system to warn you of or completely avoid collisions. Though these features are not entirely foolproof, they do help reduce the average number of accidents reported each year.
However, again, these systems are not designed to operate without human supervision; although Tesla’s autopilot can accelerate and brake to avoid collision, it still needs a human operator to account for obstacles not detected by the system.If the system misses an obstacle, a pedestrian, or another motorist while in motion, this could still result in serious injury or a fatality.
In other words, although these safety functions have been designed to assist motorists in their day-to-day travels, they still require an operator to be effective.
Today, driving a Tesla must be treated as driving any other vehicle with automatic safety features. For example, rear-facing cameras are great for knowing if an object is directly behind you while backing up, but they may not account for a car speeding down the road that’s about to rear-end you. It’s essential to work with these safety features to avoid the risk of injury.
Flaws in the Navigation System
The autopilot navigation has been known to:
- Not brake for oncoming traffic
- Fail to recognize objects like tractor-trailers
- Miss hazardous objects like concrete barriers
- Some drivers have even noticed that they have to force the car not to stop at green lights when the system is in use
Flaws in Braking
Technically, the braking system itself is not flawed, but certain factors can cause the autopilot system to not recognize an oncoming motorist. If you are turning left over multiple lanes at a green light, the Tesla may not notice the vehicle barreling down the highway several lanes over.
The system is made to detect motion within a specified range, and sometimes this range is not large enough to detect an oncoming vehicle. The best practice is to constantly monitor and support the system. If you notice a vehicle about to run a red light, you can step in and brake when the autopilot cannot.
Object Recognition Software
Because the software is limited in the range of what it can detect, you may be at risk of hitting objects your Tesla can not see. In some instances, the software has been known to not interpret certain objects as a potential hazard.
If the concrete barrier is low enough to the ground or blends in just right, the system may just interpret this as a refraction of light. Additionally, Tesla vehicles are “blind in the back,” so there are certain areas Autopilot may not be able to detect.
Stopping at Green Lights
An added feature of the Tesla’s Navigate with Autopilot helps the driver consistently stop at any stoplight. But, the autopilot system has been known to brake at every stoplight, despite the color. This can be a handy safety measure in some cases, such as someone running a red light in your path.
However, stopping at every light can lead to accidents. If you’re stopping at every green light, this can cause road congestion or crashes if oncoming vehicles aren’t aware of your braking.
Human Intervention is Needed to Turn
The Tesla autopilot system can detect and make turns with human assistance. For the vehicle to operate with autopilot, a driver must consistently keep their hands on the wheel. Without detecting a human monitoring the operations, the Tesla will shut itself off.
However, some people have figured out ways to negate the system. Weights like these can be found all over the internet to help trick the system into thinking a driver is manning the controls. Unfortunately, hanging weights and applying pressure to the steering wheel have resulted in accidents.
Though the car is not yet fully equipped for automatic driving, the system can detect turns as long as there are:
- Clearly painted roads
- Traffic cones
- Stop signs
- Traffic signs
(Source: Car and Driver)
When these directions are in clear view of the Tesla’s autopilot cameras and detectors, the system can direct turns. When the road paint has faded, or these signs are not present, the system may have difficulties navigating the road.
Currently, stops and turns are best managed by human operation. The Tesla autopilot system is great for engaging with and assisting the driver, but it cannot be relied on entirely. The autopilot feature is an impressive boost in technology, but, again, we’re quite a ways away from the system gaining full autonomy.
Should Tesla’s Autopilot Be Used?
Considering that Tesla autopilot still has a few flaws that need perfecting, you may be wondering whether it’s safe to use the feature at all in these electric vehicles. Despite its failures, the Tesla autopilot does come with its share of benefits:
It Makes Driving Safer—When Supervised
Many components of the Tesla autopilot system work together to assist the driver in avoiding accidents. Some of these include:
- Rear and front-facing cameras to alert of possible collisions
- Automatic stopping in the case of a detected collision
- Assisting the driver in maintaining their lane
- Alerting the driver of potentially hazardous obstacles
- Steering wheel controls to avoid sharp turns
Each feature related to Tesla autopilot works together to help an operator navigate their drive safely. Of course, the system is far from perfect and should not be used without a licensed driver manning the wheel, but Tesla autopilot is great for helping the driver navigate tricky highways or hazardous roads.
It Does Help Avoid Accidents
New technology always generates a lot of fear and false narratives. Right now, Tesla is facing accusations of defective vehicles and autopilot resulting in the deaths of civilians. In reality, when autopilot is enabled, Teslas are no more likely to cause harm than any other vehicle.
Tesla has tracked billions of miles traveled with their autopilot feature and routinely releases the data related to the number of incidents. The following chart represents the number of vehicle-related incidents in correspondence to miles traveled within the last nine quarters:
|Year: Quarter||Accidents Per Miles Driven with Autopilot Feature||Accidents Per Mile Driven without Autopilot Engaged but with Tesla Safety Features||Accidents Per Mile without Autopilot Feature or Tesla Safety Features|
|2019: Q1||One accident per every 2.87 million miles driven||One accident per every 1.76 million miles driven||One accident per every 1.26 million miles driven|
|2019: Q2||One accident per every 3.27 million miles driven||One accident per every 2.19 million miles driven||One accident per every 1.41 million miles driven|
|2019: Q3||One accident per every 4.34 million miles driven||One accident per every 2.7 million miles driven||One accident per every 1.82 million miles driven|
|2019: Q4||One accident per every 3.07 million miles driven||One accident for every 2.10 million miles driven||One accident for every 1.64 million miles driven|
|2020: Q1||One accident per every 4.68 million miles driven||One accident per every 1.99 million miles driven||One accident per every 1.42 million miles driven|
|2020: Q2||One accident per every 4.53 million miles driven||One accident per every 2.27 million miles driven||One accident per every 1.56 million miles driven|
|2020: Q3||One accident per every 4.59 million miles driven||One accident per every 2.42 million miles driven||One accident per every 1.79 million miles driven|
|2020: Q4||One accident per every 3.45 million miles driven||One accident per every 2.05 million miles driven||One accident per every 1.27 million miles driven|
|2021: Q1||One accident for every 4.19 million miles driven||One accident for every 2.05 million miles driven||One accident per every 978 thousand miles driven|
As you can see, the rate of crashes and fires is drastically reduced in Tesla vehicles with autopilot and safety features engaged. Even so, the vehicle-related fatalities associated with Tesla autopilot have left people questioning the safety of their design.
The national average of crashes per miles driven is currently reported as one crash per every 484,000 miles. Even without safety features installed, Tesla cars seem to have a better driving history. And, when paired with autopilot features, the crash numbers are significantly reduced.
It’s Useful for Highway Driving
In most cases, highways are the safest place to navigate a Tesla with autopilot enabled. Newer models of Tesla have launched a system with an embedded knowledge of popular roads and highways across the United States.
The system runs much like a GPS to detect the fastest routes and decide which lanes are best for optimal travel time. With autopilot, drivers will have the option to confirm suggested lane changes and alternate routes as the drive is underway. This can provide for a wonderfully efficient drive.
Even without Navigate enabled with your autopilot, Teslas are great for highway travel. Just like any other vehicle, they will require a little extra attention due to high volumes of traffic. However, the built-in safety features will help you with turns, stops, acceleration, and suggested travel routes.
Highways are often the most maintained roads in the United States, so the road paint and traffic signs should be clearly apparent. This helps the autopilot guide the driver down multiple lanes of traffic without the stress of constantly worrying about unseen obstacles. Overall, Tesla’s obstacle detection systems enable their vehicles to have a streamlined highway experience.
The Tesla autopilot system has improved the outcome of certain driver-related collisions, but the system cannot be completely independent while in operation. More often than not, the system fails when it is unattended while the vehicle is in motion.
To avoid injuries or fatalities related to Tesla’s autopilot, the system should be supervised and directed by a human operator. Until the system upgrades to full autonomy, they still need to be operated just like any other vehicle.
However, this doesn’t mean you shouldn’t use the autopilot feature at all; just make sure you monitor the system the next time you go on a drive!