Tesla has always been at the forefront of the push for self-driving vehicles, but like any other piece of advanced tech, it has its share of issues. If you drive a Tesla, you might have noticed common glitches in its autopilot technology.
The autopilot in your Tesla is glitchy because cars are not yet fully autonomous; today’s Teslas are not meant to operate without some human oversight. Glitches are the result of limitations in LiDaR and RADAR technology, lack of accurate map data in dense cities and rural areas, weather conditions, or problems distinguishing certain road obstructions from others.
Tesla CEO, Elon Musk, is vocal about which technology will lead to the best fully autonomous vehicles—though other top car manufacturers disagree. Read on to find out more about Tesla’s current autopilot features, and why drivers are seeing those annoying glitches.
What Causes Glitches in Tesla’s Autopilot?
Glitches in your Tesla autopilot are quite common—in fact, there are whole sections of the Tesla forum dedicated to the discussion of this problem.
Most often, glitches in your Tesla autopilot are related to one of three common problems:
- Poor weather conditions
- A flaw in the car’s ability to accurately read visual cues
- Common problems with new technology
To better understand the reason behind inevitable glitches in Tesla autopilot, you need to understand the six levels of car autonomy and how self-driving cars interact with the world around them.
Levels of Driverless Car Autonomy
There are six levels of driverless car autonomy:
|Level of Automation||Features|
|Level 0: No Automation||No automated features|
|Level 1: Driver Assistance Required||Radar guided cruise control, or lane-centering assistance, but does not have both technologies|
|Level 2: Partial Automation||Car has both radar-guided cruise control and lane centering. Meant for highways only, not city driving. Most vehicles in 2020 are at this level of automation.|
|Level 3: Conditional Automation||The car can drive itself in certain conditions, for example, on a highway with a median or in start-stop traffic. Tesla is somewhere between Level 3 and 4 currently.|
|Level 4: High Automation||The car can drive itself in most situations, including highway and city or town driving. In some inclement weather or when a vehicle doesn’t understand a situation, the driver must take control back. The car might be autonomous but will need to be controlled occasionally by a remote operator.|
|Level 5: Full Autonomy||A car can drive itself without an operator—it only needs to know where to go. In a Level 5 car, a passenger can sleep in the car without worrying about taking over control.|
(Source: Gear Brain)
The Differences Between LiDaR, RADAR, and Computer Vision
Self-driving cars can interact with the world around them using several technologies, including LiDaR, RADAR, and Computer Vision. There are perks and downsides to each system.
RADAR is the oldest technology—you probably associate it with military use or submarines. RADAR uses radio waves that bounce off objects to get a sense of general distance and the size and presence of foreign bodies within a certain radius. RADAR can operate in virtually any environment—underwater, in the dark, in inclement weather—but it doesn’t give you a very clear image of the object in view.
LiDaR, on the other hand, is a new technology. It uses lasers that bounce off objects and return to a sensor, which then creates a 3D map. LiDaR is remarkably accurate, though issues with two lasers returning to the same sensor can cause the system to think that an object is there that doesn’t exist. LiDaR also doesn’t work very well in the dark or when rain or snow obstructs the view.
Computer Vision is Elon Musk’s autopilot technology of choice. It is essentially artificial intelligence that reverse-engineers human vision; basically, it teaches computers to see like we do.
Computer Vision in one form or another has been around since 1960, but recently it has become an advanced system of object identification and classification. Computer Vision has a wide variety of uses, but Elon Musk is investing in this model to advance to Level 5 driverless car autonomy.
Some combination of these technologies will likely be used in the future when cars are fully autonomous.
Why Do the Autopilot Glitches Happen?
Understanding how self-driving cars interact with the world is great, but why do these technologies lead to glitches in the system? Let’s dive into the common problems with autopilot glitches and why they happen.
Poor Weather Conditions
Self-driving cars struggle in poor weather conditions and in areas where map data is complicated or sparse.
Poor weather conditions often lead to problems for self-driving cars, particularly ones that use LiDaR; this is because LiDaR uses laser technology that requires good light and a clear field of view to work properly. This has been a big problem for car manufacturers because driving in inclement weather is a norm in many parts of the country, especially in winter months.
Inaccurate Visual Readings
There can also be problems with the car’s ability to read specific visual cues—users have reported that cars using LiDaR, for example, see a plastic bag in the road as a speed bump, and can slow down unnecessarily on the highway. These glitches might seem harmless, but they can lead to accidents if the vehicle operator isn’t attentive.
However, problems reading visual cues are not only common for autopilot systems—they happen for us too. How many times have you been driving and thought you saw an animal scurry into the road, but it was actually a leaf or a plastic bag?
The problem is that when autopilot systems make these mistakes, they don’t have the same capacity to correct them that we do—at least not yet; this can lead the car thinking that something is a major obstruction in the road when it’s perfectly safe to drive over it.
Autopilot Technology is Still Relatively New
Common glitches with new technology are a part of life in the 21st century. With new technology comes many small problems. We often don’t know these problems exist until we test new systems in the field for many weeks or months—and this means that new Tesla drivers can feel like guinea pigs for self-driving technology.
Many of the small glitches that Tesla drivers report are fixed with each update, in the same way that glitches are often fixed with app updates on our phones.
The Future of Autopilot
Your glitchy Tesla might be frustrating sometimes, but is it safe? And what are the ramifications of autopilot—good and bad—when cars gain Level 5 autonomy in the future?
The reality is that even with glitches in the system, Tesla’s autopilot and similar systems are only improving our safety while driving. In fact, Tesla’s autopilot is nine times safer than average driving.
Computers may make errors that we wouldn’t, but we also make mistakes every single day while driving—statistics on road accidents prove that we aren’t the safest drivers in the world, either.
The safety features in new vehicles aren’t just convenient; they are lengthening our lives and reducing the risk of accidents on the road. So, while you might sigh at your Tesla when it doesn’t measure the distance between your car and the next down to the inch, remember that autopilot is keeping you safe.
Final Thoughts: Glitches are Part of Life—For Now
Despite some common glitches, there are plenty of bonuses when it comes to autopilot. Cars that can drive themselves can deliver groceries or medicine in rural areas that don’t have the infrastructure for regular courier services, serve as taxis, and park themselves. They can reduce transportation costs and traffic congestion, provide last-mile services to commuters in major cities, and reduce fuel consumption.
The benefits of fully autonomous cars outweigh the risks and annoyance, especially as autopilot technology improves.
So why is your Tesla autopilot glitchy? There are many technical reasons, but in short, it’s because the technology is new. We are at the forefront of the autonomous car era, and glitches are a part of that era—at least for now.