This is How Many Tesla Autopilot Deaths There REALLY Are


This is How Many Tesla Autopilot Deaths There REALLY Are

Tesla Autopilot is a revolutionary invention that is meant to make driving exponentially safer. Even so, you might be wondering how many deaths have been attributed to this technology. Of course, when used properly, a driver will always be at the wheel. But, what about those occasions when Autopilot causes an accident independent of driver interaction?

While there may be varying statistics about this topic, it appears that Autopilot has resulted in 20 fatalities. Eleven of these deaths occurred in the United States, with the other nine occurring internationally. The deaths resulted from Autopilot related incidents in a total of 16 crashes.  

Tesla Autopilot is predicated on being much safer than traditional alternatives when it comes to avoiding accidents. Data to this point would support that premise, but one must keep in mind that Autopilot does not function unless a driver is directly involved in the process. Continue reading to learn how fatal crashes factor into statistics relating to Autopilot. 

Have Any Crashes Been Attributed Directly to Autopilot?

It is often difficult to determine just how many crashes have been attributed to Autopilot. The issue is finding out who or what is at fault in any given accident. When it comes to Autopilot, we have to remember that the system has to be enabled. Just because a car is equipped with Autopilot does not mean that it was in use when a crash occurred.

In the United States, the National Transportation Safety Board (NTSB) is typically responsible for investigating crashes involving new technology such as Tesla’s Autopilot. Since its release in 2015, the NTSB has directly implicated Autopilot in several crashes where people have died. 

The NTSB has also directed the blame towards Autopilot in numerous other crashes where nobody has died. It is important to understand that being involved in an accident when Autopilot was engaged does not necessarily mean that the technology was the root cause. Much of it will come down to driver intent. 

Many drivers feel that Tesla has promoted Autopilot as a far more capable system of avoiding accidents than it is. Some drivers, for example, depend on it too heavily when the cruise control is set. Others depend on it to correct their steering when on the highway.

If either of these factors is evident in a crash, then an argument can be made either way to whether the driver is at fault or Autopilot is. We have to remember that Autopilot has always been designed as a driver’s assistant. The driver must remain in complete control of the vehicle at all times. If a driver allows Autopilot to take over even for a second, then the driver would be at fault in an accident. However, it is not always that cut and dry.

Examples of Fatal Crashes That Have Been Attributed to Autopilot

Sadly, there are cases where Autopilot has been found to be a significant factor in a crash that has ended in a fatality. Each of these findings has come after an exhaustive investigation by the NTSB, and they do not come without controversy. Even in these examples, you will notice that there is a case to be made that driver was at fault.

2016 Crash in Florida involving a Tesla Model S

With Autopilot only being available on the market for just over a year, this crash involved a Tesla Model S. The driver ended up driving the car underneath a tractor-trailer. In this case, the tractor-trailer crossed in front of the Model S. Autopilot was not designed to identify such situations, so it could not help the driver avoid the collision. 

In this case, Autopilot was found to be a possible cause of the fatal accident. If the driver was under the assumption that the system would alert him of any likely vehicles that could be collided with, it is perceived that the driver was not at fault.

2018 Crash in California Involving a Tesla Model X

The was a single vehicle accident that occurred in Mountain View, California. A Tesla Model X was being driven on a highway when it ran into a divider. The driver of the Tesla was killed instantly. 

After a lengthy investigation, it was determined that the Autopilot system on the Model X automatically steered the vehicle into the highway divider. This is believed to be the result of certain limitations in the Autopilot feature designed by Tesla. 

In this case, the driver was also possibly at fault due to his lack of response. It was determined that he was also using a game application on his cell phone and that he likely over relied on Autopilot to do the steering for him.

2019 Crash in Florida Involving a Tesla Model 3

This crash took place in Delray Beach and also involved a tractor-trailer. This time it was a Tesla Model 3 that drove under the trailer. It ended up killing the driver. In this case, it was discovered that the Autopilot system was engaged, but it did not provide any type of warning that the collision was imminent.

The driver did not have his hand on the steering wheel at the time of the crash, but neither did Autopilot warn him that he needed to. None of the collision avoidance features that Autopilot promotes were activated. This means that the auto emergency system was nonreactive.

After this crash, a recommendation was issued that Tesla limit Autopilot to the specific functions for which it was designed. If another situation is encountered while driving, Autopilot would not be able to be engaged. 

2021 Crash in Texas Involving a Tesla Model S

This was the most recent fatal crash involving Autopilot and a Tesla vehicle. This one occurred in April of 2021 and involved a Model S that flew off the road near Houston, Texas. Nobody was in the driver’s seat at the time of the accident, leading investigators to conclude that the driver may have been showing off and allowing Autopilot to drive the car without him. 

How Do You Determine Who or What is at Fault?

The findings from the most recent fatal crash involving a car that was using Tesla Autopilot at the time of an accident have not been released yet. However, it certainly appears that this is a case of a driver not using Autopilot as it was intended. Nowhere in the world is Autopilot regulated to permit driverless cars. 

Because of this, some would argue that Autopilot is not the cause of the death. They would say that this is a case of driver negligence. At the same time, others would argue that Tesla should never allow Autopilot to function if a driver was not in the driver’s seats with his or her hands on the steering wheel. 

Is Tesla’s Autopilot Really Safer Than Human Drivers?

The technology behind Autopilot is sound. It is meant to assist drivers in their quest to be safer on the road. When used correctly, there is little denying that Autopilot is safer than human drivers.

However, it is important to know the limitations of technology. Given that more than a few fatal crashes have been attributed to Autopilot provides a reason for concern. For now, it must remain a driver assistance system as opposed to providing the option for a fully self-driving car.

Wrap-Up 

When talking about deaths related to Tesla Autopilot, it is important to know what that means. Humans can still cause accidents even when using Autopilot. To determine a fatal crash resulted from the technology, the driver would need to be completely uninvolved in the decisions that were made. To date, that is only reported to have happened one time.

Tesla Discounts:

Greg

Hi, I'm Greg. My daily driver is a Tesla Model 3 Performance. I've learned a ton about Teslas from hands-on experience and this is the site where I share everything I've learned.

Recent Posts