NHTSA Investigation: Tesla Autopilot and Full Self-Driving Linked to Hundreds of Crashes
The US National Highway Traffic Safety Administration (NHTSA) conducted an extensive investigation into crashes involving Tesla electric vehicles equipped with Autopilot and Full Self-Driving (FSD) features that occurred between January 2018 and August 2023.
Tesla Accidents
NHTSA Investigation
NHTSA examined 956 crashes involving Tesla vehicles with activated Autopilot and Full Self-Driving (FSD) systems. The investigation covered incidents that occurred between January 2018 and August 2023, but the number of accidents was large.
Causes of incidents
The agency began an investigation after a series of Tesla collisions with stationary ambulances, parked on the side of the road. Most of these accidents occurred at night when vehicle software ignored warning signals, including beacons, cones and light arrows.
Casualties and injuries
As a result of these road traffic accidents, some of which Other vehicles were also involved and 29 people were killed. In addition, 211 accidents were recorded in which the Tesla collided head-on with another vehicle or an obstacle in the way. In these most serious accidents, 14 people were killed and 49 were injured.
System Deficiencies
Based on its investigation, NHTSA found that the Autopilot and FSD systems were not were designed to ensure proper driver involvement in the driving process. Despite Tesla's warnings to remain alert when using these features, in many cases drivers became overconfident and let their guard down. When a quick response was required, it was often too late.
Accident statistics
According to the data, in 59 accidents Tesla drivers had at least 5 seconds to react before the collision, and in 19 cases the danger was visible 10 and more than seconds before the incident. After reviewing crash logs and data provided by Tesla, NHTSA found that in most of the incidents analyzed, drivers made no attempt to brake or maneuver to avoid the hazard.
Comparison with Other Manufacturers
NHTSA compared Tesla's Level 2 (L2) automation features with similar systems from other automakers. Unlike competitors, Autopilot removes the driver from control rather than assists him, preventing him from engaging in the driving process. The name "Autopilot" misleads consumers about the real capabilities of the system, creating overly optimistic expectations.
Investigation Findings
NHTSA concluded that drivers using Autopilot or Full Self-Driving "were not sufficiently engaged in the driving task" and the technology Tesla "did not adequately ensure drivers were focused on the driving task."
Possible Data Gaps
However, NHTSA acknowledges that its study may be incomplete due to "gaps" in Tesla telemetry data, which could mean a much higher number of accidents involving Autopilot and FSD than the Administration was able to identify.
Glossary
- NHTSA (National Highway Traffic Safety Administration) is the US federal agency responsible for setting and enforcing vehicle safety standards. .
- Tesla is an American company, a manufacturer of electric vehicles and a developer of autonomous driving technologies.
- Autopilot is a driver assistance system from Tesla that provides partial autonomous driving at level 2 (L2).
- Full Self-Driving (FSD) is a more advanced autonomous driving system from Tesla, marketed as a fully autonomous driving system, although in practice it is also classified as Level 2 (L2).
Links
Answers to questions
What is NHTSA and what accidents involving Tesla has it investigated?
What major problems were identified during the investigation?
What comments have been made regarding Autopilot and Full Self-Driving?
What conclusions were drawn from the investigation?
What were the consequences of the accidents investigated?
Hashtags
Save a link to this article
Discussion of the topic – NHTSA Investigation: Tesla Autopilot and Full Self-Driving Linked to Hundreds of Crashes
The US National Highway Traffic Safety Administration (NHTSA) investigated 956 accidents involving Tesla vehicles equipped with Autopilot and Full Self-Driving (FSD) systems. The incidents occurred between January 2018 and August 2023.
Latest comments
8 comments
Write a comment
Your email address will not be published. Required fields are checked *
FrankB
Wow, these statistics are simply shocking! 🤯 Tesla appears to be seriously underestimating the importance of driver involvement. Technology is great, but it should help, not completely replace humans.
JuanS
Yes exactly. Tesla's Autopilot definitely creates a false sense of security. If the driver relaxes and stops paying attention to the road, this can lead to tragic consequences. 💀 The company should make the system more interactive and require constant driver attention.
LucyF
How about banning such systems altogether until they are truly secure? 😬 Too many lives have already been lost due to Autopilot errors. Safety should be priority #1.
GrzegorzW
I completely agree with LucyF. The technology is still crude, and its premature implementation has led to numerous accidents. 😤 Tesla should suspend the use of Autopilot until 100% safety for drivers and pedestrians is guaranteed.
OldGrouchKarl
Bang, all this fashion stuff is one big scam! 😠 In my day, we relied on common sense and attention on the road. No computers while driving, thank you! These newfangled toys only distract and endanger people's lives.
AnnaK
Could it be that the problem isn't just with Autopilot, but with Tesla's overall approach to safety? 🤔 They seem to be too focused on promoting new technologies and not paying enough attention to the risks.
MarioR
I agree, AnnaK. It appears that Tesla is pushing its driver assistance systems too aggressively without considering their limitations. 😕 They should be more careful and transparent in their statements so as not to mislead people.
ElenaP
Let's also not forget the role of the drivers themselves. Many people simply ignore safety rules and blindly rely on technology. 🚗 There is a need to raise awareness and train people on the correct use of such systems.