{"id":4907,"date":"2024-04-28T04:08:37","date_gmt":"2024-04-28T04:08:37","guid":{"rendered":"https:\/\/aitesonics.com\/nhtsa-concludes-tesla-autopilot-investigation-after-linking-the-system-to-14-deaths-161941746\/"},"modified":"2024-04-28T04:08:37","modified_gmt":"2024-04-28T04:08:37","slug":"nhtsa-concludes-tesla-autopilot-investigation-after-linking-the-system-to-14-deaths-161941746","status":"publish","type":"post","link":"https:\/\/aitesonics.com\/nhtsa-concludes-tesla-autopilot-investigation-after-linking-the-system-to-14-deaths-161941746\/","title":{"rendered":"NHTSA concludes Tesla Autopilot investigation after linking the system to 14 deaths"},"content":{"rendered":"
The National Highway Traffic Safety Administration (NHTSA) has concluded an investigation<\/a> into Tesla\u2019s Autopilot driver assistance system after reviewing hundreds of crashes, including 13 fatal incidents that led to 14 deaths. The organization has ruled that these accidents were due to driver misuse of the system.<\/p>\n However, the NHTSA also found that \u201cTesla\u2019s weak driver engagement system was not appropriate for Autopilot\u2019s permissive operating capabilities.\u201d In other words, the software didn\u2019t prioritize driver attentiveness. Riders using Autopilot or the company\u2019s Full Self-Driving technology \u201cwere not sufficiently engaged,\u201d because Tesla \u201cdid not adequately ensure that drivers maintained their attention on the driving task."<\/p>\n The organization investigated nearly 1,000 crashes from January of 2018 until August of 2023, accounting for 29 total deaths. The NHTSA found that there was \u201cinsufficient data to make an assessment\u201d for around half (489) of these crashes. In some incidents, the other party was at fault or the Tesla drivers weren\u2019t using the Autopilot system.<\/p>\n The most serious were 211 crashes in which \u201cthe frontal plane of the Tesla struck a vehicle or obstacle in its path\u201d and these were often linked to Autopilot or FSD. These incidents led to 14 deaths and 49 serious injuries. The agency found that drivers had enough time to react, but didn\u2019t, in 78 of these incidents. These drivers failed to brake or steer to avoid the hazard, despite having at least five seconds to make a move.<\/p>\n That\u2019s where complaints against the software come into play. The NHTSA says that drivers would simply become too complacent, assuming that the system would handle any hazards. When it came time to react, it was too late. \u201cCrashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,\u201d the organization wrote. The imbalance between driver expectation and the operating capabilities of Autopilot resulted in a \u201ccritical safety gap\u201d that led to \u201cforeseeable misuse and avoidable crashes.\u201d<\/p>\n The NHTSA also took umbrage with the branding of Autopilot, calling it misleading and suggesting that it lets drivers assume the software has total control. To that end, rival companies tend to use branding with words like \u201cdriver assist.\u201d Autopilot indicates, well, an autonomous pilot. California\u2019s attorney general and the state\u2019s Department of Motor Vehicles are also investigating Tesla<\/a> for misleading branding and marketing.<\/p>\n Tesla, on the other hand, says that it warns customers that they need to pay attention while using Autopilot and FSD, according to The Verge<\/em><\/a>. The company says the software features regular indicators that remind drivers to keep their hands on the wheels and eyes on the road. The NHTSA and other safety groups have said that these warnings do not go far enough and were \u201cinsufficient to prevent misuse.\u201d Despite these statements by safety groups, CEO Elon Musk recently promised that the company will continue to go<\/a> \u201cballs to the wall for autonomy.\u201d<\/p>\n The findings could only represent a small fraction of the actual number of crashes and accidents related to Autopilot and FSD. The NHTSA indicated that \u201cgaps in Tesla\u2019s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes.\u201d This means that Tesla only receives data from certain types of crashes, with the NHTSA claiming the company collects data on around 18 percent of crashes reported to police.<\/p>\n With all of this mind, the organization has opened up another probe<\/a> into Tesla. This one looks into a recent OTA software fix issued in December after two million vehicles were recalled<\/a>. The NHTSA will evaluate whether the Autopilot recall fix that Tesla implemented is effective enough.<\/p>\n","protected":false},"excerpt":{"rendered":" The National Highway Traffic Safety Administration (NHTSA) has concluded an investigation into Tesla\u2019s Autopilot driver assistance system after reviewing hundreds of crashes, including 13 fatal incidents that led to 14 deaths. The organization has ruled that these accidents were due to driver misuse of the system. However, the NHTSA also found that \u201cTesla\u2019s weak driver […]<\/p>\n","protected":false},"author":6,"featured_media":4907,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4025,362,5388,48,95,484],"tags":[4026,365,5390,59,101,486],"yoast_head":"\n