Copyright © BusinessAMBE 2023

Tesla is once again under scrutiny in the United States. An investigation by the National Highway Traffic Safety Administration (NHTSA) links Autopilot and the more advanced Full Self-Driving to quite a few - often fatal - accidents. Both driver-assistance systems allegedly fail adequately to keep Tesla drivers engaged in driving.

The NHTSA report is scathing for Tesla. Indeed, the car company has claimed for some time that it is on the verge of releasing a fully autonomous vehicle for personal use. However, data from the past few years show that Autopilot still has a lot of problems, some of them fatal.

Autopilot

In the news: The federal agency NHTSA opened an investigation some time back after several incidents involving Tesla cars. The results of that investigation were published last week.

The details:

  • The NHTSA investigated 956 accidents that happened with Tesla cars between January 2018 and August 2023. Each involved situations where the driver was using Autopilot during or until shortly before the accident. Many were accidents that happened when it was dark and Tesla's Autopilot misjudged certain situations.
  • In 29 of those accidents, there were fatalities. In 101 others, there were injuries.

Keeping drivers involved

The report finds that Tesla Autopilot drivers are not sufficiently designed to keep drivers involved while driving. Tesla denies that and says it does warn drivers about it. For example, those should keep their hands on the wheel and their eyes on the road. According to the NHTSA, drivers were inadequately focused during the crashes, causing them to react too late when Autopilot was at fault.

  • In as many as 59 of those crashes, Tesla drivers would have had five or more seconds to react. In 19 cases, it was as much as 10 or more seconds.
  • The analyses show that in the majority of cases, drivers did not brake or swerve to avoid the hazard.

Comparison with other systems

The NHTSA points out in its report that Tesla uses a different Autopilot system compared to competitors. Because the system immediately switches over completely when drivers take over, rather than simply adjusting, drivers would have less incentive to pay attention.

  • The name Autopilot is also misleading. Other brands usually use terms such as "assist," where it is clear that driver attention is still required.

Musk's response

Elon Musk has not responded to the NHTSA report verbatim. Recently, however, he did reiterate that his cars are safer than cars driven entirely by humans. "Stopping autonomous driving, at this point, means killing people," he said just last week.

© The Content Exchange, source News