brytfmonline

Complete News World

Autopilot and full self-driving are linked to accidents and dozens of deaths

Autopilot and full self-driving are linked to accidents and dozens of deaths

Tesla remains in the eye of the storm when it comes to scrutiny of accidents and deaths allegedly caused by driver overconfidence in the vehicle. The US regulatory body NHTSA even states that drivers think these electric cars are more capable than they actually are. Elon Musk's Autopilot and Full Self-Driving has been linked to hundreds of accidents and dozens of deaths.

Autopilot and full self-driving are in trouble

In March 2023, a North Carolina student was getting off a school bus when he was struck by a Tesla Model Y traveling at “freeway speeds,” according to a report. The federal investigation was published on April 25. The Tesla driver was using Autopilot, the company's advanced driver-assistance feature that Elon Musk insists will eventually lead to fully self-driving cars.

As mentioned in the edgeThe 17-year-old student who was shot was taken to hospital by helicopter with serious injuries. But what the investigation found after examining hundreds of similar incidents was a pattern Driver inattention, coupled with flaws in Tesla technologyWhich led to hundreds of injuries and dozens of deaths.

Drivers using Autopilot or the system's more advanced sibling, Full Self-Driving, "were not sufficiently engaged in the driving task" and Tesla's technology "did not adequately ensure that drivers maintained their attention in the driving task."

NHTSA concluded.

Image A Tesla Model 3 crashed and the Autopilot system is being investigated

In total, NHTSA investigated 956 crashesstarting in January 2018 and extending until August 2023. Among these accidents, some of which included other vehicles colliding with a Tesla vehicle, 29 people died.

There were also 211 incidents “in which the front Tesla plane collided with a vehicle or obstacle in its path.” these The accidents, which were often the most serious, killed 14 people and injured 49 others..

See also  FTX collapse comes to football world: Chelsea loses cryptocurrency sponsor

Tesla cars investigated for colliding with parked emergency vehicles

NHTSA was asked to launch its investigation after several crashes in which Tesla drivers crashed into emergency vehicles parked on the side of the road. a Most of these incidents occurred after dark, with the program bypassing site control proceduresincluding warning lights, flags, cones and an illuminated arrow sign.

The agency concluded in its report that Autopilot — and in some cases FSD — was not designed to keep the driver engaged in the driving task. a Tesla says it is warning its customers to pay attention While using Autopilot and FSD, which involves keeping your hands on the wheel and your eyes on the road.

However, NHTSA states that in many cases, The drivers became very satisfied He lost focus. When it is time to respond, it is often too late.

In 59 crashes examined by NHTSA, the agency concluded that Tesla drivers had enough time, “five seconds or more,” to react before colliding with another object. In 19 of these crashes, the hazard was visible for 10 seconds or more before impact.

By analyzing accident records and data provided by Tesla, NHTSA found this The drivers did not brake or steer to avoid the danger In most of the incidents analyzed.

Crashes without any evasive action or with a late attempt to evade by the driver were found in all versions of Tesla devices and in all accident conditions.

NHTSA reported.

The second level of exaggerated "independence".

NHTSA also compared Tesla's Level 2 (L2) automation capabilities with products available in other companies' vehicles. Unlike other systems, The autopilot will be deactivated instead of allowing drivers to adjust direction. According to the regulator, this behavior “discourages” drivers from continuing to engage in the driving task.

A comparison of Tesla's design choices with those of its L2 peers determined that Tesla was eccentric in its approach to L2 technology, combining a weak driver engagement system with lax Autopilot operational capabilities.

The agency reported.

See also  Netflix finally announced to investors what they want to hear. Now comes the more complex stage.

Even the brand name "Autopilot" is misleading, NHTSA said, raising the idea that drivers don't have control of the vehicle. While other companies use some version of "help", Tesla products trick drivers into believing they are more capable than they are. The California Attorney General and the state Department of Motor Vehicles are investigating Tesla for deceptive marketing and branding.

NHTSA admits its investigation may be incomplete based on "gaps" in Tesla's telemetry data. Which It may mean that there are many accidents involving autopilot and FSD From what NHTSA was able to find.

Tesla issued a voluntary recall late last year in response to the investigation, and issued a software update. Over the air To add more warnings to autopilot. NHTSA also said it has begun a new investigation into the crash He remembers After many security experts said that the update is inappropriate and still allows for misuse.

to The conclusions contradict Musk's insistence Tesla is an artificial intelligence company that is about to launch a fully self-driving car for personal use. The company plans to unveil a robotaxi later this year, which should herald the start of this new era for Tesla.

During the first quarter earnings call this week, Musk reinforced the idea that... The vehicles were safer than cars driven by humans.

If you have, on a large scale, a statistically significant amount of data that shows conclusively that a self-driving car has, say, half the accident rate of a human-driven car, I think it's hard to ignore. Because stopping autonomy at this stage means killing people.

Elon Musk said.

See also  Want to apply for a home loan? Banks will limit the maximum amount they are allowed to lend to 30%