Main menu

Pages

Tests show Tesla's self-driving technology can't detect children on the road | Tesla

featured image

A safety tech advocacy group released Tuesday a claim that Tesla’s fully self-driving software poses a potentially lethal threat to child pedestrians.

According to safety tests conducted by the Dawn Project, the latest version of the Tesla Full Self-Driving (FSD) Beta software repeatedly crashed into a stationary child-sized mannequin in its path. The claim that technology has a clear problem recognizing children is advertising campaign He called on the public to pressure Congress to ban Tesla’s self-driving technology.

In several tests, professional test drivers found that the software, released in June, failed to detect a child-sized person at an average speed of 25 mph, causing the car to crash into a mannequin. Dawn Project founder Dan O’Dowd called the results “extremely disturbing.”

The company’s chief, “Elon Musk, says Tesla’s fully self-driving software is ‘amazing,'” O’Dowd added. “No. It is a deadly threat to all Americans.

“More than 100,000 Tesla drivers are already using their cars’ fully self-driving modes on public roads, putting children in communities across the country at great risk.”

O’Dowd argued that the test results show that self-driving cars need to be banned until Tesla proves they “do not knock children down at pedestrian crossings.”

Tesla has repeatedly refuted claims that its self-driving technology is too primitive to ensure the safety of car occupants and other road users.

After a fire in Texas that killed two people in 2021, Musk murmured A less sophisticated version of the FSD, the autopilot feature was not turned on at the moment of the crash.

At the company’s shareholder meeting earlier this month, Musk said full self-driving was a big improvement, and it plans to make the software available to all owners who request it by the end of the year. However, questions about its safety continue to grow.

In June, the National Highway Traffic Safety Administration (NHTSA) announced it was expanding its investigation to 830,000 Tesla vehicles across all four current model lines. This expansion came after analysis of a large number of accidents revealed patterns in car performance and driver behavior.

NHTSA said the purpose of its extensive investigation was that Tesla’s autopilot technology and related systems “could exacerbate human factors or behavioral safety risks by compromising the effectiveness of driver monitoring.” I said that it is to examine the degree.

A second NHTSA investigation is also underway to determine if the forward-looking radar sensors have been removed from some new Tesla vehicles, causing the vehicles to brake for no reason. This is called “hallucination braking” and can lead to a wreck.

Since 2016, the agency has investigated 30 crashes involving Tesla vehicles with self-driving systems, 19 of which were fatal. NHTSA’s Office of Defects Investigation is also looking at the company’s autopilot technology in at least 11 crashes in which Tesla crashed into an emergency vehicle.

Many such wrecks are not investigated by NHTSA. And between July 2021 and his May of this year, automakers reported about 400 crashes involving cars with driver assistance systems, more than all other manufacturers combined. Many Tesla cars were involved.