Tesla recalls ‘Full Self-Driving’ software that runs stop signs

0
97



Tesla’s most recent update to its so-called Full Self-Driving (Beta) software included an “Assertive” mode that allowed vehicles to roll through stop signs at speeds of up to 5.6 miles per hour, without coming to a complete stop. Turns out — unsurprisingly, we may add — the feature ran afoul of National Highway Traffic Safety Administration regulations. According to documents posted by NHTSA, “Failing to stop at a stop sign can increase the risk of a crash.”

The resulting recall includes 53,822 vehicles, including Model S sedans and X SUVs from 2016 through 2022, as well as 2017 to 2022 Model 3 sedans and 2020 through 2022 Model Y SUVs. Tesla isn’t aware of any crashes or injuries caused by feature. A firmware released over the air to disable the rolling stops is expected to be sent out in early February, and owners will get required notification letters on March 28.

As we always point out when reporting on Tesla’s Full Self-Driving and Autopilot technology, they are very much not autopilot or full self-driving. These are not legitimate SAE Level 4 autonomy programs. Drivers should not expect their Tesla vehicles to drive them without human interaction.

Tesla reportedly agreed to disable the rolling stops with the software update on January 20 after meeting with NHTSA officials on January 10 and 19.

The “rolling stop” feature allowed Tesla vehicles to roll through all-way stop signs if the owner had enabled the feature. According to the documents posted by NHTSA, the vehicles have to be traveling below 5.6 mph while approaching the intersection, and no “relevant” moving cars, pedestrians or bicyclists can be detected nearby. All roads leading to the intersection had to have speed limits of 30 mph or less. If those conditions were met, Teslas would then be allowed to go through the intersection at 0.1 mph to 5.6 mph without coming to a complete stop.

Safety advocates complain that Tesla should not be allowed to test the vehicles in traffic with untrained drivers, and that the Tesla software can malfunction, exposing other motorists and pedestrians to danger. Most of the other auto companies with similar software test with trained human safety drivers.

Alain Kornhauser, faculty chair of autonomous vehicle engineering at Princeton University, said the recall is an example of NHTSA is doing its job as the nation’s road safety watchdog. The recall “shows that they can be effective even if Tesla should have been more responsible in the first place,” he said.

In November, NHTSA said it was looking into a complaint from a Tesla driver that the “Full Self-Driving” software caused a crash. The driver complained to the agency that the Model Y went into the wrong lane and was hit by another vehicle. The SUV gave the driver an alert halfway through the turn, and the driver tried to turn the wheel to avoid other traffic, according to the complaint. But the car took control and “forced itself into the incorrect lane,” the driver reported. No one was hurt in the Nov. 3 crash in Brea, California, according to the complaint.

In December, Tesla agreed to update its less sophisticated “Autopilot” driver-assist system after NHTSA opened an investigation. The company agreed to stop allowing video games to be played on center touch screens while its vehicles are moving.

The agency also is investigating why Teslas on Autopilot have repeatedly crashed into emergency vehicles parked on roadways.

Material from the Associated Press was used in this report.

Related video:



Source link

Leave a reply

Please enter your comment!
Please enter your name here