California is reevaluating Tesla’s Full Self-Driving (FSD) test program to determine if the electric-car maker’s software should fall under its DMV’s autonomous vehicle regulations, the Los Angeles Times reported on Tuesday.
FSD is an advanced driver assistance system that handles some driving tasks, but Tesla says it does not make vehicles completely autonomous. Its features “require a fully attentive driver,” according to the company.
If Tesla’s cars are deemed autonomous by California, state laws would require it to disclose all crashes on public roads, even when under manual control. Those reports are made public, as is data on self-driving systems being disengaged.
California’s Department of Motor Vehicles (DMV) informed Tesla about the regulator’s review last week, the Los Angeles Times said.
“Recent software updates, videos showing dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in this space prompted the reevaluation,” the DMV said, according to the report.
The DMV and Tesla did not immediately respond to Reuters’ requests for comment.
In October last year, Tesla vehicles with the then-latest 10.3 FSD software repeatedly provided forward collision warnings when there was no immediate danger, according to video postings of beta users. However, Tesla fixed the software within a day.