Feds: Companies must report crashes involving self-driving technology

Riley Beggin
The Detroit News

Washington — Federal regulators will now require companies to report crashes involving self-driving and driver-assist systems, the National Highway Traffic Safety Administration announced Tuesday. 

The new policy comes as the agency has opened 30 investigations into Tesla crashes resulting in 10 deaths, and after competitors and other federal agencies have called for increased accountability and oversight for the emerging technology.

This Jan. 22, 2018, file still frame from video provided by KCBS-TV shows a Tesla Model S electric car that has crashed into a fire engine on Interstate 405 in Culver City, Calif., while the vehicle's Autopilot system was activated.

"Advanced driver assistance systems can help promote safety by helping drivers avoid crashes and by reducing the severity of crashes," said Steven Cliff, NHTSA's acting administrator. "However, these technologies can be potentially dangerous if drivers don't understand how to use them, or worse, choose to misuse or abuse them."

The order applies to Level 2 driver assistance systems — such as when the car can take over steering, accelerating or braking from the driver, who must remain alert behind the wheel — and to Level 3 through 5 self-driving systems, which include features more commonly associated with self-driving cars. There are no fully self-driving cars yet available to consumers.

It requires that companies report crashes that require someone to go to the hospital, if someone dies, if the vehicle is towed away, airbags go off or if pedestrians or bicyclists are involved. Reports are due the day after the company learns of the crash and must be updated within 10 days. 

Companies will also be required to report all crashes involving injury or property damage every month and must update reports monthly with any new information. 

Compliance with the order is mandatory and "any company that fails to comply could be subject to serious enforcement consequences, including substantial civil penalties," said chief agency counsel Ann Carlson.

The order marks an increase in regulation on driver assistance and self-driving systems, which to date have only been subject to a voluntary reporting system for companies testing automated vehicles. NHTSA also formally indicated late last year it intended to propose a safety rule system for the technology.  

Safety advocates called the decision a welcome but overdue step to improve safety and move toward developing stronger rules for semi-autonomous vehicles.

"The agency has apparently finally heard the Center for Auto Safety's long-standing call for the federal government to engage in oversight of the unregulated technology currently being used on America's roads with scant oversight due to minimal data collection," the safety group said in a statement Tuesday. 

Leading automakers, safety advocates and other federal agencies have raised alarms about the unregulated technology amid a series of high-profile crashes mostly involving Tesla vehicles, including two earlier this year near Lansing and Detroit. 

The Internet is rife with examples of people driving on highways asleep behind the wheel of Tesla or in the backseat of their car. Consumer Reports found earlier this year that a Tesla can easily be tricked into driving with no one in the drivers' seat. Critics argue that Tesla's Autopilot and "Full Self-Driving" — both of which are driver-assist systems and don't make the vehicle fully autonomous — have misleading names that give customers the false impression they don't need to pay attention on the road. 

But Tesla Inc. CEO Elon Musk told Automotive News last year that it would be "ridiculous" to rename the Autopilot system. 

When people crash it "is because somebody is misusing it and using it directly contrary to how we've said it should be used," Musk said. "They've ignored the car beeping at them, flashing warnings, doing everything it can possibly do. It's not like some newbie who just got the car, and based on the name, thought they would instantly trust this car to drive itself. That's the idiotic premise of being upset with the Autopilot name."

In addition to 30 Tesla crashes, NHTSA is also investigating six other crashes involving driver assistance systems in a Lexus, two Volvos, two Cadillacs and a Navya shuttle bus. 

In April, the leading advocacy group for automakers selling vehicles in the U.S., the Alliance for Automotive Innovation, called upon automakers to ensure that any semi-autonomous vehicles come with driver monitoring technology to keep drivers engaged and to clarify messaging so as not to confuse consumers. The group does not represent Tesla. 

The next month, Ford Motor Co. CEO Jim Farley called upon the federal government to set standards for semi- and fully-autonomous vehicles. The National Transportation Safety Board has also criticized NHTSA for failing to regulate the technology and urged it in a letter earlier this year to increase federal oversight and reporting requirements.

Tesla did not respond to a request for comment on the order from The Detroit News. 

While the move appears motivated in part by the Tesla crashes, it may provide a more vivid picture of how semi- and fully-automated driving systems from many different companies are performing on the road, said Bryant Walker Smith, an associate professor at the University of South Carolina law school who specializes in autonomous vehicles. 

"This is an important piece of NHTSA's information-gathering and its public education. I was really gratified to see the emphasis on transparency," he said, noting that most of the reports will not be shielded from the public for proprietary reasons.

The order will set NHTSA up to move forward on potential safety standards and improve recalls and investigations, he said. But he warned that the advancement of autonomous systems shouldn't distract from the dangers of old-fashioned human driving. Traffic deaths rose 7% last year, the biggest jump in 13 years. 

"We should be concerned about automated driving. We should be terrified about human driving," he said. "We want to get these technologies deployed, if they represent safety improvements, as quickly as possible ... Having this kind of long-term, public conversation is really important."

rbeggin@detroitnews.com

Twitter: @rbeggin