U.S. ends probe of fatal Tesla crash without recall
Washington — The U.S. government will not order recalls or fines as it concludes its investigation of a fatal crash involving a 2015 Tesla Model S that was operating with its automated driving system activated. But federal regulators warned automakers they should make it clear to drivers that semi-automated cars are not capable of fully driving themselves.
The National Highway Traffic Safety Administration said Thursday it did not find safety defects in Tesla’s semi-autonomous “Autopilot” system in the car that was involved in the May 7, 2016, crash in Williston, Florida. It is believed to be the first U.S. death in a vehicle being driven in semi-autonomous mode.
Critics have accused Tesla of misleading drivers into believing its Autopilot-equipped cars are fully capable of driving themselves, although the company has said it warns drivers that they must be ready to take over in an emergency.
U.S. Transportation Secretary Anthony Foxx said Thursday that such warnings are more important than the branding of features like Tesla’s Autopilot.
“As you start to see more variations in the marketplace, what one company calls something might be a little different to a consumer if it’s called the same thing by another company,” Foxx told reporters during a final briefing in Washington. “So I think the most important thing is that the manufacturers be clear about what their technology can do and what it can’t do and what consequences there may in the event that someone crosses that line.”
He noted that Tesla’s vehicles are only at a level two on the transportation department’s five-point scale that rates the capability of a car to drive itself.
NHTSA said its examination of the deadly crash did not identify any defects in the design or performance of the automatic emergency braking or Autopilot systems.
It said automatic emergency braking systems used in the automotive industry through model year 2016 are “rear-end collision avoidance technologies that are not designed to reliably perform in all crash modes, including crossing-path collisions.”
And it said, “The Autopilot system is an advanced driver-assistance system that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes.”
Tesla CEO Elon Musk rushed to highlight NHTSA’s conclusion that there was no safety defect. “Report highlight: ‘The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation,’ ” he tweeted.
NHTSA’s decision to spare Tesla the harsh penalities that have been doled out to other automakers for safety violations in recent months brings to close a months-long investigation that roiled the debate about the future of self-driving cars.
The company faced questions after 40-year-old Joshua Brown was killed in the Florida crash. His 2015 Model S was operating with the driver-assist system engaged.
Brown’s Tesla collided with a semi-trailer that was undetected by the car’s Autopilot feature when the truck turned left in front of it. Radars on the car could not distinguish the side of the white truck from the sky. Florida police said the roof of the car struck the underside of the trailer and the car passed beneath. Brown was declared dead at the scene.
“Neither Autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said in a blog posting on June 30 after the NHTSA investigation was launched.
Karl Brauer, executive analyst for Autotrader and Kelley Blue Book, said the road to the autonomous car is going to be messy, with computers gradually increasing their ability to replace humans over the next 10 years.
“The Tesla fatality is an example of a human driver assigning too much capability to the car’s sensors and computing power,” Brauer said. “ It almost certainly won’t be the last incident on this journey, but people need to remember one thing — there are currently no fully autonomous cars available for public purchase or use. The advanced driver-assist systems found on many of today’s vehicles still require the full attention of the driver at all times.”
Consumer groups had seized upon the deadly crash to argue that Tesla is rushing self-driving cars to market.
John Simpson, privacy project director at the Santa Monica, California-based Consumer Watchdog group, said Thursday that regulators are “blaming the human” in their finding of no safety defect.
“NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the technology and Tesla’s aggressive marketing,” Simpson said in an email. “The very name ‘Autopilot’ creates the impression that a Tesla can drive itself. It can’t. Some people who apparently believed Tesla’s hype got killed. Tesla CEO Elon Musk should be held accountable.”
Tesla has defended the system and has maintained that demand for its vehicles is still strong.
NHTSA spokesman Bryan Thomas said Thursday that the federal probe of the Tesla crash was “thorough.”
The agency’s Office of Defects Investigation said the Florida fatal crash appears to “a period of extended distraction” that lasted at least 7 seconds.
Thomas noted Thursday that Tesla updated its Autopilot software in September to give drivers more frequent warnings and institute a “three-strikes” rule that disables the Autopilot feature after a driver fails three times to respond.
He said Tesla’s upgrades were not related to the federal probe, although he said regulators were happy with the outcome. “There can be continuing safety improvements, and we always encourage that,” he said. “But that doesn’t mean we found a defect on May 7.”
Thomas added that federal regulators hope auto companies will continue to give drivers adequate warnings about the limits of semi-autonomous features.
“Generally speaking, our view is that the manual should be clear about the limitations of the vehicle,” he said. “But we strongly believe it’s not enough to put it in the manual and hope drivers are going to read it.”