Tesla subpoenas may presage formal probe, ex-official says

Ryan Beene, Gabrielle Coppola and Alan Levin
Bloomberg News

Freshly disclosed records suggest the National Highway Traffic Safety Administration may be preparing a formal investigation into Tesla Inc.’s driver-assistance system Autopilot, a former agency official said.

The agency has issued at least five subpoenas since April 2018 for information about Tesla vehicle crashes, according to NHTSA correspondence with the electric-car manufacturer released Tuesday by Plainsite. The legal transparency group obtained the documents through a public records request for communications regarding Autopilot.

Shares of Tesla were down 50 percent since September 2019, on Monday, May 20. While it may be too early to know if investor sentiment on Tesla Inc. has taken a positive turn, June has begun on a much more optimistic note.

NHTSA also asked Tesla to provide results of internal tests on a sub-component of the Model 3 sedan’s automatic emergency braking system, and sales figures of vehicles sold with and without Autopilot since mid-2016, among other requests, according to the records.

“I think what this shows is that NHTSA has concerns about Autopilot performance,” Frank Borris, a former director of the Office of Defects Investigation at NHTSA, said after reviewing the documents. He said the subpoenas could mean the agency “is gathering information that would be supportive of a formal investigation.”

NHTSA doesn’t have an active defect probe into Tesla, and the agency may not open one. The regulator declined to comment directly on whether it will, saying in an emailed statement that it’s “committed to rigorous and appropriate safety oversight of the industry and encourages any potential safety issue be reported to NHTSA.”

“Any regulator like NHTSA would be interested in new vehicle technologies and how they make our highways safer,” Tesla said in an emailed statement. “We routinely share information with the agency while also balancing the need to protect customer privacy. Tesla has required subpoenas when customer information is requested in order to protect the privacy of our customers.”

Tesla described the documents as “business as usual,” but Borris said use of subpoenas is atypical and suggests a heightened interest in Autopilot.

Tesla Chief Executive Officer Elon Musk has staunchly defended Autopilot, saying the system improves safety and monitors more of the road than a human can do alone. The company releases quarterly data that it says demonstrates the technology improves safety. Its latest report says Tesla registered one accident every 3.27 million miles driven with Autopilot engaged, compared with one every 1.41 million miles driven without use of the system or the company’s active-safety features.

“No one knows about the accidents that didn’t happen, only the ones that did,” Tesla said in a March 2018 blog post about a fatal crash involving a Model X in California. “The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe.”

NHTSA sent the subpoenas and other requests amid a series of highly publicized Tesla crashes that date to early 2018 that attracted scrutiny from federal agencies and safety advocates alike.

Two subpoenas were issued March 11, according to the records, with one seeking information and data about a Tesla that crashed just 10 days prior in Delray Beach, Florida, where a Model 3 driver was killed when the car slammed into the side of a semi-truck with Autopilot engaged.

NHTSA and the National Transportation Safety Board have investigated several Tesla crashes in recent years, and at times NTSB has clashed with company officials.

The safety board removed Tesla from its regular participation in its probe of the Model X crash in March 2018, saying the company disclosed information about the case in spite of an agreement not to do so while the probe was underway.

Tesla said in a statement at the time it was withdrawing from its status as a party in the investigation because NTSB rules prohibited it from being transparent. The company vowed to continue providing technical data to the safety board.

The NTSB in 2017 found that Tesla’s Autopilot design, which allowed drivers to engage it on roads for which it wasn’t designed, contributed to the cause of a fatal crash involving a Model S in Florida in 2016.

The records released this week show that NHTSA has continued to closely monitor Tesla’s Autopilot technology after the agency in early 2017 closed an earlier probe into the system that found no defect.

In May, Consumer Reports called for the agency to open up another inquiry. The magazine published a study of automated driving systems months earlier that found Tesla’s Autopilot performed better than others, but it knocked the company for allowing the system to be used on roads it isn’t yet able to handle. Autopilot also lagged peers in keeping drivers engaged: General Motors Co.’s Super Cruise feature took four seconds to warn a driver to pay attention, while Autopilot waited 24 seconds.

Data on driver engagement that is included in Tesla’s communications with NHTSA point to a similar issue, said David Friedman, a former deputy administrator at NHTSA during the Obama administration, who’s now vice president of advocacy at Consumer Reports.

“Data like this show the system does not appear to be able to keep the driver engaged, and it’s one company, not the others in the space,” Friedman said. “To me, that raises real red flags about a possible defect.”