Experts: Tesla could be liable in fatal autopilot crash

Michael Martinez, and Michael Wayland

A fatal accident involving a Tesla Model S in “Autopilot” mode raises questions about who’s at fault when a semi-autonomous vehicle is involved in a crash, and it could change the course of how automakers test and implement the technology.


Lawyers and autonomous vehicle experts believe Tesla Motors Inc. could ultimately be liable in the May 7 crash in Florida that killed Ohioan Joshua Brown.

Details of the accident — including that the driver may have been watching a “Harry Potter” movie on a portable DVD player when the crash occurred — were revealed over the past two days after the National Highway Traffic Safety Administration said it was opening an investigation into about 25,000 Teslas to “examine the design and performance of any automated driving systems in use at the time of the crash.”

Tesla 'autopilot' driver may have had movie on

Some argue the warnings and notifications that come with Tesla’s beta-stage Autopilot will be enough to deflect any blame, while others say the California electric vehicle manufacturer could be at fault depending on the results of NHTSA’s investigation. Most agree this could shape future autonomous vehicle regulations, set legal precedent and impact public opinion of the technology.

“I don’t think anyone is surprised there’s been a fatality,” said Wayne Cohen, a Maryland trial lawyer and law professor at George Washington University. “The question is whether a fatality will impact the regulatory landscape. It probably will. You’re putting a real person and a terrible tragedy on the face of automated driving.”

Paul Grieco, one of the attorneys representing Brown’s family, told The Detroit News he could not say whether the family will take legal action against the car company until officials determine the cause of the crash.

Cohen and others believe Tesla “absolutely” will face a lawsuit as a result of the crash. That’s in addition to potential class-action lawsuits involving other owners.

“The issue ultimately will be whether Tesla’s automated driving product is defective and whether that defect caused the fatality,” he said. “As we learn more about the case and what caused the fatality … that could expose Tesla for damages well beyond this individual case.”

One factor that could determine liability is whether Brown followed Tesla’s instructions about remaining attentive and keeping his hands on the wheel.

In this Sept. 15, 2015 file photo, a Tesla Model S is on display on the first press day of the Frankfurt Auto Show IAA in Frankfurt, Germany. Federal officials say the driver of a Tesla S sports car using the vehicle’s “autopilot” automated driving system has been killed in a collision with a truck, the first U.S. self-driving car fatality.

Frank Baressi, the driver of the truck that Brown hit, said Brown was “playing Harry Potter on the TV screen” when the collision occurred, according to the Associated Press. The movie “was still playing when he died,” Baressi said. He acknowledged he didn’t see the movie, only heard it. A Florida highway patrolman said a portable DVD player was found inside the vehicle but did not say whether it was on.

Gail L. Gottehrer, partner at law firm Axinn, Veltrop & Harkrider LLP, said no one can prevent any class-action lawsuits from happening, but “it’s going to be hard to make a compelling argument that Tesla was trying to deceive anybody” like in recent scandals involving Volkswagen Group for its diesel emissions or General Motors Co.’s faulty ignition switches.

“I think it’s a strong argument on their part that they didn’t hide anything, and the people who chose to get into those vehicles and operate them knew what was expected of them, which was hands on the wheel, you may have to be alert, you may have to take over,” she said.

Tesla said Autopilot is turned off every time the car is shut down and “requires explicit acknowledgment that the system is new technology and still in a public beta phase before it can be enabled.” When drivers activate Autopilot, it reminds them it is an “assist feature” and to keep their hands on the steering wheel.

Bryant Walker Smith, an assistant professor with the school of law at the University of South Carolina who specializes in autonomous vehicle regulations, said it could be argued the notifications with Autopilot aren’t sufficient.

“The fact that Tesla recognized and warned of misuse does not give it a get-out-of-liability-free card,” he said. “Lots of claims can be made about design of the system, level of supervision of the user and interaction.”

If there’s any fear about negative backlash for Tesla, investors aren’t showing it. Tesla’s stock fell slightly after the news broke Thursday, but rebounded Friday and closed up 1.9 percent to $216.50 a share.

Safety during test phases

The fatal accident, Gottehrer believes, is going to raise the question of whether people, regulators or state governments are going to be comfortable allowing technology like this to be used in a test phase and how much blame falls on the company when drivers do something they know is against an automaker’s instructions.

“Any kind of product that you have, if you use it in a way that you’re told not to, it’s dangerous,” said Gottehrer, an expert speaker on liability at the recent auto-tech conference TU-Automotive in Novi. “Something bad could happen, and we have to get people into the realm of thinking it’s not a toy, it’s still a vehicle. It’s not a game. It’s not something you should be putting on YouTube.”

Brown was one of a number of Tesla owners who routinely post videos about the car and its capabilities in autopilot. Brown had a popular YouTube page showing the feature both struggling and performing well in various scenarios. In one video, Autopilot struggled through a sharp curve that included a train track. A month before he died, Brown posted a video of the feature preventing a crash after a truck veers sharply into his lane.

Other automakers have taken more cautious approaches to rolling out semi-autonomous features.

General Motors Co. confirmed in January it was delaying the rollout of its semi-autonomous, hands-free highway driving system called Super Cruise. It was slated to be available on the 2017 Cadillac CT6.

A GM spokesman on Friday said the Tesla Autopilot crash will not change the Detroit automaker’s plans to introduce the technology sometime in 2017.

In its January statement, GM stressed it would not introduce the technology until it was deemed safe: “Super Cruise breaks new ground with true hands-free capability for the highway and will be introduced in 2017. Getting the technology right and doing it safely is most important, so the exact month of introduction cannot be announced at this time.”

A Cadillac spokesman in January also told Wired that it would not release Super Cruise to “hit a date, nor will we ‘beta test’ with customers.”

Infiniti, BMW, Mercedes, Volvo and Audi all have semi-autonomous features similar to Tesla’s autopilot.

BMW launched its “Active Lane Keeping Assistant” on its 7 Series in October. The feature’s steering and lane control technology can keep the vehicle in the center of the lane, but a “Hands on Detection” feature ensures that the driver continues to control the vehicle.

“The safety of our customers is always top priority at the BMW Group. That is why we are always cautious about releasing new driver-assistance functions for the customer — even if we already master the technology to a large extent,” the company said in a Friday statement.

Smith said nobody knows how fast new self-driving features should be introduced to the driving public.

“Tesla could be really helpful in identifying a lot of weird situations,” he said. “Ultimately, that knowledge could speed innovation … and save a lot of lives.

“On the other hand, if there are a lot of these instances, that could slow progress — a la the Hindenburg — in a way that ultimately could prevent safety systems from being implemented.”

Slowing down change

The move toward self-driving cars — from advanced driver-assistance systems such as Tesla’s Autopilot to fully driverless cars — are being implemented at a blistering pace.

BMW on Friday said it plans on introducing a fully autonomous car by 2021, and a number of automaker executives said they expect the technology by the end of the decade. NHTSA later in July is expected to release national guidance for the deployment of such vehicles, and a number of states, including Michigan, are introducing legislation to speed the testing and development of autonomous cars on public roads.

Experts say this fatal crash could pump the brakes on that development.

“Automated driving has been hyped now for years,” Smith said. “It’s a pet peeve of mine when people say it’s ready. Demonstrably, that’s not true.”

Ultimately, he thinks Brown’s death won’t deter most of the public from adopting future semi-autonomous or fully autonomous technology.

“Many will put this in the front of conversation for a few days, then when a different kind of system comes out that offers them tangible benefits, they’ll choose convenience over concern,” Smith said.

(313) 222-2401

Staff Writer Melissa Burden contributed.