Tesla, Uber crashes prompt questions about robot cars

Keith Laing
The Detroit News

Washington — Tesla CEO Elon Musk complains that crashes of cars driven in semi-autonomous mode are given out-sized attention after a string of accidents involving his company's vehicles being operated on Autopilot.

But publicity over similar incidents, including a self-driving Uber vehicle that struck and killed a pedestrian, appears to be eroding consumer trust in the emerging technology, recent polling shows. It is leading to calls for Tesla to abandon the Autopilot name – and for all carmakers to adopt universal and clear names for various levels of autonomous vehicle technology.

A Tesla sedan collided with a parked police car in Southern California on Tuesday. The driver, who walked away with minor injuries, told police the car was on Autopilot.

Another Model S in Autopilot mode crashed into a fire truck in Utah on May 11. The car accelerated in the 3.5 seconds leading up to the crash, onboard data shows. The driver, who did not touch the steering wheel in the 80 seconds before impact while she checked for the best route to her destination, told police she thought automatic braking would stop the car before it hit another vehicle. 

A Tesla Model X driving Autopilot-engaged collided with a highway median in California in March, killing the driver, whose hands were not on the wheel for 6 seconds before the crash. 

“It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage,” Musk tweeted after the May 11 crash, “What’s actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle.” 

Last week, Uber shut down its self-driving program in Arizona after a March incident in which a pedestrian was struck and killed by one of Uber's robotic Volvos. Federal investigators found the self-drive system detected the woman about six seconds before the SUV hit her, but didn’t stop because emergency braking was disabled.

The public appears to be taking notice. A study released by AAA last week shows  consumer confidence in self-driving cars is waning. It found 73 percent of American drivers are afraid to ride in a fully self-driving vehicle, up significantly from the 63 percent in late 2017. 

At the same time, newer cars equipped with features like lane-keeping ability, blind-spot detection and automatic braking are helping drivers travel more safely every day. While not fully self-driving systems, they are steps on the way to full autonomy. The Insurance Institute for Highway Safety says lane-departure warnings lower the rate of single-vehicle, sideswipe and head-on crashes by 11 percent and lower the rates of injury in similar crashes by 21 percent. 

U.S. Gary Peters, D-Bloomfield Township, who has worked on legislation that would craft rules for fully self-driving cars, said many of the problems that have cropped up with semi-autonomous vehicles will be addressed when cars are completely capable of driving themselves.  

"It’s important to move quickly to modernize federal safety rules and provide regulators with the needed tools and resources to ensure advanced vehicles are deployed safely, but the companies deploying driver assist technologies today must do more to ensure people understand both the capabilities and limitations of the vehicles they are using,” Peters said in a statement. 

Understanding those limitations is behind a AAA push for a "common sense, common nomenclature and classification system" for semi-autonomous vehicles to ease confusion over the capabilities of different vehicles. Tesla in particular has come under fire for calling its limited semi-autonomous system Autopilot.

The Center for Auto Safety and Consumer Watchdog last week urged the Federal Trade Commission to launch an investigation of Tesla's "deceptive" Autopilot marketing.

"Two Americans are dead and one is injured as a result of Tesla deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is," the groups wrote in a letter to FTC Chairman Joseph Simons. "The marketing and advertising practices of Tesla, combined with Elon Musk’s public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of 'self-driving.'" 

Tesla has maintained that it makes clear to drivers that Autopilot is not a system that is capable of full autonomy. The company argues the message has gotten through to drivers more than critics admit. "The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of,” Tesla said in a statement.

Despite that, Tesla agreed last week to pay $5 million to settle claims that were raised in a class-action lawsuit that alleged that its Autopilot 2.0 software that was supposed to add additional safety features such as automated emergency-braking and side-collision warning does not work as promised. 

David Friedman, director of cars and product policy and analysis for Consumers Union, said it is not surprising that widely publicized crashes are eroding trust in self-driving cars, even though some of these are not autonomous vehicles. 

"But when you have companies advertising that they are, what do you expect?" he asked. "All you have to do is Google people in these cars and it's totally predictable the ways people will overestimate the capability of these cars." 

Friedman said it is important for carmakers to do more than put warnings in owners' manuals about  the limits of their semi-autonomous systems. As an example of an automaker doing things right, he pointed to General Motors Co.'s new semi-autonomous Super Cruise system, which only allows drivers to go hands-free once they are centered in highway lanes and prompts drivers to keep their eyes on the road with warning lights and seat vibrations. When the cars exit highways, Super Cruise turns itself off and leaves all the driving to the human pilot.

Michelle Krebs, executive analyst for Autotrader, said the truth about the safety of semi-autonomous cars that are currently on the road is likely somewhere in the middle of arguments that have been made by Musk and arguments from safety advocates. 

"Self-driving vehicles are new, in their pioneering stage and controversial," she said. "Obviously, when they get into an accident, they will get particular scrutiny by investigators and the media. Tesla, because it is high-profile and made so by Elon Musk, is of great public and media interest, so accidents that involve Teslas also will get undue attention." 


(202) 662-8735

Twitter: @Keith_Laing