Rider interaction with automated cars under study
Chandler, Arizona –
Automakers, suppliers and researchers are looking for cues of nervousness and comfort with the idea of self-driving cars. Just how easy or difficult it would be to use a smartphone app to hail and direct an autonomous vehicle. And don’t forget consumers’ reactions to unexpected roadway encounters.
The growing research into how humans interact with self-driving cars and their trust level in the machines will help automakers, suppliers and others in the tech-auto space better develop autonomous vehicles that some say could be on the road within the next five years.
There are challenges, as a number of recent studies have shown the public’s reluctance to riding in self-driving cars and giving up control of the steering wheel and brakes to a robot. Yet, automakers and suppliers are pouring billions into the development of robocars they believe will make roadways safer and dramatically reduce traffic fatalities and serious injuries.
The chipmaker Intel Corp. in late June conducted a small study of 10 consumers — ages 25-65 — to see how back-seat passengers engaged and interacted with an autonomous car. The company, which recently acquired Israel-based tech company Mobileye, is looking to learn what is needed with a self-driving car to establish trust between people and machines — and how people may use a self-driving car to hail a ride.
While Intel has been studying and working in self-driving vehicle development for years, the study was the first by the company registering consumers feedback while inside a self-driving vehicle on the road. Intel plans to release its findings later this month.
“We’re looking for things like ... are they nervous when they first get in, and show anxiety, maybe how long does it take for someone to get comfortable and kind of stop paying attention and just go on their phone and surf Facebook,” Marcie Miller of Intel’s Automated Driving Marketing group said.
Intel has developed its own smartphone apps to hail self-driving cars like people do with an Uber or Lyft. A screen on the rear back window of Intel’s self-driving test cars welcomes passengers by their names. Cameras inside the car watch for nonverbal feedback.
In the backseat, a screen allows a rider to control where the car takes passengers.
“In these cars you will basically have something that’s very similar to a Google maps interface, where you can just type in where you want to go,” said Al-Yaman Awad, head of the advanced vehicle lab for Intel in Chandler. “It opens up a map, shows you where the car is, shows you were the destination is, and you can actually follow the map as it’s driving in real time.”
Awad said the study also would look to see how riders react when a self-driving car has to re-route, or if a car backed out in front of it.
Several studies and surveys indicate automakers and suppliers have an uphill climb to gain consumer acceptance and trust.
AAA, in a survey earlier this year, found 78 percent of U.S. drivers said they would be afraid to ride in an autonomous car, and just 10 percent reported they would actually feel safer with self-driving vehicles on the road. A recent J.D. Power study found a growing percentage of people say they don’t trust self-driving car technology.
Doug Parks, General Motors Co. vice president of autonomous technology and vehicle execution, said screens in autonomous cars could help communicate with riders about all of the things the car senses and can view, and how the technology works would be powerful to a rider. But he would not confirm if GM has plans to implement anything like it in its future autonomous vehicle.
GM Chairman and CEO Mary Barra said customers’ trust in self-driving vehicles will grow as people experience them.
“The best way is to actually get in the vehicle,” she told reporters in June. “You can talk about it, but until you can experience it .... (And) that’s why we think putting it in ride-sharing fleets is going to be so very, very important to get that experience.”
Building trust in self-driving vehicles also comes outside the vehicle, Awad said. Passengers or bicyclists crossing the street in front of an autonomous vehicle want to know the car sees them, so visual indicators such as light signals could be used, he said.
“It’s got to be easily understood and it’s got to be able to work in dense environments as well where you have a lot of people crossing the roads,” he said.
Nissan Motor Corp.’s research center in Silicon Valley is studying how autonomous vehicles will make their intentions known to pedestrians and other motorists. Nissan, in a 2016 research abstract, said it has developed and is testing technology that communicates to others on the road that a self-driving car’s sensors have identified them. Nissan said the system can communicate what the vehicle plans to do such as yielding or going through an intersection, though it did not provide specifics.
A number of companies and auto suppliers are looking at cameras and other ways to ensure the drivers are paying attention so they can take control if necessary.
Auto supplier Robert Bosch GmbH is working to launch a self-driving car system for freeway use. The German multinational engineering and electronics company says its system has an interior camera that recognizes drivers. Once they select a destination, drivers are told which parts of the route they will have to control and where they can relax.
Bosch says to activate the self-driving mode, a user has to press two buttons on the steering wheel at the same time for several seconds. The roadway of a map on the vehicle’s display roadway turns blue to show when autonomous mode is activated. The display also tells a driver what the car sensors see and how much time a user has before they have to take control. A camera monitors drivers’ eye movements; if they close their eyes for a long stretch, warnings are activated.
Another German-based automotive manufacturing company, Continental AG, plans to launch its Cruising Chauffeur self-driving function for highway driving in 2020. An interior camera and algorithms analyze and determine a driver’s gaze pattern. Through “artificial empathy,” the autonomous car draws conclusions about the driver’s attention level and if the driver is ready to regain control. It tries increasingly to get the driver’s attention through visual and audio cues, plus seat vibrations; if a driver doesn’t take over when prompted, the self-driving car safely will move to the shoulder and stop.
“The sooner the operator gets used to these cues and gets trust developed into the vehicles our fatality numbers are going to go down drastically,” said Ibro Muharemovic, who leads Continental’s project. “That’s what our goal here is. The studies that we’re doing are really to find the optimal line of how much information is needed vs. how much is required, so that an individual can feel safe and secure behind a vehicle such as a vehicle that drives itself.”