Ford wants the self-driving cars of the future to be better listeners and observers: It’s aiming to build vehicles that pick up on the smallest changes in drivers’ vocal inflections and facial expressions, and adapt the drive accordingly.
The company is partnering with a German technology university to create advanced in-car microphone and camera systems. Those systems could help vehicles learn what songs a driver likes; adjust the cockpit’s ambient lighting fit the mood or activity — even know when the driver is in need of a good joke.
The team at RWTH Aachen University also will work on systems that follow gestures or eye movements that will allow drivers to answer calls or adjust the air conditioner without speaking or touching a surface, Ford announced Tuesday.
“We’re looking at how to use those pieces of technology to understand the emotions of the person, and primarily to give the driver a more rich interaction with the voice commands of the vehicle,” said Colin Smith, a Ford spokesman.
At the Detroit auto show this year, multiple suppliers displayed new tactile surfaces, biological sensors in seats, augmented-reality screens and advanced alert systems aimed at protecting occupants and making them more comfortable.
It’s part of an industry-wide push to develop enhanced — and ultimately safer — driving experiences in which a vehicle reacts to the driver in the same way a driver might react to the road.
Ford anticipates roughly 90 percent of its new vehicles will offer voice recognition by 2022. Voice-control capabilities will be offered in 75 percent of its vehicles within the same time frame. Ford has said it will have a fully driverless car without a steering wheel or pedals for braking and acceleration in 2021.
“We think that the voice is one of the primary ways to access (the vehicle),” Smith said. “We’re always looking to enhance that experience.”
Ford’s researchers are working toward a time when systems “evolve into personal assistants” that might schedule appointments or order takeout.
Through its Sync 3 system, Ford vehicles already recognize commands like “I’m hungry” and “I need coffee.” The company announced in January it is partnering with Amazon, Samsung and Sygic to more immediately offer features in 2018 model-year vehicles that integrate those companies’ services like Alexa or Samsung Gear smartwatches.
“We’re well on the road to developing the empathetic car which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive,” said Fatima Vital, senior director of marketing automotive at Nuance Communications, which helped Ford develop voice recognition in its Sync system.
Nuance says that within the next two years, voice-control systems could advance to prompt drivers with suggestions such as “Would you like to order flowers for your mom for Mother’s Day?” “Shall I choose a less congested but slower route home?” or “You’re running low on your favorite chocolate and your favorite store has some in stock. Want to stop by and pick some up?”
Systems like those Ford is researching will become far more vital as auto companies develop autonomous cars, said Bryant Walker Smith, an engineering professor at the University of South Carolina who focuses on the legal aspects of such advanced technology.
“If you’re going to step it up, an almost necessary part of this requires monitoring the human to make sure they’re monitoring the vehicle,” he said.
The new systems could look to occupy the free time that drivers will have in an autonomous vehicle.
“The company wants to be the entity that fills that time,” he said. “Companies are terrified of becoming the boring providers.”