LINKEDINCOMMENTMORE

Obtaining a driver’s license is a rite of passage for nearly all teenagers. Now, two researchers from the University of Michigan think self-driving cars should have to pass tests and get a license, too.

Michael Sivak and Brandon Schoettle of the U-M Transportation Research Institute said Thursday in a research paper that autonomous cars should be subject to tests that measure their vision, knowledge of traffic laws and other skills.

A number of automakers, along with technology companies like Google, Apple and Uber, are working toward building fully autonomous vehicles. Many in the industry predict the first versions will be on roads by the end of the decade.

“The graduated driver’s license approach would be applicable should a manufacturer explicitly decide to limit the operation of its vehicles to certain conditions, until improved hardware or software become available,” Sivak said in a statement.

For instance, automakers may feel confident self-driving cars can handle all situations except nighttime and snow. In that case, Sivak said, the vehicle could pass a licensing test related to limited conditions and receive a provisional license that would exclude nighttime driving and driving in snow.

The duo said self-driving cars should be able to easily pass any visual-based tests so long as the necessary sensors are kept clean and in good working order, but those tests could run into problems during rough weather. The researchers noted Google has said it will keep its autonomous cars out of snowy areas, and added that even strong rain is a problem for some prototypes.

The study also said self-driving computer systems have problems recognizing certain hazardous situations like downed power lines or flooded roadways.

Many times, teens have trouble remembering certain traffic laws, something that shouldn’t be a problem for the cars.

“Programming all driving and traffic laws and regulations into an onboard computer should be relatively easy,” the researchers said. “In principle, all that needs to be done is to program the complete set of laws and regulations that are contained in any state’s booklet for prospective drivers.”

Some problems remain, though. What happens if the self-driving car is stuck deciding between crashing into another car or swerving onto the sidewalk and hitting pedestrians? What about laws that most drivers tend to break, like driving five to 10 miles per hour over the speed limit?

“It would be desirable if the resolutions of such ethical dilemmas were consistent with societal norms, as is hopefully the case with human drivers,” the researchers said.

mmartinez@detroitnews.com

(313) 222-2401

Twitter.com/MikeMartinez_DN

LINKEDINCOMMENTMORE
Read or Share this story: http://detne.ws/1RZITId