DETROIT - Most cars are equipped with a compass to help them get from one place to the next, but with self-driving cars on the horizon, will that compass still be necessary? Experts said it will be, but rather than a magnetic one, they're talking about a moral compass.
Science fiction writer Isaac Asimov conceived of three laws of robots. The first is that a robot may not injure a human being, or through inaction, allow a human being to come to harm.
When it comes to programming self-driving cars, it's a sensible rule, but as all drivers know, in the real world, there's the real possibility of unavoidable crashes.
What should a self-driving car do if a person suddenly runs into its path? Should it brake abruptly but still hit the person? Should it swerve to the side, potentionally hitting someone else? Or go off the road and put the vehicle's occupants at risk?
The question could become even more complex if you imagine it's not just one person off to the side of the road but if there were a group of people.
Professor Huei Peng is the director of M-City, the University of Michigan's autonomous vehicle research center.
"Hopefully, we never enter the situation that I only have three choices and kill someone," Peng said.
The most agreed upon solutions were that it was best to spare the most lives, humans over animals, and the young over the elderly. Those answers come from a survey of millions of people in hundreds of countries.
If any of the solutions involve sacrificing the occupants of the vehicle, how keen would consumers be to buy cars that are programmed to sacrifice them?
"Basically, people when they are making decisions to buy a car to protect their self or their family members, they will still be somewhat greedy," Peng said.
According to Peng, the current sensor technology isn't robust enough for the vehicle to have the necessary awareness to make those types of highly specific decisions in a split second.
Copyright 2019 by WDIV ClickOnDetroit - All rights reserved.