How To Decide When Your Self-Driving Car Should Kill You

Self-driving cars have a lot of learning to do before they can replace the roughly 250 million vehicles on U.S. roads today. They need to know how to navigate when their pre-programmed maps are out of date. They need to know how to visualize the lane dividers on a street that's covered with snow.

And, if the situation arises, they'll need to know whether it's better to mow down a group of pedestrians or spare their lives by steering off the road, killing all passengers onboard.

This isn't a purely hypothetical question. Once self-driving cars are logging serious miles, they're sure to find themselves in situations where an accident is unavoidable. At that point, they'll have to know how to pick the lesser of two evils.

The answer could determine whether self-driving cars become a novelty item for the adventurous few or gain widespread acceptance among the general public.

In other words, the stakes are huge.

Nearly 34,000 people die in car crashes in the U.S. each year, and another 3.9 million people are injured badly enough to go a hospital emergency room, according to the Centers for Disease Control and Prevention. The National Highway Traffic Safety Administration says 93% of traffic accidents can be blamed on human error, and the consulting firm McKinsey & Co. estimates that if humans were taken out of the equation, the savings from averted crashes would add up to about $190 billion a year.

"Us having to drive our own cars is responsible for a tremendous amount of misery in the world," said University of Oregon psychologist Azim Shariff, who studies the factors that prompt people to make moral decisions.

Shariff teamed up with psychological scientist Jean-François Bonnefon of the Toulouse School of Economics in France and Iyad Rahwan, who studies social aspects of artificial intelligence at the MIT Media Lab,...

Comments are closed.