We’re on the brink of a new era of transportation, with self-driving cars making decisions as we take the back seat. One analyst recently made a “conservative” estimate that 10 million autonomous vehicles would be on the road by 2020. But this future also forces us to confront an ugly question: If an autonomous vehicle has to suddenly choose between killing two people, whom should it choose?
What would a human do?
A recent study from the University of Osnabruck in Germany examined how human drivers would react to a sudden ultimatum. Scientists put study participants in virtual reality driving scenarios and forced them to make a choice between hitting one virtual object and another. While “driving,” the study participants would suddenly be confronted with a split-second moral decision — if they swerved left, they might hit and kill a virtual man walking his dog, for example. If they swerved right, they might kill a virtual woman walking alone.
“The truth is, humans put a price tag on each and every thing. There’s a price tag on the left lane and a price tag on the right lane,” Peter König, a professor at the University of Osnabruck’s Institute of Cognitive Science, said in a phone call. “Autonomous [vehicles] will decide as a consequence of their construction.”
Yes, self-driving cars will kill people. Here’s how they’ll decide who to save.