Does your Car have a Moral Dilemma?

Would you buy a Self-driving car knowing that it may sacrifice your life?

Would you buy a Self-driving car knowing that it may sacrifice your life?

(Photo by Silver Blue via Flickr)

When I was a kid all I wanted was a self driving car named “K.I.T.T” (Knight Industries Three Thousand) now its 2016 and self-driving cars are just about here, but there is a Moral Dilemma that comes along with  self-driving cars.

In the event of a crash should the vehicle save some passengers or pedestrians?

(Photo by Becky Stern via Flickr)

Self-driving cars could save a million lives a year by eliminating the 90 per cent of car crashes caused by human error.

Should the vehicle’s algorithms be programmed to swerve and sacrifice its passengers if it means saving the lives of many pedestrians? Or should the car protect its occupants at all costs?

It’s a question with no straightforward answer, and while many companies are busy hunting down the solutions to the technical side of self-driving vehicles.

“We’re at the edge of an age where we are programming behaviors into machines,” said Wendy Ju, executive director of interaction design at Stanford University’s Center for Design Research.

(Photo by Mark Doliner via Flickr)

According to Iyad Rahwan, a study co-author and a professor at MIT’s Media Lab, “Even if you started off as one of the noble people who are willing to buy a self-sacrificing car, once you realize most people are buying self-protective ones, then you are really going to reconsider why you are putting yourself at risk to shoulder the burdens of the collective when no one else will.”

Would you buy a Self-driving car knowing that it may sacrifice your life?

MIT Media Lab allows users to take the test the Moral Machine…. Who would you save?