Driverless Cars Cause Moral Dilemma

white google car

Driverless cars cause moral dilemma for their makers

Humanity is looking forward to the day it will be able to hop in a driverless car and let a robot do the driving. This luxury might have its’ cost, researchers warn. Driverless cars cause moral dilemma for their makers. Should they take into consideration the safety of pedestrians above all else, or should they first think of the passengers?

New research revealed that most people would favor their own security over pedestrian security when driving an automated car. If something goes wrong on the road and the robot has to choose between hitting a wall and hitting a passer-by, it’s going to be the passenger that survives.

Science Magazine published explanations from high-profile psychologists and computer scientists, who conducted research on what people expect from a robot-driven car. At a first glance, everyone agreed cars should make decisions supporting the greater good. When it came to putting that in practice, though, things got a bit more complicated.

People chose survival over sacrifice when faced with the option of saving their lives, versus saving pedestrians’ lives.

Science Fiction has long been debating the dilemma of robot morality. With dozens of books and movies covering this subject, scientists still found it hard to put the lessons and debates from Sci-Fi into the real world.

As driverless cars get closer to reality, the philosophical question is starting to look more like a business and less like SF. Should manufacturers obey the consumer and create programs with various moral views, or should governments impose the value of greater good over the value of passengers’ lives?

At the core of the discussion, there is a famous example introduced in 1967 by a British philosopher called Philippa Foot. Let’s imagine a trolley on the loose, rolling towards five workers on the tracks. These lives could be saved by switching the trolley on another track, but, alas, there is one worker on this one too. What would be the correct action to pursue?

The US Army is also joining the discussion, as the war drone technology makes it possible for drones to make their own decisions between life and death.

One philosopher thinks he’s got the answer: We shouldn’t let robots make decisions by themselves. Instead, we should create a partnership with them, allowing us to have the last word.

What is your opinion: Do driverless cars cause moral dilemma?

Image Source – Wikipedia

 

 

Comments

comments

COMMENTS