Choosing Who the Autonomous Car Should Kill
Politics / US Auto's Dec 08, 2018 - 04:59 AM GMTBy: Rodney_Johnson
	 
	
   I don’t want to be President of the  United States.
I don’t want to be President of the  United States.
I wouldn’t want to subject my family, or  myself for that matter, to the intense scrutiny required of elections.
I wouldn’t want to deal with what looks  like political ego mania on Capitol Hill.
But, most of all, I wouldn’t want the  awful responsibility of sending our sons and daughters to their deaths in war.
Such a sacrifice is often required to keep our political balance, but is it worth it?
But what if that responsibility didn’t rest with the president?
What if choosing life and death with the same certainty – you know the loss of life will happen, just not when and to whom – fell to ordinary people?
That’s what’s coming.
I’m thinking about autonomous cars when I say that.
As we move closer to seeing them on the street, we’ll have choices to make, and some of them will be about who lives and who dies.
The New Trolley Problem
For decades, at least, philosophers have  struggled with the “trolley problem.”
  You’re a trolley conductor and there’s a  split in the tracks ahead. A bad guy has tied three people to the main track  and five people to the bypass.
  The brakes on the trolley fail.
  Which direction do you take?
  You must choose one, since no choice  still results in death.
  We wrestled with this question in one of  my college philosophy classes. The argument starts with the premise that life  is priceless. That sounds good, but it’s not true.
  A life has a value, which is another  life.
  Once we deal in multiples, we make  choices that seem obvious. Kill fewer people, right? That makes sense. But the  choices don’t stop there.
  Software engineers working on the  commands that control autonomous cars must work through similar questions.
  If an obstacle blocks the path of the vehicle,  should the vehicle slam into the obstacle and kill the occupants?
  Should it hit oncoming traffic,  potentially killing occupants in more than one vehicle, or launch onto the  sidewalk and hit pedestrians?
  Does the composition of car passengers or  the types of pedestrians make a difference?
And Who Gets To Decide?
Do we go with the recommendations of  philosophy professors in ivory towers?
  A group at MIT had a different idea.
  Instead of working through the moral  implications and arriving at a conclusion, they created a survey and posted it  online. It’s called the moral machine. You can take the survey at www.moralmachine.mit.edu.
  Participants flip through 13 or 14  scenarios that show different people in the path and alternate path of a  vehicle, as well as different occupants in the vehicle. You must choose whom  the car hits or, in some instances, if the car rams into a stationary barrier  and kills the occupants.
  I have no idea how many people the survey  creators expected to go through the survey, but the thing went viral. More than  40 million people around the world have weighed in.
  The results are… interesting.
Who Would We Kill?
Some of the outcomes make sense.
  Compared with killing the average male,  participants widely favored sparing a woman with a stroller. Not that we could  tell in a split second, but participants also chose to kill a thief more often  than any other person, and chose to spare a male doctor slightly more often  than saving a female doctor. That’s odd because we chose to spare girls more  than boys, and chose pregnant women more than any other individual adult.
  Confounding philosophy professors, the  survey showed that people favor saving dogs over thieves. But in a blow to  animal lovers in general, cats were at the bottom of the list, sacrificed to  save, well, everybody and everything else.
  Some of the survey is entertaining, in a  morbid sort of way. But that’s because we’re looking at little icons on a  screen, and we’ve been conditioned by modern life to treat figures on a screen  as nothing more than pixels.
  What happens when, not if, but when, this  becomes real? When an autonomous car “chooses” to hit a pedestrian instead of  harming the occupants of the vehicle or entering oncoming traffic?
  It might be the correct thing to do in  light of the choices, but will the family of the pedestrian take comfort in the  logic? More importantly to the future of autonomous driving, will they agree  not to sue the software and car manufacturers for making a pre-determined  choice?
  Accidents happen every day, with people  causing mortal harm to others. But they didn’t make the choice ahead of time,  which is key.
  As we get closer to driverless cars  becoming a reality, the trolley problem will become a bigger issue. As with  being president, I don’t want the job of writing that software, or running  those companies.
  Which begs the question. Who would you  kill?
  Rodney 
Follow me on Twitter ;@RJHSDent
By Rodney Johnson, Senior Editor of Economy & Markets
Copyright © 2018 Rodney Johnson - All Rights Reserved Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors.
© 2005-2022 http://www.MarketOracle.co.uk - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.
	

 
  
