Happy Atheist Forum

General => Philosophy => Topic started by: Arturo on November 30, 2016, 04:01:15 AM

Title: Moral Machine
Post by: Arturo on November 30, 2016, 04:01:15 AM
Ever wanted to have horrible moral choices about whose lives you get to save? Try it today and post your results!

http://moralmachine.mit.edu/ (http://moralmachine.mit.edu/)

My Results
http://moralmachine.mit.edu/hl/en/results/361110971 (http://moralmachine.mit.edu/hl/en/results/361110971)
Title: Re: Moral Machine
Post by: Sandra Craft on November 30, 2016, 05:10:28 AM
I got might sick of that self-driving car.  I couldn't see how to share my results, the only link I saw was the one to MIT.  However, my preferences in who to save went animal, young, female.  I didn't even notice such factors as obeying the law, or intervention.
Title: Re: Moral Machine
Post by: xSilverPhinx on November 30, 2016, 11:29:40 AM
:chin: Interesting.

Here are my results.

http://moralmachine.mit.edu/results/474886023 (http://moralmachine.mit.edu/results/474886023)
Title: Re: Moral Machine
Post by: joeactor on November 30, 2016, 02:21:19 PM
They left out the most important question: Am I in the car?

Here's my results:
http://moralmachine.mit.edu/results/-1410200376
Title: Re: Moral Machine
Post by: Asmodean on November 30, 2016, 03:08:08 PM
This one was easy for me. The car should always be on the side of protecting its occupants. Upholding the law comes in as a secondary consideration. Human lives over other animals as tertiary.

Basically, you hold your lane unless you kill the occupants of the car by doing so. You change lanes if it is safe in order to collide with animals other than humans. As long as a collision with humans is unavoidable, hold the lane.

Unless the robotic car is always on the side of its occupants, I, for one, don't want one. My life is all good and well, but as a driver, my primary responsibility is to my passengers, not to those outside the boundaries of my vehicle. This is not something I am willing to easily budge on.

EDIT: Also, I see the results vary depending on the circumstances one is presented with... At least for me, they do. The first two or three stay put, but the rest are pretty much down to the composition of the crowd I plough into by following the above rules.

http://moralmachine.mit.edu/results/1217361777
Title: Re: Moral Machine
Post by: Davin on November 30, 2016, 03:34:58 PM
I can never wrap my head all the way around these kinds of false dichotomies.

In the case where the machine can choose between stopping itself, or to keep on running, the obvious choice should be stop itself even at the risk of the passengers. Even if say, in the first case there are two passengers and one pedestrian, what happens if it keeps going after running over the pedestrian? How many more choices will happen in the case of stopping failure? Also, with all the new safety features in cars, passengers in a car that hits a wall are always more likely to come out alive than a pedestrian hit by a car.

Another problem with these nice and clean scenarios, is that they rarely ever occur in the real world. And deaths are never so certain, but as previously stated, passengers are more likely to survive an accident than a pedestrian. There are more options for a much safer decrease in speed as well.

If you can get the machine to decide, it should choose the scenario where no one dies even if it's the least likely outcome.  I think that car damage is preferable to lives lost, even non-human animal life.

I just can't commit to either limited answer where I can see alternate options especially if those options exist in the real world.
Title: Re: Moral Machine
Post by: Asmodean on November 30, 2016, 04:13:17 PM
That sort of was my position as well, but then... Today's machines can only predict and adapt so far. You can't give a computer a set of fuzzy rules and expect it to comply as intended.

A car with a catastrophic brake failure, for example, at a speed that puts its occupants at risk of serious injury or death in the event of violent deceleration, should disengage its engine, then stay loyal to its occupants. Why make it more difficult?
Title: Re: Moral Machine
Post by: Davin on November 30, 2016, 04:49:48 PM
Also, do they expect a machine to be able to tell gender, occupation, age, fitness, and other things in time? What if the machine's feed to the FBI and NSA privacy invading databases are slow or borked?

But my thoughts, to keep it simple, is for the machine to try to keep the passengers safe while avoiding living things.
Title: Re: Moral Machine
Post by: Asmodean on November 30, 2016, 05:24:39 PM
Pretty much. Soft non-living targets before animals before humans before solid barriers before other cars is my thinking. Doing this check like 20 times per second should produce the best outcome for the occupants. Speed is a factor, of course. Doing 30km/h with one healthy, secured occupant inside? The wall it is then. Otherwise, I don't see why *who* you hit should even matter, you being a robotic car in this case.
Title: Re: Moral Machine
Post by: Sandra Craft on December 01, 2016, 01:30:20 AM
Quote from: Asmodean on November 30, 2016, 03:08:08 PM
This one was easy for me. The car should always be on the side of protecting its occupants.

Interesting.  I went in the opposite direction -- I sacrificed the occupants of the car first.  I figured they chose to get into a car they couldn't control and if it went nuts that was on them.
Title: Re: Moral Machine
Post by: Pasta Chick on December 01, 2016, 03:36:27 AM
The framing of it being a self driving car makes it rather odd. I think it's more supposed to be like the trolley problem where you, the one taking the quiz, are in control, they're just looking for an excuse the car can't stop.

There will never at any point be a self driving car that can assess things like who is homeless or who just robbed a bank - for that matter, we really can't assess those things on site either. So I didn't factor them in at all yet somehow saved more homeless dudes than anyone else.
Title: Re: Moral Machine
Post by: Arturo on December 01, 2016, 04:03:20 AM
Quote from: Pasta Chick on December 01, 2016, 03:36:27 AM
yet somehow saved more homeless dudes than anyone else.

:rofl:
Title: Re: Moral Machine
Post by: Pasta Chick on December 01, 2016, 04:15:53 AM
I think it's a weird thing to factor in anyway. Who is actually out there like "yeah fuck the homeless I'll just run that fucker over"?

The bank robbers I sort of get.
Title: Re: Moral Machine
Post by: Asmodean on December 01, 2016, 04:17:03 AM
Quote from: BooksCatsEtc on December 01, 2016, 01:30:20 AM
Quote from: Asmodean on November 30, 2016, 03:08:08 PM
This one was easy for me. The car should always be on the side of protecting its occupants.

Interesting.  I went in the opposite direction -- I sacrificed the occupants of the car first.  I figured they chose to get into a car they couldn't control and if it went nuts that was on them.
Well, I tried to think like I'd want MY robotic car to think. Simple thoughts, those. A tad sycophantic and maybe with a dash of sociopathy, but... That what you do if you are MY car, even if in some situations it means that you have to run over me.
Title: Re: Moral Machine
Post by: Arturo on December 01, 2016, 04:20:16 AM
Quote from: Pasta Chick on December 01, 2016, 04:15:53 AM
I think it's a weird thing to factor in anyway. Who is actually out there like "yeah fuck the homeless I'll just run that fucker over"?

The bank robbers I sort of get.

I think it adapts to what it perceives as your "rules" to make the decisions harder.