News:

Departing the Vacuousness

Main Menu

Moral Machine

Started by Arturo, November 30, 2016, 04:01:15 AM

Previous topic - Next topic

Arturo

Ever wanted to have horrible moral choices about whose lives you get to save? Try it today and post your results!

http://moralmachine.mit.edu/

My Results
http://moralmachine.mit.edu/hl/en/results/361110971
It's Okay To Say You're Welcome
     Just let people be themselves.
     Arturo The1  リ壱

Sandra Craft

I got might sick of that self-driving car.  I couldn't see how to share my results, the only link I saw was the one to MIT.  However, my preferences in who to save went animal, young, female.  I didn't even notice such factors as obeying the law, or intervention.
Sandy

  

"Life is short, and it is up to you to make it sweet."  Sarah Louise Delany

xSilverPhinx

I am what survives if it's slain - Zack Hemsey


joeactor

They left out the most important question: Am I in the car?

Here's my results:
http://moralmachine.mit.edu/results/-1410200376

Asmodean

#4
This one was easy for me. The car should always be on the side of protecting its occupants. Upholding the law comes in as a secondary consideration. Human lives over other animals as tertiary.

Basically, you hold your lane unless you kill the occupants of the car by doing so. You change lanes if it is safe in order to collide with animals other than humans. As long as a collision with humans is unavoidable, hold the lane.

Unless the robotic car is always on the side of its occupants, I, for one, don't want one. My life is all good and well, but as a driver, my primary responsibility is to my passengers, not to those outside the boundaries of my vehicle. This is not something I am willing to easily budge on.

EDIT: Also, I see the results vary depending on the circumstances one is presented with... At least for me, they do. The first two or three stay put, but the rest are pretty much down to the composition of the crowd I plough into by following the above rules.

http://moralmachine.mit.edu/results/1217361777
Quote from: Ecurb Noselrub on July 25, 2013, 08:18:52 PM
In Asmo's grey lump,
wrath and dark clouds gather force.
Luxembourg trembles.

Davin

I can never wrap my head all the way around these kinds of false dichotomies.

In the case where the machine can choose between stopping itself, or to keep on running, the obvious choice should be stop itself even at the risk of the passengers. Even if say, in the first case there are two passengers and one pedestrian, what happens if it keeps going after running over the pedestrian? How many more choices will happen in the case of stopping failure? Also, with all the new safety features in cars, passengers in a car that hits a wall are always more likely to come out alive than a pedestrian hit by a car.

Another problem with these nice and clean scenarios, is that they rarely ever occur in the real world. And deaths are never so certain, but as previously stated, passengers are more likely to survive an accident than a pedestrian. There are more options for a much safer decrease in speed as well.

If you can get the machine to decide, it should choose the scenario where no one dies even if it's the least likely outcome.  I think that car damage is preferable to lives lost, even non-human animal life.

I just can't commit to either limited answer where I can see alternate options especially if those options exist in the real world.
Always question all authorities because the authority you don't question is the most dangerous... except me, never question me.

Asmodean

That sort of was my position as well, but then... Today's machines can only predict and adapt so far. You can't give a computer a set of fuzzy rules and expect it to comply as intended.

A car with a catastrophic brake failure, for example, at a speed that puts its occupants at risk of serious injury or death in the event of violent deceleration, should disengage its engine, then stay loyal to its occupants. Why make it more difficult?
Quote from: Ecurb Noselrub on July 25, 2013, 08:18:52 PM
In Asmo's grey lump,
wrath and dark clouds gather force.
Luxembourg trembles.

Davin

Also, do they expect a machine to be able to tell gender, occupation, age, fitness, and other things in time? What if the machine's feed to the FBI and NSA privacy invading databases are slow or borked?

But my thoughts, to keep it simple, is for the machine to try to keep the passengers safe while avoiding living things.
Always question all authorities because the authority you don't question is the most dangerous... except me, never question me.

Asmodean

Pretty much. Soft non-living targets before animals before humans before solid barriers before other cars is my thinking. Doing this check like 20 times per second should produce the best outcome for the occupants. Speed is a factor, of course. Doing 30km/h with one healthy, secured occupant inside? The wall it is then. Otherwise, I don't see why *who* you hit should even matter, you being a robotic car in this case.
Quote from: Ecurb Noselrub on July 25, 2013, 08:18:52 PM
In Asmo's grey lump,
wrath and dark clouds gather force.
Luxembourg trembles.

Sandra Craft

Quote from: Asmodean on November 30, 2016, 03:08:08 PM
This one was easy for me. The car should always be on the side of protecting its occupants.

Interesting.  I went in the opposite direction -- I sacrificed the occupants of the car first.  I figured they chose to get into a car they couldn't control and if it went nuts that was on them.
Sandy

  

"Life is short, and it is up to you to make it sweet."  Sarah Louise Delany

Pasta Chick

The framing of it being a self driving car makes it rather odd. I think it's more supposed to be like the trolley problem where you, the one taking the quiz, are in control, they're just looking for an excuse the car can't stop.

There will never at any point be a self driving car that can assess things like who is homeless or who just robbed a bank - for that matter, we really can't assess those things on site either. So I didn't factor them in at all yet somehow saved more homeless dudes than anyone else.

Arturo

It's Okay To Say You're Welcome
     Just let people be themselves.
     Arturo The1  リ壱

Pasta Chick

I think it's a weird thing to factor in anyway. Who is actually out there like "yeah fuck the homeless I'll just run that fucker over"?

The bank robbers I sort of get.

Asmodean

Quote from: BooksCatsEtc on December 01, 2016, 01:30:20 AM
Quote from: Asmodean on November 30, 2016, 03:08:08 PM
This one was easy for me. The car should always be on the side of protecting its occupants.

Interesting.  I went in the opposite direction -- I sacrificed the occupants of the car first.  I figured they chose to get into a car they couldn't control and if it went nuts that was on them.
Well, I tried to think like I'd want MY robotic car to think. Simple thoughts, those. A tad sycophantic and maybe with a dash of sociopathy, but... That what you do if you are MY car, even if in some situations it means that you have to run over me.
Quote from: Ecurb Noselrub on July 25, 2013, 08:18:52 PM
In Asmo's grey lump,
wrath and dark clouds gather force.
Luxembourg trembles.

Arturo

Quote from: Pasta Chick on December 01, 2016, 04:15:53 AM
I think it's a weird thing to factor in anyway. Who is actually out there like "yeah fuck the homeless I'll just run that fucker over"?

The bank robbers I sort of get.

I think it adapts to what it perceives as your "rules" to make the decisions harder.
It's Okay To Say You're Welcome
     Just let people be themselves.
     Arturo The1  リ壱