News:

Departing the Vacuousness

Main Menu

utilitarianism

Started by billy rubin, April 23, 2020, 08:48:08 PM

Previous topic - Next topic

billy rubin

Quote from: xSilverPhinx on April 28, 2020, 06:04:56 PM
These moral questions result in moral judgments, and it makes more sense to think that moral judgments come from the brain and not some objective framework created by a divine external force. Therefore, you can't divorce these moral questions from the psychological and/or social contexts in which they find themselves. And people are more emotionally than rationally driven in their decisions.

For instance, you could resort to a more or less utilitarian solution to a moral problem depending on how emotionally invested you are in the outcome. To make my point clear, if you found yourself faced with the Trolley Problem and you had to decide which person or people have to die, your answer could vary if a loved one was in either group. Because emotions are not rational, if a loved one was on one track by him or herself odds are way greater that you would sacrifice 5 strangers on the other track in order to save him or her.

uh oh

trolley problem.

that asks th eutilitarian ism question in its most basic terms.

did you ever hear the version with the drawbridg keeper who had to decide whether to crush his little boy who had fallen into the geartrain in order to lower the bridge for the runaway passenger train?

but the trolloy problem is usually phrased in order to catch you on the horns of the dilemma:: kill this set of people, or that set of people . . . decide.

i used to believe that there was another solution to the trolley problem, which was to refuse to participate, and not to touch the switch lever. i was a theist back then and my reasoning was that there was going to be death either way, and the most important decision was whether or not i was willing to be part of a dilemma that required me to kill. the justification for this line of thinking is that allowing death by inaction does not carry the same responsibility as causing death by action, even when the numbver who die is greater.

an extreme version is the dilemma as to whether to torture a terrorist in order to determine the deactiviation code for his bomb that will kill many people. is torture acceptable in this instance? if it is, when is it ever wrong?

i don't think that way anymore, but neither do i have a general solution to the problem. i don't think there is a general solution.


"I cannot understand the popularity of that kind of music, which is based on repetition. In a civilized society, things don't need to be said more than three times."

xSilverPhinx

Quote from: billy rubin on April 28, 2020, 07:02:06 PM
Quote from: xSilverPhinx on April 28, 2020, 06:04:56 PM
These moral questions result in moral judgments, and it makes more sense to think that moral judgments come from the brain and not some objective framework created by a divine external force. Therefore, you can't divorce these moral questions from the psychological and/or social contexts in which they find themselves. And people are more emotionally than rationally driven in their decisions.

For instance, you could resort to a more or less utilitarian solution to a moral problem depending on how emotionally invested you are in the outcome. To make my point clear, if you found yourself faced with the Trolley Problem and you had to decide which person or people have to die, your answer could vary if a loved one was in either group. Because emotions are not rational, if a loved one was on one track by him or herself odds are way greater that you would sacrifice 5 strangers on the other track in order to save him or her.

uh oh

trolley problem.

that asks th eutilitarian ism question in its most basic terms.

did you ever hear the version with the drawbridg keeper who had to decide whether to crush his little boy who had fallen into the geartrain in order to lower the bridge for the runaway passenger train?

but the trolloy problem is usually phrased in order to catch you on the horns of the dilemma:: kill this set of people, or that set of people . . . decide.

i used to believe that there was another solution to the trolley problem, which was to refuse to participate, and not to touch the switch lever. i was a theist back then and my reasoning was that there was going to be death either way, and the most important decision was whether or not i was willing to be part of a dilemma that required me to kill. the justification for this line of thinking is that allowing death by inaction does not carry the same responsibility as causing death by action, even when the numbver who die is greater.

an extreme version is the dilemma as to whether to torture a terrorist in order to determine the deactiviation code for his bomb that will kill many people. is torture acceptable in this instance? if it is, when is it ever wrong?

i don't think that way anymore, but neither do i have a general solution to the problem. i don't think there is a general solution.

I think the really interesting thing we learn from the Trolley Problem is that there is no universal answer, though most people when asked will sway toward one outcome or another depending on how you ask the question and frame the problem. And that's where psychological differences or contexts come in.

For instance, most people will say they would pull the lever that switches tracks and results in the train killing 1 instead of 5 strangers. That is possibly the most rational utilitarian approach. But if you ask them if they would do the same if that 1 person was a loved one, most say they would opt to save their loved one and murder the 5 strangers. No longer rational.

See I used the word 'murder' there? Some might think it's a bit of a strong choice of word. The way the above Trolley Problem is framed, pulling a lever gives the person making the choice some psychological distance. They're just making a choice and pulling a lever, not actively killing the 1 person or 5 people. If you add the second part of the Trolley problem, this becomes clear.

For those who are not familiar, the second part goes like this:

Say you are on a bridge standing behind a really heavy person and you see a train approaching. On the tracks below just a little further ahead there are 5 people tied up who will die if the train is not stopped. The extremely heavyset man is just above the tracks, and if you push him, he will fall and stop the train, saving the other 5. All people involved are of the same 'emotional value' (strangers) to you.     

Supposing you only had these two choices, do you push him or not?   

Most people will say they would not push him. But this is odd, you may think. The outcome is exactly the same as the first part of the Trolley problem! Save 5 strangers by murdering 1 stranger. So, what's going on here?

Turns out psychological distancing from the action of killing a person is no longer there. From the moment you push someone, it becomes murder. You are at the center of that action and are forced to take ownership of the choice and acting on that choice, beside being keenly aware it.     
I am what survives if it's slain - Zack Hemsey


xSilverPhinx

Sorry for rambling there without really answering your questions. :blahblah:

:grin:

I just love catching a tiny glimpses at the gears working inside the complex mind.  :P The Trolley Problem was a pretty well designed thought experiment, IMO.
I am what survives if it's slain - Zack Hemsey


Davin

Quote from: xSilverPhinx on April 28, 2020, 08:57:38 PM
Sorry for rambling there without really answering your questions. :blahblah:

:grin:

I just love catching a tiny glimpses at the gears working inside the complex mind.  :P The Trolley Problem was a pretty well designed thought experiment, IMO.
I have issues with the trolley problem. I guess more about how it's used. I feel like people use such an extreme (and highly unlikely), event and then tries to pull the reasons behind the judgment and use them in less extreme situations that are more likely to happen. It's like the opposite of everything looking like a hammer. As if some one asks you how you would solve:
2x2 + 5x + 3 = 0
and once you do, asks you to use that same method on:
2 + 5 = ?

It's a bit crazy to me.
Always question all authorities because the authority you don't question is the most dangerous... except me, never question me.

billy rubin

#19
Quote from: xSilverPhinx on April 28, 2020, 08:44:39 PM
For instance, you could resort to a more or less utilitarian solution to a moral problem depending on how emotionally invested you are in the outcome. To make my point clear, if you found yourself faced with the Trolley Problem and you had to decide which person or people have to die, your answer could vary if a loved one was in either group. Because emotions are not rational, if a loved one was on one track by him or herself odds are way greater that you would sacrifice 5 strangers on the other track in order to save him or her.

i suggest that the choiceis entirely rational, because loved ones are likely kin, aND  by saving known kin you are saving a portion of your genotype. TThis is orthodox sociobiology, using maynard smiths model of inclusive fitness.

assuming a nearby stranger might share perhaps 1/128 of your chromosomes, that's the same as you share with  a pretty distant cousin some seven divisions removed. but your child shares 1/2 of your genes. so your child contains 64 times as much of you as the stranger does. you will have to save 64 strangers to do your genotype as much good as saving your child.

if the unit of selection is the gene, and if thereis a gene for altruism, it will impose a selection force 64 times as strong regardingyour child as it does regarding a stranger.

so you save the loved one, no emotions required. cold hard natural selection will cause you to evolve a tendency to favor kin over strangers. obviously this has mplications for motherly love, parentlal investment and care for offspring, and so on. its why the step child is neglected compared to the natural siblings.

this ihas been documented pretty clarly in a number of social animals. i'm thinkng of alram calls in belding's ground squiorrels, for example. they warn siblings of dsanger, but let cousins get eaten, when sounding alarm calls might endnger themselves.

https://science.sciencemag.org/content/197/4310/1246

Quote

I think the really interesting thing we learn from the Trolley Problem is that there is no universal answer, though most people when asked will sway toward one outcome or another depending on how you ask the question and frame the problem. And that's where psychological differences or contexts come in.

For instance, most people will say they would pull the lever that switches tracks and results in the train killing 1 instead of 5 strangers. That is possibly the most rational utilitarian approach. But if you ask them if they would do the same if that 1 person was a loved one, most say they would opt to save their loved one and murder the 5 strangers. No longer rational.

See I used the word 'murder' there? Some might think it's a bit of a strong choice of word. The way the above Trolley Problem is framed, pulling a lever gives the person making the choice some psychological distance. They're just making a choice and pulling a lever, not actively killing the 1 person or 5 people. If you add the second part of the Trolley problem, this becomes clear.

For those who are not familiar, the second part goes like this:

Say you are on a bridge standing behind a really heavy person and you see a train approaching. On the tracks below just a little further ahead there are 5 people tied up who will die if the train is not stopped. The extremely heavyset man is just above the tracks, and if you push him, he will fall and stop the train, saving the other 5. All people involved are of the same 'emotional value' (strangers) to you.     

Supposing you only had these two choices, do you push him or not?   

Most people will say they would not push him. But this is odd, you may think. The outcome is exactly the same as the first part of the Trolley problem! Save 5 strangers by murdering 1 stranger. So, what's going on here?

Turns out psychological distancing from the action of killing a person is no longer there. From the moment you push someone, it becomes murder. You are at the center of that action and are forced to take ownership of the choice and acting on that choice, beside being keenly aware it.   


^^^this was my dilemma regarding being the agent in a trolley scenario, versus standing by and watching.


"I cannot understand the popularity of that kind of music, which is based on repetition. In a civilized society, things don't need to be said more than three times."

xSilverPhinx

Quote from: Davin on April 28, 2020, 09:16:01 PM
Quote from: xSilverPhinx on April 28, 2020, 08:57:38 PM
Sorry for rambling there without really answering your questions. :blahblah:

:grin:

I just love catching a tiny glimpses at the gears working inside the complex mind.  :P The Trolley Problem was a pretty well designed thought experiment, IMO.
I have issues with the trolley problem. I guess more about how it's used. I feel like people use such an extreme (and highly unlikely), event and then tries to pull the reasons behind the judgment and use them in less extreme situations that are more likely to happen. It's like the opposite of everything looking like a hammer. As if some one asks you how you would solve:
2x2 + 5x + 3 = 0
and once you do, asks you to use that same method on:
2 + 5 = ?

It's a bit crazy to me.

I think those people are misusing the Trolley Problem when they generalise it in that way. I don't think the point of it all is to predict or model what people will do in so and so circumstances, but rather it's a psychological tool to shed light on moral decision making and judgements.

I think the reason it has to be so extreme in its examples is because there are way more variables in the moral grey areas, so it makes sense not to want to tread in those areas. But of course, most moral decisions are made taking into account a huge number of factors, and people will decide things differently based on their biology, experiences, beliefs, etc. 
I am what survives if it's slain - Zack Hemsey


xSilverPhinx

Quote from: billy rubin on April 28, 2020, 09:22:40 PM
Quote from: xSilverPhinx on April 28, 2020, 08:44:39 PM
For instance, you could resort to a more or less utilitarian solution to a moral problem depending on how emotionally invested you are in the outcome. To make my point clear, if you found yourself faced with the Trolley Problem and you had to decide which person or people have to die, your answer could vary if a loved one was in either group. Because emotions are not rational, if a loved one was on one track by him or herself odds are way greater that you would sacrifice 5 strangers on the other track in order to save him or her.

i suggest that the choiceis entirely rational, because loved ones are likely kin, aND  by saving known kin you are saving a portion of your genotype. TThis is orthodox sociobiology, using maynard smiths model of inclusive fitness.

assuming a nearby stranger might share perhaps 1/128 of your chromosomes, that's the same as you share with  a pretty distant cousin some seven divisions removed. but your child shares 1/2 of your genes. so your child contains 64 times as much of you as the stranger does. you will have to save 64 strangers to do your genotype as much good as saving your child.

if the unit of selection is the gene, and if thereis a gene for altruism, it will have a selection 64 times as strong in  your child as it does in a stranger.

so you save the loved one, no emotions required. cold hard natural selection will cause you to evolve a tendency to favor kin over strangers.

Emotions have a genetic basis and evolved as well. ;) And if you research a bit on the neuroscience of decision-making you will see just how important emotions are in everyday decisions. I'd bet they take part in ALL decisions. Marketing has this figured out since the time of Edward Bernays (Freud's nephew). There's even a neurological condition (I'm racking my brain trying to remember the name of the disorder, but can't...maybe it'll come to me soon) in which sufferers have a really hard time deciding basic stuff because they have impaired emotion, even though their ability to rationalise and IQ are normal.

I ask you, if you had to make a choice to save your child or a stranger you would stop and think, I need to save my kid because they share 50% of my genes or would you not think at all and save them because you love them? Act first, think later? 

I suggest you read this short essay on Jonathan Haidt on the rational-emotional mind and moral reasoning, just for fun ;) :

https://cct.biola.edu/riding-moral-elephant-review-jonathan-haidts-righteous-mind/

I'll just add a little snippet here, to entice your appetite.  ;D

QuoteHe [Haidt] shows us that morality is neither the result of rational reflection (a learned exercise in determining values like fairness, justice, or prevention of harm) nor merely of innate, inherited assumptions. Haidt gives us the "first rule of moral psychology": "Intuitions come first, strategic reasoning second" (367). Human morality is largely the result of internal predispositions, which Haidt calls "intuitions." These intuitions predict which way we lean on various issues, questions, or decisions. The rational mind—which the Greek philosophers so valorized and even idolized—has far less control over our moral frameworks than we might think. Intuition is much more basic and determinative than reasoning.

If you're really interested in these two systems: the fast, intuitive thinking versus the slower, rational thinking then I suggest this book by Nobel laureate Daniel Kahneman:

I am what survives if it's slain - Zack Hemsey


Davin

Quote from: xSilverPhinx on April 28, 2020, 08:44:39 PM
Most people will say they would not push him. But this is odd, you may think. The outcome is exactly the same as the first part of the Trolley problem! Save 5 strangers by murdering 1 stranger. So, what's going on here?
That's because the result is only the exact same if you only consider the amount of people killed vs. saved. We instinctively understand that things are more complicated than that, even in an oversimplified thought experiment. Not many are able to explain it though. So most will say not much more than something like, "because it's different."
Always question all authorities because the authority you don't question is the most dangerous... except me, never question me.

xSilverPhinx

Quote from: Davin on April 28, 2020, 10:10:44 PM
Quote from: xSilverPhinx on April 28, 2020, 08:44:39 PM
Most people will say they would not push him. But this is odd, you may think. The outcome is exactly the same as the first part of the Trolley problem! Save 5 strangers by murdering 1 stranger. So, what's going on here?
That's because the result is only the exact same if you only consider the amount of people killed vs. saved. We instinctively understand that things are more complicated than that, even in an oversimplified thought experiment. Not many are able to explain it though. So most will say not much more than something like, "because it's different."

Yes, the amount of people killed vs. saved is the same, but the mental paths people take to reach a course of action are different. And that's the point I think this problem is trying to show.

It takes a reductionist approach in that it removes a lot of the complexity, which are extra variables and therefore make a "dirty" experiment. This approach has its pros and cons, of course and I think in the cognitive sciences results such as these can rarely be generalised.   
I am what survives if it's slain - Zack Hemsey


Davin

Quote from: xSilverPhinx on April 28, 2020, 10:49:04 PM
Quote from: Davin on April 28, 2020, 10:10:44 PM
Quote from: xSilverPhinx on April 28, 2020, 08:44:39 PM
Most people will say they would not push him. But this is odd, you may think. The outcome is exactly the same as the first part of the Trolley problem! Save 5 strangers by murdering 1 stranger. So, what's going on here?
That's because the result is only the exact same if you only consider the amount of people killed vs. saved. We instinctively understand that things are more complicated than that, even in an oversimplified thought experiment. Not many are able to explain it though. So most will say not much more than something like, "because it's different."

Yes, the amount of people killed vs. saved is the same, but the mental paths people take to reach a course of action are different. And that's the point I think this problem is trying to show.

It takes a reductionist approach in that it removes a lot of the complexity, which are extra variables and therefore make a "dirty" experiment. This approach has its pros and cons, of course and I think in the cognitive sciences results such as these can rarely be generalised.
I think it's more than the mental paths it takes to get to the conclusion. For instance, one might consider taking someone entirely out of danger and putting them in a position to be harmed (to death), different (and in most cases worse), from changing the direction of a train from five people already in a position of danger to one person already in a position of danger.
Always question all authorities because the authority you don't question is the most dangerous... except me, never question me.

xSilverPhinx

Quote from: Davin on April 28, 2020, 10:56:32 PM
Quote from: xSilverPhinx on April 28, 2020, 10:49:04 PM
Quote from: Davin on April 28, 2020, 10:10:44 PM
Quote from: xSilverPhinx on April 28, 2020, 08:44:39 PM
Most people will say they would not push him. But this is odd, you may think. The outcome is exactly the same as the first part of the Trolley problem! Save 5 strangers by murdering 1 stranger. So, what's going on here?
That's because the result is only the exact same if you only consider the amount of people killed vs. saved. We instinctively understand that things are more complicated than that, even in an oversimplified thought experiment. Not many are able to explain it though. So most will say not much more than something like, "because it's different."

Yes, the amount of people killed vs. saved is the same, but the mental paths people take to reach a course of action are different. And that's the point I think this problem is trying to show.

It takes a reductionist approach in that it removes a lot of the complexity, which are extra variables and therefore make a "dirty" experiment. This approach has its pros and cons, of course and I think in the cognitive sciences results such as these can rarely be generalised.
I think it's more than the mental paths it takes to get to the conclusion. For instance, one might consider taking someone entirely out of danger and putting them in a position to be harmed (to death), different (and in most cases worse), from changing the direction of a train from five people already in a position of danger to one person already in a position of danger.

Ah ok, I think I understand the point you're making now. Yes, I think you're right. There are other decisions involved. I think it has to do in part with the psychological distancing I mentioned earlier. Even empathy is involved.

It's interesting that you put it that way. Just as an addendum to my point about emotions driving decisions in a reply to billy rubin, in brain scans evaluating moral decision-making there is higher activation in prefrontal regions (such as the orbitofrontal cortex and ventromedial cortex, both more or less just behind the forehead and above the nasal cavity) which are typically not not very activated in some psychopaths, for example. These two regions are both very important in these kinds of decisions and are linked to the emotional centers in the brain.
I am what survives if it's slain - Zack Hemsey


Asmodean

There is a thing other than the trolley problem, that I think merits more of a... Well, think, in this day and age; self-driving cars. There's talk about programming them with some sort of ethical protocols or what have you. Personally, I oppose the idea entirely - a vehicle's priority, if it can be thusly called, should in my opinion always be the safety of its own occupants.

Come to think of it, that's no more than I demand of myself when driving with passengers, so if for some reason I AM barreling down the motorway at 110 km/h, carrying another person, surrounded by sheer cliffs, and suddenly there IS a kindergarten full of babies, each with a puppy and a kitty besides, well within my maneuvering distance... Requiescat im pace.

I actually did a survey about this, and they analyzed my data wrongly. In the result, they claimed that so-and-so many people would save a pregnant lady before an old guy or a animal before a human... No. From the point of view of a car, placed in a bad situation through no fault of its own, I always sided with my occupants. Simple as that.

This deserves its own thread though, I think, so I digress before we go into more realistic examples and this shit gets well and truly dark.
Quote from: Ecurb Noselrub on July 25, 2013, 08:18:52 PM
In Asmo's grey lump,
wrath and dark clouds gather force.
Luxembourg trembles.

billy rubin

Quote from: xSilverPhinx on April 28, 2020, 09:52:09 PM
Emotions have a genetic basis and evolved as well. ;) And if you research a bit on the neuroscience of decision-making you will see just how important emotions are in everyday decisions. I'd bet they take part in ALL decisions. Marketing has this figured out since the time of Edward Bernays (Freud's nephew). There's even a neurological condition (I'm racking my brain trying to remember the name of the disorder, but can't...maybe it'll come to me soon) in which sufferers have a really hard time deciding basic stuff because they have impaired emotion, even though their ability to rationalise and IQ are normal.

I ask you, if you had to make a choice to save your child or a stranger you would stop and think, I need to save my kid because they share 50% of my genes or would you not think at all and save them because you love them? Act first, think later? 

no, no . . . no reasoning required in this model. its mechanical, like any examples of natural selection.  animals recognize relatedness for close kin by lifelong association, and then make intuitive decisions based on similarity. think of phenotypic associative mating. if you can distinguishe between your child and and a stranger--to any degree-- then natural selection over time will mindlessly select for behaviors that favor survival of genes that recognize kin and encourage nepotism. im asserting that love is a mechanical motivator like fear, pain, cold, and hunger.

but i'm not at all saying that emotions don't enter in to the equation. i personally think that the exprienceof emoption is the heritable physical trait that is subject to selection and is an important part of the model im suggesting. emotion is the brain's immediate motivation for  action, and its the lizard brain that sees the offspring in danger and screams Save It! or sees the stranger, and cooly concludes, Not Worth The Risk.

Quote
I suggest you read this short essay on Jonathan Haidt on the rational-emotional mind and moral reasoning, just for fun ;) :

https://cct.biola.edu/riding-moral-elephant-review-jonathan-haidts-righteous-mind/

I'll just add a little snippet here, to entice your appetite.  ;D

QuoteHe [Haidt] shows us that morality is neither the result of rational reflection (a learned exercise in determining values like fairness, justice, or prevention of harm) nor merely of innate, inherited assumptions. Haidt gives us the "first rule of moral psychology": "Intuitions come first, strategic reasoning second" (367). Human morality is largely the result of internal predispositions, which Haidt calls "intuitions." These intuitions predict which way we lean on various issues, questions, or decisions. The rational mind—which the Greek philosophers so valorized and even idolized—has far less control over our moral frameworks than we might think. Intuition is much more basic and determinative than reasoning.

i will read the rest of it after i dig my stinking culvert. just finished teaching th enumber three son that you can loosen corrosion-frozen hose connections enough to remove them by hand by banging th ejoint on the cellar steps, if you don't want to walk 1000 feet to the warehouse to get a wrench.



"I cannot understand the popularity of that kind of music, which is based on repetition. In a civilized society, things don't need to be said more than three times."

billy rubin

Quote from: Asmodean on April 29, 2020, 12:32:25 AM
There is a thing other than the trolley problem, that I think merits more of a... Well, think, in this day and age; self-driving cars. There's talk about programming them with some sort of ethical protocols or what have you. Personally, I oppose the idea entirely - a vehicle's priority, if it can be thusly called, should in my opinion always be the safety of its own occupants.

well thats interesting, because its s real-world trolley problem. i think there's going to have to be some soul-searching on the part of the programmers in this field.

take a self-driving vehicle--no occupants-- that is programmed to recognize pedestrians and slow down and stop when they are in its path and to steer around them when it can't. it pulls off a  high speed motorway and the brakes fail. ahead is an unavoidable crowd of people, a coarse-grained obstacle. clumps of people, large and small, with narrow gaps.

the vehicle cannot stop. where should it steer?


^^this question is going to have to be addressed, in the real world, pretty soon. id never thought of it before.


"I cannot understand the popularity of that kind of music, which is based on repetition. In a civilized society, things don't need to be said more than three times."

Asmodean

Quote from: billy rubin on April 29, 2020, 12:48:53 AM
take a self-driving vehicle--no occupants-- that is programmed to recognize pedestrians and slow down and stop when they are in its path and to steer around them when it can't. it pulls off a  high speed motorway and the brakes fail. ahead is an unavoidable crowd of people, a coarse-grained obstacle. clumps of people, large and small, with narrow gaps.
See? It's already getting dark in a hurry.

I think it should emergency-maneuver like a sensible "AI." Avoid frontal impact, avoid oncoming lanes, use side barriers to slow down (If there be a ditch in lieu of side barriers - go in it at an angle conducive to dumping maximum speed for a minimum of damage. If there be a field - even better)

A thing one also ought to factor in, is the "AI's" ability to get on that horn faster than any human would while maneuvering for their lives. Then, one does have to take into account the pedestrians' ability to look towards the sound, do a quick risk assessment and dive for the nearest "not here."
Quote from: Ecurb Noselrub on July 25, 2013, 08:18:52 PM
In Asmo's grey lump,
wrath and dark clouds gather force.
Luxembourg trembles.