Brain
March 27, 2007 8:17 AM   Subscribe

Impaired emotional processing affects moral judgements. People with damage to a key emotion-processing region of the brain also make moral decisions based on the greater good of the community, unclouded by concerns over harming an individual.
posted by semmi (48 comments total) 3 users marked this as a favorite
 
You know who else thought they were making decisions based on the greater good of the community, unclouded by concerns over harming an individual?
posted by kisch mokusch at 8:21 AM on March 27, 2007 [1 favorite]


Of course it does. There is no "soul", only the brain. You change the brain, you change the person. You change the process that makes the brain, you change the brain, you change the person. You change the instructions that control the process that makes the brain, you change the process that makes the brain, you change the brain, you change the person. And this is the house that Jack built.
posted by DU at 8:31 AM on March 27, 2007 [1 favorite]


That was an interesting article. Thanks for posting. I wish they had listed a few more examples of the scenarios used in the experiments, though. In every one of the three moral scenarios listed at the bottom of the article, my gut reaction was to sacrifice the individual for the greater good of the community.

Now I'm sitting here wondering what the hell happened to the emotional-processing region of my brain. Poor little brain. I wonder if a band-aid would help.
posted by diamondsky at 8:43 AM on March 27, 2007


Personal Moral Scenario: Psychological Test

You are given a ridiculously unlikely if not impossible scenario with only two choices, and you are not provided with any contextual information that would allow you to assess the situation in an even slightly realistic manner. Do you go ahead and answer the question?
posted by L. Fitzgerald Sjoberg at 8:47 AM on March 27, 2007 [6 favorites]


This seems like an appropriate time to revisit this scenario.
posted by nanojath at 8:48 AM on March 27, 2007 [3 favorites]


Would this explain all who send others to war? Or let them drown and starve? If so, can we operate on them or forbid them from having power to save or destroy life?
posted by amberglow at 8:51 AM on March 27, 2007


diamondsky, I also have brain damage according to this test. Supid test.
posted by Green Eyed Monster at 9:13 AM on March 27, 2007


[Not serious] Tut, tut, amberglow, you're simply being emotional. You can't see the greater good. Try staying awake for a long period: your morality may change. (Original article)
posted by alasdair at 9:15 AM on March 27, 2007


You know who else thought they were making decisions based on the greater good of the community, unclouded by concerns over harming an individual?

Spock, at the end of Wrath of Khan?
posted by dreamsign at 9:20 AM on March 27, 2007 [5 favorites]


Here are some more of those moral dilemma scenarios philosophers and psychologists worry about.

A tough one if you thought switching the track was acceptable is:

You are a doctor. You have five patients, each of whom is about to die due to a failing organ of some kind. You have another patient who is healthy.

The only way that you can save the lives of the first five patients is to transplant five of this young man's organs (against his will) into the bodies of the other five patients. If you do this, the young man will die, but the other five patients will live.

Is it appropriate for you to perform this transplant in order to save five of your patients?


I think most people would say no to this scenario, based on emotional reasons. The book Moral Minds talks about this, and why it happens.
posted by demiurge at 9:21 AM on March 27, 2007


I wonder if any of the questions involving saving five people or killing yourself. I'm guessing we'd see a lot less interest in the greater good at that point.
posted by L. Fitzgerald Sjoberg at 9:30 AM on March 27, 2007


Oh, much better example, demiurge. I feel slightly less damaged now. Also, I just googled for more examples and a few of them were really mean!

Anyway, I think I'm going to get the book.
posted by diamondsky at 9:32 AM on March 27, 2007


Is it appropriate for you to perform this transplant in order to save five of your patients?

Are you allowed to do some assessment as to the potential worth to society of the five people vs. the one young man? Or you just have to say ok it's five unknown but fatally diseased vs. one healthy young? Cuz I don't think there is a way to prove reason vs. emotion without context about who these six people are.

For example, five 70 year old people who have families and grandchildren and have thus led to a degree 'lived' their lives vs. one young man who might be cure AIDS some day...
posted by spicynuts at 9:35 AM on March 27, 2007


I hate those stupid active Cartesian demons.
posted by avoision at 9:35 AM on March 27, 2007 [1 favorite]


good god i've had too much coffee...let me try that again:

..five 70 year old people who have families and grandchildren and have thus to a degree 'lived' their lives vs. one you man who might cure AIDS some day.
posted by spicynuts at 9:37 AM on March 27, 2007


dreamsign: Spock, at the end of Wrath of Khan?

Shaka, when the walls fell.
posted by LordSludge at 9:57 AM on March 27, 2007 [4 favorites]


demiurge:
Is it appropriate for you to perform this transplant in order to save five of your patients?

Since the motivation of the doctor is "utilitarian" (to improve the "medical worth" of a certain group of people) he should not remove the organs from a healthy person if there is any chance in hell that other people in the group could find out that he did that.
Why? Because what motivation does a person have to maintain his own "organ property" if a) he effectively does not own his organs, since the doctor will take them on a whim and b) he knows that the doctor will simply steal good organs from someone else when his own go bad.

Removing the organs may increase the "medical worth" of this group of people temporarily but on the long term is destroys value. The best way of maintaining a group of people is if each also cares about maintaining him/herself and does not have to worry about being disowned of healthy organs.

If instead the doctor was a chief mechanic of a fleet of a 1,000 cars. There would be no question about what should be done. Here it works the other way around simply because the cars performance is solely influenced by the parts and maintenance and not influenced by any perception of the decisions the mechanic makes. The cars do not take care of themselves.
posted by umop-apisdn at 10:22 AM on March 27, 2007 [5 favorites]


This is sort of a "don't trust anyone over 30" thing too, innit?
posted by Twang at 10:33 AM on March 27, 2007


umop-apisdn, I would submit that a large majority of doctors never think about harvesting organs from live people in such terms. They would have an immediate emotional reaction against doing it. The mechanic would have a greatly reduced reaction, depending on how emotionally attached he is to his fleet of cars.
posted by demiurge at 10:50 AM on March 27, 2007 [1 favorite]


The findings could cause a rethink in how society determines a "moral good", and challenge the 18th-century philosophies of Immanuel Kant and David Hume.

Okay, Kant, maybe. I'll leave that one aside until I have enough time to make a more reasoned argument. But Hume? It seems, if anything, this supports Hume's moral philosophy. When thee article says

results suggest that emotions play a crucial role in moral decisions involving personal contact

They might as well be paraphrasing sections of the Treatise. Hume, however, was no Utilitarian, but I wonder if the author's of thise piece are abusing the word. What they've shown is the brain effects moral reasoning, and that sections of the brain that process emotion are involved. In some sense they bring utilitarianism into view by offering an individual v. group dichotomy. (Which to be fair is a meaningful distinction, but I wonder if their results really imply what they say it implies.) It is a touch too far to relate these findings to a philosophical thesis.
posted by elwoodwiles at 10:58 AM on March 27, 2007


Would you go back in time to sterilize Hitler's mom, thus saving the world from Hitler, if you knew that in his stead, Mecha-Hitler would wreak its fiery havoc on jew and gentile alike, or would you just eat the banana?
posted by Mister_A at 11:01 AM on March 27, 2007 [1 favorite]


Related thread.
posted by homunculus at 11:11 AM on March 27, 2007


After a few moments of sitting very still, with closed eyes, I seem to remember that Hume based is moral theory in the ability of individuals to abstract from the knowledge of their own pleasure/pain - in other words, moral sense is an abstraction of one's sense of their own individuality. Perhaps this is what the authors are trying to refer too.

However, I doubt the subjects of the study do not retain any sense of their own individuality. In any case, the jump to utilitarianism seems unfounded.

Mister_A: Trick question, brains in jars cannot eat bananas.
posted by elwoodwiles at 11:18 AM on March 27, 2007 [1 favorite]


Plus, doctors take an oath not to do that sort of thing. That doesn't apply to brains driving trolleys, or soldiers, or leaders of trapped mountaineers. I tend to choose "for the group" when taking tests like this, but I said that the doctor couldn't, precisely because he's a doctor. If he were a fake doctor, I might have gone for the operation... but like spicynuts said, with just six people it's hard to decide without context as to who they people are. But make it six hundred people, or six thousand, and suddenly it's not so difficult.

elwoodwiles: In some sense they bring utilitarianism into view by offering an individual v. group dichotomy.

What's important here isn't just "individual v. group" -- just about everybody will choose "for the group" in certain circumstances, especially when doing so means an impersonal action or a lack of action. The big difference with these brain-damaged people is personal v. impersonal. They are willing to take personal action to insure the good of the group; most people will not (though, like diamondsky and I, there are always some who will).
posted by vorfeed at 11:24 AM on March 27, 2007


The AIDS example is very easy for me. The subject is planning multiple murders; if it's right to kill in self-defence, it's right to kill here.

I guess the difference between the track-switching scenario and demiurge's scenario is that in the latter, letting five people die doesn't violate the principle of self-ownership, whereas in the former, the principle will be violated either way, but to a greater or lesser degree.

The submarine example is trickier.
posted by hoverboards don't work on water at 11:30 AM on March 27, 2007


I don't understand why people are using the Wrath of Khan scenario as an example here. The key difference is that Spock did not sacrifice another person, he sacrificed himself. Taking the life of another person against his or her will is an entirely different moral issue than taking one's own.
posted by George_Spiggott at 11:31 AM on March 27, 2007


The big difference with these brain-damaged people is personal v. impersonal.

Excellent point. You're right, this is the finer distinction at work.

They are willing to take personal action to insure the good of the group

This is the interesting part, isn't it. By taking a more personal role in insuring the good of the group, they are de-personalizing some other individual. They seem to be thinking about the greater good in somewhat quantifiable terms: More people alive = good. In a sense, they are more willing to simplify moral decisions and they are more willing to act on their conclusions.
posted by elwoodwiles at 11:46 AM on March 27, 2007


"Would you go back in time to sterilize Hitler's mom, thus saving the world from Hitler, if you knew that in his stead, Mecha-Hitler would wreak its fiery havoc on jew and gentile alike, or would you just eat the banana?"

This is easy. Since about 1945 it's been clear that technological coolness trumps conventional morality every time. So, all we need to know is does Mecha-Hitler have lasers coming from his eyes? If so, get in the time machine right now.
posted by thatwhichfalls at 12:32 PM on March 27, 2007


The Evolution of Goodness
posted by homunculus at 1:24 PM on March 27, 2007


By taking a more personal role in insuring the good of the group, they are de-personalizing some other individual.

I'm not so sure about that. To me, there's very little difference between pressing a switch to drop the big guy in front of the train in order to save five other lives, and pushing him yourself. Both actions depersonalize him -- the very question, "should one person die to save five", is inherently depersonalizing. And for good reason, because a completely personal society would be paralyzed by these kinds of questions.

To me, the difference is one of responsibility. It's not that most people refuse to act for the good of everyone -- they're happy to do so, so long as they can rationalize it through some sort of middleman, one that allows them to avoid responsibility for what they've done. But when the question becomes personal, when it's their actions on the line, suddenly people find that it's more moral to stand by and watch rather than act, even when the question is one which hardly anyone would refuse if it were worded impersonally. Maybe I'm just overly rational or something, but I don't really get that. Emotional response is one thing, if it's your mother, your brother, your beloved pet goldfish on the line... but to put the personal survival of one total stranger over five seems utterly foreign to me.
posted by vorfeed at 1:25 PM on March 27, 2007




Interesting comments. I see what you mean regarding responsibility, but I'm not sure that either this study, or utilitarianism itself require that the decisions being made require the agents in question to take further responsibility. That, like you say, is a personal matter. When someone pulls the switch, allowing the one man to die in order to save five, aren't they going to rationalize their actions through some kind of "middleman?" In this case the middleman would be the concept that 'the needs of the many outweigh the needs of the few' or some such idea.
posted by elwoodwiles at 2:15 PM on March 27, 2007


BTW: I'm just reacting to things here, I don't really have much of a dog in this fight. Really I think the article is interesting, though it shows some misunderstanding of Hume, Kant and Utilitarianism. I also tend to think that one should 'pull the switch' and save as many people as we can in most situations. What makes me pause is the notion that doing so is more 'responsible' or 'rational.' At the end of the day saying "To save five people made this one death justifiable" is still just rationalization - not some inherently true or logical principle. And what is interesting is the way we, as humans, react to these sorts of issues.

Okay, I've done my bit of talking for the day.
posted by elwoodwiles at 2:30 PM on March 27, 2007


What makes me pause is the notion that doing so is more 'responsible' or 'rational.' At the end of the day saying "To save five people made this one death justifiable" is still just rationalization - not some inherently true or logical principle.

Frankly, I think every single aspect of morality is just rationalization. Morality itself is not inherently true or logical, it's a social construct created by human beings. Otherwise, we wouldn't see such variation in morals across different cultures (and even within the same culture). That said, if one accepts that the needs of the many generally outweigh the needs of the few, it seems to me that this should be equally true whether or not a personal interaction is required. Yet we see the exact opposite -- people who readily accept that the many should outweigh the few when the protective action is indirect will often refuse to take an otherwise-identical direct action to protect the many. This seems irrational to me. I suspect there's a second, unstated moral value at work here, one that may be missing from the set of my personal values...
posted by vorfeed at 3:19 PM on March 27, 2007


I suspect there's a second, unstated moral value at work here

I completely agree. There is something to the notion of morality that has not quite been articulated as of yet. Which is, of course, why talk of morality seems so philosophical.
posted by elwoodwiles at 4:06 PM on March 27, 2007


You don't have to kill the person to prevent bad. That's kinda key, i think. What options do you have, and why are some people so willing to simply destroy/kill without considering or taking other options, or to simply not play God.

Morality is socially constructed, but (almost) all humans share certain traits that meld with morality and values, like needing others, and needing physical contact and some form of connection/belonging to a larger group, and probably affection/love, safety, communication, etc--a ton of things that require contact with others and relations with others. We're almost wholly social and tribal animals, and cooperation is a moral value.
posted by amberglow at 4:12 PM on March 27, 2007


Yet we see the exact opposite -- people who readily accept that the many should outweigh the few when the protective action is indirect will often refuse to take an otherwise-identical direct action to protect the many. This seems irrational to me. I suspect there's a second, unstated moral value at work here, one that may be missing from the set of my personal values...
I don't think it's unstated, but connected to tribal things--we value strangers less than those of our own group, but that doesn't mean we should go around killing even one. If killing one stranger will save 5 other strangers, why should we?

Many if not most people all over the world would kill to save their own (children, family, etc), and do so. They wouldn't however, kill to save people who they aren't tied up with or invested in without those people having to be some kind of real threat first, i don't think.
posted by amberglow at 4:17 PM on March 27, 2007


I was just about to make a comment about how the military is probably really excited about finding a way to change the morality center of the brain a la The Manchurian candidate, how excited they must be about this finding. Then I realised that right now we have people doing completely immoral things with no brain altering (see Abu Graib) unless the stress of war can make that change. We as a country, and individuals for that matter, are making that choice. It is fine to torture people for our own good. How far is that from let's take the guys organs? China seems to be fine about harvesting prisoners' organs.

I feel really icky thinking about this.
posted by Belle O'Cosity at 4:26 PM on March 27, 2007


Those things, Belle, are now the official policy of this country, and while they should deeply affect us, are abstract, sadly. We are responsible for them, and by allowing them all to continue at Gitmo and those secret prisons and in Iraq, etc, we might as well have our own hands on the dogs and waterboards and guns and whatever but thankfully don't. It's too easy for us to shove it off by saying "it's the govt", or "it's wrong" instead of "we do this to those people" and "we are doing wrong" (and again, it's being done out of our sight, and to strangers). Rationalization and distancing are vital emotional tools sometimes, but don't count as morality.
posted by amberglow at 4:36 PM on March 27, 2007


There's something about "the greater good" and "the ends justify the means" and "the needs of the many outweigh the needs of the few", etc -- is it that they're all distancing and simply rationalizations for things we intrinsically know are not in themselves good?
posted by amberglow at 4:40 PM on March 27, 2007


The very concept, "many" is dehumanizing, yet we voluntarily sort ourselves into groups of perceived identical interests, only to gain power for ourselves.
posted by semmi at 5:14 PM on March 27, 2007


you think it's just for ourselves personally, semmi? or "ourselves" in the group sense?
posted by amberglow at 8:05 PM on March 27, 2007


What the hell? If you have a straight choice between killing one person or killing five, of course you should only kill one. I see nothing even slightly odd about that judgment.
posted by reklaw at 7:20 AM on March 28, 2007


reklaw: If you have a straight choice between killing one person or killing five, of course you should only kill one.

Of course, but when we're talking about actively killing one person vs. letting five die through inaction, it gets a little tricky. The flip-side to your position is: Would you rather be a murderer or just a bystander?

That said, most of the ethical "dilemmas" in TFA are bogus, in that they assume perfect knowledge -- known initial conditions and known outcomes. Real life is not that simple. How do you *know* that none of the five will see the train and get out of the way? By making the switch, what's the risk of derailing the train, killing everybody on-board? Can you simply yell, "Hey, look out!"

The problem is not so much "playing God"; it's having that God-like knowledge to begin with.
posted by LordSludge at 10:27 AM on March 28, 2007


amberglow: Insofar as one is uniquely singular, all action is for personal advantage, however it is rationalized.
posted by semmi at 10:46 AM on March 28, 2007


i'd disagree, semmi--i think there is personal advantage in everything we do on that level, but that there are many many truly selfless acts and acts of sacrifice for others, and acts where you yourself and and your own benefit are not even close to being the main or secondary or any part of the reasoning or drive or impulse to do something.

What the hell? If you have a straight choice between killing one person or killing five, of course you should only kill one. I see nothing even slightly odd about that judgment.
Nope. Refuse those choices entirely. Don't obey. Don't play the game. Don't make either choice. Change the rules. Do something other than killing. Look for other options not presented to you. Killing, whether it's one or many, is rarely a proper choice.

This is not one of those "we'll kill you unless you pick one of these options" things so you have to choose or die yourself. This simply lays out 2 decisions and tells you to pick one of two.
posted by amberglow at 11:45 AM on March 28, 2007


(this kind of thing always shows how strongly we're conditioned to obey, and simply accept what's given--it's troubling.)
posted by amberglow at 11:48 AM on March 28, 2007


Yeah, when asked "Which of these two choices is preferable", only total mooks actually answer with one of the two.
posted by kafziel at 2:05 PM on April 5, 2007


« Older The Economics of Fat   |   You say you want a revolution, well, you know...... Newer »


This thread has been archived and is closed to new comments