Courage to Refuse
May 7, 2008 8:52 AM   Subscribe

Of forty participants in Milgram's first experiment on obedience to authority, fifteen refused to continue at some point. An insight into the thoughts of one man who refused to obey Milgram's immoral orders.
posted by iffley (45 comments total) 17 users marked this as a favorite
 
So how long before Godwin rears his head? :-)
posted by Mike D at 8:58 AM on May 7, 2008


Interesting. Thanks.
posted by small_ruminant at 8:59 AM on May 7, 2008


fascinating.
posted by milestogo at 8:59 AM on May 7, 2008


So how long before Godwin rears his head? :-)

Did you R the FA? It's pre-Godwinned for our convenience.

Interesting, though. Good post.
posted by gurple at 9:06 AM on May 7, 2008


He just figured out the trick, though.
posted by unsupervised at 9:08 AM on May 7, 2008


He just figured out the trick, though.

Exactly! How many of the fifteen stopped despite falling for it? I find that to be far more interesting/significant.
posted by prefpara at 9:13 AM on May 7, 2008 [1 favorite]


I'm curious about the reactions of those who "followed orders". Is there anything on the web about their reaction and thoughts after the experiment?
posted by Blazecock Pileon at 9:13 AM on May 7, 2008


I'm curious about the reactions of those who "followed orders". Is there anything on the web about their reaction and thoughts after the experiment?

Milgram's book on the experiments includes some fascinating interviews with many subjects. I'm not sure if it's available online.
posted by grobstein at 9:17 AM on May 7, 2008


Obedience to Authority should be required reading in all high schools (don't hold your breath).

Milgram talks about some fascinating variations on the experiment in the book. These are the two that stick in my mind:

The experiment was conducted with the learner in another room, in the same room, sitting next to the teacher, and in one version the teacher was required to physically touch the learner and push his arm down onto a metal plate. The closer the learner and teacher were, the higher the rate of disobedience.

I think this finding is evidence against NRA arguments that if someone didn't have a gun, he/she would merely kill someone with a knife or a bat or a rock. This finding shows that, the more physical and emotional distance there is between victim and perpetrator, the more willing the perpetrator is to commit violence. Don't think too hard about what this means in the case of functionaries sitting in underground bunkers with red buttons in front of them.

The single most significant variable to affect rates of disobedience was being in the presence of others who refused. In one version, the teachers worked in teams, only one of whom was not aware of the true nature of the experiment. The others were actors who at some point would refuse to go on. In the presence of others who refused, rates of disobedience were highest.
posted by crazylegs at 9:22 AM on May 7, 2008 [15 favorites]


You must watch it to truly appreciate it.
posted by NoMich at 9:23 AM on May 7, 2008 [1 favorite]


82 copies available here
posted by crazylegs at 9:24 AM on May 7, 2008


I'm not sure that a discussion of the good-German theory CAN be godwined.
posted by Artw at 9:26 AM on May 7, 2008


So what we can take from this article is that good communists will defeat good Germans every time.
posted by Roger Dodger at 9:32 AM on May 7, 2008


Crazylegs, it turns out that there is a Killology lab (I am not making this up) that has examined the relationship between physical distance and resistance to killing. The graph at the end of this Killology web page shows this.
posted by jasonhong at 9:37 AM on May 7, 2008 [2 favorites]


Diana Baumrind, in the American Psychologist, 1964, complained that there was no informed consent and that even if valuable information were gleaned, it would not justify the risk that real [emotional] harm [would be] done to the subject.

Heh, I think the 25 who didn't refuse were indeed in need of an emotional spanking.
posted by TheOnlyCoolTim at 9:38 AM on May 7, 2008


So what we can take from this article is that good communists will defeat good Germans every time.

Hmm, I suspect it was more being a Communist in America. He has some thoughts along those lines:

In the early 1950s, I was harassed and tailed by the FBI, and in 1954, along with other leaders of the Communist Party in Connecticut, I was arrested and tried under the Smith Act on charges of "conspiracy to teach and advocate the overthrow of the government by force and violence." We were convicted, as expected, and I was about to go to jail when the conviction was overturned on appeal. I believe these experiences also enabled me to stand up to an authoritative "professor."

This is not to say that membership in the Communist Party made me or anyone else totally independent. Many of us, in fact, had become accustomed to carrying out assignments from people with higher positions in the Party, even when we had doubts. Would I have refused to follow orders had the experimental authority figure been a "Party leader" instead of a "professor"? I like to think so, as I was never a stereotypical "true believer" in Party doctrine. This was one of the reasons, among others, that I left the Party in the late 1950s. In any event, I believe that my political experience was an important factor in determining my skeptical behavior in the Milgram experiment.

posted by Artw at 9:48 AM on May 7, 2008


I think we should study people who volunteer for studies. There's something wrong with those people. Also, I'm not sure what the applicability is of the study, although I'm a long time fan.

The people that do the most damage aren't the ones that follow orders. The problem is the silent masses who aren't involved and won't get involved. Or so they think.
posted by ewkpates at 9:55 AM on May 7, 2008


Not only did some refuse to participate, but, if the William Shatner film The Tenth Level is to be believed, one actually took a swing at Milgrom, causing him to leap away into a Kirk-like shoulder roll.
posted by Astro Zombie at 10:02 AM on May 7, 2008 [1 favorite]


I think we should study people who volunteer for studies. There's something wrong with those people

At least at my institution, those people are responding to incentives. Either they're paid for their time (doesn't have to be very much) or they have to do it for a grade (intro psych students here are required to take part in other peoples' studies, I think).

I stay away from the psych department. Those people are nuts.
posted by dismas at 10:07 AM on May 7, 2008


I think this finding is evidence against NRA arguments that if someone didn't have a gun, he/she would merely kill someone with a knife or a bat or a rock.

The problem with this is that most shootings take place within 3 to 15 yards or less -- these are same-room distances, which means that the shooter is close enough to clearly see the face of his or her target, and close enough to touch after a second or two of movement. Other than drive-bys, there isn't a whole lot of Milgram-style murder at a removed distance with handguns.

Also, the major problem pro-gun-control England and Wales are having with stabbings doesn't exactly make your point. "A child is stabbed to death on the streets of Britain every week and knife homicides out number gun homicides by three to one" -- note that they still have shootings ten years after they enacted one of the strictest gun bans in the world! This link (from the Gun Control Network, no less) has some very interesting stats on how poorly gun control has served to stop gun crime in Britain, particularly over the last ten years. Here it is in graph form, from this page. Note that handguns were banned there in 1997.
posted by vorfeed at 10:26 AM on May 7, 2008 [2 favorites]


This is not to say that membership in the Communist Party made me or anyone else totally independent. Many of us, in fact, had become accustomed to carrying out assignments from people with higher positions in the Party, even when we had doubts. Would I have refused to follow orders had the experimental authority figure been a "Party leader" instead of a "professor"? I like to think so...

I was glad to see him acknowledge this, but I think he's more optimistic about himself than is warranted (as is only human). I'd bet money that if his Party boss told him to push the button, he'd have done it. That's what the CP is based on, after all; as with the Army, if you don't want to follow orders, you don't join in the first place.
posted by languagehat at 10:35 AM on May 7, 2008


Oh, and great post!
posted by languagehat at 10:36 AM on May 7, 2008


Derren Brown re-enacts the experiment – for entertainment, but still fascinating.
posted by liquidindian at 10:56 AM on May 7, 2008 [2 favorites]


FTA:
I think the experiment had only limited relevance to our understanding of the actions of the German people under Nazi rule. In the experiment, the professor had no power to enforce his orders. In Nazi Germany, the enforcement powers went from simple reprimand all the way to imprisonment and death.

His description is very interesting, but here he seems to miss the point. If the authority figure in the experiment "had no power to enforce his orders," one would expect a much smaller ratio of people acquiescing to them. The fact that the majority of the subjects followed their orders all the way to the end indicates that fear of reprisal from the authority figure isn't quite what it had previously been thought to be. At the very least, it makes it clear that you need not have guns in order to get people to obey you.

The point, then, is that you don't need to have a Nazi regime in place to get people to obey Nazi-like orders.
posted by voltairemodern at 11:00 AM on May 7, 2008 [1 favorite]


The professor brought in the learner and I was flabbergasted. His face was covered in tears and he looked haggard. He offered his hand and thanked me for stopping the experiment, saying that the shocks hadn't really hurt but anticipating them had been dreadful. I was confused as to whether he was in earnest or acting. I left unsure, and waited outside for the learner so I could discuss it with him.

This is pretty screwed up. When I studied this experiment in introductory psych, I was under the impression that as soon as the experiment was over, they explained the entire procedure and brought in the learner to confirm that the shocks were faked. If they really let people walk away thinking they might have just tortured a guy to death, then the whole thing becomes a lot more ethically questionable, to say the least.

crazylegs: Another interesting variation I recall was that instead of conducting the experiment in a research lab at Yale, they set up shop in a run-down office complex. If I remember correctly, even though the explanation and setup were the same, people were a lot more reluctant to cooperate when they didn't have the visible connection to a prestigious university to fall back on.
posted by teraflop at 11:00 AM on May 7, 2008


Vorfeed, I love the fact that the website you linked claims to use the brunt power of statistics to "prove" america shouldn't have gun laws and then commits one of the classic tricks from the how to lie with statistics handbook by making a comparison between the number of homicides total in 100'000s and the fall in knife crimes, in millions. ,
But as you can see in the graphs, the fatal stabbings fell more than the rate of fatal shootings.


I'm not commenting on his argument, which I know nothing about at all. Just that
No one can really see anything from graphs measured on different scales IMOP.
posted by munchbunch at 11:09 AM on May 7, 2008


I enjoyed this.

In retrospect, I believe that my upbringing in a socialist-oriented family steeped in a class struggle view of society taught me that authorities would often have a different view of right and wrong than mine. That attitude stayed with me during my three and one half years of service in the army, in Europe, during World War II. Like all soldiers, I was taught to obey orders, but whenever we heard lectures on army regulations, what stayed with me was that we were also told that soldiers had a right to refuse illegal orders (though what constituted illegal was left vague).
posted by jessamyn at 11:11 AM on May 7, 2008 [1 favorite]


It's worth noting that Milgram was working in the early days of social psych; as with Philip Zimbardo's Stanford Prison Study, there was minimal/zero thought given to ethical oversight, and indeed this trial can't easily be replicated today because of the informed consent issues arising.

(Hell, I got a whole novel out of the latter study ...)
posted by cstross at 11:15 AM on May 7, 2008 [3 favorites]


And a very enjoyable one, by the way, even though the inspiration was fairly clear about a quarter of the way in. :)

Milgram's original paper doesn't mention anything about a debriefing procedure, but I found a couple of indirect sources (1) (2) claiming that participants were fully informed about the procedure afterward. Apparently there was a retrospective paper published in 1972 that goes into a bit more detail about the ethical concerns, but I haven't been able to track down a copy.
posted by teraflop at 11:54 AM on May 7, 2008


That Derren Brown video is sort of heartbreaking. You don't usually get the sense from reading about the Milgram experiment just how painful it was for some of the participants to think they were hurting someone else. The subjects in the Brown video were just so desperately unhappy to think that they were causing someone else pain.

Actually, it makes me a little more optimistic about humanity than just reading the results of the Milgram experiment did.
posted by Astro Zombie at 12:08 PM on May 7, 2008


Chip Kidd's book, the Learners, gives and interesting fictionalized account of this--if you can, read the first book, The Cheese Monkeys, then read the Learners.
posted by agatha_magatha at 12:19 PM on May 7, 2008


I'm not commenting on his argument, which I know nothing about at all. Just that No one can really see anything from graphs measured on different scales IMOP.

The particular graph I linked to is the one I was trying to show -- I only linked to the rest of the page out of fairness, because I don't like linking to other people's work without giving proper credit for it. And the graph I linked to shows pretty clearly, in a single scale, that gun crimes have increased in Britain since the gun ban. You'll note that this is the only claim I made with regards to it.

Also, that page never even makes the comparison you're claiming it does! Nowhere does it directly compare the two graphs you linked to (they are in two different sections of the article!), nor does it claim anything close to "the fatal stabbings fell more than the rate of fatal shootings". And besides, the guy who runs that page was entirely up-front about the change in scale: "Please note, the rest of the graphs will use per-million rates to make it easier to see the shifts." Sorry, but that's not exactly an attempt to lie with statistics.

On top of that, you're just plain wrong, because "the fatal stabbings fell more than the rate of fatal shootings" can certainly be shown using two different graphs with two different scales. Rates of change aren't dependent on scale! Visibility is the reason why that page chose different scales for the two graphs, not some attempt at subterfuge.
posted by vorfeed at 12:35 PM on May 7, 2008


er, actually, I just spotted where the guy is comparing them, my mistake. At any rate, the rest of my comment stands -- it is perfectly valid to compare the rates of change over those two graphs, because rates of change are not dependent on scale. Also, the owner of the page did not make any attempt to hide the change in scale, or to use it to mislead.
posted by vorfeed at 1:07 PM on May 7, 2008


Actually teraflop, the video NoMich linked says that changing the location from Yale to some random office building reduced obedience rates only slightly.
posted by anthill at 1:42 PM on May 7, 2008


Actually, it makes me a little more optimistic about humanity than just reading the results of the Milgram experiment did.

Yes. Right up until the bloke who asks for more switches.
posted by liquidindian at 3:06 PM on May 7, 2008


I had just written about this, and was alerted about this discussion, and "unsupervised" is dead on. This guy (from the experiments) claims that his previous life-- as a communist, as a questioning soldier-- made him naturally able to resist authority.

But really, he refused to participate because he thought that he was the one being experimented on. In other words, he didn't stop because he cared about the other person-- he just didn't want to be manipulated.

This doesn't mean he is a bad person, but it does mean that the reasons he cites for withstanding the awesome powers of authority are really reasons he is able to detect manipulation. In other words, this is about narcissism, not morality.

And to echo crazylegs: the problem isn't so much that people can be controlled by authority; the problem is that the mere presence of other people makes us more or less susceptible to it. What makes us noble creatures is the ability to rise above the swarm.
posted by TheLastPsychiatrist at 3:44 PM on May 7, 2008 [2 favorites]


The closer the learner and teacher were, the higher the rate of disobedience.

And the closer the teacher and the authority figure were, the higher the rate of obedience.
posted by Durn Bronzefist at 4:19 PM on May 7, 2008


In retrospect, I believe that my upbringing in a socialist-oriented family steeped in a class struggle view of society taught me that authorities would often have a different view of right and wrong than mine.

There is little reason to put stock in this guy's explanation.

Firstly, if there's one thing social psych has taught us, it's that people are horrible at describing their motivations and explaining why they act as they do. Confabulation is everywhere. It's pretty clear that this man takes pride in his political background, and has latched onto a cultural script that lets him explain his actions in terms of it. Maybe it really is the case that socialists are more likely to opt out of the Milgram experiment than participants of other political affiliation (there is probably data on this, but I don't know it). But evidence for this claim should not come from testimony about motivation. This is especially true when the author is describing a decision he made over forty years ago.

Secondly, as others have said, he didn't resist authority; he saw through the experiment. Maybe everyone, regardless of upbringing, would bow out of the experiment if they suspected they were being lied to. All of this makes me wonder if his results were stricken from the data because he saw through the experiment, or if they included him anyway. If Milgram didn't discount people who suspected something was up, then that means the number of people who are obedient to people they take to be honest authority figures is even higher than typically reported.

There's a prominent psychologist who teaches at a university in the Midwest who was asked why he didn't accept offers from more prestigious universities. His answer: research is easier in Kansas. Subjects believe anything you tell them. Try to do experiments in New York City and subjects will think everything you say is a lie.
posted by painquale at 4:22 PM on May 7, 2008 [2 favorites]


Also, I learned on NPR a few months ago that Milgram and Zimbardo were high school classmates. They must have had one hell of a domineering homeroom teacher.
posted by painquale at 4:28 PM on May 7, 2008


In other words, this is about narcissism, not morality.

This view goes some way towards explaining why nurses were the group that were most compliant when it came to torturing. Obviously, it's because they're been previously subjected to years of instruction in obedience from a group, many of whom appear to have no sense of morality whatsoever.

Also, if narcissism is the reason for the author's detection of manipulation, why wouldn't narcissism also be the reason for you to declare your own inability to detect his self-delusion?
posted by PeterMcDermott at 4:33 PM on May 7, 2008


ick. That should be ability to detect the author's self-delusion -- obviously.
posted by PeterMcDermott at 4:34 PM on May 7, 2008


[PeterMcDermott]... I don't follow you. Why would you think narcissism would help one [i.e. me] detect the author's self-delusion? My point is that narcissism helps him see himself outside of a hierarchy, so thus doesn't attribute actual authority to authorities. But narcissism wouldn't give one any special insight into self- delusion of this kind.

Unless this is simply an ad hominem attack, in which case wow, you got me. Nicely played.
posted by TheLastPsychiatrist at 5:58 PM on May 7, 2008


I agree in general with TheLastPsychiatrist's idea, but it's not so much "narcissism" as a sense of one's own importance, and an urge to self-determination, of which a normal person has a normal amount, and a narcissist has far too much (and presumably there exists a condition of having far too little).

In other words, this is like saying "the man's paranoia helped him realize that the men loitering beside his car were muggers". That's not paranoia, that's ordinary caution. A paranoid might assume that a mother packing away shopping into a car beside his, was a mugger.

Similarly, a narcissist might assume that, say, a criminal trial is being conducted primarily for the purpose of finding a way to help him evade punishment for whatever unimportant thing it was that he did. Or that not only the shock victim and experimenter, but everyone else in the room is actually an actor, and he and only he is being tested. (Which may shade into paranoia or megalomania, depending on the associated fear or delight at the realization.)

I too am very interested in the group who didn't realize it was an experiment and still resisted.
posted by aeschenkarnos at 7:03 PM on May 7, 2008


cstross (Hell, I got a whole novel out of the latter study ...)
Yours is an interesting and disturbing "answer" to the issue of informed consent, too.

It's not even vaguely a spoiler, so for the benefit of those who haven't read "Glasshouse": the story takes place in a "zero-point tech" universe, where copying anything, including people, is trivial; as are making any modifications as desired to the copy. The main protagonist agrees to split off a copy to participate in a psychological study. In return for the copy's participation, the "main identity" (ie, all copies) are promised a benefit.

This raises all kinds of social ethical issues, eg: your right to bind copies of yourself to contracts. Your right to remove the knowledge of the contract from your copy. Courts could subpoena a copy of you to be destructively interrogated. You could torture copies of yourself to indulge sadomasochism (which itself could be edited out of you, or edited into you), have sex with them to indulge narcissism, modify them slightly to create your own species of you. Copies of yourself could be killed or tortured to punish or frighten you, although it's equally trivial to modify "you" to be cooperative. The enormous value of the last remaining copy. And so on. There's a whole bunch of stories in the copy/modify concept alone.

(David Brin's "Kiln People" works on a similar idea of distributed identity, but lower-tech; if you liked the identity issues raised in "Glasshouse", I'd recommend that book too.)
posted by aeschenkarnos at 7:20 PM on May 7, 2008


We went over this in my sociology class and I found this this study, along with the Stanford Prison Experiment, incredibly fascinating. While the SPE was more about abuse of power, Milgram's experiment on people's willingness to go along with orders, even when they felt it was wrong, was interesting and disturbing at the same time.

BTW, Milgram's experiment was done before vital ethics were put into place, so this is something else you should consider about the experiment.
posted by Chocomog at 4:40 AM on May 8, 2008


« Older Finding Waldo   |   "Some images of the spots that gave me the most... Newer »


This thread has been archived and is closed to new comments