Experiments in Philosophy
November 24, 2010 10:18 AM   Subscribe

 
Previously
posted by Perplexity at 10:26 AM on November 24, 2010


In my link, you get to test yourself, which I found rather interesting.
posted by St. Alia of the Bunnies at 10:28 AM on November 24, 2010


Previously (including links to these specific quizzes/tests/games). However, I think they've added a few more over the years, so it may be worth a revisit.
posted by jedicus at 10:30 AM on November 24, 2010


so this is web reflexive?
posted by clavdivs at 10:32 AM on November 24, 2010


The article in the previous FPP links to your site, and some comments discuss it a little and also link to your site.

I don't have a particular view on whether the FPP should be deleted as a "double" -- I just knew the site looked familiar and found the previous connection to it from the blue.
posted by Perplexity at 10:32 AM on November 24, 2010


Would You Eat Your Cat?

I do not have a cat on but if i were starving i might consider it.
posted by clavdivs at 10:35 AM on November 24, 2010


Moral relativism, as always, wins.
posted by Joey Michaels at 10:37 AM on November 24, 2010


Oh yay, a website to justify torture:

"Is torture wrong?" Yes.

"Always always wrong?" Yes. It doesn't actually get results and is against what we stand for as a society.

"Well what about this scenario I just made up in which torture actually could work, and a million people could die, and they were dumb enough that torturing this guy would save these people?" I'm gonna vote "no" anyway, because your scenario is stupid.

"Ha! You're stupid!"
posted by explosion at 10:40 AM on November 24, 2010 [47 favorites]


The 'you're about to be tortured' one is very poorly designed. It uses 'you' to mean 'your body' in the set-up, then switches to meaning 'your mind in another person's body' in the second section, then accuses you of being confused. The whole design of the experiment begs the question of where consciousness resides.
posted by unSane at 10:43 AM on November 24, 2010 [8 favorites]


Exactly what I tought , Torture justifying experiments...

You are Jack Bauer...
posted by CitoyenK at 10:43 AM on November 24, 2010


No, in your torture the fat man scenario, the torture is not justified. Because Superman will find the bomb and throw it into space anyway, and Superman disapproves of torture.

You want to make up fictional worlds with unreal solutions to false dilemmas, we can go tit for tat on that one.
posted by kafziel at 10:44 AM on November 24, 2010 [3 favorites]


Is child murder and rape always wrong? Y N

Ok, you have a time machine and you can go back in time to kill Hitler before he starts WWII, but ONLY if you rape him first since the time machine is retroactively fueled by the despair of children. Do you rape and kill child Hitler? Y N
posted by furiousxgeorge at 10:44 AM on November 24, 2010 [38 favorites]


>>Ok, you have a time machine and you can go back in time to kill Hitler before he starts WWII, but ONLY if you rape him first since the time machine is retroactively fueled by the despair of children. Do you rape and kill child Hitler? Y N

Is this a trick question? Y N
posted by mooselini at 10:46 AM on November 24, 2010 [2 favorites]


A survey is not an experiment. In some cases, a survey may be part of an experiment, but I do not see that this is the case here. What hypothesis is being tested?

(Did they mean "thought experiments?" I still object; thought experiments are not experiments any more than a hot dog is a dog.)
posted by DevilsAdvocate at 10:46 AM on November 24, 2010


I love stuff like this. I just managed to get through the "Fat man" one without contradictions. That's because my morality is as finely-honed and consistent as a really sharp knife that cuts with amazing consistency. Probably.
posted by Decani at 10:49 AM on November 24, 2010


I enjoyed my experience of the experiments as an insight into how a truly diabolical genius might twist what is essentially Judeo-Christian morality toward evil ends.

Is Dick Cheney behind this thing?
posted by philip-random at 10:50 AM on November 24, 2010


Yeah, the torture one is stupid. I object to the torture arguments of "what if you save a million lives?!?" because in the real world, you can never know what's actually going to happen as the results of the torture.

Then I get presented with some completely impossible scenario where you have a priori knowledge that there is no other option and that torture has a great chance of working, and you can't evacuate residents while looking for the bomb or something, then I get told I'm morally inconsistent.

Bah!
posted by zug at 10:53 AM on November 24, 2010


I'm gonna vote "no" anyway, because your scenario is stupid.

Yeah, pretty much. The perfect is the enemy of the good here. Movie plot scenarios lead to really bad practical security choices. Police acting as if they were Jack Bauer, 100% logically consistent utilitarians, is not very desirable. On the menu: secret black prisons, extrajudicial renditions with a side of the inevitable corruption and abuses to follow.

There are good societal and pschological reasons why we're on average 67% consistent. The people who can make the torture and kill decisions often don't work well in everyday life. Would you rather live in a society with people well adjusted for "kill or be killed" decisions, or one in which we aren't all perfectly logical in our moral choices?
posted by bonehead at 10:53 AM on November 24, 2010 [1 favorite]


Perhaps then you should revisit your blanket opposition to torture.

No, thank you. That was a long enough visit for me.
posted by fartknocker at 10:54 AM on November 24, 2010


The entire site needs to be lifted into the Maybe monad, most of the questions they ask do not have an answer.
posted by Dr Dracator at 10:54 AM on November 24, 2010 [2 favorites]


Although I agree with the objections above, I am thinking this was meant more as an exercise philosophy problems rather than answers to moral problems. (My son is the philosophy geek of our family and I have long since learned that to argue with him is to fail before you start. ) The reason you have to set up the scenarios in such a way is so you can argue the philosophy, not that any of these scenarios have much to do with real life.
posted by St. Alia of the Bunnies at 10:58 AM on November 24, 2010 [1 favorite]


Yes, I would eat your cat.
posted by nomadicink at 11:04 AM on November 24, 2010 [4 favorites]


My problem with these experiments is that I'm pretty sure a bunch of people spend a lot of time trying to recreate these impossible situations so that they can justify these morally and cowardly decisions.

The number one reason why you can never recreate these experiments in real life is that there is no way to ever have the level of certainty that they require you to have. It just does not exist.
posted by jabberjaw at 11:05 AM on November 24, 2010 [4 favorites]


Oh, and that makes these experiments pretty useless.
posted by jabberjaw at 11:06 AM on November 24, 2010


Okay, the Euthyphro one annoyed me because it said that as an atheist I could ascribe my own characteristics to my "god". I did so, and they were internally consistent. Then it identified "tensions" between my definition of "god" and "what most people think about god". I defined my god precisely so it would not have internal contradictions - of course such a god is going to grate with the commonly-held theistic versions!
posted by Decani at 11:08 AM on November 24, 2010 [6 favorites]


It didn't let me eat my cat. I feel cheated.
posted by Curious Artificer at 11:09 AM on November 24, 2010 [2 favorites]


I would not torture one fat ugly man to save a million small, adorable children, but I would eat my neighbour's cat. Hell, if I was hungry enough and he was dead or dying anyway, I'd probably eat my neighbour (parts of him anyway).
posted by philip-random at 11:10 AM on November 24, 2010


And I would definitely eat your God. And drink his blood. We're supposed to, aren't we?
posted by philip-random at 11:12 AM on November 24, 2010 [9 favorites]


"You might want to spend some time examining how you think about morality!"

Okay, fine. Maybe I'd push you in front of a train.
posted by gordie at 11:13 AM on November 24, 2010 [9 favorites]


Jack Bauer is being tortured by Evil Foreigners. He has a bomb set up in the basement of EvilCo which will go off in 5 minutes. Evil Sweaty Swarthy Guy is standing over him with a hacksaw and pliers. Upon torture, Jack Bauer says the bomb is on the roof, which gives the bomb time to detonate as EvilCo henchmen go on a wild goose chase.

Do you, as an audience member:
1) Realize that torture is stupid even in the unrealistic ticking time bomb scenario, then experience uncomfortable cognitive dissonance between your bloodlust and rationality
or
2) Cheer wildly since Jack Bauer is a Good Guy and looks like you

-- Continue --
posted by benzenedream at 11:15 AM on November 24, 2010 [6 favorites]


Yes, I would eat your cat.

Well that's very kind. How are you at catching mice?

wait i did that wrong
posted by shakespeherian at 11:16 AM on November 24, 2010 [3 favorites]


I answered the fat man questions so as to maximize death and suffering, and it said:
Your moral consistency score is 100% (higher is better) Well done. This score suggests that you are admirably consistent in the way you view morality. In fact, none of the people who have completed this activity demonstrate greater moral consistency in their responses than you manage. But don't feel too pleased with yourself. Most people don't think about morality very clearly!
Hooray sociopathy!
posted by fleetmouse at 11:17 AM on November 24, 2010 [17 favorites]


My cat would eat me.
posted by ovvl at 11:19 AM on November 24, 2010 [2 favorites]


Hooray sociopathy!

As we all know, hypocrisy is the worst of all possible sins.
posted by empath at 11:21 AM on November 24, 2010 [2 favorites]


"In the Face of Death" was not very good! It says that sailors killing a semi-conscious almost dead person for food is similar to a doctor killing a very conscious almost dead person to save her daughter. That is so wrong for a number of reasons, not least because the doctor takes the Hippocratic oath and sailors do nothing of the sort.
posted by 200burritos at 11:22 AM on November 24, 2010


If somebody created an air tight straw man argument, would it change your point of view on something like torture? Y N

If Somebody created an air tight straw man, would you pour bees on Nick Cage's head? Y N
posted by Joey Michaels at 11:24 AM on November 24, 2010 [14 favorites]


If I wish to maximize the sum total of human happiness, and if my ethical system allows for circumstances that justify torture, and I conclude that every person that takes this test will find their personal happiness significantly diminished as a result of frustration at having been taken for a (philosophically speaking) ride, and the ONLY WAY to disable the website is by the use of a password, which only the creator of this test knows...
I'm pretty sure the only ethically acceptable path is the torture of the site's webmaster.
posted by verb at 11:25 AM on November 24, 2010 [1 favorite]


Yeah, as others have pointed out, contrived examples are contrived. Part of the reason that I don't believe in torture is that I don't believe the assumption (thankfully at least made explicit here) that there is a 75% chance of torture producing a "success."

Incidentally, I also don't believe that a person could ever really be "certain" that pushing a fat man over a bridge onto a set of train tracks could really be used to guarantee the train stopping before it hit five more people. With that in mind, I think balking at that action is entirely morally appropriate. What if you pushed him and missed the tracks? What if the train would have been able to stop anyway? This is entirely different from switching tracks, about which you have a reasonable expectation of the consequences.

This doesn't even get into whether ethical systems should be consistent, or whether it's possible to have a totally consistent ethical system that didn't seem completely alien to a person. Maybe ethical rules are actually heuristics, not statements of logical fact, and so we shouldn't be surprised when applying different reasonable ethical rules would lead to different decisions.
posted by en forme de poire at 11:29 AM on November 24, 2010


From the website:
[...] on at least one occasion you have responded that it would be right to end the life of one person to save the lives of some other greater number of people. It is strange then that you do not think that torture is ever justified

I question the qualifications of this philosopher.
posted by auto-correct at 11:29 AM on November 24, 2010


Not defending this particular site, but the blanket dismissal of contrived thought experiments misses the point. They definitely have their problems, especially if you try to map them onto non-contrived reality. But the point of a good contrived thought experiment is not to recreate a realistic situation, but rather precisely to create a situation so extreme that it brings your basic ethical intuitions into the spotlight and highlights contradictions between them. The whole point of the exercise is that real situations are usually not extreme enough to make these contradictions so visible. Of course, you can argue that there is nothing wrong with one person holding contradictory ethical intuitions; that's a separate point.
posted by game warden to the events rhino at 11:40 AM on November 24, 2010 [13 favorites]


After the painfull 'talking with God' one, I'm pretty sure this philosopher took his classes from Andrew Schlafly...
posted by CitoyenK at 11:40 AM on November 24, 2010


Also, WTF:

This argument is highly suspect. In particular, it seems to commit us to the view that Body-Person A and Body-Person B in Scenario 2 are both, in some sense at least, simultaneously two separate people. Thus, for example, if you somehow remain partly Body-Person A, even though your thoughts and memories have been transferred into Body-Person B, it would seem to follow that the original occupant of Body-Person B (Person B) must somehow remain at least partly Body-Person B, even though their thoughts and memores now only exist in Body-Person A. If this is true, then it follows you are sharing Body-Person B with its original occupant, which of course leads to a whole series of highly implausible possibilities to do with how this would be experienced, executive control, etc, etc.

"Implausible possibilities"? When your thoughts and memories are now inside a person with different genetics, neurological structure, gender, etc., it doesn't seem that implausible to me that in some sense you are "sharing" the body with the original occupant. This isn't inconsistent with the view that the brain is the seat of consciousness, it just acknowledges that the brain's activity is also modulated by other systems of the body.

The whole site is just mind-blowingly condescending.
posted by en forme de poire at 11:40 AM on November 24, 2010 [2 favorites]


The whole point of the exercise is that real situations are usually not extreme enough to make these contradictions so visible.

My problem with these contrived examples is that by making the situations so extreme and alien, they introduct experimental artifacts that don't actually tell you about the consistency of the moral positions people hold, as much as they tell you about the hidden assumptions that you've covertly and perhaps unwittingly introduced into the scenario.
posted by en forme de poire at 11:43 AM on November 24, 2010 [2 favorites]


P.S., "introduct" is totally a word, meaning "introduce"
posted by en forme de poire at 11:44 AM on November 24, 2010 [1 favorite]


P.S., "introduct" is totally a word, meaning "introduce"

Then why not just use "introduce"? Or are you a conceptual artist?
posted by philip-random at 11:51 AM on November 24, 2010


I balked at "maximize the sum total of human happiness". Suppose I could kill a guy (his resulting happiness, %0), and make 5 other people deliriously happy (100%). [Let's say due to an oddball will they get everything they ever wanted if the guy dies in the next five minutes.] Whereas ordinarily they're all basically grumpy people whose happiness averages to about 40%. If I do nothing, the group total is 240% of the maximum possible happy-units. If I kill the guy that goes to 500%. Obviously if my goal is to maximize the sum total of human happiness this guy gotta get whacked.
posted by George_Spiggott at 11:52 AM on November 24, 2010


Come to think of it, the moral issue there has been explored before in stories like Shirley Jackson's "The Lottery" and Ursula le Guin's "The Ones Who Walk Away from Omelas".
posted by George_Spiggott at 11:55 AM on November 24, 2010 [1 favorite]


I answered that torture is always morally wrong but then later on said I'd happily torture the fat dude (just a thought experiment - don't have a shit about it kids). It said I was inconsistent. I'm don't believe that's inconsistent, I just understand that if I did torture someone I'd be taking part in actions that were morally wrong.

I certainly think that if you're going to do something horrible to someone then you need to accept that what you're doing is an awful terrible thing and deal with the consequences.
posted by longbaugh at 11:58 AM on November 24, 2010 [4 favorites]


any more than a hot dog is a dog

"is a dog" vs "made up from dog", semantic quibbling.

And I've been saying "will kill rush limbaugh for food", for years now.
posted by nomisxid at 12:01 PM on November 24, 2010


The 'you're about to be tortured' one is very poorly designed. It uses 'you' to mean 'your body' in the set-up, then switches to meaning 'your mind in another person's body' in the second section, then accuses you of being confused.

"Only Who Can Prevent Forest Fires? You Pressed YOU, Meaning Me. That Is Incorrect. The Correct Answer Is ME, Meaning You."
posted by Fuzzy Monster at 12:02 PM on November 24, 2010 [2 favorites]


Longbaugh's answer is the correct one for the Jack Bauer scenario that our leaders substitute for thought. If it's conceivable that torturing a guy could save millions of lives, does that mean we give generals the authority to torture? No, in that case, if the general's so sure, he can break the law and face the consequences. If it turns out his choice was deemed necessary, he can be pardoned or have is sentence commuted or whatever. The ability to imagine a Jack Bauer scenario is not a basis for changing the law.
posted by George_Spiggott at 12:02 PM on November 24, 2010 [8 favorites]


Then why not just use "introduce"? Or are you a conceptual artist?
That was supposed to be a joke. "Introduct" was a typo.

posted by en forme de poire at 12:04 PM on November 24, 2010 [1 favorite]


I can't find a way to give a series of answers that include a blanket opposition to torture that also avoid the site suggesting that I rethink my opposition to torture. This is very frustrating. It makes me unhappy.

I wouldn't worry about it. I actually stopped at the preliminary "is torture always always always wrong" question and thought about it for a good bit. Eventually I decided that, although I could think of no plausible real-world situation in which I would find torture to be moral, I could imagine a bizarre perfect storm (more like, perfectly implausible storm) of conditions in which I would find torture to be acceptable, so I had to check the "no, torture is not always always always wrong" answer. That doesn't mean I find the torture of real-life Gitmo detainees to be moral, though.

Big surprise [HAMBURGER] when I was subsequently presented with just such a perfectly-implausible-storm situation in one of the questions, but at least I was consistent. On that one.

Not on the first one, where I'd pull the switch and kill one person instead of five even though I had said it's wrong to kill someone where it could be avoided. Which the site said was inconsistent. However, I reject the premise that allowing someone to die through inaction is any less "causing" a death than killing someone through action is. (I'm aware that many people, including some noted philosophers, have argued that it is, but I do not agree with them. I'm also aware that my belief leads to other uncomfortable questions, such as how I can justify buying a shiny new HDTV when I could instead have donated the money to build wells to provide clean drinking water to third-world villages which would not otherwise have such access, which I don't have a good answer for.)
posted by DevilsAdvocate at 12:06 PM on November 24, 2010 [4 favorites]


One issue here is that most people use "X is always wrong" and the like to mean "I am convinced that X will be wrong in every situation I ever end up in."

Philosophers sometimes like to think about situations that will never come up. That's fine, if you're into that sort of thing. But so when a philosopher hears "X is always wrong" they hear a stronger claim: roughly, "X would be wrong in any logically consistent situation — even situations that I am convinced are totally impossible in the real world."

A lot of these experiments involve situations like that. When most people say "Torture is always wrong," they're not saying it would be wrong to torture some hypothetical freak of psychology who was certain to to give true, relevant, helpful information under torture. What they're saying is, "I am convinced that no human being could be relied on to behave that way under torture." A philosopher would phrase that differently ("Torture isn't always wrong; but because of these facts about human psychology, I don't ever expect to see a situation where it's justified in the actual world") and that creates confusion and frustration.

That said, this sort of thought experiment can be useful when you're all speaking the same language. It's a pity that most people's exposure to them is in setups like these where there's a fundamental gap in communication.
posted by nebulawindphone at 12:09 PM on November 24, 2010 [2 favorites]


I passed the Whose Body Is It one with no inconsistencies. But that's because I don't feel emotions. Strangely enough I don't feel bad about that.
posted by Splunge at 12:13 PM on November 24, 2010


I think I've finally outgrown utilitarianism. Nobody can have perfect information about any scenario, especially not about the real value of a human life. Everybody is guided more by emotion and social norms than hard logic, anyway. I would push a thousand people onto the tracks to save my own life. Whether I just pushed a button that sent their train into a wall or had to physically drag each one to their death, I would do it. I am an emotional, irrational human being with a bunch of inconsistent and illogical views about what is right and wrong, and I have an unbelievably high subjective value of my own life.
posted by tehloki at 12:14 PM on November 24, 2010 [1 favorite]


Who cares about this philosophy shit, I won a Ferrari!
posted by desjardins at 12:15 PM on November 24, 2010 [1 favorite]


So the point of this game is to give answers that won't cause contradictions in the purposefully ridiculous scenarios the game designers made up. Right? Yay, I winned!
posted by Huck500 at 12:17 PM on November 24, 2010 [1 favorite]


Just as you should never trust a skinny chef, you should never trust a fat saboteur.

I mean, really: in all the action thriller/spy adventure movies I've ever seen, I can't think of a single bomb-planting, train-derailing terrorist who wasn't either pale and svelte with a posh English accent, or dun-colored and wiry with a beard like Cat Stevens.

I elected not to push tubby off the overpass because he's an obvious patsy (fatsy?).

The real baddie is watching events unfold via closed circuit TV monitors from an as yet undisclosed location miles away. We won't actually meet face-to-face for at least another two reels. Then, it's go time!
posted by Atom Eyes at 12:20 PM on November 24, 2010 [2 favorites]


"You know what I did to Jehoram of Judah, don't you? Yeah, I made his intestines fall out."

Yeah, I heard. You're not right in the head, you know?
posted by Splunge at 12:23 PM on November 24, 2010 [1 favorite]


My problem with these contrived examples is that by making the situations so extreme and alien, they introduct experimental artifacts that don't actually tell you about the consistency of the moral positions people hold, as much as they tell you about the hidden assumptions that you've covertly and perhaps unwittingly introduced into the scenario.

FWIW I definitely agree this often happens, just wanted to push back against the apparent point being made that "unrealisticness" by definition renders these kinds of thought experiments worthless.

(And, of course, even if all they do is prompt you to formulate explicit, well-articulated reasons for why you think they're crap that can still be philosophically useful.)
posted by game warden to the events rhino at 12:26 PM on November 24, 2010 [1 favorite]


Fat saboteur.
posted by furiousxgeorge at 12:27 PM on November 24, 2010


I think the main problem for the "You're Being Tortured In The Morning" scenario is that if you hold the opinion that your identity is based on your memories and thoughts, there is no way you could be "reassured" by learning that a procedure before the torture will result in your identity being wholly eliminated. At the end, they give the argument that you should feel reassured because you're not getting tortured anymore, but it's hard to feel reassured when you're told you're going to die.
posted by demiurge at 12:33 PM on November 24, 2010 [1 favorite]


hey lookit me although there is some tension in my responses my position on abortion seems generally coherent and well thought out (which, of course, is not the same as saying it is right)

plus i hate people-plants, all getting in my house and growing into human children that don't carry my genes, upsetting the dogs and eating me out of house and home n shit

who wants some tacos
posted by infinitewindow at 12:41 PM on November 24, 2010


Why does the man have to be fat? Why can't he just be incredibly dense?
posted by [citation needed] at 12:44 PM on November 24, 2010 [1 favorite]


Bart and Lisa in 36 seconds. Also not a philosophy exercise, but a good exercise in data analysis. Plus, I learned about Simpson's paradox.
posted by kuujjuarapik at 12:48 PM on November 24, 2010


any more than a hot dog is a dog

FWIW, I saw a sign today that said "HOT DOGS -- THE BEST IN TWON" [sic]. True story.
posted by goodnewsfortheinsane at 12:50 PM on November 24, 2010


Question 1: Torture, as a matter of principle, is always morally wrong.
I can't answer that question. Torture is repugnant to me. But I'm not an atomic consistent whole. I might be in a situation where I feel a bit of sympathy secretly for torture while still finding it repugnant as well.
Principles only come in at the point where I create rules to guide my behaviour so as to fulfill a social contract. So that people will be able to know what to expect from me and to trust me and I them in return.
Similarly a society creates principles, rules, laws to institutionalise a form of quid pro quo social contract.
Obviously society is not an atomic consistent whole either.
The morals we uphold are the result of trade offs between opposing values, middle grounds between opposing forces, choices about cut off points. Within ourselves, within society and among societies.

The question seems to have a model of morality that is similar to propositional logic; axioms, derived implications, tautologies and paradoxes.
I think that model is flawed and that it is blatant that the questions are trying to lead the respondee into a trap of a conclusion of either consistency coupled with unintended outcome or inconsistency. Which I personally don't find an interesting outcome.

Here's one example of a personal inconsistency I experienced recently. I always had a rather abstract idea that abortion should be allowed in the society I live in. The basic structure of the trade off I saw was that 1. we hold that people shouldn't be killed, that every life is in that respect of ultimate value 2. unwanted pregnancies apparently can sometimes cause a lot of undesirable societal effects 3. there's no point where an outgrowth of cells in the uterus suddenly changes into a person with an inviolable life, so society has decided by approximate heuristics and consensus where to place the cut off point. So the law that states that abortion up to a certain gestation time is legal is probably wise.
What changed for me and has made me a bit inconsistent in time were two things; I had a daughter recently and I read that medical possibilities and nicu's have advanced so far that the eldest age where abortion can legally be performed and the youngest age where a neonatal baby has a quite good chance of survival are getting very close, weeks apart.
So the strength of some of my feelings changed, my empathy for the situation changed and I learned some new information. I've considered whether I should change my convictions, my moral principles, about abortion somewhat. Maybe I will maybe I won't.

My point is that I think we reverse engineer moral principles from the multitude of outcomes we and others have observed, the value that we attach to these outcomes and our analysis of the interplay between the causes of these outcomes. Not the reverse.
This is more a matter of feelings of empathy and of self interest, of knowledge of how society works, how people work. A matter of experience and wisdom and political compromise.
A process of a multitude of changing and approximate forces settling on some middle ground.
Not a matter of timeless context-less crystalline truth.
posted by joost de vries at 12:52 PM on November 24, 2010 [1 favorite]


W^
posted by clavdivs at 1:00 PM on November 24, 2010 [1 favorite]


My critique of "You're Being Tortured in the Morning" scenario:

If you choose to have Body-Person A tortured, the site assumes it's because you think "you" are Body-Person B and want to avoid torture, and chides you for that belief. It completely ignores the possibility that you understand that "you" are still Body-Person A (with someone else's memories), but find the moral choice to be taking the suffering on yourself rather than choosing to have it inflicted on another innocent person.
posted by DevilsAdvocate at 1:02 PM on November 24, 2010 [3 favorites]


A large part of the annoyance I get from this site (as someone with a philosophy degree) seems to be its unbelievably naive view of deontological ethics.

For example: it's possible to believe that killing is wrong, it's possible to believe that killing to maximize other peoples' happiness is wrong, and it's possible to believe that, given a choice between saving one life and saving five lives, it would be better to save the five. This is not a contradiction if you have a system which can handle the idea of conflicting duties and ways of resolving those conflicts; it's only a contradiction if you're the sort of straw-man absolutist the site (and certain utilitarians, not gonna speculate on the obvious leanings there) tries to set you up to be.
posted by ubernostrum at 1:03 PM on November 24, 2010 [8 favorites]


The fault here is the assumption that torture actually works for anything other than getting false confessions.
posted by empath at 1:26 PM on November 24, 2010 [3 favorites]


Consider the following case:

On Twin Earth, a brain in a vat is at the wheel of a runaway trolley. There are only two options that the brain can take: the right side of the fork in the track or the left side of the fork. There is no way in sight of derailing or stopping the trolley and the brain is aware of this, for the brain knows trolleys. The brain is causally hooked up to the trolley such that the brain can determine the course which the trolley will take.

On the right side of the track there is a single railroad worker, Jones, who will definitely be killed if the brain steers the trolley to the right. If the railman on the right lives, he will go on to kill five men for the sake of killing them, but in doing so will inadvertently save the lives of thirty orphans (one of the five men he will kill is planning to destroy a bridge that the orphans' bus will be crossing later that night). One of the orphans that will be killed would have grown up to become a tyrant who would make good utilitarian men do bad things. Another of the orphans would grow up to become G.E.M. Anscombe, while a third would invent the pop-top can.

If the brain in the vat chooses the left side of the track, the trolley will definitely hit and kill a railman on the left side of the track, "Leftie" and will hit and destroy ten beating hearts on the track that could (and would) have been transplanted into ten patients in the local hospital that will die without donor hearts. These are the only hearts available, and the brain is aware of this, for the brain knows hearts. If the railman on the left side of the track lives, he too will kill five men, in fact the same five that the railman on the right would kill. However, "Leftie" will kill the five as an unintended consequence of saving ten men: he will inadvertently kill the five men rushing the ten hearts to the local hospital for transplantation. A further result of "Leftie's" act would be that the busload of orphans will be spared. Among the five men killed by "Leftie" are both the man responsible for putting the brain at the controls of the trolley, and the author of this example. If the ten hearts and "Leftie" are killed by the trolley, the ten prospective heart-transplant patients will die and their kidneys will be used to save the lives of twenty kidney-transplant patients, one of whom will grow up to cure cancer, and one of whom will grow up to be Hitler. There are other kidneys and dialysis machines available, however the brain does not know kidneys, and this is not a factor.

Assume that the brain's choice, whatever it turns out to be, will serve as an example to other brains-in-vats and so the effects of his decision will be amplified. Also assume that if the brain chooses the right side of the fork, an unjust war free of war crimes will ensue, while if the brain chooses the left fork, a just war fraught with war crimes will result. Furthermore, there is an intermittently active Cartesian demon deceiving the brain in such a manner that the brain is never sure if it is being deceived.

QUESTION: What should the brain do?

[ALTERNATIVE EXAMPLE: Same as above, except the brain has had a commisurotomy, and the left half of the brain is a consequentialist and the right side is an absolutist.]
posted by ymgve at 1:33 PM on November 24, 2010 [14 favorites]


DRINK THE VAT QUICKLY
posted by ook at 1:41 PM on November 24, 2010


Ah, utilitarianism, making decisions with data that cannot be measured. So practical.
posted by GuyZero at 1:49 PM on November 24, 2010 [1 favorite]


QUESTION: What should the brain do?

Go right, obviously. Brains in vats are notoriusly evil.

While war crimes are an attractive inducement, the prospect of an unjust war pleasntly crinkles the corpus callosum. Also, the brain needs to set a good example stike a horrible warning into the medulla oblongata of young Eddorians.
posted by bonehead at 1:58 PM on November 24, 2010


Also, the questions ignore the possibility that you might consider something morally wrong but do it anyway. For example, I imagine a lot of people go through with abortions while thinking that they're doing something more or less morally wrong. I'm pretty sure the guys who ate the cabin boy didn't feel too good about it. Just because you are forced into a decision by circumstances doesn't make it morally correct. It makes it deeply ambiguous.
posted by unSane at 2:02 PM on November 24, 2010


Torture is virtually useless for providing useful data.

The function of torture is to provide a sense of moral justification for the interrogator's power over the detainee.
posted by ovvl at 2:24 PM on November 24, 2010




the main problem for the "You're Being Tortured In The Morning" scenario is that if you hold the opinion that your identity is based on your memories and thoughts, there is no way you could be "reassured" by learning that a procedure before the torture will result in your identity being wholly eliminated

Exactly. The first fact is reassuring because you don't have to experience the anticipation of torture. The other facts are less reassuring, because they actually affect your identity (even just on the second fact: you might think that you'd gain strength from your memories, by thinking of a loved one, say. And that those memories would help you deal with the torture).
posted by Infinite Jest at 2:34 PM on November 24, 2010


QUESTION: What should the brain do?

Maybe instead of pontificating we should just call a brain in a vat and ask it.
posted by en forme de poire at 2:51 PM on November 24, 2010


I enjoyed "Elementary, Dear Wason." I had seen the problem in the abstract before, and I have some familiarity with formal logic, so I managed to get all three right. What was striking to me, though, and I hadn't seen before was the difference between the abstract examples (1 and 2) and the real-world example (3). I had to think a few seconds on 1 and 2 to make sure I was doing it right, but I got #3 instantly. Likewise, on the overall results, only 23% (as of this writing) answer #1 and 2 correctly, but 61% get #3 correct. Despite the fact that all three are isomorphic. Fascinating.
posted by DevilsAdvocate at 2:54 PM on November 24, 2010


Go right, obviously. Brains in vats are notoriusly evil.

Anything that comes from a vat is evil. This is why I don't eat Quorn.
posted by nebulawindphone at 3:08 PM on November 24, 2010 [1 favorite]


Well, I went through a number of the experiments and all I'm left with now is a terrible fear of seed people.
posted by orme at 3:18 PM on November 24, 2010


The fault here is the assumption that torture actually works for anything other than getting false confessions.

Ah, but as long as we're dealing with bizarrely improbable Hollywood-like scenarios anyway, you can re-establish the dilemma by removing the need to extract information at all. For example, a Saw-like villian has trapped you and another person in a room; you are free to move about the room, which contains various instruments of torture, but the other person is strapped helplessly to a table. The villian, watching over CCTV, informs you that he has planted a nuclear weapon in a major city, and will detonate it, killing millions, unless you torture the other person. Now it's irrelevant whether torture is an effective means of extracting information, but the question of whether torture is sometimes1 moral remains.

1For values of "sometimes" which are "effectively never, but not physically impossible."
posted by DevilsAdvocate at 3:21 PM on November 24, 2010


"Torture, as a matter of principle, is always morally wrong."

then...

"Your response that the fat man should be tortured is in direct contradiction with your earlier claim that torture is always wrong."

This test is broken. But then, the idea that a fat man's body could stop a train makes it pretty impossible to take seriously in the first place...
posted by Chuckles at 3:24 PM on November 24, 2010


If you choose to have Body-Person A tortured, the site assumes it's because you think "you" are Body-Person B and want to avoid torture, and chides you for that belief. It completely ignores the possibility that you understand that "you" are still Body-Person A (with someone else's memories), but find the moral choice to be taking the suffering on yourself rather than choosing to have it inflicted on another innocent person.

It ignores that possibility because it straight up tells you to answer as if you wanted to avoid the torture yourself.
posted by ODiV at 3:25 PM on November 24, 2010


If a card has a circle on one side, then it has the colour yellow on the other side.

For some reason, I don't parse this as if and only if. I got all three "wrong" consistently because of that.
posted by Chuckles at 3:35 PM on November 24, 2010


Actually on further thought, I'm not sure that the writer(s) of the tests have a firm grasp of the concept of morality. Or I don't. I think that the few tests that I took actually pushed me towards moral nihilism. Was that their point? If so, well played. ::golf clap:: Well played.
posted by Splunge at 3:52 PM on November 24, 2010


I balked at "maximize the sum total of human happiness". Suppose I could kill a guy (his resulting happiness, %0), and make 5 other people deliriously happy (100%). [Let's say due to an oddball will they get everything they ever wanted if the guy dies in the next five minutes.] Whereas ordinarily they're all basically grumpy people whose happiness averages to about 40%. If I do nothing, the group total is 240% of the maximum possible happy-units. If I kill the guy that goes to 500%. Obviously if my goal is to maximize the sum total of human happiness this guy gotta get whacked.

If you're dead, you have no potential happiness. Even if the five sociopaths will be having sex on ecstasy forever in a space bubble over Jupiter as they each write the Great American Novel who's publication will cure world hunger, they have a finite amount of happiness they are capable of experiencing. Happiness does not exist any more for the guy that got whacked, it cannot be measured as such. All I'm trying to say is that death and total happiness are not the extremes of a scale.
posted by seagull.apollo at 4:12 PM on November 24, 2010 [1 favorite]


It ignores that possibility because it straight up tells you to answer as if you wanted to avoid the torture yourself.

Ah, fair enough. I didn't read carefully enough, apparently.
posted by DevilsAdvocate at 4:18 PM on November 24, 2010


For some reason, I don't parse this as if and only if.

What? Neither does the site. Their answers are consistent with "if" meaning "if" in the sense it's commonly used in formal logic, which is not "if and only if." Did you mean you did parse "if" as "if and only if?" In which case you would have had to turn over all four cards to verify that the rule (as you understood it) was being followed.
posted by DevilsAdvocate at 4:23 PM on November 24, 2010


The villian, watching over CCTV, informs you that he has planted a nuclear weapon in a major city, and will detonate it, killing millions, unless you torture the other person. Now it's irrelevant whether torture is an effective means of extracting information, but the question of whether torture is sometimes moral remains.

You're still asking people to make a lot of implausible practical assumptions here. They'd have to assume that they somehow knew the killer's promise to be reliable, and knew the bomb wasn't a hoax. They'd also have to assume that the victim was totally unwilling to be a hero about it — that even with complete information, he'd never choose a bit of pain in order to save millions of lives. (After all, if they assume that the victim consents — or even that he would consent were he given the chance — then the moral analogy to nonconsensual torture in the real world starts to break down.)

Someone who believes (quite plausibly) that no promise is perfectly reliable, no threat is perfectly credible, and most people are basically altruistic, can consistently say "I'd torture the guy in that scenario" and "Because that scenario is impossible, torture is still always wrong."
posted by nebulawindphone at 4:33 PM on November 24, 2010


Two years ago, I received an email inviting me to a local rock venue for an evening of edgy rants on experimental philosophy. They presented it like it was this brand new thing, too hot for the stuffy halls of academia, but perfectly suited for a smoky ill-lit concert hall.

I thought it was really funny that they sent the invite around to the psych department, because, uh, we've been doing experimental philosophy for almost two centuries.

Fuck trolleyology
posted by solipsophistocracy at 4:53 PM on November 24, 2010


What? Neither does the site.

Ya, you are right. sort of :)
If a card has a circle on one side, then it has the colour yellow on the other side.
The way I react to the statement is, I do not have to go on a fishing expedition to look for rule violations. Just do the literal: see circle, check for yellow. That's it. Done.

I keep typing "I can see how that is wrong." Then going back and deleting it, because frankly it just has me confused. To me the statement is not equivalent to "If a card has the colour yellow on one side, it has a circle on the other." Was there something in the preamble I'm missing?

And going back to look again, I can't see anything wrong with my reading. The statement does not require a circle under yellow, it only requires yellow under a circle.
posted by Chuckles at 5:02 PM on November 24, 2010


And now I see my mistake :)

You have to check the red card for the circle, and I didn't even read their answer key properly. I don't know why this question confuses me so much...
posted by Chuckles at 5:16 PM on November 24, 2010 [1 favorite]


They'd have to assume that they somehow knew the killer's promise to be reliable, and knew the bomb wasn't a hoax.

OK then, let's remove the killer's promise entirely, and set up a Rube Goldberg-esque device. Also in the room with you are six people strapped, immobile, in chairs. Each has a bow and arrow1 aimed directly at one of their eyes, the tip of the arrow just a millimeter away from their eyeball, aimed in such a way that the arrow will go through the eye and into the brain if fired, leading to almost certain death. The bows are all connected to a mechanical timer, such that they will fire if time expires and no other action is taken. The outside of the timer is transparent so you can see all the inner mechanisms of the timer and verify that it works exactly as it initially appears, but you cannot access the timer, nor the bows and arrows, nor the people in chairs, etc.

There is one way you can stop the arrows from firing: there is a string which, if cut, will trip a mechanism which stops the timer. Most of the string is inaccessible, also encased in plexiglass or whatever, except for one small segment, underneath a small hole in the table upon which our original person is lying. Even there, the string is not accessible from the sides or below (also surrounded by plexiglass). And the hole in the top of the table is blocked by the person's leg which, again, is held immobile.

Above the hole is a knife, attached to a guide such that you can move the knife only in one dimension: plunging the blade through the person's upper leg and cutting the string, stopping the timer, and preventing the arrows from firing, and leaving the person with a painful but probably non-fatal wound. The leg is not perfectly centered over the hole (though still completely blocking it), so you can be sure the knife will not hit the femur and be stopped by the bone as it goes through the leg. It is not possible to remove the knife from the guide, at least not before the timer goes off.

They'd also have to assume that the victim was totally unwilling to be a hero about it — that even with complete information, he'd never choose a bit of pain in order to save millions of lives. (After all, if they assume that the victim consents — or even that he would consent were he given the chance — then the moral analogy to nonconsensual torture in the real world starts to break down.)

The victim is fully conscious and aware of what is going on. As you reach towards the knife (not asking him whether he wants you to use it, for fear that he might say "no" and invalidate your precious assumption that he would consent if you asked him), he realizes exactly what you mean to do and yells "GODDAMMIT DON'T YOU DARE SHOVE THAT KNIFE THROUGH MY LEG I'D RATHER THOSE SIX PEOPLE OVER THERE DIE!"

Someone who believes (quite plausibly) that no promise is perfectly reliable, no threat is perfectly credible, and most people are basically altruistic, can consistently say "I'd torture the guy in that scenario" and "Because that scenario is impossible, torture is still always wrong."

Well, I've now removed any spoken promise or threat, and established that the potential torture victim is decidedly not altruistic. (As an aside, the insistence on a perfectly reliable promise or threat seems odd. Do you mean to suggest that if it were 100% certain that torture would save lives, it would be acceptable, but no such situation is possible, and if it were only 99% certain to save lives it would be unacceptable? Which would not be an inconsistent position, I suppose, but I would find it a bizarre one.)

And yes, I realize this is getting pretty silly, but in my defense I note the original scenario is fairly silly too.

1I thought about using guns rather than bows and arrows here, which might be a more certain means of death, but I wanted to avoid the "How do I know it's loaded? How do I know the bullets aren't blanks?" objections.
posted by DevilsAdvocate at 5:37 PM on November 24, 2010


I have always been an advocate of "thinking outside the box" as trite as that is. Tests like these limit the parameters. If it was something like the BladeRunner VK tests, searching for specific responses from robots that are programmed that would be one thing.

But asking human beings questions after boxing them in with specific things that they HAVE TO consider... Well it doesn't seem right.

I don't agree with a lot of the premises that the tests start with. It invalidates the final conclusions if I have to "pick the best one, even though". You're skewing the results in the premise.

Might as well have the test be a group of questions where there are two answers, one lets you continue, the other boots you out as useless.

I would be useless guy.
posted by Splunge at 5:38 PM on November 24, 2010


DevilsAdvocate: And yes, I realize this is getting pretty silly, but in my defense I note the original scenario is fairly silly too.

Then what's the point? Of course it's possible to construct a "thought experiment" that's so convoluted it forces an infinitely complicated question into a y/n format. And... so? What is that supposed to illuminate or demonstrate?
posted by dogrose at 5:54 PM on November 24, 2010


Of course it's possible to construct a "thought experiment" that's so convoluted it forces an infinitely complicated question into a y/n format. And... so? What is that supposed to illuminate or demonstrate?

Nothing, from a practical standpoint. Pretty much everyone in this thread (at least among those who have weighed in on the issue), myself included, are agreed that torture is immoral under all plausible scenarios. What is under debate is whether torture might be moral under situations which are highly implausible, but neither unimaginable nor physically impossible. Again, it has no practical bearing on anything, but I (and apparently at least a few others) find it an interesting question all the same. If the question doesn't interest you, you're free to ignore it.
posted by DevilsAdvocate at 6:03 PM on November 24, 2010


Torture question seems to assume that if you think something is morally wrong then you can't do it in a given situation and still be consistent. I guess the straw man has never heard of the lesser of two evils
posted by bonaldi at 6:39 PM on November 24, 2010


That's an interesting take, bonaldi, but it's not what I would understand "morally wrong" to mean. If I were in a situation in which I had only two options, both leading to undesirable outcomes but one clearly more undesirable than the other, I would not consider the option leading to the less undesirable outcome to be "morally wrong."

But all the same it's interesting to consider that "morally wrong" may be an ambiguous term even aside from the question of the specific morals involved. That two people may both say that "torture is always morally wrong" yet mean different things (and not just because of the ambiguity of "always" as nebulawindphone neatly demonstrated).
posted by DevilsAdvocate at 7:00 PM on November 24, 2010


The Monty Hall Problem is not an experiment in philosophy.

No it isn't, and the computer cheated me (I chose door 1 every time).
posted by clorox at 7:16 PM on November 24, 2010


DevilsAdvocate: Nothing, from a practical standpoint.

Oh, so you're thinking of it as a freshman-dorm debate, where everything's theoretical and no one really, like, gets hurt.

Totally fine.

Except that people who represented me tortured actual, living people despite the fact that there were no arrows waiting to pierce eyeballs like grapes, no ticking time bombs, nothing at stake except rage and revenge.

Any "thought experiment" that forces people to choose torture as a rational/ moral/ honorable response is way to normalize vicious, dehumanizing behavior.
posted by dogrose at 7:46 PM on November 24, 2010 [1 favorite]


DevilsAevocate, you scenario just illustrated the moral necessary of never going anywhere without a Leatherman, Duct Tape and a Sawsall.
posted by humanfont at 7:47 PM on November 24, 2010


That's an interesting take, bonaldi, but it's not what I would understand "morally wrong" to mean. If I were in a situation in which I had only two options, both leading to undesirable outcomes but one clearly more undesirable than the other, I would not consider the option leading to the less undesirable outcome to be "morally wrong."

I'm right with bonaldi on this one. It is the point I was trying to make before humiliating myself on the IQ portion of this quiz :)

See, I can understand resorting to torture in the face of a ticking bomb terrorist. That doesn't excuse one from facing the burden of having done something outrageously wrong. As I put it in the past..
I suggest that in such a fictitious scenario it would make sense for the relevant authorities to go right ahead and do the torturing. Once the immediate problem is resolved the authorities can write a complete account and confession, which will result in appropriate punishment. Those authorities can then sit in jail happily dreaming of all the lives they saved, and anticipating the well deserved heroes welcome they will receive upon release from jail.
posted by Chuckles at 8:00 PM on November 24, 2010


Any "thought experiment" that forces people to choose torture as a rational/ moral/ honorable response is way to normalize vicious, dehumanizing behavior.

I have explicitly and repeatedly said that torture is immoral in all plausible circumstances.

If there are people who want to rationalize the use of torture to themselves in real-world situations, they certainly don't need my help to do so. And if they attempt to twist my quite clear words on the subject to mean precisely the opposite of what I have said, it is they, not I, who are "normaliz[ing] vicious, dehumanizing behavior."
posted by DevilsAdvocate at 8:18 PM on November 24, 2010


And if anything, the absurdity of my examples above illustrates just how incredibly implausible a situation must be before torture would be moral. Your argument is analagous to "you shouldn't argue that killing in self-defense is rational/moral/honorable, because it is a way to normalize all killing."
posted by DevilsAdvocate at 8:26 PM on November 24, 2010


DevilsAdvocate: If there are people who want to rationalize the use of torture to themselves in real-world situations, they certainly don't need my help to do so. And if they attempt to twist my quite clear words on the subject to mean precisely the opposite of what I have said, it is they, not I, who are "normaliz[ing] vicious, dehumanizing behavior."

That whistling sound? That's the point, streaking over your head.

Yes, there are people—Americans, representing you and me—who torture other people in real-world situations. No, they don't need your help or mine to actually pulpify someone for no reason. And no one expects you to go out and pulpify anyone.

But "thought exercises" that force you to decide between slaughter v. torture make torture a legitimate choice. Not such a big deal. Not worth objecting to. So when you read about it, you're kinda meh. That's the point of the OP.

Language is powerful. You seem blind to that fact.
posted by dogrose at 9:44 PM on November 24, 2010


On the contrary, dogrose. I agree with you on at least part of your argument. I certainly agree that it is possible, in some cases, for words of one person to lead to despicable acts by another. I also agree that if it was reasonably forseeable by the speaker that their words would lead to such actions, the speaker shares in the moral accountability for those actions.

However, I do not believe that there is the slightest possibility in this particular instance that my thought experiments will lead to increased torture. If I did believe that, I would certainly agree it would have been immoral of me to post them; indeed, I would be writing to the mods right now, demanding that my comments be deleted, so as to prevent any more increase in torture beyond what they may have already caused.

Speaking of which, have you flagged my thought experiments, and written to the mods to request their deletion? Because if you truly believe that the presence of my thought experiments on this page might possibly—no matter how remote a possibilty—lead to an increase in torture, then morally you ought to do what you reasonably can to have them removed. If you haven't, then at best you are engaging in hyperbole (you do not actually believe that my comments might lead to an increase in torture) and at worst you are a hypocrite (you do believe that my comments might lead to an increase in torture, yet you fail to act in such a way to try to eliminate that possibility).

(Side note: this holds regardless of whether the supposed link between my comments and actual torture is direct or indirect, whether the causal chain between my thought experiments and actual torture is one link long or fifty.)

You seem to believe that my thought experiments are at the precipice of a slippery slope. But if any part of it is slippery, the sheer absurdity of my examples are so far removed from any real-world situation where torture would be considered that they are akin to a flat, level, frozen pond; yet you will not let your children skate on that pond, because you have heard of another pond, thousands of miles away, which does have an icy slope of the sort you fear.

But "thought exercises" that force you to decide between slaughter v. torture make torture a legitimate choice.

No, no they don't. That whistling sound? That's my point, streaking over your head. If I note that jaywalking is less objectionable than murder, and thus jaywalking may be morally permissible if by doing so one may prevent a murder, do you really think that reading my argument will convince people to start jaywalking solely because they enjoy doing so, and that my argument has blinded them to the immorality of jaywalking in general?
posted by DevilsAdvocate at 11:12 PM on November 24, 2010


CAPSLOCK IS POWERFUL

YOU SEEM BLIND TO THIS FACT
posted by seagull.apollo at 1:24 AM on November 25, 2010


These aren't even problems about torture. The people who wrote the problems just used the worst thing they could think of, because the universal assumption is that TORTURE IS REALLY FUCKING BAD, DON'T DO IT. If you're just here to grind an axe about torture practices in the US Military, then you might as well keep steppin', you're in the wrong place.
posted by seagull.apollo at 1:32 AM on November 25, 2010


hey everybody look at me i just finished phil1001 and so i know there are like these unresolved tensions between utilitarianism and deontology and i totally made a website about it where you're always wrong especially if you're a deontologist you idiot ur such a coward
posted by obiwanwasabi at 2:45 AM on November 25, 2010


I think the annoyance is that the "But now you have to torture the evil man to save a million people" scenario -- the contrived, unrealistic one that everyone rolls their eyes at in this thread -- is actually the rationale used to justify real incidents of torture by our country's government. Like, in public discourse, in debates between presidential candidates, on CNN, etc.

What I find interesting in an unrelated way is the anger at being presented with thought exercises that "force" someone to choose the lesser of two evils when they believe that there are additional options available in "real life." It's worth noting that this is the response of many conservative opponents to sex education and condom promotion when presented with the "Give out condoms/teach safe sex, or pregnancies and abortions will rise" choice. One might be the "lesser" of two evils, they just believe there are additional options that make both unnecessary.
posted by verb at 7:03 AM on November 25, 2010 [1 favorite]


(Lack of) sex education is a real issue. Ticking bomb terrorist threats are not.
posted by ymgve at 7:04 PM on November 25, 2010


The Office:
"Would you steal a loaf of bread in order to feed your children?"
"They're not your children because you are a cuckold. The bread is poisoned."
posted by ovvl at 8:12 PM on November 25, 2010 [2 favorites]


« Older Your Mother Whips Hair In Hell, Karras   |   Why Do We Talk Newer »


This thread has been archived and is closed to new comments