Lesser-Known Trolley Problem Variations
April 24, 2015 4:23 PM   Subscribe

Lesser-Known Trolley Problem Variations There’s an out of control trolley speeding towards Immanuel Kant. You have the ability to pull a lever and change the trolley’s path so it hits Jeremy Bentham instead. Jeremy Bentham clutches the only existing copy of Kant’s Groundwork of the Metaphysic of Morals. Kant holds the only existing copy of Bentham’s The Principles of Morals and Legislation. Both of them are shouting at you that they have recently started to reconsider their ethical stances.

Fred Clark: Yeah, maybe elaborately constructed and convoluted, semi-sadistic analogies aren’t always helpful for ethical reflection.

Background: doing vs. allowing harm
Previously: trolley problems and autonomous cars
Previously: could the body of a 184lb human stop a trolley?
posted by justsomebodythatyouusedtoknow (51 comments total) 26 users marked this as a favorite
 
Bentham left the better looking corpse...going with Kant.
posted by clavdivs at 4:35 PM on April 24, 2015 [3 favorites]


trolley’s path so it hits Jeremy Bentham instead.

I immediately started laughing. For these I'll need not just "moral calculus" but "moral differential equations" or perhaps "moral real analysis."
posted by the man of twists and turns at 4:36 PM on April 24, 2015 [10 favorites]


Is there a lever I can pull to never deal with the trolley problem again? I am not particularly concerned about what else it does.
posted by ckape at 4:36 PM on April 24, 2015 [18 favorites]


I assumed this meant a fully laden shopping trolley, with the one wonky wheel, on a sloped car park.

It look me less time to figure it out than it did the Little Boy horror, though.
posted by Mezentian at 4:39 PM on April 24, 2015 [3 favorites]


Western philosophy can be so silly sometimes.
posted by doctor_negative at 4:39 PM on April 24, 2015 [1 favorite]


Won't someone think of the trolley!
posted by blue_beetle at 4:40 PM on April 24, 2015 [7 favorites]


I assumed this meant a fully laden shopping trolley,

I always imagined it was a tea trolly, and someone might get a bit scalded if the urn turned over.

I'm not sure it's worth throwing someone off a bridge to stop that. Even if it's just a dental bridge. Or possibly a bridge hand. You definitely shouldn't bid 2 Spades over 1 No Trump if you only have 4 spades. You deserve to get scalded.
posted by GenjiandProust at 4:44 PM on April 24, 2015 [5 favorites]


Also, at the risk of clanging my own bell, what about...?
posted by GenjiandProust at 4:47 PM on April 24, 2015


at the risk of clanging my own bell,

I thought the bells on the trolley went "ding"?
posted by Mezentian at 4:49 PM on April 24, 2015 [3 favorites]


...on a somewhat unrelated note, I guess I'll go listen to that song that klangklangston reminds cortex of.
posted by halifix at 4:59 PM on April 24, 2015 [1 favorite]


In one of the Thursday Next books, she gets stuck in a hypothetical boat in an ethics seminar; it reminds me a bit of these.
posted by jeather at 5:01 PM on April 24, 2015 [1 favorite]


Advanced Trolley Problems (which I got via Scott Alexander's Tumblr, the man is a content machine), including the Trolley Hall Problem.
posted by pw201 at 5:02 PM on April 24, 2015 [16 favorites]


Can I consult with Smith and Jones?
posted by thelonius at 5:09 PM on April 24, 2015 [2 favorites]


I thought the bells on the trolley went "ding"?
Trolleys definitely go clang.
posted by adamrice at 5:16 PM on April 24, 2015 [1 favorite]


Maybe it's because these trolley problems are all in English, since your morals depend on your language (via):
here we report evidence that people using a foreign language make substantially more utilitarian decisions when faced with such moral dilemmas. We argue that this stems from the reduced emotional response elicited by the foreign language, consequently reducing the impact of intuitive emotional concerns. In general, we suggest that the increased psychological distance of using a foreign language induces utilitarianism. This shows that moral judgments can be heavily affected by an orthogonal property to moral principles, and importantly, one that is relevant to hundreds of millions of individuals on a daily basis.
All my problems with trolleys involve infrastructure maintenance.
posted by the man of twists and turns at 5:21 PM on April 24, 2015 [8 favorites]


basically (I may be stating the obvious here), you'd point the trolley at whichever philosopher's ethical stances you considered least in need of revision, with the possible proviso that if you really like the one you're killing, you'll have to be able to justify killing that philosopher in terms of their approach as stated.

also I think maybe the time traveler example might be more interesting if the second worker is the first worker ten minutes previously.
posted by You Can't Tip a Buick at 5:34 PM on April 24, 2015 [1 favorite]


Hey, you can't run over Jeremy Bentham! Think how long it will take to reconstruct him for the auto-icon!
posted by thomas j wise at 5:34 PM on April 24, 2015 [1 favorite]




well so we have to factor the suffering incurred by the auto-icon reconstructors through their labor into our calculations, I guess.

alternately, though, maybe auto-icon reconstructors really enjoy their work and would be made happier if presented with the prospect of a real challenge.
posted by You Can't Tip a Buick at 5:37 PM on April 24, 2015 [1 favorite]


What if it was the trolley from Mr. Rogers' Neighborhood, and, if you pull the lever one way, you will kill Henrietta Pussycat, and, if you pull the lever the other way, you will kill Daniel Striped Tiger. Your only other choice is to throw Lady Aberlin in front of Trolley, which, while rude, will be pretty OK because she is really large compared to Trolley.

You wouldn't expect Mr. Rogers to condone some sort of idiotic thought experiment involving carnage, would you?
posted by GenjiandProust at 5:40 PM on April 24, 2015 [4 favorites]


Trolleys definitely go clang yt .

Sorry, Judy Garland can't lie.
posted by Mezentian at 5:42 PM on April 24, 2015 [3 favorites]


This made me LOL.

I reflexively hated the Trolley Problem as a student, and most of my students reflexively hated it when I was a professor. THEY'RE NOT FUCKING FOOLED, they know you're deliberately constructing something super-shitty. (And my students, when I'd ask them, "So you think X?" would answer, "Well, I thought I thought X, but now I think you're setting me up so I'm not sure I want to commit anymore.")

It is not that hard to come up with ACTUAL TROLLEY PROBLEMS in the real world and start from there, and it's way more interesting. You can eventually abstract out to the trolley problem. Like, I can make an ebola vaccine but it will require me letting this one guy get sick enough to die, when I know I can cure him. Or what if I can save 9 people with this guy's organs, and he's a terrible person, but you have to actively cut the life support on him? IT IS NOT LIKE LIFE IS LACKING IN TROLLEY PROBLEMS.
posted by Eyebrows McGee at 6:00 PM on April 24, 2015 [11 favorites]


Someone I was friendly with did a lot of work on trolley problems in other cultures, and came to the conclusion that everything written about the trolley problem was bullshit.
posted by jeather at 6:19 PM on April 24, 2015 [1 favorite]


The best part is that there are probably a lot of frustrated murderers among this cohort of university graduates who are wandering around wondering where the trolley levers are.
posted by srboisvert at 6:50 PM on April 24, 2015 [2 favorites]


In the same vein as Advanced Trolley Problems and "Consider the following case", there's The Allegory of the Trolley Problem Paradox.

It is not that hard to come up with ACTUAL TROLLEY PROBLEMS in the real world and start from there, and it's way more interesting. You can eventually abstract out to the trolley problem. Like, I can make an ebola vaccine but it will require me letting this one guy get sick enough to die, when I know I can cure him.

That is a good and less-abstract example of a trolley problem. But I think the point of using trolleys instead of real-world situations (apart from tradition) is that real life has so many extra factors, people can evade the one being examined if it makes them uncomfortable.

I've met at least one person who wouldn't pull the lever in the standard trolley problem, no matter whether the trolley is headed towards the one person or the five, on the grounds that they don't want to be responsible for any death (which leads to an interesting discussion of whether they're responsible for the consequences of a lack of action, given that they had an opportunity to take action and knew it). But getting to that point requires explaining that their only options are pull or don't-pull, and they know the consequences with certainty, and the people on the tracks are undifferentiated, and so on.

It's hard to find a real-world case where people can't just say "But you don't know that letting the guy get sick will help you cure ebola," or "What if the guy is Jesus and the person with ebola is Hitler?" or worse "Only Africans get ebola, so I don't care." The trolley problem gives just enough physical detail to let you imagine the situation, but without letting those details have distracting side-effects (by fiat if necessary).
posted by Rangi at 6:59 PM on April 24, 2015 [4 favorites]


Or what if I can save 9 people with this guy's organs, and he's a terrible person, but you have to actively cut the life support on him? IT IS NOT LIKE LIFE IS LACKING IN TROLLEY PROBLEMS.

The reason I always kind of hated Trolley problems is that it kind of is. This scenario is super unlikely, and is probably not going to be your decision to make. I mean, first of all, you have to be a doctor to ever face such a situation. Then, you have to have someone on life support who will die as soon as you take them off of it, but whose organs will remain usable (most organ transplants are taken from people who are already declared brain-dead). And finally you have include the organ transplant team on your side and have no worries about the patient's loved ones disputing things.

Most organ transplants are done with everyone's approval, or the problem is more like the Terry Schiavo type issue where a family member is against recognizing brain death and someone didn't sign off in time... The moral choice of having to kill fewer persons to save more is not likely to show up in most people's everyday lives, except in the removed or abstract. And I guess that's the real issue - can we take ethical responsibility for how many lives we ruin in order to get a cheaper iPhone or better coffee...

The ones we don't see are much easier to shrug off. But if it's set up like a Trolley problem it seems like the only time you're making ethical choices is when you are faced with a hurdling train and people in front of you, when actually living ethically is a lot more complicated and annoying, and it turns out practically everything you do has consequences that could be damaging, and you have to work out how much responsibility you can bear, or should have to bear.
posted by mdn at 7:16 PM on April 24, 2015 [4 favorites]


The ones we don't see are much easier to shrug off. But if it's set up like a Trolley problem it seems like the only time you're making ethical choices is when you are faced with a hurdling train and people in front of you, when actually living ethically is a lot more complicated and annoying, and it turns out practically everything you do has consequences that could be damaging, and you have to work out how much responsibility you can bear, or should have to bear.

Trolley problems, even the "yeah, but this one's real" versions, all seem to take place in a world with virtually no history and a perfectly predictable future. They exist in a world that is oddly without human or social relations, a world of atomized actors in situations with perfectly bounded horizons. This is not our world; these hypothesized actors are not anything like people.

People are irretrievably situated, with the result that ethics isn't physics; it cannot be deduced by modeling a perfect sphere in a vacuum and then adding new variables. In general, Ethics 101 debates are story problems written by people who don't know how to do math.
posted by kewb at 7:44 PM on April 24, 2015 [2 favorites]


I'm honestly confused at Fred Clark's position, and everyone else's (mdn, kewb) that's basically, "Trolley problems aren't real". I...think we all knew that? The point is that modeling a perfectly "spherical" ethical dilemma allows you to examine why you take the stances you take. Problems like these point out unexamined areas of our opinions—areas where if we think about our belief, we might change it. Have we taken the position that reflecting on our moral stances is a bad thing? Or that having moral principles is a bad thing? I'm honestly confused. Trolley problems are the kind of mental exercise that make one stronger at the task of contemplating complex scenarios like Terry Schiavo or how to feel about Chinese iPhone factories.
posted by daveliepmann at 8:08 PM on April 24, 2015 [6 favorites]


People are irretrievably situated, with the result that ethics isn't physics; it cannot be deduced by modeling a perfect sphere in a vacuum and then adding new variables.

Physics itself can't be modeled that way. Newton's laws work well enough for near-perfect spheres in a near-vacuum with only gravity to consider (i.e. orbital mechanics), but throw a bunch of H2O molecules together and you end up with chaotic fluid dynamics. And yet the same principles lie behind both, and with enough careful reasoning and computing power we can predict the weather days in advance.

Trolley problems are like that. A real-world decision may have slight resemblances to a hundred different idealized scenarios, but if you've used those scenarios to figure out what your moral principles are (and to make sure that they don't contradict each other, which gut instincts very well might), you'll be better equipped to make decisions that don't feel arbitrary and that you won't regret when considering them after the fact.
posted by Rangi at 8:30 PM on April 24, 2015 [3 favorites]


We're going to need more trollies
posted by fallingbadgers at 9:15 PM on April 24, 2015 [1 favorite]


You pull the switch lever half-way in between and derail the train.

(Oh, wait a minute: John Stuart Mill is on it.)
posted by Chocolate Pickle at 9:47 PM on April 24, 2015 [1 favorite]


Rodney McKay or Captain Archer.
posted by clavdivs at 10:03 PM on April 24, 2015


There's an out of control trolley speeding along some tracks. On the tracks stands Stephen King, deep in thought. You have the ability to redirect the trolley onto a siding, but Dan Brown is there, in heated conversation with Kazuo Ishiguro and W.G. Sebald. What to you do?
posted by Emperor SnooKloze at 10:40 PM on April 24, 2015 [1 favorite]


Speak of the devil; here's a real-world trolley problem published soon after this post. Should you prescribe the antidepressant with a 50% chance of killing one's sex drive, or the one with a 1-in-300,000 chance of suddenly killing one's liver? (And of course, since it's the real world, there are alternatives like "try meditation and exercise first" or "don't ask me, I'm not a doctor.")
posted by Rangi at 10:51 PM on April 24, 2015


We're going to need more trollies

Yes, to exhaustively explore the ethical implications we will need to send a trolly down each path.
posted by aubilenon at 10:55 PM on April 24, 2015 [2 favorites]


Enough of this Newtonian ethics. If I put a screen with two properly placed slits in front of the trolley I can get the trolley to interfere with itself such that the probability waveform comes out to 0 at both locations.
posted by ckape at 11:11 PM on April 24, 2015 [10 favorites]


Eyebrows McGee: "It is not that hard to come up with ACTUAL TROLLEY PROBLEMS in the real world and start from there, and it's way more interesting...IT IS NOT LIKE LIFE IS LACKING IN TROLLEY PROBLEMS."
Well, if you want to make trolley problems more like real life, you have to have an endless succession of consecutive trolley problems, the details of which cannot be fully known until you make a decision about the previous problem. Oh, you also have to decide RIGHT NOW! If you don't decide, a third, much worse, outcome will suddenly pop into existence and be chosen for you.
posted by dg at 11:19 PM on April 24, 2015 [1 favorite]


Is this another Minecraft thing?
posted by yoHighness at 2:20 AM on April 25, 2015 [2 favorites]


Trolley problems are like that. A real-world decision may have slight resemblances to a hundred different idealized scenarios, but if you've used those scenarios to figure out what your moral principles are (and to make sure that they don't contradict each other, which gut instincts very well might), you'll be better equipped to make decisions that don't feel arbitrary and that you won't regret when considering them after the fact.

Well, no, not really, because the reasons we make decisions with ethical considerations in real life rarely resemble the reasons we make the abstractedly "pure" ethical decisions in trolley problems. Others have pointed out that training yourself to use perfect information to make decisions in the absence of most of the relevant information is a really, really bad idea for very obvious reasons. (See also: People who never get past Econ. 101 making economics pronouncements.)

Should you prescribe the antidepressant with a 50% chance of killing one's sex drive, or the one with a 1-in-300,000 chance of suddenly killing one's liver?

What is the mechanism of liver death, and what factors predict it? Ditto for the sex drive thing. What additional resources do we have to help people deal with the consequences of scenario one vs. scenario two? What does this patient want and what risks is that patient willing to assume? If the patient is incapable of consent for some reason, what does the family think? If the patient is incapable of consent, why would the one about sex drive not be the immediate option? Why does your scenario seem to imply a doctor who just picks the medication by making a choice alone? Do doctors in this made-up world not consult with one another or with patients and/or patient families?

Any ethical decision that does not try to answer all of these questions is probably not an ethical decision in the real world. And no trolley problem can survive them.
posted by kewb at 4:40 AM on April 25, 2015 [3 favorites]


And indeed, at the end of part I of Rangi's example:
I don’t want to overemphasize this particular calculation for a couple of reasons. First, SSRIs and nefazodone both have other side effects besides the major ones I’ve focused on here. Second, I don’t know if the level of SSRI-induced sexual dysfunction is as bad as the prostate-surgery-induced sexual dysfunction on the database. Third, there are a whole bunch of antidepressants that are neither SSRIs nor nefazodone and which might be safer than either.

But I do want to emphasize this pattern, because it recurs again and again.
In other words, having strained to find a real-world trolley problem, it turns out the best example the author could find has a rage of responses that would not be permitted in a trolley problem. But trust him, he's an amateur Bayesian, and this is actually an argument against malpractice lawsuits rather than an argument about which drugs to use or not use.

But then, the author admits that he doesn't have all the information needed to make a good decision anyway:
[Epistemic status: I am still in training. I am not an expert on drugs. This is poorly-informed speculation about drugs and it should not be taken seriously without further research. Nothing in this post is medical advice.]
Again, it's after this disclaimer and others that the author later insists that this is a pattern that "recurs again and again." Sure, he doesn't know what he's talking about and admits it, and this example may not be what he says it is because he's poorly informed, but the thing he admits he doesn't know enough about is still a great example of a pattern that recurs all the time, which we know because the underinformed author has told us so right after telling us how poorly informed he is.

You know, I'm coming around. Trolley problems do tell us something useful about moral agents; specifically, about the sort of people who construct elaborate, reductive trolley problems.
posted by kewb at 5:50 AM on April 25, 2015 [1 favorite]


I'd say put the trolley, track and philosophers inside a box with two radioactive triggers (one disables the other as it fires), close the lid, and let Schroedinger make the choice.
posted by Twang at 6:24 AM on April 25, 2015 [1 favorite]


What if there were a ticking time bomb on the trolley and one of the potential victims is the only one who knows where it is? Do you throw the switch to kill the innocent worker so that you can then torture the bomber to find the location of the bomb? Of course, once you find the bomb you still have to get rid of it.
posted by TedW at 7:25 AM on April 25, 2015 [1 favorite]


In other words, having strained to find a real-world trolley problem, it turns out the best example the author could find has a rage of responses that would not be permitted in a trolley problem.

I also dislike the trolley problem and related stuff like the fat man variant, but there are real-world examples that are relatively close to the pure trolley problem:

Should we require cars to have airbags? Airbags save some people who would have died in their wrecks, but also kill a few people who would have lived. Similarly, they prevent some serious injuries but cause lots of minor injuries.
posted by ROU_Xenophobe at 8:12 AM on April 25, 2015 [1 favorite]


That's not a trolley problem but a bog standard risk analysis.
posted by MartinWisse at 8:32 AM on April 25, 2015 [1 favorite]


Yeah, one of the key findings of the trolley problem was that people make different ethical choices based on how they would have to physically cause their choice to occur. That was above and beyond the useful things it points out about moral intuition as opposed to moral principles. The very fact that the problem is a constrained Gedankenexperiment allowed us to discover how people actually make ethical choices without worrying that the question was spoiled by some detail that's extraneous to one person but vital to the next.
posted by daveliepmann at 8:59 AM on April 25, 2015




You see a trolley rushing towards an overweight person. In front of you is a lever, which lets you redirect the trolley towards the only existing copy of the Bible instead.

Hmmmm.
posted by GenjiandProust at 10:46 AM on April 26, 2015


You see a trolley rushing towards a newborn baby clutching the only existing copy of the design for a machine that can artificially induce perfect happiness. In front of you is a lever, which lets you redirect the trolley towards a clone of Hitler instead.

And better!
posted by GenjiandProust at 10:47 AM on April 26, 2015


"You see a trolley rushing towards the Mona Lisa. In front of you is a lever, which lets you redirect the trolley towards a sentient trolley with feelings and a complex and beautiful inner life."
posted by Eyebrows McGee at 11:28 AM on April 26, 2015 [1 favorite]






« Older "Man of Steel" has 99 problems, but he just solved...   |   Believe It Or Not, I'm Not Home Newer »


This thread has been archived and is closed to new comments