I’d pull the lever to kill one person immediately. Assuming the decision maker at each stage is a different person with different opinions on moral, ethical, religious, and logical questions, then it’s a near certainty that someone is going to pull the lever to kill the people at their stage. If you’re lucky, it’s the very next guy. If you’re not, it’s the guy killing a million people a couple of iterations later. If I’m the first guy, I’ll take the moral hit to save the larger number of people.
If you’re not, it’s the guy killing a million people a couple of iterations later
I feel like running over all those bodies would make the train come to a stop way before it ran over a million people.
Now I sit back and wait for some morbid soul who is better at math and physics than me to figure out the answer.
Now if we assume the victims tied up are frictionless orbs, and the train is also a frictionless orb, and the two of them are travelling in a frictionless void than I reckon we could kill a few more.
But then would they die if they don’t slow the train down? The train would necessarily have to impart some energy in order to effect a change in their bodies.
Maybe the train is an unstoppable force.
Like the GTA train
deleted by creator
I mean if you’re going fast enough with a pointy train, you could chop up people pretty easy. You just need to make sure that each person is a tire width apart to make sure the wheels don’t lose traction. Assuming a person is roughly half a metre across and a tire is 75cm in diameter, we get 1.25m per person, so a track of 1250km for a million people. Not very long at all.
I agree with your logic, so far as it goes. However, there are, currently, just over eight billion humans in existence. If my quick, over-tired math is correct, that means only 34 people have to say no, until we run out of people to tie to the tracks. Assuming, at that point, the system collapses and nobody dies, I’d guess 34 people would refuse - might be the better choice.
Would you trust the entirety of human existence to be decided by 34 people? In my experience from watching reality TV, the last one always screws the rest over for their own benefit.
Imagine being the last one. You could singlehandedly wipe out half the global population. This would normally be a bad thing, and it is, but it would also make every surviver twice as rich, solve food scarcity and halve the pollution, perhaps even saving humanity from itself.
If that’s not enough, think about everyone now having double the amount of kittens and half the traffic on the roads.
Society and the economy are not a zero sum game. Killing half the population wouldn’t make the survivors twice as rich. It would send society into chaos which would make the remaining people’s lives far worse.
I’m not sure reality TV is a good basis, it’s very manipulated and set up for drama. I have a lot more faith in humanity in general than I do in reality TV stars. But you still have a good point, it’s definitely not a sure thing.
Oh yeah. I was assuming an infinite series (somehow). Also, odds are good that out of 34 people, one of them would misunderstand the rules or be crazy enough to do it anyway for various reasons. I’d probably still do it.
You weren’t wrong, the meme implies an infinite series, and I might be cheating to apply real-world constraints to an absurd hypothetical.
After we run out of people, they start adding cats & dogs.
Yikes! Pull the lever now!
Exactly. If you have the means at hand, you have the responsibility to act. At the risk of taking a shitpost way too seriously, if you were in that situation and actively chose to leave the decision to someone else to kill double the people, then you acted unethically.
Technically the 2nd guy could just let it go through and nobody dies. However if it was to double over and over forever until it stopped, then technically the best option is to just double it forever. Nobody would ever die? If someone decided to end “the game” as it were and kill some people, then that’s on them.
Pretty sure there’s a base case when you run out of people to tie to the tracks. A naive log2 of 8 billion is only 33 decisions.
Except, given finite resources, the tracks would run out before having enough space for 8 billion tied-up people.
deleted by creator
Yes, say there are 2^33 people for illustrations sake, by 33 decisions you (the first puller) are guaranteed to be dead too. At 32 it’s 50/50, the odds increase as the decisions get made. From a self preservation standpoint the best thing you can do to minimize your personal risk is pull the lever. It also happens to kill the fewest other people.
The only out is nobody pulls the lever.
It’s on them, but it affects thousands or millions of others.
As such if you can prevent that, and don’t, it’s also on you too.
I think that’s bad logic. The choice everyone has is kill or not kill. I can’t be held responsible for someone deciding to pick kill when they have the ability to pick not kill.
You’re not responsible for their choice.
You’re responsible for giving them the choice.
Ok, and what does that actually mean for/to me? It’s not the same as intentionally putting somrone in a situation where both choices knowingly result in death. And even if was in this situation, wouldn’t it ultimately be the fault/responsibility of whoever set up the scenario to being with?
True, since we’re analyzing a hypothetical ethical question I shouldn’t leave any open assumptions. I made the assumption that at some point, at least one person will have to die, as in I see this trolley problem as a situation where at the end there is no choice and the maximum number of people die.
On the one hand, the possibility exists that the buck gets passed forever, especially as the kill count numbers grow substantially making the impermissibility of allowing the deaths grow with it. It’s not likely the any given person would kill one stranger, let alone millions.
On the other hand, in an infinite series, even something with miniscule odds will still eventually inevitably happen, and some psycho will instantly become the most infamous murderer in history, followed immediately by the person that didn’t just kill one person and end the growth before it started.
I think this is a good metaphor for how humanity has “dealt” with problems like climate change.
If you make a tough decision, it causes hardship now, but prevents hardship in the future. If you don’t make a tough decision now, someone in the future has to either kill a lot of people, or just pass the buck to the next guy. People justify not making the tough decisions by saying that maybe eventually down the line someone will have an easy decision and there will be nobody on the side path, even though all observable evidence says that the number of people on that path just keeps growing exponentially.
If you are really unlucky the number doubles so many time you end up tied on the tracks.
But what if you’re the tenth person with 1024 on the line? Or the 20th person with 1,048,576? Etc. Is there ever a point (before it’s everyone, in which case risk doesn’t increase) where you stop pulling it?
I don’t think so.
If we all collectively agree to just pass it on, then either:
-
It’s infinite, and it just passes on forever, or…
-
It’s not infinite and somebody at the end has no choice, in which case nobody in charge of a lever has killed anyone
So yeah, I say pass it on.
Except that somewhere down that chain someone is almost certainly going to choose to kill people, so by passing the trolley on down to them you’re responsible for killing a lot more than if you ended it right now.
And since every rational person down the line is going to think that, they’ll all be itching to pull the “kill” lever first chance they get. So you know that you need to pull the kill lever immediately to minimize the number of deaths.
Only the person pulling the lever is responsible for his/her action though. There is a difference between passively passing on and actively murder someone
Dentological ethics: you have a duty to not murder people, so you don’t pull the lever
Utilitarian ethics: pulling the lever will kill less people
In this case it isn’t even a guarantee that anyone has to die as the problem is presented, the tram can just continue to be passed along. The default setting for the lever is “go to next” so to not pull the lever is easier both physically and morally.
The individual that pulls the lever is the same individual that would take action to harm others for no benefit, and even in real life I can’t morally take responsibility for a person who runs over a child by purpose after I let his/her car merge in front of me just before a school crossing
deleted by creator
If I hand a machete to Jason Voorhees I think I’m at least partly responsible for the people he hits with it. I know what he’s going to do with that thing.
Except you’re not passing a machete to Jason Voorhees. That would be “double it and pass it to the next person who you know is going to pull the lever.”
You’re passing a machete to the next person in line. You don’t know who that is. They may or may not pass the machete down the line. Considering I would not expect a person chosen at random to kill someone when handed a machete, it seems unethical for me to kill someone with a machete just to prevent handing it to someone else.
I know Jason is somewhere down that line I’m handing the machete off to. And the farther down the line he is the more people he’s going to kill.
There are only 33 people in the line though.
Either you get to 33 and there are no more and the track just ends or it’s “nuke the planet” or dont for everyone else above 33.
Or it keeps doubling even well after its surpassed the human population, and we all have to keep hitting “pass” in turns forever, and if even a single person gives up then boom.
That’s only if he’s next in line though. If you pass a machete to someone who might one day eventually pass it onto him, is that as bad? I suppose at some point there’s an ethical cutoff lol
The farther away he is the worse it is because the more people he gets to kill. If for some reason I absolutely had to pass the machete down the line then the best case is for the very next person I hand it to to be Jason. But even better if it’s me.
In this case you don’t hand him a machete, instead you murder someone innocent to prevent possible murders in the future by a third party
I guess it comes down to the weight you give the word “possible” in your sentence. If possible means extremely likely (and there are logical reasons to believe so) then taking responsibility makes sense.
I guess then the issue would be: do you ever find out the result of your actions? If no, then I guess it’s sort of a “glass half empty/full” kind of thing, because you could just pass it on and assume the best and just go live your life quite happily.
Although if you did find out the result, imagine being first, pulling the lever and then finding out nobody else would have.
If it’s infinite, you’d basically be gambling that no evil person exists.
If it’s infinite (up to the current human population), we’re all tied up on the tracks. Unless we’re leaving out the exact number of people that would bring it to approximately the full population, I guess.
As long as I’m not on the tracks, I’ll take the hit and kill one instead of risking a potential genocide.
-
Step in front of the train: Tell your manager this whole project is dumb, provide a list of reasons why it’s a bad idea and explain you are prepared to resign rather than enable its further development.
Someone needs to stop tying people to those train tracks or this trolley problem will never go away.
MULTI-TRACK DRIFTING!! Which also kills the other lever guy, bonus!
KANSEI DORIFTO
Double it and I’ll do it myself
Yeah, who even needs a trolley?
Just keep doubling forever until the number is more than everyone alive, free s-risk emergency button.
This might cause a buffer overload that crashes the programming and we can escape the matrix together once and for all
But what if we are all NPCs?
In that case I hope someone has a backup
Napkin math, from the last time I saw this:
I’ve been thinking about this. I estimate a few people per 1000 would do an atrocity for no reason if they were guaranteed no consequences, and the deaths if the switch is pulled are 2^(n-1) for the nth switch. The expected deaths will cross 1 somewhere in the high single-digits, then (since it’s outcome*chance), so the death minimising strategy is actually to pull yours if the chain is at least that long.
Edit: This assumes the length of the chain is variable but finite, and the trolley stops afterwards. If it’s infinite obviously you pull the switch.
Could you elaborate what you are analysing here? If I dont misinterpret the model, the option where you dont double the victims minimizes deaths every time.
Ah, but then you’re giving the opportunity to the next guy to kill even more, if he wants. Most people obviously won’t want to do that, but a rare few will, and the body count gets so big so fast that it only takes a few switches before that’s a bad risk.
I was expecting a bigger number of switches, but I guess that’s just another example of humans being bad at tracking the consequences of large quantities.
But if you assume that such a person exists, then it is inevitable that someone will pull the switch. The very best case is that such a person is immediately after you. Therefore, the only minimizing choice is to kill however many people you have.
Oh, I see. Yes, the context here was that we assume all possible chain lengths. If it’s infinite the death-minimising strategy is obviously to pull it, and if your switch is the only one you obviously don’t. The question was where it changes from one to the other.
I’ll edit a clarification in.
Makes me wonder what happens when the number of people tied to the tracks exceeds rhe number of people currently alive. Should be around the 33rd lever.
I think you’re on the right track.
Half-pull the lever so that the points get stuck midway between the two tracks. That should derail the trolley. Someone could conceivably still get hurt, but it improves everyone’s chances.
(What? You mean it isn’t a literal trolley that has to obey the laws of physics? Damn.)
News next day, 10 dead in derailment.
Ten baby puppies to be exact
Philosophy problems vs all real world problems
deleted by creator
There is one person in danger.
Now I pull the lever.
Now there are two _______
person in dangers
I’m afraid you failed the wug test, or rather one of many wugs test.
Modern financial system in one picture.
If we keep doubling, will I eventually be a person on the tracks? There are a finite number of people, so eventually I would be, right? So, passing the buck would be equivalent to handing my fate to a stranger.
OTOH, if there are an infinite number of people, then this thought experiment is creating people out of thin air. Do these imaginary people’s rhetorical lives even matter?
Either way, it seems better to kill 1 person at the start.
If it creates infinite number of people, it could solve world hunger with some good ol’ Soylent green thinking. Although you might want to figure out how to slow down the trolley at some point.
deleted by creator
Just walk away and assume the original engineer put safety measures in place.
If you pull the lever after the trolley’s first set of wheels has passed the switch but before its last set of wheels has passed the switch then you’ll derail the trolley and everyone lives.
Except the guy in the trolley
He should have been wearing his seatbelt. That’s on him.
Also, why wasn’t he pulling the emergency brake? He deserves it.
The group of kids on a school trip in the trolley.
That’s also another fun layer of metaphor to this whole thing. What’s in the trolley? Nobody knows, and has to make decisions based on that incomplete information.
Yeah, not a bad answer! I’d assume someone is on the trolley too, but that’s just an assumption and, hey, maybe they would survive the wreck anyway!