First, the problem, like so many other constructions in philosophy, is enframed by the metaphysics of presence and control. That is to say, if we attempted to give a purely physical description of the event, then it would simply be something like "The trolley traveled at x speed along y route and there was a probability p1 it would hit 1 person and probability p2 it would hit the group of people." But the trolley problem says, "Hey you! Yeah you! This is happening and you're involved and you have to choose to do something or not." It has to be happening right here right now or else, I presume, it just shouldn't concern you very much. I suppose if the Trolley problem were instead something like, "You have a choice to travel a billion lightyears to Alpha Centauri to pull a lever that will kill baby alien Hitler before he grows up to kill the totally innocent baby aliens, or you can just stay on earth and do nothing and baby alien Hitler will eventually grow up to kill a lot of baby aliens (also, all of these aliens eat humans)" the problem might not be so interesting.
Also, setting aside the additional metaphysical problems of free will, responsibility, utility, categorical imperative, etc. the problem of technology is implicit in the trolley problem. It is quite challenging to construct a quandary with similar force and urgency without invoking technology.
"Perhaps there is a landslide and the rocks are headed to crush" - well no, I would need some technology to divert the landslide. Maybe I could just run untie their ropes - no, knots are technology too. Maybe the people are just asleep not bound and I just have to run to either the 1 or the 5 and push them all out of the way or - no that sort of loses the spirit of the 'sacrifice one to save more' in the original.
Without adding some forms of technology into the mix, natural situations where you'd have to make this sort of choice seem almost inconceivable. And if we replaced the character of the Trolley with an human murderer or a savage hungry lion, then the problem changes texture: feed this killer one person to save more. The Trolley is selected because it is an unstoppable force that can't reasoned with, only redirected. And this is the problem with technology in general.
It is taken as a given. This technology exists, you can't stop it or try to reason with it, it just is, and it creates increasing dangers that you have to form new moral and ethical judgments around, judgments that may never be adequate and may conflict with your preexisting sense of self. That is to say, a society may have a certain ethical code that everyone feels pretty good about, then after the development of nuclear technology, this same society must become very oppressive to prevent the new horror of some bad guy using a nuclear weapon to destroy everyone; most people in the society had nothing to do with the creation of the nuclear technology, but they still have to reorder their whole moral character to support or oppose the solutions to the new quandaries created by the tech.
The closest I can come to conceiving a similar quandary without evoking technology is this: Your family is lost in the desert and starving, there's no food. You and your partner can eat your five children to survive, or you can feed your partner to your children and they'll survive. But in this scenario it seems like we'd want to put some moral responsibility on the people for getting lost without food. And can't they just eat one kid while continuing to search for rescue/food?
And if eating the people is only going to keep the survivors alive for a little while longer and they're all going to die soon enough no matter what, then shouldn't they all just accept their fates and all die together instead of making the terrible choice to eat someone?
This is what is lost in the Trolley problem. The inventors and producers of the Trolley have no moral responsibility placed on them even though without the technology this sort of event just wouldn't be possible. I don't know what sort of other terrible things are going on in the world that I could have prevented instead of wasting my time deciding whether to pull the Trolley lever. And everyone dies soon enough anyway. All of these things are supposed to be out of my control when I enter the Trolley problem, and if I were engaged in a real-life actually happening trolley crises, there's no guarantee I would follow any theoretical reasoning in the heat of the moment. Really, I just want to ban Trolleys. Sorry, you won't be able to make new technology that create new problems to be solved by creating new technologies that create new problems.
[–]captain_tucker 1 ポイント2 ポイント3 ポイント (0子コメント)