“We must never make the problem of pain worse than it is by vague talk about the ‘unimaginable sum of human misery.’ Suppose that I have a toothache of intensity x, and suppose that you, who are seated beside me, also begin to have a toothache of intensity x. You may, if you choose, say that the total amount of pain in the room is now 2x. But, you must remember that no one is suffering 2x; search all time and all space and you will not find that composite pain in anyone’s consciousness. There is no such thing as a sum of suffering, for no one suffers it. When we have reached the maximum that a single person can suffer, we have, no doubt, reached something very horrible, but we have reached all the suffering there ever can be in the universe. The addition of a million fellow-sufferers adds no more pain.”
20 comments:
Wouldn't God himself be the consciousness that suffers the totality of the world's pain, in some sense?
That is an old and thorny theological question, the pass-ability of god. I am not sure the origin but a very Orthodox answer both Greek and Calvinist is that God is not affectedly our suffering. It is assumed that if hie was he wouldn't be perfect, That's one of the complainants of process theology. It's part of what they call "static and immovable."
I recall taking a class in ethical theory under law professor Deryck Beyleveld (author of "The Dialectical Necessity of Morality" amongst other things).
He used the same point Lewis is making here as an argument against certain forms of Utilitarianism in ethics. The concept of maximising utility requires the concept of "sums of utility" which, depending on how that utility is conceived, may not be a meaningful one.
"Wouldn't God himself be the consciousness that suffers the totality of the world's pain, in some sense?"
God suffers all the pain in the world, suffering right along with every one of us.
=======
"He used the same point Lewis is making here as an argument against certain forms of Utilitarianism in ethics."
Isn't Utilitarianism "ethics" built around this silly idea that Lewis is critiquing?
greatest good for greatest number? John Rawls also tore it down, saying it reduced morality to a business ledger.
Looks like someone is at it again:
The Skeptic Zone: Labeling the Enemy
Lewis: " But, you must remember that no one is suffering 2x..."
Lewis seems to be defending the idea that cutting the finger off of 20 people is no worse than cutting the finger off of one person. After all, no one of the 20 experiences more pain than having a finger removed.
I'd suggest that any moral system that doesn't account for the multiplication of an action fails on both an intuitive and rational level.
Let's call this brick number 784 in why I think Lewis is so disappointing as a supposed philosopher or intellectual of any significance.
It could be morally counterintuitive, I actually think it is. But is it irrational? And how would you show this? This is an argument, outside the context of the problem of evil, where a Lewis-like idea is defended.
http://www.pitt.edu/~mthompso/readings/taurek.pdf
I'd suggest that any moral system that doesn't account for the multiplication of an action fails on both an intuitive and rational level.
I'd suggest that atheism has no moral system applicable to all humans, so it fails on every level.
Lewis is making a point that many others have made before and after him (Deleuze's individuation writings, for example) in other respects about a distinction between intrinsic properties like temperature and extrinsic properties of a person or thing, like their mass.
Pain, like temperature, is an intrinsic property. If you and I both have mild pain, it does not add to a severe pain.
I would however define "total human suffering" as an extrinsic property, like mass (I would be hard pressed to derive that definition from human experience, though). If two people suffer a mild amount, total suffering (but NOT total pain intensity) adds to a greater amount than just one person suffering.
Perhaps if several people are hungry I'd be more likely to get them food than one person, but it depends on the situation (is one of the hungry people going for food already?).
VR: "It could be morally counterintuitive, I actually think it is. But is it irrational? And how would you show this?"
I wrote, "I'd suggest that any moral system that doesn't account for the multiplication of an action fails on both an intuitive and rational level."
Moral systems are about our desires and the desires of others -- they are agreements about how one should behave given a changing set of circumstances. And it would be irrational to subscribe to a moral system where, say, everyone (including you) felt the pain of punishment for anyone who should be punished.
This isn't a sophisticated insight on my part.
What's irrational about subscribing to a moral system like that if moral systems are about our desires and agreements? I don't see how it's irrational to desire and agree to the situation stated.
Sigh.
A moral system is about agreements.
If you prefer to live in a world where you are punished every time anyone is punished, then you are free to prefer that. And if belonging to that moral system is your greatest desire, then you would be acting rationally.
But I doubt that you would long desire to live in that world. Still, if you want to go there, then you are welcome to it.
Just wanted to confirm that it's not irrational as you originally said it was.
SteveK: "Just wanted to confirm that it's not irrational as you originally said it was."
No, it's irrational.
It's irrational because moral systems are about agreements, and it's irrational to expect that your desires should outweigh the desires of all others in systems that are about agreements.
This is about as elementary as it gets. Imagine my surprise that you should struggle to understand it.
"And if belonging to that moral system is your greatest desire, then you would be acting rationally."
???
Me: ""And if belonging to that moral system is your greatest desire, then you would be acting rationally."
SteveK: "???"
Wanting to act according to your desires = personal rationality
Expecting that others will precisely share your every personal desire = irrational
Hence, agreements being a factor in achieving a kind of moral rationality.
Not expecting that others will share your personal desire = rational
In other words, it's not *necessarily* irrational. Imagine my surprise that you struggle to grasp this.
SteveK: "Not expecting that others will share your personal desire = rational / In other words, it's not *necessarily* irrational. Imagine my surprise that you struggle to grasp this."
Yup. You've lost me.
It happens.
Post a Comment