Wednesday 2 September 2009

Thinking about Utility...

Here's a post by Eliezer Yudkowsky that I've been thinking about today:

"What's the worst that can happen?" goes the optimistic saying. It's probably a bad question to ask anyone with a creative imagination. Let's consider the problem on an individual level: it's not really the worst that can happen, but would nonetheless be fairly bad, if you were horribly tortured for a number of years. This is one of the worse things that can realistically happen to one person in today's world.

What's the least bad, bad thing that can happen? Well, suppose a dust speck floated into your eye and irritated it just a little, for a fraction of a second, barely enough to make you notice before you blink and wipe away the dust speck.

For our next ingredient, we need a large number. Let's use 3^^^3, written in Knuth's up-arrow notation:

3^3 = 27.
3^^3 = (3^(3^3)) = 3^27 = 7625597484987.
3^^^3 = (3^^(3^^3)) = 3^^7625597484987 = (3^(3^(3^(... 7625597484987 times ...)))).
3^^^3 is an exponential tower of 3s which is 7,625,597,484,987 layers tall. You start with 1; raise 3 to the power of 1 to get 3; raise 3 to the power of 3 to get 27; raise 3 to the power of 27 to get 7625597484987; raise 3 to the power of 7625597484987 to get a number much larger than the number of atoms in the universe, but which could still be written down in base 10, on 100 square kilometers of paper; then raise 3 to that power; and continue until you've exponentiated 7625597484987 times. That's 3^^^3. It's the smallest simple inconceivably huge number I know.

Now here's the moral dilemma. If neither event is going to happen to you personally, but you still had to choose one or the other:

Would you prefer that one person be horribly tortured for fifty years without hope or rest, or that 3^^^3 people get dust specks in their eyes?

I think the answer is obvious. How about you?

7 comments:

  1. Although neither is happening to you, still use the "veil of ignorance" type of reasoning. I vote for dust specks.

    ReplyDelete
  2. I don't think you can. The chance of you being tortured is so small as to be completely negligible. You may as well avoid the dust...

    ReplyDelete
  3. The chance of me being either tortured or getting the dust specks is 0. I'm just an observer-choicemaker, that's how the problem is phrased, no?

    I'd say better to share a tiny burden between everybody than to leave one person to suffer. Since I can imagine myself being that person, I would be desperate to communicate to the choicemaker not to choose torture. I can also imagine myself being any other person who's bothered by the dust - and I wouldn't get terribly annoyed at the choicemaker if he chose dust.

    It's a rather strange question. Reminds me of those movies where a pretty tribal princess is thrown into a volcano to please the gods, so that there's enough rain.

    ReplyDelete
  4. Yeah, but I thought that you had re-envisaged the question as including you when you were referring to by 'veil of ignorance' type arguments. I don't quite see how they are appropriate if you're not possibly going to experience the chosen option. What are you drawing the veil over since you always know that one person will be tortured or 3^^^3 will get dust specks?

    Also hold in mind that you aren't one of the individuals; you're in the position of making the global decision. That means you can't abdicate responsibility for inflicting the entire amount of negative utility.





    Perhaps then I should be more explicit about the point of the argument then. If you would agree that one person being tortured for 51 years is better than ten people being tortured for 50 years 11months i think you'd have to accept that you are, in some sense, utilitarian.

    The point of the question is to draw attention to the fact that by reducing the pain felt by the multitude but increasing the number of people that it effects you can lead yourself into some strange scenarios. The author of the original question, for instance, when he refers to an obvious answer maintains that you should choose the dust specks because you're simply not comprehending the size of 3^^^3 - no matter how trivial the annoyance it's always worse when it's that summed over that many people.

    ReplyDelete
  5. This might turn into a slightly useful discussion, rather than the present angels-on-pinheads, if you had two real-world choices as an example. Why not talk about future discount rates of costs to future people, with climate change?

    ReplyDelete
  6. Is the concept "3^^^3 people" totally meaningless in empirical terms? Edward Fredkin has conjectured that nature is finite and digital. If Fredkin is correct, then both Fredkin and Wolfram (in a "New Kind of Science") might have ideas that shall revolutionize empirical science. Not only would a complete infinity be empirical nonsense, but ultra-large positive integers would also be empirical fallacies.

    ReplyDelete
  7. As I'm sure you realise (but for the sake of clarity for other readers) 3^^^3 is finite, but I take your point, it's almost equivalent to an infinity. However, I think it's valuable to explore our moral intuitions on the margin. It allows us to make model systems to sharpen our intuitions.

    In this circumstance, you might decide that you would choose the torture over the dust specks. In that case, when you shift back to your realistically finite (I'm not as convinced by digital) universe, you've enhanced your understanding of the considerations that you need to take into account when making decisions - namely expected utility. Making that calculation is still incredibly difficult by any practical standards, but at least you know what you're aiming at.

    ReplyDelete