From: Wei Dai (firstname.lastname@example.org)
Date: Wed Jun 02 2004 - 08:52:00 MDT
On Wed, Jun 02, 2004 at 05:32:38AM -0400, Eliezer Yudkowsky wrote:
> The last question strikes me as irrelevant;
It's relevant to your fifth goal:
5. Avoid creating a motive for modern-day humans to fight over the
If you can't convince modern-day humans that collective volition
represents them then naturally they'll want to fight. For example, if
al-Qaeda programmers wrote an AI, "knew more" would mean knowing that only
Allah exists, and they would fight anyone who suggests that "knew more"
should mean a Bayesian probability distribution over a wide range of gods.
> Let me toss your question back to you: What do you think a devout
> Christian should be said to *want*, conditional upon Christianity being
> false? Fred wants box A conditional upon box A containing the dimaond;
> Fred wants box B conditional upon box B containing the diamond. What may a
> devout Christian be said to want, conditional upon Christianity being
> false? I can think of several approaches. The human approach would be to
> *tell* the devout Christian that Christianity was false, then accept what
> they said in reply; but that is the Christian's reaction on being *told*
> that Christianity is false, it is not what the Christian "would want"
> conditional upon Christianity being false. If the Christian is capable of
> seriously thinking about the possibility, the problem is straightforward
> enough. If not, how would one extract an answer for the conditional question?
The thing is, I'm not sure that's the right question to ask, and the
example I choose was meant to show the apparent absurdity of asking it.
So I don't understand what point you're making by tossing the question
back to me.
Have you looked at any of the existing literature on preference
aggregation? For example this paper: "Utilitarian Aggregation of Beliefs
and Tastes", available at
I think it might be worth taking a look if you haven't already.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT