From: Wei Dai (email@example.com)
Date: Thu Jun 03 2004 - 10:33:05 MDT
On Wed, Jun 02, 2004 at 12:09:58PM -0400, Eliezer Yudkowsky wrote:
> The point of the analogy is to postulate al-Qaeda programmers smart enough
> to actually build an AI. Perhaps a better phrase in (5) would be, "avoid
> policies which would create conflicts of interest if multiple parties
> followed them". Categorical Imperative sort of thing. I am *not* going to
> "program" my AI with the instruction that Allah does not exist, just as I
> do not want the al-Qaeda programmers programming their AI with the
> instruction that Allah does exist. Let the Bayesian Thingy find the map
> that reflects the territory. So the al-Qaeda programmers would advise me,
> for they know I will not listen if they mention Allah in their advice.
But where does the Bayesian prior come from? Al Qaeda has its prior, and
you have yours. What to do except fight?
> Reading... read. Relevant stuff, thanks.
Did reading it cause you to change some of your designs. If so how?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT