Re: Singularity intros (was: Si definition of Friendliess)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Apr 06 2001 - 16:53:20 MDT


Arona Ndiaye wrote:
>
> That's ok Eliezer, I never said you should not, you didn't, I shouldn't or I
> didn't.
> Do you believe it would stop some hacker somewhere to come with the first
> seed AI ?
> Even assuming a ban of technologies ?

No; I think it would slow down AI long enough for military nanotechnology
to be developed and used. Even if it doesn't, it strongly increases the
probability that the foremost research projects at any given point will be
run by rogue states that are not exactly Friendliness-aware.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT