From: Ben Goertzel (email@example.com)
Date: Mon Jun 25 2001 - 11:51:41 MDT
My view, as has been stated repeatedly, is somewhere between Eli's view and
the view of sunrise2000 ....
I'm not sure we can engineer Friendliness as thoroughly as Eli thinks, but
I'm pretty confident we can do *something* stronger than ignore the whole
issue and blithely create AI's without worrying about their ethical
orientations at all...
> -----Original Message-----
> From: firstname.lastname@example.org [mailto:email@example.com]On Behalf
> Of Eliezer S. Yudkowsky
> Sent: Monday, June 25, 2001 1:39 PM
> To: firstname.lastname@example.org
> Subject: Re: Who is working on Real AI?
> email@example.com wrote:
> > ``Friendliness is irrelevent.'' Dealing with AI, we're dealing with a
> > transhuman technology. Since friendliness is a patently human notion, it
> > will quickly become obsolete in SI circles. There is much more to the
> > issue of engineering FAI (as demonstrated by some interminable SL4
> > threads); so this is just a sufficient-but-not-elegant assertion.
> > My opinion is that ``friendliness'' is just a cover to give SINGINST
> > 501c(3) non-profit status and make strong AI look more - shall we say -
> > ``friendly'' to SL0-SL3 communities.
> I just thought I should go on record as denying this, not because I think
> it's plausible to anyone except this guy, but because someone going over
> the list archives in five years might take silence as an indicator of
> -- -- -- -- --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT