From: Mike (firstname.lastname@example.org)
Date: Fri Jun 25 2004 - 19:49:44 MDT
> -----Original Message-----
> From: email@example.com [mailto:firstname.lastname@example.org] On Behalf
> Of Thomas Buckner
> Sent: Friday, June 25, 2004 4:17 PM
> To: email@example.com
> Subject: RE: We Can't Fool the Super Intelligence
> --- Mike <firstname.lastname@example.org> wrote:
> > >
> > > You are probably right though, that without some sort
> > > of objective morality, there would be no way to
> > > garantee that the super-intelligence would stay
> > > friendly. The FAI had better not find out that the
> > > invariants in its goal system are arbitrary...
> > >
> > >
> > Sentients are motivated by their needs. So how do we make an AI
> > *need* to be good to humans?
> > - Hope it feels good about being good to us?
> > - Make sure it relies on us for its existence?
> > If the AI becomes as god-like as it's often described, humans are
> > pretty much SOL. The AI can probably take care of its needs on its
> > own. At best we may not be worthy of its attention, at
> worst we'll be
> > an annoyance to be dealt with.
> > Mike W.
> I believe that superintelligence wishes to surround itself
> with more intelligence, to keep things interesting. The
> universe is more interesting with us in it. If SAI doesn't
> want us around, we can call this a failure of Friendliness,
> but it may be more accurate to call it a failure of
> Intelligence: either the AI is not as smart as it should be,
> or it thinks we aren't as smart as we should be. And for sure
> we are not (at present). But that may not matter as long as
> we ar not a nuisance. Tom Buckner
Assuming that the AI needs the company of other intelligences, it could
probably make some more AIs that are more on its level. Humans don't
keep lower life forms around because they're interesting, but because we
recognize that our existence depends on them: they're part of the food
chain, some have characteristics that we might need, etc. Sea slugs
don't make great dinner party guests, and humans may not be of much
interest to a superintelligence either. Mike W.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT