RE: Fighting UFAI

From: Ben Goertzel (ben@goertzel.org)
Date: Thu Jul 14 2005 - 13:28:17 MDT


> > Eli, this clearly isn't true, and I think it's a
> > poorly-thought-out statement on your part.
> >
> > For instance, consider
> >
> > Goal A: Maximize the entropy of the universe, as rapidly as
> > possible.
> >
> > Goal B: Maximize the joy, freedom and growth potential of all
> > sentient beings in the universe
>
> Saying "sentient beings" instead of "humanity" is a cop-out, Ben.
> For our purposes, they are identical.

Hmmm...

Well, I think that when Eliezer said "humanity" he probably really meant
"humanity." So I won't take your reply as a proxy for his...

How about

Goal C: Migrate to the Andromeda galaxy and use all the mass-energy there to
advance mathematics, science and technology as far as possible; but leave
the Milky Way galaxy alone, ensuring that it evolves over time very
similarly to what would happen if no superhuman AI ever existed.

This goal also doesn't mention humanity explicitly, yet seems much less
dangerous than goal A.

Of course, you could argue that the Milky Way here is serving as a proxy for
humanity; but, for sure, humanity is not being explicitly mentioned...

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT