Re: Game theoretic concerns and the singularity (was RE: Are we Gods yet?)

From: Michael Anissimov (altima@yifan.net)
Date: Fri Aug 02 2002 - 03:02:02 MDT


James Higgins wrote:
>I don't agree. The first AI to reach SI capability could easily
remove,
>or at least halt, all others which are behind. In other words, unless
>two AIs end up reaching the Singularity at nearly the same time
(within
>days I'd guess) then there will only be one. Assuming that 1st
decides
>that it doesn't want competition, at least (which we can't know).
Thus,
>I find it very unlikely that we will have multiple SIs, unless the 1st
>SI wants it that way.

Days? I would estimate that the window of difference allowed for two
transhumans newly initiating a cycle of strong self-improvement to
maintain game-theoretic equivalency would be somewhere in the
nanosecond or microsecond range. In addition, I would find it highly
likely that posthuman SIs would choose to reconcile their differences
(likely to be minor, in terms of goals) rather than burning resources
in a mutually wasteful physical conflict. Looking at your rhetoric,
you seem to be thinking in terms of "all sentients will necessarily
have strong observer bias", which, of course, is dangerous when
seriously considering the motivations of entities outside of the
familiar phase space.

>That said, I do believe there will be multiple SIs. I just don't
think
>they will be terran. Other civilizations out there *must* be getting
to
>this point as well. They may be on the other side of the universe,
but
>that won't have anywhere near the effect on SIs that it does on us.
So,
>eventually, I believe multiple SIs will encounter one another. Might
be
>many (real time) years, or only days. May not even be relevant to us
if
>the SI just pops out of our universe for something better, leaving us
>behind.

Wow, you've got a powerful Us/Them complex going on when you talk about
SIs. You talk as if all humans upgrading to Powerhood isn't the only
long-term inevitability, as if an indifferent SI could come into
existence but not see mankind as building blocks, and as if the first
benevolent transhuman won't create a moral singleton to protect
individual rights (in the case of a malevolent or indifferent
transhuman, everything goes black immediately). Out of curiosity, may
I ask you if you've mulled over any of these concepts before? Also,
even if Earth-originating SIs ran into extraterrestrial SIs, wouldn't
that potential occurence be so insanely far into the subjective future
as to render it irrelevant to us today?

Michael Anissimov
       

-----------------------------------------------------
http://eo.yifan.net
Free POP3/Web Email, File Manager, Calendar and Address Book



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT