Re: The generalized past (was: Infinite hells)

From: Simon Gordon (sim_dizzy@yahoo.com)
Date: Fri May 09 2003 - 12:48:03 MDT


 --- "Eliezer S. Yudkowsky" wrote:
> Unalterable Level IV Processes, if they exist, may
> be counted as part of the generalized past - though
> of course it would be a terrible error to write them
> off too early. At any rate I do not believe it
> would be necessary to erase the knowledge. We
> already have
> emotional faculties for dealing with mathematically
> unalterable known horrors; we call them the "past".
> There is no definite temporal relation of an
> external Process to Bayesia in any case. Since we
> cannot affect it, why not call it the "past"?

First of all, as a member of the human race who has an
absolute belief in the existance of all possible
worlds, i can testify that whenever i contemplate
conscious beings in enduring hell worlds (for want of
a better phrase) it *does* provoke an emotional
response in me. This is mainly due to my awareness
that whatever i imagine does have a very real
existance, the instants of my imagination are all
presents and those presents exist in real time as
equally real as our present, they are co-happening
right now. Given their presentness and realness i dont
see how we can label them as members of the "past" or
even the generalised past.

Now since i can certainly imagine things far worse
than the horrors of our past such as the holocaust,
you can see that i already have a hard time dealing
with this and yet my empathy levels are quite low
because i am a mere human (and a male of the species).

Our recollections of the past are very different from
the actual past. The actual past is a bunch of
presents, and those presents co-happen with all the
other presents, but the past we remember exists only
as information patterns stored in either our memories
or embedded in the world as we know it and none of it
may be accurate in terms of representing the actual
past. For example time travel may have allowed our
descendents to change everything we know about the
past, including the holocaust or any event before
todays date and time, in the majority of QM histories
at the very least. Strangely this means that those
doubters of the holocaust who many see as ignorant may
actually be right when they say it never happened,
even if all the history books and history sources
disagree, our probable history may turn out to very
quiescent or nice without any significant horrors to
speak of.

Anyway the point i was making was that a highly able
vivid imagination plus high levels of empathy plus
strong awareness of Level IV immutable processes is a
recipe for disaster for a mind which wants to remain
sane. The way to get round this problem of course is
to put a limit on one or more of the above
ingredients. Putting limits on posthuman minds may
seem unnecessary and contrary to your notion of the
freedom that posthuman minds will have, but presumably
those limits will be self-imposed. Each mind should
have the right to become insane if ve wishes, but i
think most would choose sanity over insanity so there
may be many instances in which there is a balancing
act wherever there is a potential conflict between
higher level extreme cognitive abilities which
posthumans will no doubt possess. The above is just
one example.

The message seems to be: instead of trying to
concentrate on creating artificial super-intelligence
however possible, we should focus instead on trying to
replicate the exact nature of the human mind but in an
artificial substrate. The human mind is a proven
model. Its out there and it funtions well in our
environment as evolution intended, we know it works.
If we instead focus all our energy on creating that
all explosive Singularity yielding self-bootstrapping
seedAI with a model that we think will work just
because we know it will have the intelligence of
humans and has the ability to quickly surpass human
intelligence, then many things could go wrong.
Intelligence could interfere with friendliness,
empathy could interfere with imagination etc. There
may be many such unforseeable conflicts which could
turn our good Singularity into a very disasterous
Singularity. Imagine the first supposed FAI quickly
gaining intelligence and power and also quickly
becoming insane in the process perhaps because of its
liberal ability to evolve in anyway it chooses
resulting in a design conflict, humanity would
unlikely be able to do anything about this having not
got an SI psychiatrist on stand by.

I put my trust in evolution far more than human
design. Evolution has had millions of years to slowly
tinker and experiment resulting in our very stable and
balanced human minds. Replicating the human mind
artificially would be another stable stepping block
toward the creation of SIs, and this method is a lot
safer. In short we should learn to walk before we rush
ahead.

Simon.

__________________________________________________
Yahoo! Plus
For a better Internet experience
http://www.yahoo.co.uk/btoffer



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT