From: Ben Goertzel (email@example.com)
Date: Thu Jan 08 2004 - 04:25:02 MST
I had a dream last night about the following way of increasing the odds of
human intelligence surviving the Singularity. I'm not surprised if it's
unoriginal, of course, but I can't think of where I've heard it before.
It's much like things in Egan's novel Diaspora, but with differences.
The basic premise is that traveling faster than the speed of light may be
impossible or at least very difficult even for superintelligent AI's.
The idea is to create a factory that spits out a vast number of spacefaring
modules, each of which travels very close to light speed, and each of which
-- a seed AI, triggered to awaken in N years and start evolving toward
-- a simulated world for the seed AI to play in
-- a universe of virtual humans embodied in the simulated world
Statistically, some percentage of these AI's will become friendly and
ultimately create a nice corner of the universe for their simulated
population. And if the ones that evolve into bad AI's can't break the light
speed limit, then they'll have a very hard time catching up with all the
good ones to annihilate them, even if they have the desire to do so. After
all, the good ones can keep on running ... faster and faster as they design
better and better propulsion systems ;-))
Obviously, this doesn't increase the probability that a given human-instance
survives the Singularity, but it much increases the probability that *some*
human-instance corresponding to a given human (e.g. me) survives the
Also obviously, if the bad uber-AI learns to break the light-speed limit,
then the plan may fail. Or may not, because the good AI may figure out how
first ... or a good AI and bad AI may partition off the universe ... etc.
Anyway -- it was an amusing dream ;))
Does anyone know of any sci-fi that explores this specific territory?
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT