Re: Different View of IA

From: Gordon Worley (redbird@rbisland.cx)
Date: Wed Apr 24 2002 - 10:56:29 MDT


On Wednesday, April 24, 2002, at 05:27 AM, Will Pearson wrote:

> If the intelligent substrate scenario is the only one fed on this list,
> which it seems to be then it will become very hard to destroy and may
> not be worth the other memes effort.

Memes have a way of getting fat and lazy. Being the status quo is only
a mildly successful way for a meme to continue to propagate, since it
makes it easier to attack as the meme machines forget what it was about
the meme that was so great.

> So to start off, the belief that human society could not take a form
> that would prevent abuses of nanotechnology. If this is not your belief
> I apologise.
>
> So to let other people do the talking, I recently found an article on
> wired http://www.wired.com/wired/archive/4.12/fftransparent.html by
> david brin about how society could regulate itself by everybody
> basically being able to spy upon each other. Ramp up the technology a
> bit with prehuman intelligences helping each human do the spying. And
> you could get a stable society.

I doubt this. Let's jump in the Way Back Machine to see why ...

So, here I am, in the ancestoral environment. I'm amongst a tribe of
about 150 humans. They live close together and can spy on each other
fairly easily. While a few can hide a little, human societies work
because you can't go against society without some consequences, so
there's always someone sticking his or her nose into someone else's
business. One day Unk is caught not sharing the chicken he caught.
Well, everyone knows that Unk's family has gone without meet for a few
weeks, so they let this pass. A few days later, he catches another
chicken and again shares none of it. When he does this the third time,
people are pissed. The solution: beat him. Maybe rape his wife and
kill one of his children, too. Unk is upset and he fights a few of his
neighbors and manages to bloody them a little. After that Unk is a good
human and mostly gets along in society.

Now we take the Way Back Machine into the future (yes, we're going
negatively backwards):

So, here I am in the year 2015 where nanotech spy technology is
everywhere. Just yesterday Knu (the great great great ... great
grandson of Unk--unless of course the village had managed to get his
wife pregnant that time Unk did not share his chicken) was caught by his
neighbors cracking the encryption on a DVC (digital video cube) and
watching every Inspector Clouseau film without paying $5 a minute to the
MPAA like all other good, god-fearing people do. Since the MPAA doesn't
know about this, his neighbors kindly decide to inform them. With the
press of a button, the MPAA sends out nanobots to Knu's home and has it
liquidated. Literally. Knu, soaking wet and pissed, is still fully
capable, unlike his long dead relative Unk who was badly bruised and
only able to throw a punch or two before a fight was over. Using the
assembler at the nearby Kinko's, Knu builds and sends out some nanobots
to liquidate this neighbors houses to get revenge. But, as it turns
out, Knu is not the best programmer, so his nanobots accidently
liquidate the Earth. The dolphins enjoy all the extra space (now
they're really thanking humans for all the fish :-P) but humans are dead
or drowning.

Back in the e-mail I'm writing right now:

So, as we see, humans don't really change, just the technology does.
Consequently, humans with technology that can destroy the world are very
dangerous. It's mostly through luck that we have managed not to nuke
ourselves out of existence or back to the stone age. One day someone
with nuclear weapons or nanotech or something is going to probably kill
everything on Earth. For many of us, the push to reach the Singularity
as quickly as possible is that the longer we wait the more likely it is
that we'll be caught on Earth when someone decides to blow it up.

> Side note. Has there been any thought about going towards a standard
> units way of mesuring intelligence. For example a Super Intelligence
> could be a Mega-intellignce. Meaning 1 million times more intelligent
> that the average human being. Then a whole new world of
> Exa-intelligences etc would be available, prehuman could be
> deci-intelligent or centi :) Or is this just a silly idea.

Well, it's not completely silly, but the problem is finding some way of
measuring intelligence. What kind of tool are you going to use to tell
me the intelligence of one human? What other units is intelligence
defined in (you have to have some standard to measure against)? While
it may be possible to do intelligence ratings, there isn't a clear way
to do it right now.

> I will have to read Vinge, you can't get him on the high street over
> here, although you can get Zindell. Strange isn't it.

/A Fire upon the Deep/ and /A Deepness in the Sky/ share the cynical
view that I have for humanities future if it continues at its current
level of intelligence with bigger and badder toys. While fiction is
just that, I think Vinge illustrates an important, observable fact about
intelligence and its technological limits.

--
Gordon Worley                     `When I use a word,' Humpty Dumpty
http://www.rbisland.cx/            said, `it means just what I choose
redbird@rbisland.cx                it to mean--neither more nor less.'
PGP:  0xBBD3B003                                  --Lewis Carroll


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT