From: Ben Goertzel (firstname.lastname@example.org)
Date: Sun Sep 15 2002 - 17:53:57 MDT
> Cliff Stabbert wrote:
> > How big would the temptation be for any current superpower to grab the
> > first workable nanotech or the first usable general AI and use it to
> > wield power over others?
> Stealing the first workable nanotech is one thing. "Stealing" a general
> AI is a bit different from stealing its physical hardware. AI is not a
> tool. It is a mind. Moving an AI from one place to another doesn't make
> it a tool in your hands, any more than moving Gandhi from India
> to Germany
> causes him to become a Nazi. Now of course potential thieves may
> not know
> that, but if so, failing to keep track of the distinction ourselves is
> hardly conducive to enlightening them.
Well, there is a difference between humans & general AI's in this context.
If one has a general AI that does not have the power to protect itself from
being stolen (having the hardware it's running on moved to a different
place; having its software copied and replicated elsewhere, etc.), then one
probably has a general AI that can't protect itself from having its code
rewritten, its mind force-fed with false knowledge, etc.
Brainwashing a kidnapped human is a difficult and sometimes probably
impossible task. Brainwashing a kidnapped general AI may be much easier if
the kidnappers are expert programmers and computer/cognitive scientists and
have some understanding of the AI design.
Of course, this risk is most pertinent in the transition period when
near-human-level AI is there and significantly transhuman AI is not...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT