Hello...

From: Nathan Russell (nrussell@acsu.buffalo.edu)
Date: Thu Mar 07 2002 - 21:57:37 MST


Hi,

I'm a sophomore CS major, with a strong interest in transhumanism, and just
found this list.

I just looked at a lot of the past archives of the list, and one of the
basic assumptions seems to be that it is difficult to be certain that any
created SI will be unable to persuade its designers to let it out of the
box, and will proceed to take over the world.

I find it hard to imagine ANY possible combination of words any being could
say to me that would make me go against anything I had really strongly
resolved to believe in advance.

Additionally, while the SI was convincing me of its goals, wouldn't I show
some emotional/facial reaction? What if the head experimenter is watching
my face, with his hand on a switch that will drop a liter of HF on the SI's
motherboard?

The other possibility is that, as soon as the SI becomes able to act
physically, it will do Bad Things to our planet.

Suggestion: Seal it, and one of the experimenters, with sufficient oxygen,
food, battery power, etc, in the dead center of a sphere of titanium oxide
the size of 10 city blocks. Wouldn't that prevent any such problems?

Nathan



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT