Re: Si definition of Friendliess

From: Chris Cooper (coop666@earthlink.net)
Date: Wed Mar 28 2001 - 17:20:52 MST


Eliezer,
  Thanks for the link. While it did clear up some of my question, others
remain.

>The Sysop Scenario also makes it clear that individual volition is one of
the strongest forces in Friendliness; individual volition may even be the
only part of Friendliness that matters to a Sysop - death isn't
intrinsically wrong; it's wrong because people don't want to die. <

  My scenario of humans in a cushy Sysop-controlled zoo doesn't seem to
conflict with your description of Friendliness. If we have as much, if not
more freedom in our new virtual digs, minus the ability to harm ourselves or
others, the Sysop has achieved its goal of Friendliness to humans. We would
still have individual volition to do anything that we could do pre-Sysop. We
couldn't upload/upgrade ourselves without the Sysop's help, but then we
couldn't do this before, either.Thus, no Friendliness conflict. I still
don't understand why a Friendly SI will be interested in "bootstrapping"
humans into the post-Singularity toybox, considering that it would be less
trouble to just copy vimself a few billion times instead. I'm not trying to
be difficult, I'm just trying to understand why our SI will have the desire
to upgrade humans at all. I'm also trying to follow your wisdom of not
thinking in an anthropomorphic manner when thinking about SI motivation. If
the entire human upload/upgrade scenario is based on the strength of
Friendliness during the AI- to- Transition guide- to- Sysop evolution, I
hope that everyone involved does a damn good programming job.

  I'll say it again, I hope that I'm terribly wrong about this, but I
haven't been convinced, yet.

COOP



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT