Re: Deliver Us from Evil...?

From: James Higgins (jameshiggins@earthlink.net)
Date: Sat Mar 24 2001 - 09:25:42 MST


At 03:42 PM 3/24/2001 +0000, Christian L. wrote:
>OK, you had a definition for "ending all evil". My mistake. I had scooped
>around Eleziers website without finding anything.
>>To eliminate all INVOLUNTARY pain, death, coercion, and stupidity from
>>the Universe.
>>Any problems?
>Yes, the problems remain. While "death" can be clearly defined, "pain" and
>"coercion" cannot. Have you got separate definitions for these too? Is it
>only physical pain or also psychological? Can I be bullied all my life in
>school and when I put the gun in my mouth to end my misery, the gun
>clicks? Or wait, is that voluntary death? Or have my tormentors used
>coercion to get me killed?

Actually, my question lies in the involuntary stupidity. How do you define
that? The simple solution, is that the "friendly" SI force feeds
everything it knows into every Power, which for the most part makes
everyone the same. So we end up with Billions of 99.9999% identical
minds. Boring. I find it difficult to believe that you could force a
being with near infinite intelligence to be stupid in the first place.

>All interaction between humans includes various degrees of coercion; from
>suggestion to persuation to brute force. Where do you draw the line?

Hey, this would make it impossible to talk humans into uploading. Possibly
making it impossible to get anyone to upload at all. This could explain
the *poof* scenario where all Powers vanish, never to be heard from
again. (and thus very few, if anyone, follows)

>>>The scary thing about that is, who gets to define what constitutes "evil?"
>>Someone has to do it.
>No, noone has to do it. All we have to do is build a seed AI (unless we
>are talking about Asimov Laws...).

Exactly. The only thing that worries me about the future, is this point
right here. That human individuals are going to try and influence rules
post-SI so that it becomes their personal concept of utopia. I would hope
that Powers are inherently friendly in virtually all cases by nature. In
any case I thought that the idea of us providing rules for the other side
had been totally, and completely proven a terrible idea. For example,
forcing Asimov Laws was said to be a horrible idea because it is impossible
to determine exactly what they would become after thousands, or even
millions of upgrades. Does that not equally apply to "friendliness"?



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT