From: Brian Phillips (firstname.lastname@example.org)
Date: Mon Feb 04 2002 - 19:20:58 MST
----- Original Message -----
From: Alan Grimes <email@example.com>
Sent: Monday, February 04, 2002 9:16 PM
Subject: Re: Sysop Scenario FAQ
> I had two more thoughts about this.....
> I tend to think of artificial beings in more of an artistic sense, that
> of creating a being to surpass me. Strong attractors, such as
> "friendliness" are incompatable with this vision as they are
> pathalogical... Just like "faith", strong attractors are _ALWAYS_
I rather suspect the artistic integrity of Eli's vision is not
especially important compared to the possibility of birthing
a major Perversion.
Just a thought :)
Why is a strong attractor a bad thing? Substantiate this assertion
> The game Civilization Call To Power, presents a possible form of
> government not at all unlike your sysop scenereo, where the mechanism of
> controll is an implanted brain-control device. Can you argue that your
> AI would not choose this method to control people and provide them with
> the AI's version of happiness?
Well that would depend on if it were a truly Friendly AI and what sort
of technology it had available to it. Logically the only way a Friendly
AI would do this would be if all other alternatives were worse.
Frankly I think that if a >H AI were sufficently motivated to tinker with
everyone and had MNT ve would just upload everyone. Would
be much more efficent than some sort of borg style system.
If an AI was sufficently motivated to make you happy
why go with a piggyback overide..why not just plug you
into a virtual reality (assuming you don't get uploaded alltogether?)?
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT