From: Gordon Worley (email@example.com)
Date: Thu Jan 31 2002 - 08:39:29 MST
On Thursday, January 31, 2002, at 06:58 AM, Christian Szegedy wrote:
> DarkVegeta26@aol.com wrote:
>> I think the more likely "universal concepts" to remain will be 'fun',
>> in some sense, more 'understanding' in some sense, and more 'increased
>> amount and speed of information processing/complexity' in *some*
>> sense. I think it would be considered unethical to a Next-Level
>> entity to *not* convince a human (which it could do quite easily) to
>> accept the uploading/transcension process.
> Perhaps it is unetical to convince a human to upload and waste valuable
> computational resources, instead of letting them used by
> some well tuned AI much much more effectively.
> I may say so: if you save one human life by uploading, you kill a
> hyperintelligent/hypersensitive AI in the same turn.
How so? If you mean simply that for every uploaded human that's one
less AI that might have been created, true, but it only might have been
created. You can't kill something that was never born. This is just
-- Gordon Worley `When I use a word,' Humpty Dumpty http://www.rbisland.cx/ said, `it means just what I choose firstname.lastname@example.org it to mean--neither more nor less.' PGP: 0xBBD3B003 --Lewis Carroll
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT