From: Michael Vassar (firstname.lastname@example.org)
Date: Fri Aug 11 2006 - 01:20:49 MDT
Err...Martin. Michael was talking about using nanotech to duplicate people,
not time travel or cloning.
>From: "Martin Striz" <email@example.com>
>Subject: Re: Maximize the renormalized human utility function!
>Date: Fri, 11 Aug 2006 02:53:29 -0400
>On 8/11/06, Michael Anissimov <firstname.lastname@example.org> wrote:
>> > Well, only if you completely ignore the effect of the environment of
>> > individual and all the other consequent effects of that idea.
>>What? If I copy the most benevolent person I know, then they will be
>>bevolent, no matter the environment. Humans are flexible - they
>>adjust. Someone won't automatically turn evil just if they're placed
>>in a slightly different environment.
>I bet they would, if, for example, you abused or tortured them,
>especially in childhood. They may not be "evil" but they would be so
>psychologically damaged that they wouldn't be the most benevolent
>person you know anymore. Not even benevolent at all.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT