Re: Another Take on the Fermi Paradox

From: Samantha Atkins (samantha@objectent.com)
Date: Thu Dec 26 2002 - 02:28:47 MST


Gordon Worley wrote:
>
> On Tuesday, December 24, 2002, at 04:59 PM, Samantha Atkins wrote:
>
>> Gordon Worley wrote:
>>
>>> If we are in a simulation, the whole thing is probably hopeless.
>>
>>
>> How so? It would seem to depend on the nature of the sim and what is
>> possible within it, and perhaps beyond it, to us.
>
>
> I feel like I am very real. While I have not tried myself (though I
> probably should), Eliezer has asked to be let out and no answer came.
> This leads me to believe that we are not living in a Friendly simulation.
>

Illogical. Of course you feel real. You are perfectly real in
the context of the sim. That you are not let out when you ask
could mean a variety of things, like that you really aren't
ready to be let out. It is not necessarily a sign of
"unfriendliness".

> If we are living in a simulation, then an Unfriendly Singularity
> occurred, in which case I wouldn't put to much faith in being still
> around when the simulation ends. So, in my opinion, if this is a
> simulation, we're out of luck.
>

Still doesn't follow.

>>> If we've already been contacted, human ideas of morality are 180
>>> degrees off the mark.
>>
>>
>> Again, how so? I can think of many things you might have had in mind
>> but I am not sure what you were thinking of.
>
>
> Human morality points in the direction of altruism (human morality
> itself is not altruism, but approximates it). Altruism, as we
> understand it, means helping people. As an example, the Prime Directive
> is very selfish, since it's essentially preserving societies as they are
> when they could help them live the good life like everyone in the
> Federation for the sake of anthropological study. If the Prime
> Directive is moral, our understanding of altruism is completely off the
> mark.
>

Well, a lot depends on what really is "helpful". It might be
quite presumptious to assume that critters that don't know how
to help themselves know exactly what help would or would not
look like from creatures more capable and intelligent. The
Prime Directive is not in the least "selfish" unless you do not
comprehend it. From your remarks, I think you do not. The
Prime Directive is to keep from at best replacing a civilization
with a remake of ones own and at worse plant the seeds of a
civilization's destruction, sense of pointlessness or other
serious malaise by becoming known at the wrong way and in the
wrong time.

Also, I believe that civilizations, like individuals, sometiems
need to make their own mistakes and learn from them or not. It
is no kindness to attempt to free them from having to struggle
and learn. This produces only a dependent person or
civilization much too easily.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT