From: Gordon Worley (email@example.com)
Date: Thu Dec 26 2002 - 09:18:05 MST
On Thursday, December 26, 2002, at 04:28 AM, Samantha Atkins wrote:
> Gordon Worley wrote:
>> On Tuesday, December 24, 2002, at 04:59 PM, Samantha Atkins wrote:
>>> Gordon Worley wrote:
>>>> If we are in a simulation, the whole thing is probably hopeless.
>>> How so? It would seem to depend on the nature of the sim and what
>>> is possible within it, and perhaps beyond it, to us.
>> I feel like I am very real. While I have not tried myself (though I
>> probably should), Eliezer has asked to be let out and no answer came.
>> This leads me to believe that we are not living in a Friendly
> Illogical. Of course you feel real. You are perfectly real in the
> context of the sim. That you are not let out when you ask could mean
> a variety of things, like that you really aren't ready to be let out.
> It is not necessarily a sign of "unfriendliness".
I don't know about you, but if this is a simulation, it sucks and I
want out. It's not Friendly to let someone suffer when they don't have
to and have asked for the suffering to stop. I think, in this case,
it's more likely that our theory of Friendliness is correct than that
we are seeing `unfriendly' Friendliness.
>> If we are living in a simulation, then an Unfriendly Singularity
>> occurred, in which case I wouldn't put to much faith in being still
>> around when the simulation ends. So, in my opinion, if this is a
>> simulation, we're out of luck.
> Still doesn't follow.
In a Friendly Singularity, you would not be allowed to hold a mind in a
simulation that wants out. Since we can't get out, if this is a
simulation, we are very likely not living in a Friendly Singularity.
If we are in an Unfriendly Singularity, the chance of anyone even being
very friendly decreases (remember, most of the reason that humans are
friendly is evolution; you can win, though, if you become the biggest
kid on the block by about 100 times). Hence I think that lacking
strong indicators of high morals, if we're forced into this kind of
suffering here, I doubt we'll be given some computronium and told to go
enjoy living in the Singularity when the simulation ends.
Also, this is not a logical proof; it's a series of probabilistic
calculations. In the end, P(living in a simulation) << P(not living in
a simulation), in my mind, anyway.
>>>> If we've already been contacted, human ideas of morality are 180
>>>> degrees off the mark.
>>> Again, how so? I can think of many things you might have had in
>>> mind but I am not sure what you were thinking of.
>> Human morality points in the direction of altruism (human morality
>> itself is not altruism, but approximates it). Altruism, as we
>> understand it, means helping people. As an example, the Prime
>> Directive is very selfish, since it's essentially preserving
>> societies as they are when they could help them live the good life
>> like everyone in the Federation for the sake of anthropological
>> study. If the Prime Directive is moral, our understanding of
>> altruism is completely off the mark.
> Well, a lot depends on what really is "helpful". It might be quite
> presumptious to assume that critters that don't know how to help
> themselves know exactly what help would or would not look like from
> creatures more capable and intelligent. The Prime Directive is not in
> the least "selfish" unless you do not comprehend it. From your
> remarks, I think you do not. The Prime Directive is to keep from at
> best replacing a civilization with a remake of ones own and at worse
> plant the seeds of a civilization's destruction, sense of
> pointlessness or other serious malaise by becoming known at the wrong
> way and in the wrong time.
I understand the Prime Directive and I say that Captain Kirk could have
helped a lot more people if he had dropped a few replicators on
20th-Century-Earth-looking planets than if he had just ignored them.
I'm aware that it's possible to do more harm than good, but
anthropologists have come a long way in helping the people they study
come towards the modern world. I think that deciding the best course
of action to help an underdeveloped civilization would be an easy task
for an SI.
Maybe a Singularity can't get here to help us. Oh well, back to plan A.
> Also, I believe that civilizations, like individuals, sometiems need
> to make their own mistakes and learn from them or not. It is no
> kindness to attempt to free them from having to struggle and learn.
> This produces only a dependent person or civilization much too easily.
Such changes are not always bad. America was not brought up to its
current level of technology by a more advanced country, yet it is
highly dependent on the rest of the industrialized world. Dependence
is not necessarily a bad thing; it's a sign of an integrated system. I
think a large amount of strife comes from feeling that your tribe is
much lower than it used to be because you are in a larger system of
tribes than you used to be.
-- Gordon Worley "Man will become better when http://www.rbisland.cx/ you show him what he is like." firstname.lastname@example.org --Anton Chekhov PGP: 0xBBD3B003
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT