Re: AGI funding (was Re: Some bad news)

From: Slawomir Paliwoda (velvethum@hotmail.com)
Date: Sun Nov 10 2002 - 06:01:26 MST


> You're placing too much trust in your moral intuitions.

That's the only heuristic I have. Besides, it's more like common sense.
Perhaps, I will acquire some new and better heuristic in the future, but now
I'm stuck with this one which in this case tells me very clearly what needs
to be done.
.
> Generally, if a
> moral compromise instinctively seems like a good idea, it's because in the
> ancestral environment that moral compromise would have promoted your
> *personal* reproductive fitness.

I don't doubt that, except I don't see any compromise here.

> It is not a coincidence that the moral
> compromises that seemed to Stalin to promise the greatest good for the
> greatest number ended up with Stalin as tribal chief and all of the
> supposed beneficiaries miserable.

It's convenient to apply the principles of communism to the morality of its
leaders, but I'm afraid that Stalin didn't start his "career" with the aim
of providing the greatest good for the greatest number. Anyway, in the
absence of a compromise discussing this further would be just an unnecessary
distraction.

> It seems to your moral intuitions like compromising the message at the
> heart of the Singularity seems like a good idea, something that would work
> to promote the Singularity, and certainly not anything that you are doing
> for the sake of personal fame.

Promotion of the Singularity is not important. Is it shocking? It shouldn't
because this ought not be about Singularity. Devoting any more time to this
empty concept would be like those heuristics that assigned the highest value
to themselves (that I vaguely remember). This is not even about AGI/FAI, but
what humanity might be like in the future. If your microwave oven could do
exactly the same things for humanity what FAI could, then the goal should
still remain the same. Then you would be concerned with microwave ovens, and
then, maybe Singularity, except if you accomplished your goal without
technological Singularity, then why would you still care about this event?
Now, assume you can't talk about Singularity at all, and think of your
message. Would it still make sense? It should. Otherwise it needs
adjustment.

> Why does it seem like a good idea? Is it
> an empirical generalization from the history of postagricultural
> societies? Are you modeling the detailed effect of your moral compromise
> on millions of interacting people in order to predict the outcome of a
> complex social and memetic system? Of course not. It seems like a good
> idea because fifty thousand years ago, people who thought it was a good
> idea tended to end up as tribal chiefs. In the domain of politics, a
> means to an end intuitively seems like a good idea to the extent that
> carrying out that means would have served the purposes of your genes in a
> hunter-gatherer tribe, not to the extent the means would achieve its
> supposed end in our far more complex culture.

So this is why. But what if this is the only right way. Then you would have
the answer with two solutions. Just because one solution is implied to be
derived from "bad" sources, doesn't make the same answer that could be
obtained by following the other solution less right. Again, another
distraction.

> It *is* a famous empirical generalization from the history of
> postagricultural societies that people who start out by making moral
> compromises in the service of their ideals usually end up not
> accomplishing anything toward those ideals, although their adaptations may
> (or may not) operate in accordance with ancestral function to place them
> in positions of personal benefit.

There's a mechanism of attachment by which the author's mind forms a strong
bond to his ideas comparable to a parental instinct between a parent and a
child. The connection is so strong that severing it would be considered
harmful to the whole existing structure. Meanwhile, this is just an
illusion, but the mind is unable or unwilling to notice, so it tries to
protect the bond at all costs even though it might have the capability to
observe that whatever it is connected to so strongly might be just an empty
space.

Slawek



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT