From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Tue Apr 29 2003 - 15:11:26 MDT
Simon Gordon wrote:
> --- "Eliezer S. Yudkowsky" wrote:
>> Yes, the trouble with Tegmark is that it can drive people insane
>> unless they understand how decision theory transfers over into very
>> large or infinite universes. You cannot, of course, cause evil,
>> monstrous acts to stop existing. Everything exists. However, you can
>> try to give evil, monstrous acts a very low measure - decrease their
>> subjective probability in the futures of most sentients.
> Indeed. I expect in the future, many beings will want brain implants
> (or additional software algorithms if they happen to be uploaded),
> which will block their knowledge and understanding of Tegmarks level IV
> multiverse - perhaps just for some temporary comfort and peace of
> mind, or perhaps it will be necessary in very advanced beings in order
> that they can remain sane while maintaining their useful mainstay
> emotions such as compassion and empathy.
I don't think that compassion and empathy are inherently irrational. They
may have irrational human implementations, but that doesn't mean that
anything essential or valuable about them would be lost if they had
rational humane implementations. What does it mean to have a rational
implementation of an emotion? To a first approximation, it means that the
emotion does not require believing in false statements.
So why would compassion and empathy require me to disbelieve in Tegmark's
IV, assuming it to be true?
>> What matters is not whether something "exists" or "does not exist"
>> but its subjective conditional probability.
> Yes, to YOU, the subject, those probabilities matter. But the point is
> that those horrible things DO and MUST happen, and however small the
> quantity of those subjective scenarios relative to the other scenarios,
> that fact doesnt help those poor beings that have to endure the
> unimaginable tortures and suffering that necessarily occur as the
> theory predicts.
And pretending they don't exist is a better way to help them? Pretending
they don't exist means you have no way of helping them at all.
Rationality is rationality, truth is truth, even in the face of hell. If
there's a real but small measure of irreducible hell, then that is the
truth and always was. There is nothing empathic or compassionate about
disbelieving it; that doesn't help anyone.
Or perhaps I wasn't clear on what I meant by "subjective conditional
probability"; I was speaking of the subjective conditional probability of
hell in the lives of all sentients, in every thread in the entire web of
qualia throughout Level IV, not just my own life. There may be an
irreducible minimum of inescapable hells, but if so, compassion is
reducing every reducible probability of hell that correlates with one of
your decisions. Which you can only do if you model the dependency. Which
you can only do if you model hell. Accurately.
-- Eliezer S. Yudkowsky http://intelligence.org/ Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT