From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Wed Sep 18 2002 - 11:07:00 MDT
Samantha Atkins wrote:
> I don't think so. If you are having trouble keeping yourself alive
> because of grossly inadequate nutrition, shelter and very spotty water
> supplies of poor quality then your low-level survival drives will, quite
> rightly, swamp your higher level needs for a time. This would be true
> of any conceivable sentient that had low-level survival minimal
> requirements. So please come down to earth just a bit, if you will.
Um... no. I refuse. Life on Earth just before the Singularity should be
about the Singularity because the Singularity is what matters for the next
billion years. This doesn't change regardless of where you live, although
whether or not you are likely to *know* about the Singularity may have a
differential probability that varies strongly with geographic location.
Now it may be that low-level needs place you at subsistence level and
leave very little free energy, but if you know about the Singularity, you
should still be concerned with it, with whatever free energy you have.
And it may even be that being placed in certain environments is enough to
disrupt or destroy an altruistic goal system. But in this case I would
consider, not quite that "you" have died, but that you have been
transformed into something whose philosophical judgments you should no
longer take into account.
Dedication isn't a trivial thing, Samantha. I don't know how strong my
will is. I don't know what it would take to break my will, whether it's a
little or a lot of adversity. But it doesn't matter. A broken Eliezer,
however little or however much it takes to break Eliezer, is not relevant
to this discussion. While my will continues it will be devoted to the
Singularity because that is the pretty darned unambiguous outcome of
rational altruism. And if I could continue that under whichever
conditions of adversity, I would be personally proud of it - although
obviously Singularity considerations mediate that I should try to
encounter as little adversity as possible; this isn't about pride.
"If Brahma has me burnt, I will spit into the flames. If he has me
strangled, I will attempt to bite the executioner's hand. If my throat is
cut, may my blood rust the blade that does it."
-- Sam, in Roger Zelazny, "Lord of Light"
I don't know if I could, but I know that I *should*, remain focused on the
Singularity on the way to my execution, and shout out my last thoughts
about Friendly AI to the firing squad to take advantage of the very small
but nonetheless real probability that someone will remember them and put
them online. Not out of defiance; because that is, under those
hypothetical conditions, the best remaining pathway to helping humanity
and humanity's future. It's not a *good* pathway, but it's what's
available under those conditions. And I do not deny the possibility of
that kind of dedication to any of the six billion people on this Earth,
regardless of what conditions they live under.
I think you do humanity a disservice if you suppose that human beings are
capable of altruism only under comfortable conditions. Maybe it's true of
me personally. If I do my job competently, it will never be put to the
test. But if I do fail that test, that's a flaw in me, not something that
changes the correct course of action.
>> Why is it, Ben, that you chide me for failing to appreciate diversity,
>> yet you seem to have so much trouble accepting that this one person,
>> Eliezer, could have an outlook that is really seriously different than
>> your own, rather than some transient whim? I don't have any trouble
>> appreciating that others are different from me, even though I may
>> judge those differences as better or worse. You, on the other hand,
>> seem to have difficulty believing that there is any difference at all
>> between you and someone you are immediately talking to, regardless of
>> what theoretical differences you might claim to believe in or respect.
> Why is that you are beginning to take this attitude of being above it
> all and almost of a different species from the rest of us? As
> wonderfully bright and dedicated as you are I don't believe that this is
> justified, at least not yet.
Why not say: "No matter *how* bright and dedicated you are, or aren't,
that attitude would *never* be justified." This helps to avoid debate
about side issues.
>> Suppose that I did tend to focus more on material poverty if I were
>> experiencing it. That supervention of my wired-in chimpanzee
>> priorities is not necessarily more correct.
> If it is the difference between life and death, then it is higher
> priority in that it is prerequisite to the rest of your goals. There
> must be enough surplus of energy beyond what is needed to survive and
> accomplish some basic functionality before higher goals can be
> addressed. Many people in this world do not have that much today. That
> is also potentially many brains of good potential that are never utilized.
I agree. This is yet another problem that can best be fixed via (drum
roll) the Singularity. That's the most effective, most efficient, fastest
way that I can put any given amount of effort into solving that problem.
If I do it some other way, I fail. If I do it some other way because of a
reason other than my anticipation of maximum benefit to those people, I
fail in altruism.
>> I might as well say to some Third
>> Worlder "You might consider material poverty less serious if you lived
>> here." For that matter, I could also be tortured until I considered
>> ending the pain to be the most important thing in the universe. So
>> what? What does this have to do with the price of tea in China, or to
>> be more precise, the Bayesian Probability Theorem?
> Puh-leze. Can you manage to address anything directly without dragging
> out BPT?
Nay, surely not, for the BPT lies at the very foundations of the universe.
>> How do any of these things change the facts? In what way are they
>> "evidence" about the issue at hand? I run on vulnerable hardware with
>> known flaws, such that there are certain environmental stimuli that
>> would produce very-high-priority signals capable of disrupting more
>> rational means of aligning subgoals with supergoals; environmental
>> stimuli may even result in negative or positive reinforcement in
>> sufficient amounts to overwrite the current goal system. Again, so
>> what? That's just a broken Eliezer, not an enlightened Eliezer.
> The so what is that many billions of people live today in just such
Yes. Six billion, to be precise.
> Do you feel some empathy for them? Is belping them part of
> what drives your work?
It's a big part of it. I care about the vastly greater future as well.
>> "Vastly"? I think that word reflects your different perspective (at
>> least one of us must be wrong) on the total variance within the human
>> cluster versus the variance between the entire human cluster and a
>> posthuman standard of living. I think that the most you could say is
>> that some humans live in very slightly less flawed conditions than
>> others. Maybe not even that.
> Your perspective includes hypotheticals not currently in existence.
Um... yeah. And this is a bad thing because...
> Given current existential conditons, some humans live in vastly more
> flawed conditions than others.
Adversity isn't a relative quantity, at least not as I measure things.
Whether you're doing well or poorly doesn't depend on whether someone else
has more or less. Right now all known sentient beings live under
conditions of tremendous adversity.
>> > As a person of great material privilege, you are inclined to
>> > focus primarily on the limitations and problems we all share.
>> As a student of minds-in-general, I define humanity by looking at the
>> features of human psychology and existence that are panhuman and
>> reflect the accumulated deep pool of complex functional adaptation,
>> rather than the present surface froth of variations between cultures
>> and individuals.
> Hmmm. That froth is where the people live!
No, that froth is what the people focus on, because those variances are
the ones which are adaptively relevant to *differential* reproduction and
hence perceptually salient. If there exists a race which has evolved to
be six miles tall because height is the most powerful evolutionary
advantage, they will, among themselves, focus on the remaining six inches
worth of variance. Hence the IQ wars.
>> > Of course, I agree with you that creating a superhuman AGI can be a
>> > great way to end material poverty as well as to overcome the many
>> > self-defeating characteristics of human nature.
>> It's a way to rewrite almost every aspect of life as we know it. You
>> can take all the force of that tremendous impact and try to turn it to
>> pure light. You can even hypothesize that this tremendous impact,
>> expressed as pure light, would have effects that include the ending of
>> fleeting present-day problems like material poverty. But it is unwise
>> in the extreme to imagine that the Singularity is a tool which can be
>> channeled into things like "ending material poverty" because some
>> computer programmer wants that specifically.
> I think you are attempting to turn yourself into a FAI disconnected from
> your own humanity. I am not at all sure this is a good thing.
1: What the heck *else* do you think I've been trying to do with my mind
for the last six years?
2: If you think that is in any wise, shape, or form a bad thing, you must
have an extremely different conception of FAI than I do.
3: You've never met an FAI or you wouldn't say that. (Neither have I, of
course, but I have a good imagination.)
4: If it's good enough for an FAI, it's good enough for me. The only
question is whether *I* can manage it.
As for being "disconnected from my own humanity"... is this a generic way
of saying "different from what you were last week"? And if not,
disconnected from humanity and connected to what, exactly?
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Sat May 25 2013 - 04:00:35 MDT