Re: SL4 meets "Pinky and the Brain"

From: James Higgins (jameshiggins@earthlink.net)
Date: Tue Jul 16 2002 - 10:58:51 MDT


Eliezer S. Yudkowsky wrote:
> James Higgins wrote:
>> Mike & Donna Deering wrote:
>> > a human. Or for that matter the status of an AI programmer. How
>> do we
>> > know that Eliezer isn't trying to take over the world for his own
>> purposes?
>>
>> Um, what else do you think Eliezer IS trying to do if not take over the
>> world for his own purposes? That may not be the way he states it but
>> that is obviously his goal. His previously stated goal is to initiate
>> the Sysop which would, by proxy, take over the world. He states that
>> this is to help all people on the Earth, but it is still him taking over
>> the world for his own purposes.
>
> James, this is pure slander. I've stated before and will state again
> that the Sysop Scenario was one bright stupid idea that I sincerely wish

Eliezer, I did not intend it as such. I believe what I wrote to be
factual, please note that I did not specify what your motives are. You
are in effect attempting to take over the world via an SI. This could
be a very good thing, depending on how your implementation turns out. I
apologize if you took that as an insult or such.

> I'd never come up with, since it seems to take over people's minds
> faster than a transhuman AI with a VT100 terminal. The Sysop Scenario
> is the Singularitarian equivalent of the Sixties Syndrome or the
> Jetsons; futurism couched in human terms, easy to imagine and therefore
> probably flat wrong.

Even if the SI doesn't actually take over the world (as in locking down
all matter) it will be able to control the world due to its vastly
superior intellect. So, sysop or not, it is still taking over the world
so to speak. The difference (in my oppinion) would be that the Sysop
scenario would provide a nice, comfy cell for all intellects where a SI
that didn't take full control could just be a helper.

> What does matter, and does not change irrespective of whether the result
> of seed AI is a Sysop Scenario or the first very powerful transhuman
> helping the world make a smooth transition to a civilization of equally
> powerful transhumans, is whether the mind you build has *your own* goal
> system, or whether the result is the same mind that would be built by
> *any* AI project which shares the moral belief that the outcome of the
> Singularity shouldn't be made to depend on who implements it. This
> holds true whether it works because of objective morality, or because of
> a "subjective" yet clearly widely shared belief that certain kinds of
> personal morality should *not* be transferred into the first transhuman
> AI. I clearly spend a great deal of time worrying about this. Ben, on
> the other hand, has stated in clear mathematical terms his intention to
> optimize the world according to his own personal goal system, yet you
> don't seem to worry about this at all. I'm not trying to attack Ben,
> just pointing out that your priorities are insane. Apparently you
> don't listen to what either Ben or I say.

For the record, I worry about all manner of things when it comes to the
Singularity, Nanotech, or the like. I also listen very closely to what
both you and Ben say.

This isn't the place to get into the details but having conversed with
you and Ben for awhile I believe he is significantly wiser than you (not
necessarily more intelligent). Nothing personal, but your ~20 and
virtually all people that age are full of themselves. It is a
perspective that changes with experience and, I believe, allows a person
to make significantly better judgements. I've said this before, but
I'll repeat it here, over the next 5-15 years you will mature, gain
wisdom and shift your perspectives. Hopefully in 10 years time I'll be
as comfortable with your decisions as I am now with Ben's (which is far
from 100% - btw).

Eliezer, please don't take this personally. You agree that a person
can't think like, or fully understand the thinking of, a person more
intelligent than themselves. Well, I believe that in addition a person
can't understand things in the same way as a person who has more life
experience (is wiser) than themselves. Pretty much everyone becomes
wiser with age (some more than others, obviously). Assuming you don't
keep yourself locked in a closet thinking about the Singularity then
I'll bet you that 10 years from now you'll agree with this statement. I
sincerely wish I could just explain/teach this to you but,
unfortunately, you just can't start to see over that hill until you've
climbed it for awhile. And I expect that my perspective will also
change and mature over the next 10 years (it is a continual process -
unless you intentionally isolate yourself from it).

> Oh, well. I've been genuinely, seriously accused of trying to take over
> the world. There probably aren't many people in the world who can put
> that on their CV.

The way I intended it is that, essentially, anyone working on a seed AI
or the Singularity would qualify. Of course, that is still a very small
group of people...

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT