From: Ben Goertzel (firstname.lastname@example.org)
Date: Tue Apr 02 2002 - 07:17:37 MST
> Gordon Worley wrote:
> > Correct me if I'm wrong guys, but based on this terminology, here is the
> > difference between what Eliezer and Ben think. Eliezer claims that an
> > AI needs a brain that has the goal of Friendliness on top, hard coded
> > into place (with the usual exception that the spirit of Friendliness is
> > what is hard coded, not the letter of what Friendliness means). Ben,
> > though, thinks that the brain just takes care of very basic stuff and
> > the mind picks the goals. Or, more accurately, there is an extra goal
> > layer between brain and mind and this goal layer decides what the mind
> > can and cannot tell the brain to do, rather than having the brain do its
> > own mind take-over protection.
> Uh, I think this exactly misstates Ben and my respective positions'. From
> my perspective, the way that humans have evolved to offload so much moral
> functionality onto reason instead of the brain - by virtue of being
> imperfectly deceptive social organisms that argue about each
> other's motives
> in adaptive contexts - is a feature, not a bug, and one that
> takes a lot of
> work to duplicate. From my perspective, I worry that Ben seems to be
> proposing goals that are very close to the wiring level, whether they are
> "learned" or "preprogrammed".
> An AI needs a *mind* with Friendliness on top, *not* a brain with
Eliezer, I feel you are selectively quoting your previously stated positions
What Gordon is referring to is your oft-made statement that a seed AI should
be created so that its internal goal system has Friendliness as a
*supergoal*. Indeed, according to your past writings this supergoal is
supposed to live at "mind level", not "brain level."
I have doubted whether Friendliness can/should be a supergoal supervening
over all other goals.
The question about how low-level goals are, is a different issue, which we
haven't discussed much. My view is that there are low-level goals, close to
the "wiring" level, which serve as seeds of a sort, about which
higher-level, more abstract and flexible goals form.
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT