From: Ben Goertzel (email@example.com)
Date: Wed Dec 13 2000 - 12:10:30 MST
> Under routine circumstances, however, the verbal thoughts are in immediate
> control. (Note that I do not say the conscious mind is in control, since
> emotions can also exert major influence over verbal thoughts.) With
> respect to long-term goals, verbal thoughts are in effectively complete
I just don't believe you. This is not how it feels my mind work, nor does
how preverbal children or chimpanzees can do complicated things....
> > But I was noting one plus: it integrates relatively useful goal
> systems all
> > through our minds in subtle & complex ways.
> The return on integration is scarcely greater than the investment in
> instinct... maybe less.
I don't know how to make this kind of judgment. But my work practically
building an AI
has given me a lot of respect for what evolution has achieved in terms of
goals throughout a complex self-organizing control structure.
> > If Ai systems don't evolve, they'll have to get this some other
> way, that's
> > all. It's far from impossible.
> What AIs need is the correct decision, the Friendliness. Why do they need
> the tangle to get it? What's wrong with supergoal and subgoal?
Friendliness is not in reality a goal that can expressed by a simple
formula. Like all real concepts, in principle it's a big, complicated mess
only be mastered through experience. Gathering this experience, given
resources requires a complex mind
in which various goals guide a network of processes.
As we all know through our real-world life, ethical behavior isn't about
mastering a rule or having
a simply-declared belief; it's an attitude which has to permeate our being
to be really useful in guiding
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT