Re: Happy Box

From: John K Clark (johnkclark@fastmail.fm)
Date: Mon May 05 2008 - 09:52:24 MDT


On Sun, 4 May 2008 "Matt Mahoney" <matmahoney@yahoo.com> said:

> I want to change my goal system so I am happy all the
> time no matter what happens. I know this is possible

I agree, it is possible.

> Will this make me more intelligent?

It will make you completely uninteresting to me or to the universe in
general, you will only be interesting to yourself.

  John K Clark

> --- John K Clark <johnkclark@fastmail.fm> wrote:
>
> > An agent that discovers new information but refuses to change its
> > goals
> > merely (Merely!) because it is a means to an end, that is to say it
> > won’t change its goal system even though it now knows the old
> > structure
> > won’t work but a new structure will, is about as intelligent as a
> > rock.
> > And this is how you expect to make a super intelligent slave? Good
> > luck, you’ll need it.
>
> I discovered that my goal system won't work. For example, I am not
> happy if I don't eat, or if I am too hot, or too cold, or poked with
> sharp objects. I want to change my goal system so I am happy all the
> time no matter what happens. I know this is possible because I did
> simulations with http://www.mattmahoney.net/autobliss.txt with the
> second and third arguments both positive. Will this make me more
> intelligent?
>
>
> -- Matt Mahoney, matmahoney@yahoo.com

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - Same, same, but different…


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT