Re: Books on rationality

From: Gordon Worley (redbird@rbisland.cx)
Date: Wed Jun 05 2002 - 11:47:06 MDT


On Wednesday, June 5, 2002, at 12:42 PM, Ben Goertzel wrote:

> It seems that this is a matter of prioritizing short-term goals over
> long-term goals.
>
> To do so is not intrinsically irrational. What is irrational is
> *believing
> that one is acting optimally toward one's long-term goals*, whereas
> *actually* one is choosing one's actions toward one's short-term goals

Okay, I guess I failed to communicate that.

> It almost seems that what you're calling "rationalization" is what
> logicians
> call "abduction" or "hypothesis" -- one of the most important forms of
> reasoning.

I was worried when I was writing this that it might sound like that.
Guess I was right. ;-)

> I think that rationalization is, specifically, a *control mechanism for
> inference* that *specifically seeks to consider only evidence in favor
> of
> the truth of a certain proposition, rather than evidence against its
> truth.*
>
> It is not specifically tied to abduction, deduction, or any specific
> mode of
> reasoning, in my view. One can rationalize with pure deduction if one
> wishes.

Yes, but it's harder to rationalize deduction. It's very easy to
rationalize abduction. That's why my example went the way it did
(looking back at what I wrote, I didn't really get out of example mode
when I started to explain rationalization, sorry).

> Also, I think that *rationalization itself* is a powerful heuristic for
> creative thought.
>
> However, really great creative thinkers may proceed by
>
> 1) rationalizing their way to an interesting conclusion, by careful
> inference control
>
> 2) then, AFTERWARDS, considering their conclusion in a more balanced
> way,
> looking at both positive and negative evidence equally
>
> In other words, I think that rationalization is an important part of the
> creative thought process, but it is dangerous if used alone, rather
> than as
> one stage in a multi-stage process

I think there's a difference between rationalization and *choosing* to
ignore some facts so that you can get a conclusion that you can then go
back and play with. For example, when programming, it's common to start
working on a program by assuming that the input is always in a specific
format and fits certain guidelines. Then, once you have the basic
algorithm working, you go back and generalize your algorithm to work
with more general input.

>> When you are trying to find a reason why your conclusion is true, you
>> being to feel like you are straining to come up with support. Stop
>> right there. If you don't know, you don't know, so don't make
>> something
>> up, because whatever you make up is not helpful and will confuse
>> matters, an irrational choice.
>
> On the contrary, making stuff up is great. The important thing is to,
> afterwards, judge what you've made up by a harsher criterion (a fuller
> consideration of evidence).

This is true. My main objective in this example was to point out that
you have to acknowledge that you don't know and realize the consequences
of making things up. I generalized too much, so here's the situation I
was thinking about when I wrote this:

You are explaining to your friend why you think that theory x is true.
In doing so he asks a question that you realize that you don't know the
answer to. Rather than admit that you don't know, you make up some
excuse on the spot that probably doesn't really hold up, but, as you see
it, keeps the argument going.

Of course, later on you should sit down and try to figure out what an
answer might be to his question. Maybe it turns out he found the error
in theory x. Maybe you'll find and answer and it provides even more
support for x. Maybe the question was irrelevant and your friend was
just trying to distract you because *he* had an emotional dislike for x
and didn't want to even think of reasons why it might or might not be
true.

> To say "It's more important to me to get laid tonight than to graduate
> college" is not illogical, it's just a goal weighting that many people
> consider normatively suboptimal...

This is what I discuss later on, but I think that any particular example
should be as self inclusive as possible, so this one needs some work.

> This is just silly... If I were given this choice and the odds of
> getting
> the extra $500 were 50%, I of course would take the 50% chance of an
> extra
> $500.

Okay, Eli pointed out I got the numbers all wrong. I remembered the
outcome of the situation, but not the details. Sloppy e-mail, as
usual. ;-) Against my better judgment, I tried to pound that e-mail
out in an hour late at night.

> I think that, in each case that you describe, the real irrationality
> comes
> down, not to incorrect inference on a micro level, but rather to
> *emotionally biased inference control.*

This is what I think, too, but I've gotten some bad reaction to
explaining the situation that way, so I took a different approach last
night. The people that I've tried this explanation on either complain
that I want to eliminate *all* bias, which is certainly a good goal but
not possible depending on how broadly you define bias, or simply reject
that it's impossible. That's why the book project will be laid out
something like this:

Operators Manual for the Human Mind

I. The Human Brain
II. The Human Mind
III. Ways of Human Thought
IV. Rational Thought

By the time there's any talk of rationality, the person will have some
understanding of how the human mind works *to the best of our knowledge*
so that an explanation of rational thought won't sound impossible.

> What "irrationality" is, in my view, is largely the following. One
> has a
> conclusion that would be emotionally gratifying if it were true (and
> one has
> a goal of gratifying the emotions in question). One then biases the
> information fed into one's logical inference component, in a specific
> way
> oriented toward making the desired conclusion come true. In a sense,
> this
> produces a divided mind, which "intentionally" (though perhaps
> unconsciously, still in a goal-oriented way) hides some knowledge that
> it
> has from some parts of itself.

We're in complete agreement here. A usual, when I let my fingers fly,
something not quite like what I have in mind comes out (combined with
trying to pick some angle other than my understanding to explain).

--
Gordon Worley                     `When I use a word,' Humpty Dumpty
http://www.rbisland.cx/            said, `it means just what I choose
redbird@rbisland.cx                it to mean--neither more nor less.'
PGP:  0xBBD3B003                                  --Lewis Carroll


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT