RE: Books on rationality

From: Ben Goertzel (ben@goertzel.org)
Date: Wed Jun 05 2002 - 10:42:10 MDT


> You are in college and need to take a class in general relativity. You
> notice, though, that there are lots of members of the opposite sex in
> that oral communication class. In fact, there's one person in there you
> like a lot and would like to get to know better. Taking that class will
> set you back in your academic career. Assuming that you are in college
> to learn and get some kind of degree, choosing to take the oral
> communication class because it might let you get to know someone better
> is an irrational decision.

It seems that this is a matter of prioritizing short-term goals over
long-term goals.

To do so is not intrinsically irrational. What is irrational is *believing
that one is acting optimally toward one's long-term goals*, whereas
*actually* one is choosing one's actions toward one's short-term goals

i.e., fooling oneself, by ignoring evidence

> You have a job providing tech support. After many days working and
> thinking about how bad your job is when you get home, you realize that
> Grand Theory X would justify you being rude to the customers and you'd
> be a lot happier. The first trap of irrationality is acting on Grad
> Theory X without having proven it. If you get past that, there are
> others. So, you are sitting around trying to prove Grand Theory X and
> start to come up with reasons that it must be true. After a while you
> have a whole bunch of reasons that, while they sound a bit implausible
> at times, justify Grand Theory X in your eyes. That's rationalization,
> another form of irrational thought. Telling the difference between
> rationalization and a logical progression of ideas is easy if you know
> for what to look. If you are rationalizing, you start with a conclusion
> and seek out evidence to support that conclusion. Unless you have a
> well trained mind, you'll ignore the counter evidence and conclude that
> your conclusion is in fact true. Logical thought occurs when you have
> some evidence and you get a conclusion from that. Or, possibly, you
> have a conclusion and you have some evidence and after sitting around
> from a long time and thinking hard and trying to come up with counter
> examples and failing, you find that the evidence really does support
> some conclusion (more likely a slightly different conclusion than what
> you started with).

Hmmmm....

It almost seems that what you're calling "rationalization" is what logicians
call "abduction" or "hypothesis" -- one of the most important forms of
reasoning.

A great many scientific theories begin by someone starting with a theory
they'd like to be true, and coming up with reasons why it might be true.
This is abduction, in essence.

You are championing deductive inference, which is just one among several
modes of inference. (And when probabilistic evidence is involved, deduction
is not *absolutely* certain either, although it tends to have
greater-confidence conclusiosn than induction or abduction.)

I think that rationalization is, specifically, a *control mechanism for
inference* that *specifically seeks to consider only evidence in favor of
the truth of a certain proposition, rather than evidence against its truth.*

It is not specifically tied to abduction, deduction, or any specific mode of
reasoning, in my view. One can rationalize with pure deduction if one
wishes.

Also, I think that *rationalization itself* is a powerful heuristic for
creative thought.

However, really great creative thinkers may proceed by

1) rationalizing their way to an interesting conclusion, by careful
inference control

2) then, AFTERWARDS, considering their conclusion in a more balanced way,
looking at both positive and negative evidence equally

In other words, I think that rationalization is an important part of the
creative thought process, but it is dangerous if used alone, rather than as
one stage in a multi-stage process

> When you are trying to find a reason why your conclusion is true, you
> being to feel like you are straining to come up with support. Stop
> right there. If you don't know, you don't know, so don't make something
> up, because whatever you make up is not helpful and will confuse
> matters, an irrational choice.

On the contrary, making stuff up is great. The important thing is to,
afterwards, judge what you've made up by a harsher criterion (a fuller
consideration of evidence).

> You have a big test tomorrow and need to study for it. Right in the
> middle of studying, your girlfriend shows up and wants your attention
> (ahem!). The rational choice is to keep studying, because in the long
> run doing well on a test and learning material are more important than
> making out.

Again, this is not irrationality intrinsically, it's a weighting of short
term over long term goals.

It's irrational only if one fools oneself as to what one is doing.

To say "It's more important to me to get laid tonight than to graduate
college" is not illogical, it's just a goal weighting that many people
consider normatively suboptimal...

> You are playing a gambling game. You have $500 dollars. First you are
> given a choice: you can either be given another $100 or you can try to
> win $500 more, but if you don't win, you get nothing. It doesn't matter
> what the odds are, $100 is almost always the more rational choice
> because it guaranteed and most people will pick that one.

This is just silly... If I were given this choice and the odds of getting
the extra $500 were 50%, I of course would take the 50% chance of an extra
$500.

Furthermore, your statement about what most people will do is contradicted
by observed behavior on TV game shows ;>

I think that, in each case that you describe, the real irrationality comes
down, not to incorrect inference on a micro level, but rather to
*emotionally biased inference control.*

What "irrationality" is, in my view, is largely the following. One has a
conclusion that would be emotionally gratifying if it were true (and one has
a goal of gratifying the emotions in question). One then biases the
information fed into one's logical inference component, in a specific way
oriented toward making the desired conclusion come true. In a sense, this
produces a divided mind, which "intentionally" (though perhaps
unconsciously, still in a goal-oriented way) hides some knowledge that it
has from some parts of itself.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT