Rationality and Goals (was Books on rationality)

From: Lee Corbin (lcorbin@tsoft.com)
Date: Sun Jun 09 2002 - 17:54:49 MDT


Mike writes

> Whether a goal is short term or long is not the determining factor of
> precedence. Most things of great value require great effort to achieve,
> therefore, *MOST* long term goals are more important than short term.

Yes, that's how we've come to view the situation in terms of
the meanings we give the terms, or so it seems to me.

> But the term length or effort is not the determining factors of
> precedence [rather] the value of the goal is. [Yes]

Gordon writes
> [Lee wrote]
> > I'm very skeptical that it is even possible, let alone
> > wise, to *always* resist short term temptation. (Strange
> > that evolution never stumbled on endowing some creature
> > with only long term goals.)

Mikko corrected that with

> Not that strange, since thinking in terms of long term goals requires a
> comparatively sophisticated mind, and that's hard enough to evolve as it is.

Yes, although 200,000 years does seem a long time for long-term
goals not to have precluded short term motivations in humans, but
you could very well be right.

Gordon writes

> There's an over-emphasis here on short term goals and long term goals.
> It's a matter of evaluating what will happen in the future as a result
> of an action and determining which tradeoff is better.

Seems abundantly well put to me.

> In this example, the short term tradeoff is either that I get
> the girl and take an easy class or I learn something interesting
> and take the hard class. Long term, I have a very low chance of
> chance of gaining a life long friend or a high chance of learning
> something which will be useful.

I think I understand; but implicitly here there are 3 goals: getting
the girl (and your system's incentive to reproduce?), learning something
interesting, gaining a friend. It sounds like 2 and 3 are long term,
and in your value system it also appears, for some reason, "rational".

> Furthermore, it's not ignoring all short term goals, but those goals
> that result from human biases. For example, I have two books laying on
> my desk. One is a collection of short stories by PKD and the other is
> on system software (the programs that make computers go). How I decide
> which to read will be based on a large number of factors, but none of
> them relate to my ability to reproduce (even if improved reproductive
> fitness results, I give it a weighting of zero).

The only way that I can parse this last sentence "ability to
reproduce" is by conflating it with my conjecture above that
you meant by "getting the girl" a satisfaction of your
instincts to biologically reproduce.

I suspect that you and others here have adequately answered the
following questions, but I could still use some help: what makes
some of those goals rational and not others? You clearly identify
with the long-term institution Gordon Worley, and not with the
subsystems that you almost never give in to (by your previous claims).
I'm real murky on how rationality fits into all this.

(I will post an appreciation of Ben's characterizations of
"rationality" soon.)

Thanks,
Lee



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT