From: John Robb (firstname.lastname@example.org)
Date: Tue Dec 10 2002 - 17:44:45 MST
> IQ is no longer defined as a ratio of mental to calendar age, and hasn't
> been for quite some time. Today IQ is actually *defined as* a normal
> distribution, with, if I recall correctly, 16 IQ points being equal to one
> standard deviation. That is, if you have an IQ of 132, it *means that*
> your test scores are in a percentile two standard deviations from the
Don't put too much emphasis on IQ. Here are a some reasons why (I won't
explain the implications of many of these points, you are going to have to
figure these out on your own):
1) It doesn't accurately depict the value of the large "base" intelligence
you get for merely being human.
2) It is a relative measure. A car is a car. Some can go faster and some
can go slower. But all travel at a pretty amazing pace relative to...
3) Hard work often trumps IQ. In my experience, it is a 80/20 split in
favor of hard work.
4) Statistical measures of success (such as the stats that show people with
high IQs earn more money than people with low IQs) are biased due the fact
that IQ (SAT and GRE) is used as means of allocating scarce resources
(higher education at premium schools). A genetic fallacy. Coase would
have a field day here.
5) Common sense is a better survival trait that a high IQ (I used to be a
pilot -- we are all a little Darwinian in our outlook).
6) Education plays a major role in performance on IQ tests (oh, I have seen
this question before...)
There are a lot more.
Basically, this is an intro into why I think it is very difficult (nigh
impossible) to build an AI that is superior than a human (across the board).
There isn't a clear target to aim at, or a meaningful measure of success.
I am much more in favor of IA and letting an entrepreneurial mind adapt to
the new tools that are made available.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT