Re: Risk, Reward, and Human Enhancement

From: Robin Gane-McCalla (robinganemccalla@gmail.com)
Date: Wed Jan 02 2008 - 15:50:47 MST


I think the whole premise of this is a bit messed up because:

1) We aren't really defining IQ, there are many different IQ tests out
there and none of them are very accurate at predicting things like
future income or contributions to society. If by IQ you mean general
intelligence, I think you'd need to define it better and I don't
believe that it could be measured by a single number.
2) No comprehensive argument that one person of a high IQ is better
than many people of lower IQs. While some people think great
intelligences like Einstein have created huge differences in the
world, people like Einstein wouldn't be possible if it weren't for
people who were slightly less intelligent, but who were able to
understand and apply Einstein's theories. If Einstein were more
intelligent, and everybody else less intelligent, then people wouldn't
understand his idea and they might have even killed him for coming up
with such heresy as they did with earlier scientists.
3) The whole situation is contrived, and in my opinion, unlikely to
occur. Even if there were some drug that increased intelligence in
some and decreased it in others, the intelligent thing to do with it
would be to test the brains (via MRI, EEG etc) of all the people and
find out what neurological factors caused their intelligence to
increase or decrease.

I think that the age of the super scientist who discovers some amazing
law which revolutionizes the world is over. Real scientific advances
will be made by groups.

On Dec 31, 2007 4:01 PM, Samantha Atkins <sjatkins@gmail.com> wrote:
>
>
> On Dec 13, 2007, at 7:06 PM, Rolf Nelson wrote:
> On Dec 5, 2007 11:54 AM, Byrne Hobart <sometimesfunnyalwaysright@gmail.com>
> wrote:
>
> > As we get better at directly manipulating human abilities, we're probably
> going to encounter situations in which a treatment has uncertain effects.
> Consider a new intelligence enhancement drug that, in clinical trials, has
> been shown to reduce IQ by 5 points 90% of the time, and raise it by 10
> points 10% of the time (and can be repeated indefinitely). For an
> individual, this is a pretty bad deal -- but get a group of 10,000 devoted
> singularitarians, have each one take the treatment, and then repeat it for
> the ones who get enhanced, and you'll end up with one person with an IQ 50
> points higher. And one ridiculously smart individual may make enough of a
> contribution to outweigh making 9,000 willing volunteers marginally dumber.
>
> I'd like to think I would volunteer, if it were the most cost-effective way
> to help out. (I can't say whether it's probable that I would actually
> volunteer, since (a) I'm not yet taking time to actually think it over since
> it's hypothetical, and (b) the human brain has an uncanny knack for
> rationalizing its way out of actually following through with making
> sacrifices for strangers.)
>
>
> Why would you like to think that? On the face of it it is a very poor
> decision. The likely outcome is that you lose 3.5 IQ points. Perhaps we
> get somone whose IQ is raised 50 points but of what good is that to you?
> There are hyper IQ folks out there now who aren't working on anything you
> think is all that important. What makes you think that possibly getting one
> more such person is worth the real damage to yourself and most others who
> volunteer? Would you "like to think that" because it makes you a "good
> Joe" or willing to take one for the group or such? Why?
>
> The rest of the "all too human" reasoning above says that you (and I mean
> all of us) need all our marbles intact. :-)
>
> - samantha
>
>

-- 
Robin Gane-McCalla
YIM: Robin_Ganemccalla
AIM: Robinganemccalla


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:01 MDT