Re: Dangers of human self-modification

From: Samantha Atkins (samantha@objectent.com)
Date: Sat May 29 2004 - 18:42:48 MDT


On May 26, 2004, at 7:39 AM, Ben Goertzel wrote:
> Philip, I understand your point but I don't agree with it. I'm much
> more pessimistic about human nature.
>
> Two things could come from genetically engineering of wise humans:
>
> 1) a vastly greater incidence of individuals as wise as the
> maximally-wise humans on Earth today
>

Wisdom today seems to be a product of training, development, intention
more than genetics per se. So why believe we will get more of it
through genetic engineering in particular?

> 2) the creation of humans who are 2 or 10 times as wise (yeah I know,
> we
> have measurement problems here) as any human alive today
>

Same objection. I don't believe wisdom is amenable to genetic
engineering significantly.

> Either one of these would make a big difference, eh?
>
> I just have very little faith in the ability of sociocultural change to
> elevate the wisdom level of the race.

I don't see that wise people today are produced by sociocultural
conditions particularly. It seems more individual / exceptional.

>
> In fact, I have little faith that genetic engineering will be used to
> create ultrawise beings either -- I reckon it will be used first to
> create superintelligent killing machines. And my bet is still that
> superhuman AGI will come before any of these things, due to the slower
> pace of experimentation in human genetic engineering enforced by
> society's ethical concerns about experimentation on humans.
>
>

Well, many believe that AGI is likely to be dedicated to
"superintelligent killing machines" too.

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:47 MDT