From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Tue Apr 22 2003 - 10:19:54 MDT
Some excerpts from Congressman Brad Sherman's words in the House Committee
on Science hearing on "The Societal Implications of Nanotechnology",
Wednesday, April 9, 2003.
I want to respond to the distinguished chair of the space subcommittee
that long before his subcommittee authorized the program that took us into
space, the poets made us want to go there. It is good to have the
societal elements (or as he would abbreviate the term: nuts) talking to
the scientists at an early stage of this process rather than wait until
the end. I commend the panel for focusing on the fact that one of the
things that nanotechnology may bring is new orders of intelligence,
whether that's through genetic engineering, perhaps at the nanotechnology
level, or non-organic nanotechnology or some combination.
First, I would point out that intelligence is the most explosive thing in
the universe. There are those who think fusion is the most explosive
thing except when you realize that intelligence gave us that fusion.
There was less than a decade between when Einstein wrote to Roosevelt of
the possibility of a nuclear explosion and when we had to develop a
nuclear non-proliferation regime. Now we are engaged in a regime change
as part of that regime.
About a hundred thousand years ago we saw the last increase in
intelligence when Cro-Magnon greeted Neanderthal. Perhaps the first thing
a Neanderthal said upon looking at Cro-Magnon is, "Is that us?" And I
don't know. And we may be looking at new entities and wondering whether
the next intelligence is our progeny, our competitor, or a bit of both.
You've pointed out that we are going to see massive increases in the
spread of knowledge and technology, and I'm confident that humans will be
better at curing those things that can be cured by intelligence. If SARS
emerges twenty years from now, you science folks will give us a cure in
weeks instead of years. But, there are problems caused by intelligence:
like the fact that we can bombard ... uranium atoms. And those problems
will probably also increase since their cause, human intelligence, increases.
Mr. Kurzweil, I believe that you have written that it's roughly thirty
years between now when we get a non-biological intelligence that surpasses
human intelligence and have suggested that that occurs by reverse
engineering the human brain. Since I'm out of time I'm going to ask each
panelist how many years they think it will take any of the branches of
nanotechnology to give us an intelligence that surpasses any known human
intelligence? Just shout out a number of years and make sure it's longer
than anyone will hold you to account for, because we will forget your
answer in less than a decade.
Winner: Actually, I hope never. One of the concerns about nanotechnology
and science and engineering in this scale is that it is plowing onward to
create a successor species to the human being. I think when word gets out
about this to the general public, they will be profoundly distressed. And
why should public money be spent, I would wonder, to produce an eventual
race of post-humans? Perhaps this needs wider public debate.
Sherman: That's pretty much how we spent the last five minutes. Ms.
Kurzweil: If I could just suggest that since this has gone into a
discussion: we already have people walking around who have computers in
their brains, who have Parkinson's disease or hearing disabilities or a
dozen different neural implants. We have artificial augmentations and
replacements of almost every body system. So the ultimate implication of
these technologies will not be a successor species, but really an
enhancement of our human species. I would define the human species as
that species that inherently seeks to extend our own horizons. We didn't
stay on the ground. We didn't stay on the planet. We're not staying with
the limitations of our biology.
Ms. Peterson: Well, I'll say 25 to 30 years and express my surprise that
this question would come up here, and also say that these kinds of things
are labeled "science fiction". My work is often labeled "science
fiction". But, I point out that if you look ahead 30 years, and what you
see sounds like science fiction, you might be wrong. But if it doesn't
sound like science fiction then you're definitely wrong.
Sherman: Dr. Winner and Mr. Kurzweil have addressed from opposite
standpoints that interesting question: What is a human being? Dr.
Kurzweil puts forward the idea that wherever evolution takes us, if it
produces a self-aware and ambitious exploring entity, that that is human;
and Dr. Winner..., well, I guess he wants to count the fingers and count
the toes. And I don't know if there is a way to address this in the
remaining 15 seconds.
Winner: One important question is who gets to do the counting at all.
The last statistics, I understand there are about 6 billion plus humans on
the planet most of whom, the overwhelming majority of whom, are not
involved in these projects. They might be interested to find out that
these plans are in the works and they might even want to have a say.
Sherman: I think they would, and it does go back to the question as if
Mr. Kurzweil were a Neanderthal, and met that Cro-Magnon, whether he would
be happy or unhappy? I'll let him respond if he wants to.
Kurzweil: I think rather than developing sentient, non-biological systems
(although that will happen), I think our primary destiny is to enhance our
own capabilities. I mean, I like having 10 fingers and 10 toes.
Incrementally, one step at a time, as we overcome various types of
physical afflictions and limitations of our human capability we'll be
enhancing our civilization. We've done that already. We're doing things
today that couldn't be possible without the intimate merger with our
technology and we're going to stay on that path.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:42 MDT