From: Thomas McCabe (email@example.com)
Date: Sun Nov 11 2007 - 19:18:55 MST
On Nov 11, 2007 8:49 PM, Peter de Blanc <firstname.lastname@example.org> wrote:
> On Sun, 2007-11-11 at 19:00 -0600, Damien Broderick wrote:
> > At 07:20 PM 11/11/2007 -0500, Peter de Blanc wrote:
> > >what if someone from 1700 CE asked you how fire works?
> > >Would you say "I know the answer, but I am not able to communicate it
> > to
> > >you"?
> > what if someone from 17000 BCE asked you how fire works?
> > Would you say "I know the answer, but I am not able to communicate it
> > to
> > you"?
> I wouldn't be able to communicate with the person from 17000 BCE because
> I would not know the language ve speaks. This is not analogous to the
> scenario with an AI, because the AI would probably have access to lots
> of information about the languages spoken by modern humans.
That's not the point. Even if you did speak their language, you
couldn't communicate what we meant by "combustion", because they
wouldn't have any of the prerequisite knowledge. In English, you might
define "combustion" as "the heat-producing chemical reaction of air
with organic materials". But they don't know what a chemical reaction
is. They don't know what an organic material is. They don't even know
what "heat" or "air" are- as late as several hundred years ago, people
thought that heat was some kind of all-pervasive, conserved "caloric
fluid", and that air was the all-pervasive medium in which things
happened, like the modern-day Lorentzian manifold. The inferential
distance is simply too great. This is why we need CEV in the first
place. To quote:
"Let's ask Fred's volition a more complicated question: "What should
the code of a Friendly AI look like?" But there's a minor problem,
which is that Fred is a hunter-gatherer from the Islets of Langerhans,
where they have never even heard of Artificial Intelligence. There is
a tremendous distance between the real Fred, and a Fred who might
specify the code of a Friendly AI. Fred needs more than knowledge of
simple facts, and knowledge of many background disciplines not known
to the Islets of Langerhans. Fred needs to answer moral questions he
doesn't know how to ask, acquire new skills, ponder possible future
courses of the human species, and perhaps become smarter. If I started
with the Fred of this passing moment, and began a crash education
course which Fred absorbed successfully, it would be years before Fred
could write a Friendly AI. Different Everett branches of Fred would
not write identical code, and would probably write different AIs.
Would today's Fred recognize any of those future Freds, who had
learned so much about the universe, and rejected so many prevailing
beliefs of the Islets of Langerhans?
Yet today's Fred still has wishes that straightforwardly apply to the
creation of an Artificial Intelligence. Fred may not know what a
paperclip is, or what a solar system is, but nonetheless Fred would
not want anyone to create an Artificial Intelligence that tiled the
solar system with paperclips."
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT