From: H C (email@example.com)
Date: Sat Jan 21 2006 - 16:19:03 MST
>From: Ben Goertzel <firstname.lastname@example.org>
>Subject: Re: Why invest in AGI?
>Date: Sat, 21 Jan 2006 17:24:07 -0500
>My impression is that Gates has an attitude roughly similar to that of
>Kurzweil, which is roughly (according to my interpretation, which may
>be wrong): "AGI is definitely going to happen, and may well drive the
>Singularity, but it's most likely to happen as the natural result of a
>few more decades of conventional academic science, so there's no need
>to invest in maverick AGI approaches living out the mainstream."
>I find this attitude frustrating yet not extremely stupid.
>-- Ben G
"extreme" is a relative word here.
All things considered, I would say that it isn't a bad word for
characterizing the expected utility difference between the approach you just
stated, and a more direct approach to bringing about a Singularity.
Why this is true is a key question that needs to be answered for big
investors like these guys. (That is, "Why is the expected utility of direct
action *extremely* higher than the expected utility of natural
The answer here is obvious to me, in that the faster it happens, the faster
we can help people (ie, stop death, etc), and the more control a
Friendly-driven researcher has on the Singularity, the more likely to
prevent existential disaster.
So their real hold up, as it appears to me, is that, since they can't
envision any way someone can develop and implement a full AGI theory, that
nobody else probably can either (pretty much the most common opposition
held) This proposition also has lots of empirical backing for them, having
known about the dismal and public failure of Strong AI in the past. This
also makes sense for their expectation about "natural progression". The
progression would be quite artificial if we knew it in advance, wouldn't it?
My guess about what would be enough to convince is probably some massive
technical explanation of what the theory is, how it is different fom old
stuff, and why it will necessarily gives rise to intelligence. In order to
overcome their presumptions, you would probably ALSO have to point out the
presumption they are making (as discussed above), quite as specifically as
we are here, and expain to them why that assumption is not well calibrated.
Perhaps after some convincing they will say "Holy crap, you people are WAY
smarter than me, and I believe you."
>On 1/21/06, H C <email@example.com> wrote:
> > I really wonder what Bill Gates is thinking. Apparently he is a fan of
> > Kurzweil and the idea of AGI in general. I wonder if Microsoft or Bill
> > are funding any AGI research right now.
> > -hegem0n
> > >From: Ben Goertzel <firstname.lastname@example.org>
> > >Reply-To: email@example.com
> > >To: firstname.lastname@example.org
> > >Subject: Re: Why invest in AGI?
> > >Date: Sat, 21 Jan 2006 10:08:37 -0500
> > >
> > > > The bottom-line is that we might be talking about an uncommon beast,
> > >this
> > > > younger than 45 year-old worth $13.3.M. My bet would be that a large
> > > > percentage of them would be Hollywood actors, sports figures, and
> > > > entertainment types.
> > >
> > >Or -- more relevantly -- software types who cashed out during the
> > >dot-com boom ;-)
> > >
> > >ben g
This archive was generated by hypermail 2.1.5 : Mon May 20 2013 - 04:01:03 MDT