Re: Investing in FAI research: now vs. later

From: Robin Lee Powell (rlpowell@digitalkingdom.org)
Date: Wed Feb 20 2008 - 14:26:43 MST


On Wed, Feb 20, 2008 at 12:57:32PM -0800, Matt Mahoney wrote:
> --- "Peter C. McCluskey" <pcm@rahul.net> wrote:
> > There appears to be a serious lack of communication between
> > people who think we're doomed without FAI and the people who
> > expect a diverse society of AIs. It appears that the leading
> > advocates of one outcome can't imagine how anyone could believe
> > the other outcome is possible. This appears to be a symptom of
> > a serious failure of rationality somewhere. I wish I could lock
> > the leaders of each side of this schism into a room and not let
> > them out until they either reached agreement or came up with a
> > clear explanation of why they disagreed. Presumably part of the
> > disagreement is over the speed at which AI will take off, but
> > that can't explain the certainty with which each side appears
> > to dismiss the other.
>
> I think that both sides can agree that a singularity will result
> in the extinction of humans in their present form and their
> replacement with higher level intelligence. Where they disagree
> is whether this is good or bad. A rational approach does not
> answer the question.

I disagree.

What I'm worried about is humans being replaced with paperclips or
little animated smiley faces. While you could argue that something
that succeeds at doing so is "smarter" than us, that's a really
narrow view of intelligence, IMO.

-Robin

-- 
Lojban Reason #17: http://en.wikipedia.org/wiki/Buffalo_buffalo
Proud Supporter of the Singularity Institute - http://intelligence.org/
http://www.digitalkingdom.org/~rlpowell/ *** http://www.lojban.org/


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT