Re: Syllabus for Seed Developer Qualifications [WAS Re: Some considerations about AGI]

From: Michael Roy Ames (michaelroyames@yahoo.com)
Date: Sun Jan 29 2006 - 00:23:21 MST


paul,

I would have to agree with you somewhat. The article is definitely not
politically correct. I see it as a somewhat tounge-in-cheek over-statement
meant to discourage applicants who are not willing to put in the hard,
tedious work of learning the needed knowledge and gaining the mental
discipline to contribute as a seed AI programmer - not to mention
discouraging less than highly talented individuals. The seed AI programmer
described &/or implied in seed-ai-programmer.html is an imaginary being,
unlikely to be found in the world today, but perhaps not with a probability
of zero. And there lies the rub. Should we alter the entry requirements to
be less stringent, more readily attainable by mere mortals? Should we alter
the article to appear less offensive, more conventional while retaining its
filtering effect?

There is not doubt that SIAI needs more talented and dedicated researchers.
It may be that the best way to attract such individuals, even individuals of
such high standards, is something other than the current formulation of
seed-ai-programmer.html. What would you suggest?

Michael Roy Ames

----- Original Message -----
From: "Paul Bettner" <paulbettner@gmail.com>
To: <sl4@sl4.org>
Sent: Saturday, January 28, 2006 22:01
Subject: Re: Syllabus for Seed Developer Qualifications [WAS Re: Some
considerations about AGI]

Does that page (http://www.intelligence.org/action/seed-ai-programmer.html)
strike anyone else as incredibly elitist? Not to mention highly unrealistic?

I really couldn't say how the takeoff is going to happen, but I highly doubt
it's going to come about as a result of some self selected group of
"humanitarian geniuses" who only let those they deem worthy into their club.

Most likely it's going to happen at google, or CMU, or Ben's company.

I'm not trying to start a fight here. I believe intelligence.org's heart is in
the right place. I'm just really put off by the elitist self-important
attitude of this particular article and I think it's worthwhile for me to
express my dissapointment and my opinion that the article should be removed.
I feel that it does a very poor job of representing the institute and
singularity ideas in general.

- paul

On 1/26/06, Mikko Särelä <msarela@cc.hut.fi> wrote:
>
> On Wed, 25 Jan 2006, Mike Dougherty wrote:
> > Game theory (if that's what it's called) - to understand the difference
> > between the "zero-sum" game of most traditional models, and "infinite
> > resource" (for lack of a better term opposite zero-sum) thinking that
> > will result from a post-Singularity economy.
>
> I'd also add the game theoretic theory of morality. Possibly not
> directly relevant to FAI or AGI, but definitely relevant in understanding
> how human morality has come to be and what are the things that maintain
> it. Might allow one to avoid a few dangerous pitfalls.
>
> --
> Mikko Särelä http://thoughtsfromid.blogspot.com/
> "Happiness is not a destination, but a way of travelling." Aristotle
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT