Re: Syllabus for Seed Developer Qualifications [WAS Re: Some considerations about AGI]

From: Paul Bettner (paulbettner@gmail.com)
Date: Sat Jan 28 2006 - 23:01:48 MST


Does that page (http://www.intelligence.org/action/seed-ai-programmer.html)
strike anyone else as incredibly elitist? Not to mention highly unrealistic?

I really couldn't say how the takeoff is going to happen, but I highly doubt
it's going to come about as a result of some self selected group of
"humanitarian geniuses" who only let those they deem worthy into their club.

Most likely it's going to happen at google, or CMU, or Ben's company.

I'm not trying to start a fight here. I believe intelligence.org's heart is in
the right place. I'm just really put off by the elitist self-important
attitude of this particular article and I think it's worthwhile for me to
express my dissapointment and my opinion that the article should be removed.
I feel that it does a very poor job of representing the institute and
singularity ideas in general.

- paul

On 1/26/06, Mikko Särelä <msarela@cc.hut.fi> wrote:
>
> On Wed, 25 Jan 2006, Mike Dougherty wrote:
> > Game theory (if that's what it's called) - to understand the difference
> > between the "zero-sum" game of most traditional models, and "infinite
> > resource" (for lack of a better term opposite zero-sum) thinking that
> > will result from a post-Singularity economy.
>
> I'd also add the game theoretic theory of morality. Possibly not
> directly relevant to FAI or AGI, but definitely relevant in understanding
> how human morality has come to be and what are the things that maintain
> it. Might allow one to avoid a few dangerous pitfalls.
>
> --
> Mikko Särelä http://thoughtsfromid.blogspot.com/
> "Happiness is not a destination, but a way of travelling." Aristotle
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT