Re: Syllabus for Seed Developer Qualifications [WAS Re: Some considerations about AGI]

From: H C (lphege@hotmail.com)
Date: Sun Jan 29 2006 - 08:38:20 MST


"You may not be able to become a seed AI programmer. In fact, it is
extremely likely that you can't. This should not break your heart."

"Meanwhile, before the Singularity, consider becoming a regular donor
instead. It will be a lot less stressful, and right now we have many fewer
regular donors than people who have wandered up and expressed an interest in
being seed AI programmers. "

"No one who might actually be hired will be scared off by the thought of an
extremely difficult job or harsh competition. If someone is brilliant enough
to have any realistic chance of becoming an AI programmer, and ethical
enough to be accepted, nothing we could possibly say would scare them away
from applying"

-hegem0n

>From: Paul Bettner <paulbettner@gmail.com>
>Reply-To: sl4@sl4.org
>To: sl4@sl4.org
>Subject: Re: Syllabus for Seed Developer Qualifications [WAS Re: Some
>considerations about AGI]
>Date: Sun, 29 Jan 2006 00:01:48 -0600
>
>Does that page (http://www.intelligence.org/action/seed-ai-programmer.html)
>strike anyone else as incredibly elitist? Not to mention highly
>unrealistic?
>
>
>I really couldn't say how the takeoff is going to happen, but I highly
>doubt
>it's going to come about as a result of some self selected group of
>"humanitarian geniuses" who only let those they deem worthy into their
>club.
>
>Most likely it's going to happen at google, or CMU, or Ben's company.
>
>I'm not trying to start a fight here. I believe intelligence.org's heart is in
>the right place. I'm just really put off by the elitist self-important
>attitude of this particular article and I think it's worthwhile for me to
>express my dissapointment and my opinion that the article should be
>removed.
>I feel that it does a very poor job of representing the institute and
>singularity ideas in general.
>
>- paul
>
>On 1/26/06, Mikko Särelä <msarela@cc.hut.fi> wrote:
> >
> > On Wed, 25 Jan 2006, Mike Dougherty wrote:
> > > Game theory (if that's what it's called) - to understand the
>difference
> > > between the "zero-sum" game of most traditional models, and "infinite
> > > resource" (for lack of a better term opposite zero-sum) thinking that
> > > will result from a post-Singularity economy.
> >
> > I'd also add the game theoretic theory of morality. Possibly not
> > directly relevant to FAI or AGI, but definitely relevant in
>understanding
> > how human morality has come to be and what are the things that maintain
> > it. Might allow one to avoid a few dangerous pitfalls.
> >
> > --
> > Mikko Särelä http://thoughtsfromid.blogspot.com/
> > "Happiness is not a destination, but a way of travelling." Aristotle
> >



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT