RE: Rationality and altered states of consciousness

From: Ben Goertzel (ben@goertzel.org)
Date: Fri Sep 20 2002 - 09:13:00 MDT


> I mean I think Ben's full of crap saying implicitly that
> working toward the
> singularity isn't helping humans concretely in the present world.
> CFAI is a
> herculean task like the Augean Stables: Ben would have some of us
> throw dung out
> of the stable with old reliable spoons, while the bigger thinker sees the
> solution (diverting most of a river with a giant hollow log or
> such so that it
> passes through the stables, according to the story).

Firstly, my own AI project (www.realai.net) is also a "herculean" project,
aimed at launching a good and rapid Singularity. At present, in practice,
it is a significantly larger project than Eliezer's project, though both are
quite large in their intended scope.

Secondly, I don't agree with your phrasing about "helping humans
concretely." I think that Eliezer's and my own work toward the Singularity
is not "helping humans concretely in the present world." However, we have
reason to believe it will help humans in the future, and also (roughly
equally important to me) to help create nonhuman sentient beings in the
future.

-- Ben



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:40 MDT