From: ben goertzel (email@example.com)
Date: Fri Jan 18 2002 - 13:53:22 MST
> Imagine if Eliezer (or some other critical SingInst
> person, should the SingInst be critical) decided to fall in love and run
> with some girl? This could be a disaster. I just think it would take a
> different type of shock level to *truely* serve the Singularity over
> anthropocentric desires.
Well, I consider myself highly devoted to the Singularity, yet I persist in
having an active and rewarding personal life as well.
I am not sure the world is best served by having AI researchers devote 100%
of their minds and lives to AI. A certain different perspective is obtained
by having an actual human life in addition to doing singularity-focused
work, and I believe that this different perspective may actually be valuable
for the process of bringing up and teaching the first "real AI."
> I don't think the anthropocentric attitudes of arrogance and hubris would
> adopted by a truely higher shock level. Higher levels of efficency and
> productivity, on the other hand, most definitely would. This means
> eliminating things like sex and eating long meals, or even chatting with
> friends in person or going out to the "real world". Maybe they'll evolve
> beyond the need to have "fun" for mental health, because new interfaces
> be so interesting and engaging that that will be all they need.
See, this scares me. I don't want the future of the universe determined by
people with no lives. I think that this leads to a very narrow emotional
condition, which is somehow not the best condition to be in when creating
the next race of AI's. I don't want the first real AI to have
I'm not afraid of the Singularity, just of this hypothesized subculture of
uberbeings who never want to have sex, eat or have fun. ;>
Of course, this is just *my* intuition which is to some extent colored by
anthropomorphism. But there you go...
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT