From: Stuart Armstrong (firstname.lastname@example.org)
Date: Fri Sep 26 2008 - 03:04:43 MDT
> Bostrom does not seem to offer any good alternatives. In any case, he implicitly assumes that certain forms of intelligence, what he calls eudaemotic (with human-like motivations and "conscious") are preferable to other types. That is understandable, but it is not our choice to have that preference. It is an evolved trait of humans.
Well yes. We have to base our preferences on something. Even your own
"let evolution just happen, it will decide what is good" is an evolved
and cultural artifact, as are my objections to that idea.
The fact that our preferences are of arbitrary origine doesn't mean
that they are wrong.
> Could someone remind me again, what are we trying to achieve with a singularity?
The survival of some version of humanity. Beyond that, the usual
eternal meaninful happiness and immortality stuff. Beyond that, we all
This archive was generated by hypermail 2.1.5 : Wed May 22 2013 - 04:01:37 MDT