Re: SIAI & Kurweil's Singularity

From: Michael Vassar (michaelvassar@hotmail.com)
Date: Fri Dec 16 2005 - 05:35:04 MST


I would assert that to date more intelligent and otherwise more capable
instance of human being *are* particularly more trustworthy than other
humans, at least in proportion to how much power they hold, but the
relationship is only moderately strong and may be specific to Western
culture.

Unfortunately, this doesn't tell us much about radically augmented humans in
any event. The difference among humans is too small to extrapolate to that
between humans. Also, power selects for ambition and recklessness almost as
much as for intelligence, both today and in a regime of human recursive
self-improvement.

My guess is that a human who understood existential risks well prior to
recursive self-improvement and who had a substantial head-start on other
humans could slow take-off safely, but I would not want to risk it. The
chance of survival in a scenario of economic or other forms of competition
between agents capable of recursive self-improvement seems close to zero
though. The only way I can imagine it working out is if the agents are
spread over a large region of space and protected by light speed lags.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT