From: Brian Atkins (firstname.lastname@example.org)
Date: Thu Apr 05 2001 - 00:27:38 MDT
James Higgins wrote:
> If 1 is the case, I honestly think we're screwed long-term anyway. A sysop
> environment will just breed contempt, jealously, etc in the SIs within its
> domain. Eventually you end up with either a breakout of the sysop
> environment or a mostly unhappy population.
> How can we say that building the SI is better than any other perceived
> course of action? I can actually imagine life on the other side of
> creating an SI being rather bleak. Imagine a reality where we know
> everything (to a reasonable approximation) and can do anything
> (ditto). This would get boring very quickly. Since the major driving
> force behind human nature seems to be continual learning, what do you do
> when you have learned everything? I think this reality would be more like
> my personal hell than anything else.
Ok James, so what exactly is your ideal future scenario? If you insist on
having some mystery left in the Universe I'm sure the Sysop will oblige
you by modifying your memories and dumping you permanently into a simulated
reality of medieval Europe. If you request such a thing of course.
Personally I don't see any kind of downside to a Sysop scenario. You've got
access to the most knowledge and technology of any point in history, AND
you have the highest achievable (in the physical Universe) freedom while
maintaining each individual's requested level of safety/property rights.
To have more freedom than that requires an anarchic scenario, which post-
Singularity technologies would cause to be a very unstable situation.
-- Brian Atkins Director, Singularity Institute for Artificial Intelligence http://www.intelligence.org/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT