From: Lee Corbin (firstname.lastname@example.org)
Date: Sun Apr 20 2008 - 09:19:29 MDT
> The whole advantage of private property and prices is that is
> allows efficient transfer of information. When there is not
> need for information - when your economy reduces to
> "everyone needs enough water and food to survive"
> then command economies work fine.
> Once we move beyond basic survival - then, private property
> comes into its own.
> What is interesting about a "super AI" is that it could run an
> efficient command economy. And it could run it in ways that
> minimise the negative philosophical consequences to people.
But this is true only, of course, because the super-AI is so
superior to people. I can run an aquarium that minimizes
the negative consequences to fish. The hard question is,
of course, what forms either evolve (as Matt would insist)
or are advisable (as others and I want to know) when
there is a *collection* of AIs. Specifically, the speed of
light guarantees separate entities/intelligences.
> Would we want that?
I have found that some people mind, and that some don't.
Me, I don't have any trouble living under the oversight
of an all-knowing AI, and I'm grateful for the runtime it
provides me. Frankly, it feels a bit juvenile for entities to
feel that they *must* be the wisest and best within so
many light years. (They can always send copies out to the
outer reaches, and satisfy their urges that way, but, I guess
only if they understand that they *are* their copies.)
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT