Re: [sl4] Re: Signaling after a singularity

From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Wed Jun 25 2008 - 03:12:34 MDT


>> The aspect of a post-singularity world I'd like to look at is the
>> absence of signalling. If we had an advanced AI, it should be able to
>> fully understand the personality and abilities of an individual
>> human. If it would accept to reveal this information to humans, then
>> we would live in a society with perfect understanding of each other.
>
> I suppose you're taking the AI-only-approach (Anissimovic) singularity.
> I don't know what an understanding of personality would entail. The
> idea of personality is pop psychology anyway, so saying something like
> this makes me wonder if you know what the ai would be knowing in the
> first place.

Personality may be pop psychology, but it's not a concept devoid of
information. It's useful in assigning certain people to certain jobs,
for instance. Assuming the AI could not just brute force the problem
and predict everyone's actions in every circumstances (chaos would
probably forbid this), then the AI would have to rely on some
simplified model that gives it enough information to make decisions.
"Personality" is one of our simplified model's ("abilities" is
another); the AI's simplified models would be much better, but we can
still call "Personality" as a shorthand.

> You think that showing off is why people do things?

I think it is major reason a lot of people do things.

> As for the economy, just ignore it. As for the culture, I don't
> see what you mean. Would the information be deleted for some reason?

The old culture would still be there; it's whether there would be new
ongoing cultures that I'm wondering.

> Why would dictators be the best way to govern?

I'm not saying they would; I'm just saying that all the assumptions on
what system of government is the best may have to rexamined, and what
we take for granted now (democracy best) may not be true after a
singularity.

As for why governing, there will still be a finite (though huge)
amount of ressources available to anyone, and there will still be the
problem of violence/coercion between agents. Some system of government
would still be needed to adjudicate conflicts. And this system must
have access to a higher level of violence than any individual agent,
if there is any chance that agent could misbehave.

It might be a collaberative, communal hippy government, or it might be
an AI dictatorship, but it would still be a government.

Of course, if people are no longer people, then it might be possible
to avoid the issue of violence all together. Still needs a system of
adjudicating though.

> You asserted above that this form of talk is a waste of time, but here
> you're wondering whether or not we want/prefer it ? Please make up your
> mind. :-)

Translation: speculation is a waste of time, but it's very fun! :-)

Stuart



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT