From: Stuart Armstrong (firstname.lastname@example.org)
Date: Mon Jul 07 2008 - 09:30:26 MDT
>> Yes, very good. Exactly. However, I wonder if those who won't change, and
>> so will still want to covet their neighbor's
>> status, unique possessions, or resources will gain the upper
>> hand evolutionarily.
>> Another form of saying the same thing might be that the urge
>> to dominate is probably an ESS. I don't like this, but I don't
>> have any idea of what to do about it.
> The same sense of fairness that leads you to remove your desire to
> dominate others may also lead you to bolster your resolve not to be
> dominated yourself and to help others who under threat of domination
> by a third party. Thus if everyone has the ability to self-modify it
> will be both easier to resist and harder to indulge your wicked urges,
> making for a salubrious equilibrium where everyone is nice to everyone
> else, and everyone is happy being nice to everyone else.
Sounds like an individual solution to a collective action problem -
i.e. don't count on it. There's far too many ways it can go wrong,
especially if we add evolution to the mix.
Lee mentioned governments a few posts ago. The only justification of
governments is to solve collective action problems (violence being the
king of these problems so far in human history; out of control
evolutionary arms races might be the king issue after a singularity).
I feel we underestimate the importance of this, because we live in a
world saturated by governments (certainly in terms of GDP), so most of
the collective action problems that can be solved have been solved.
And a solved problem does not attract attention.
So purely individual solutions to these sort of problems strike me as
very unlikely. It seems that people's desciptions of a post
singularity world always boost the aspects that make individuals more
autonomous (total information, extreme mobility, etc...) while
minimising those that would make them more dependent (advances in
offensive weaponry, viral invarsions and the need to defend against
them, finiteness of ressources, fast evolution filling the world with
competing copies, hacks to make other minds more malleable, etc...)
So, despite the arguments here, a society of humans able to modify
their minds at will is probably not going to gravitate to something
pleasant. On the other hand, it might not take much in terms of
coercive interventions to allow such things to happen. In fact, it may
be enough for the AI to gift humans with many technologies, push them
in certain self-modifying directions, and then turn itself off, to
create a positive dynamic equilibirum.
This archive was generated by hypermail 2.1.5 : Sat May 18 2013 - 04:01:10 MDT