Re: [sl4] Re: Property rights

From: Lee Corbin (lcorbin@rawbw.com)
Date: Sat Jul 12 2008 - 21:13:28 MDT


Stuart writes

>> I agree with your generality, but not with your specifics. "Dysentery" is
>> not a libertarian idea, and as libertarian idealists point out, proper
>> medical prevention *could* be arranged by a free people who were
>> both smart enough, well educated enough, and farsighted enough.
>
> I have no doubt about that; I've looked into some of those solutions
> myself, and most of them look like complicated and risky methods to
> construct an inefficient liberty-based version of something that could
> be accomplished much easier by governmental fiat.

You speak as if arrogating to government the power to
"fiat this" and "fiat that" is without grave risk.

> I'm not saying its a bad idea to do so, but it carries a large cos
> (that might be compensated by an increase in liberty, but that's
> a philosophical question).

Well, don't forget the enormous costs of an unchecked government.
Currently some enormous fraction of United States GNP is
absorbed by the government, with little result to point to except
interference in free market wealth creation, discouragement of
individual incentive, and various "earmarks" of corruption. What
day is it in June now that Americans can say that they're finally
working for themselves and not the government? Or has it passed
into July? In any case, the U.S. is emblematic of the problem
which in a number of European countries is already much in "advance".

> To put in economic terms, negative externalities (it is to the
> advantage of every parent to have every child immunized, except their
> own). To solve these you need most people to be not only smart but
> altruistic, and to have some mechanism for them to organise their
> altruistic impulses. If some people are not altruistic enough, and if
> the externalities are sufficiently large, you need compulsion (maybe a
> soft, financial version of compulsion, but some compulsion non the
> less).

Yes, a certain, though quite limited, number of examples like
this are to my way of thinking correct. But returning to a
relevant angle.....

Suppose (i) the entire solar system were to fall under the
sway of a single multifaceted intelligence vastly, vastly more
capable than its human creators, (ii) somehow humans as
presently constituted (or equivalent uploads) are allowed
control of some small fraction of solar resources (iii) a certain
reader permits this discussion to proceed on these four
assumptions without interjecting that we should all drop
these ideas and get to work right now on concrete steps
for improving our lot, and (iv) the resources allocated to
each of some billions of humans are compute intensive
enough to allow for quite possibly very novel developments.

THEN, if we have any say in the matter right now (which
we probably won't have), would it be better or worse for
that Intelligence to allow complete freedom of resource usage
(up to, but, realistically, not including threats to itself)?
For example, ought that Intelligence permit one to create
veridical historical reenactments?

Lee



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT