Re: Self vs. other (was Re: Balance of power)

From: William Pearson (wil.pearson@gmail.com)
Date: Fri May 02 2008 - 01:38:00 MDT


2008/5/1 Matt Mahoney <matmahoney@yahoo.com>:
> --- William Pearson <wil.pearson@gmail.com> wrote:
>
> > Nothing has free choice, however not all information channels have an
> > equal affect on the state of the systems involved. I think it
> > possible
> > to create computer systems that have a symbiotic relationship, but a
> > much less ability to change a human than the human does it.
>
> If two symbiotic agents with unequally sized saturated memories
> communicate, then both agents must change state at the same rate, as
> measured by conditional algorithmic complexity.

If and only if they send each other no redundant or useless
information. Which is unlikely even between subsystems of the same
agent.

> However if by "change"
> you mean the percentage of information affected, then the smaller
> agent will change faster relative to its size.
>
> Suppose that A and B are saturated agents and algorithmic complexity
> K(A) > K(B). Consider a message x from A to B, where we don't count
> any bits ignored by B.

Ignoring, forgetting etc is vital to intelligence to work in the real
world. Else you have stasis.

> Then K(B(t2)|B(t1)) = K(x|B(t1)), where t1 is
> the time before the message and t2 is after. A can forget x (since A
> can always get it back from B), so K(A(t1)|A(t2)) = K(x|A(t2)).
>
> By symbiotic, I mean that communication tends to minimize K(A) + K(B),
> an ideal division of labor. If K(x|B(t1)) < K(x|A(t2)) (B can remember
> x more easily than A could have), then B should keep x, or else send it
> back.

Why remember anything, why is it important?

> If A and B are both saturated and symbiotic at equilibrium, then
> information is conserved. Both agents must change state at the same
> rate. There must be equal amounts of information flowing in each
> direction because neither agent can hold any more.

Or some of it is discarded. I was considering an open system where one
part of the systems was connected to the internet.

> This relation holds even if the agents perform lossy compression. If
> one agent transmitted information faster than the other, then the other
> receiver must discard some of it to make the rates equal. To
> generalize the above argument, let x be the part of the message that is
> not ignored by the receiver. The sender, of course, must continue to
> remember the ignored part.
>
> Now suppose that A is a human and B is a calculator. This is a
> reasonable division of labor because the calculator can do arithmetic
> faster and more accurately than I can. But I can completely change the
> state of the calculator's registers much faster than it can change me
> into a different person. This is also the case with all nonhuman
> agents in existence today. But if B was a superhuman AI, then the
> situation could be reversed.

No, because we are not exactly like calculators and other nonhuman
agents. We can resist state change, as long as things don't send tweak
our motivational systems as well (pain, pleasure). Just look at how
slow academic fields are to change.

> The agents need not be symbiotic. By a simpler argument, the maximum
> change K(A(t2)|A(t1)) is min(K(A), K(B)), i.e. transfer all of the
> information from B to A until A is full. Then B can change A into a
> different person only if K(B) >= K(A).

There is no such actual thing as a "different person". There is no
sharp distinction between self and environment. The separate agent
hood I give to computers is simply an illusion that allows me to work
with them better.

Do you consider the visual cortex as trying to change you to a
"different person"? If not why not? Why is it not separate?

 Will Pearson



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT