The cult of infinity

From: J. Andrew Rogers (andrew@ceruleansystems.com)
Date: Mon Aug 23 2004 - 17:11:59 MDT


<meta>
I've been flying around the country and therefore mostly
offline for the last few weeks. I'll be leaving again in a
few days, so if I'm slow to respond to things, that is why.
</meta>

As has probably been fairly apparent, I discarded the
notion of infinities having any practical real existence
years ago for a variety of reasons. Therefore, I suppose
I could be called an "infinite set atheist" even though I
wouldn't generally describe it that way. Infinities are
arguably interesting and useful in mathematics, but do not
have a practical application in real systems other than as
crude approximations. Since I concern myself solely with
systems that exist in our universe, I do not have much use
for the notion. This needs to be explained though, because
the above is different than asserting that the universe is
finite state.

The kinds of systems I assume have two basic properties:

1.) They are countable in all dimensions (i.e. discrete).
For our universe, the Planck dimensions work just fine as a
physical analogue of this.

2.) In finite time, the system can only express finite
algorithms. The speed of light serves this function given
#1. (For all I know, this may follow from #1; I have not
thought about it very hard.)

The implied asterisk above is that countable doesn't mandate
finite-ness, and having a system bound to finite-state-ness
in finite time is a perfectly adequate description of real
systems. No point in overly restricting the model when not
necessary, and it doesn't affect the mathematical treatment
of a system. So in a sense, I don't have a problem with
infinities as long as they are never expressible as a
practical matter. In other words, they can be treated as
finite state systems for many purposes even if they
technically are not under the usual mathematical
definitions. You can only have infinite algorithms given
infinite time (and assuming infinite space).

A fun theoretical part is that the limit of K-complexity for
a given computer varies as a function of time, among other
things, a more accurate model generally even for normal
silicon, though it is approximated to a static value. I
cannot recall ever seeing this treated as variable in
theory in this way. Pervasively treating a variable as static in theory
is not unusual, e.g. a similar type of situation exists in
Black-Scholes, as long as one is aware of this approximation so that it
can be discarded when necessary.

This is closely related to the memory latency problem of
hardware. No matter how much RAM you theoretically have,
the size of an algorithm is constrained by the amount of
space you can reference in a finite period time.

Not too much meat here -- I'm on a time restriction -- but
there you have it. I've taken to calling systems that
meet this description as "algorithmically finite", as
I consider them a useful and distinct class of system, but
as far as I've been able to tell there is no term in
mathematics for this particular type of tidy theoretical
system.

j. andrew rogers



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:48 MDT