analog computing

From: my_sunshine (sun@faclib-0119.unh.edu)
Date: Sat May 12 2001 - 13:12:02 MDT


Some clarifications on the analog/ai issue... this is how I see it.

(1) Digital systems are not limited to the Von Neuman model. Most digital
systems, today, are based upon the step-by-step, limited precision, array
memory, overgrown-calculator model of computation proposed by John Von Neuman
in his First Draft of a Report on the EDVAC. This approach to digital
digital computing is neither necessary nor, except when such finite
constraints are practically acceptable, desirable. Symbolic math programs,
such as Maple or Mathematica, use digital logic to perform general logic and,
in fact, it can be shown that any formal system can be represented in discrete
form (strings of symbols, finite number of transformations, etc.). If you read
the First Draft, Von Neuman actually starts off drawing parallels between
computational units (I think he called them "atoms", not sure) and *neurons*.
Things like the adoption of the binary representation of numbers were only
things which he did for the sake of design simplicty. Imhop, it is unfortunate
that generations of computer programmers have, for the most part, blindly
followed the provisions of one of the earliest models of computing. I think
that, as this discussion indicates, the time for a break from the Von Neuman
model is coming. I also think that AI will be the main technology to do this.
To address an objection which was raised earlier, the modeling of analog
systems (modeling noise or not), *does* require alot of processing *using
today's digital technology*. Change the computing model (i.e., to some kind
of cellular automaton), and full digital modelling of anolog systems suddenly
becomes possible -- in real time, even, if you like.

(2) There is no fundamental distinction between the "analog" and the "digital".
Analog and digital are fundamentally just two different formal mappings
between physical representations of information and their semantics. In other
words, the difference is in what the reading on the voltmeter is interpereted
to mean. It may be broken up into two ranges for the representation of one bit,
divided into four distinct voltage ranges for the representation of two bits,
or treated as a continuous value (a real number). Digital technology gets all
its "spark" from the the design advantages which digital systems provide over
analog ones. Digital components are basically all the same, digital machines
can easily be reconfigured, and digitization eliminates mechanical noise.
The gear was the first digital device, and Charles Babbage realized its power,
in a rather grand way, by designing the first mechanical (yes, geared) CPU.
>From the invention of the pendulum clock to the Internet, digital technology
has fueled a revolution in engineering simply because of design considerations.
Really, the difference between the analog and the digital is the same as the
difference between the integers and the real numbers. (I can see it now: Pi
saying to 10 "I'm better than you are..." :)

(3) The human brain fits neither the "analog" nor "digital" characterizations.
The human brain (indeed, the whole nervous system) works with interphasing
oscillators which humans have yet to understand. So far, we may have invented
the analog, and invented the digital, but we have yet to invent/understand the
"Fourrier" computing ocurring in the human brain -- and, sorry, understanding
is a prerequisite for deliberate design. Once we invent the kind of component
which the human brain uses, then we may go about designing AIs which employ
them, because we'll be able to translate AI implementations from analog and
digital design spaces into the design space of the new technology. But, until
we crack the human brain, we just have to implement technology (like AI) using
techniques which we have.

(4) The use of a particular representation system is not a prerequisite for
engineering an AI. I think, by now, it is clear that the digital and the analog
are fundamentally the same and practically different. Besides, once an AI
becomes sentient, it will probably re-engineer itself, figure out how the
brain works, or invent a representation with more benefits than either digital,
analog, or "neuronal", anyway.

David Montenegro
sunrise2000@mediaone.net



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT