From: CyTG (firstname.lastname@example.org)
Date: Thu Jan 12 2006 - 07:01:26 MST
Im trying to wrap my head around this AI thing, and entirely how far along
we are in measures of computational power compared to whats going on in the
I know many believes that there's shortcuts to be made, even improvements of
that model nature has provided us with, the biological neural network.
Still. Humor me.
Here's my approximated assumptions, based on practical experience with ann's
and some wiki.
Computational power of the human mind ;
100*10^9 neurons, 1000 connections each gives about 100*10^12 operations _at
the same time_ .. now on average a neuron fires about 80 times each second,
that gives us a whopping ~10^14 operations/computations each second.
On my machine, a 3GHz workstation, im able to run a feedforward network at
about 150.000 operations /second WITH training(backprop) .. take training
out of the equation and we may, lets shoot high, land on 1 million 'touched'
neurons/second .. now from 10^6 -> 10^14 .. that's one hell of a big
Also .. thinking about training over several training sets (as is usual the
case) wouldn't I be correct at making an analogy to linear algebra ?
thinking of each training set as a vector, each set having their own
direction. In essense, two identical training sets would be linear
'depended' on each other and subject for elimination? (thinking there could
be an mathematical sound approach here towards eliminating semi-redundant
Hope its not too far off topic!
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT