From: Mike Dougherty (firstname.lastname@example.org)
Date: Fri Jun 02 2006 - 19:17:44 MDT
On 6/2/06, CyTG <email@example.com> wrote:
> "In my estimate, the implementation in neural hardware is not necessary,
> and I believe that we have the hardware to do it now. In fact, I think
> we have had that hardware for about a decade. That is a rough estimate
> based on my understanding of human cognitive architecture, and my take
> on what the design of the first successful AGI will be like."
> Im curious, what qualifies you guys to make such a statement ?
> Dont get me wrong, if its just the product of your own
> "uneducated"-homegrown thinktank, thats quite fine, but one could also get
> the impression that you're workign with/educated in the field of CS/AI
> and/or an aspect of the human psyche.
> Just curious :o)
I wonder if the estimation of hardware complexity for reproducing AI in a
computer is in any way dependant on the complexity of the brain doing the
estimation. Considering neuron density, extra-dimensional topology, carbon
nanotube / quantum - based "intuition", nutritional fluctuations impacting a
lifetime of memories that shape perceptions, etc. - the only one to ever
know how much silicon it takes to make an intelligent computer will be the
first computer to declare itself intelligent. Of course we should treat
that statement the same way we would treat each other... by asking, "What
makes you think you're intelligent?" (or possibly, "Yeah? Proove it.")
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT