Re: answers I'd like, part 2

From: Adam Safron (asafron@gmail.com)
Date: Fri Nov 16 2007 - 09:34:44 MST


Well said. From this conversation, I can see two important
implications:

1. Emulating is probably an inefficient–and potentially unethical or
even dangerous way–of achieving human-level AI.

2. Digital uploading may be so difficult/expensive as to make it not
worthwhile. Instead, we could transcend biology by replacing our
neurons with synthetic computational equivalents. This seems like a
rational first-step for a couple reasons:
        a) Synthetic neurons can be monitored and replaced.
        b) Synthetic neurons could be robust under a wider range of
conditions (e.g. permanently jacked into a simulated world without an
actual bio-body).
        c) Synthetic neurons are potentially much faster than bio-neurons.
        d) Synthetic neurons can potentially keep logs of their activity
such that we could create "back-ups" of our minds.
        e) If you want to digitize someone's mind, you're going to need to
know the state of a huge number of neurons (and glia, etc.) that are
changing from moment to moment. With synthetic neurons, you would
know that you have all of the important functional details. In fact,
I would argue that it's a necessary prerequisite to uploading. But
even if you don't like uploading, these data-sets will be enormously
valuable for understanding the details of how brain produces mind.

But first we need advanced nanotechnology and some means of
controlling/programming the nanobots. Or maybe–probably?–the AGI
people will have succeeded long before we have this kind of
technological sophistication. In that case, the AGI will be able to
tell us whether or not this is the optimal first step for maximizing
human potential.

-a

On Nov 16, 2007, at 9:42 AM, Stathis Papaioannou wrote:

> On 17/11/2007, Adam Safron <asafron@gmail.com> wrote:
>> Everything you say here makes sense to me, but I think I disagree
>> with
>> respect to the necessity of molecular level modeling for making a
>> functioning intelligence based on an emulated brain. Surely,
>> molecular level information will be necessary to make an accurate
>> model of any specific intelligence (e.g. uploading). However, we
>> could emulate the functional properties of the brain–though not any
>> specific brain–without keeping track of every molecule. And if we
>> did
>> keep track of every molecule, it seems that the required
>> computational
>> resources would make emulating a brain nearly impossible, even with
>> the ability to harness sl4 levels of computation.
>
> I think you would need both molecular level imaging and modelling to
> simulate a particular human mind, but a coarser level of imaging would
> do to simulate a generic human mind. But that doesn't mean a
> computationally far simpler human level intelligence is not possible.
> The brain did not evolve with the goal of making it easy to simulate.
>
>
>
>
> --
> Stathis Papaioannou
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT