From: Bryan Bishop (firstname.lastname@example.org)
Date: Wed Jun 25 2008 - 07:58:23 MDT
On Wednesday 25 June 2008, Anthony Berglas wrote:
> Thanks for your feedback, responses below...
> The ability to make new physical hardware might eventually limit
> exponential growth of intelligence, but at a point so far in the
> distance that I do not think that it is relevant.
There's going to be a maximum for the functionality that can be
expressed within 500 gigabytes or however large your most recent hdd
> At 03:05 PM 25/06/2008, Bryan Bishop wrote:
> > > Hardware has certainly become much, much faster, but software has
> > > just become much, much slower to compensate. We think we
> > > understand computers and the sort of things they can do.
> >Are you a programmer and have you any idea ?
> Yes. But in any case you know that modern machines do not feel to be
> amazingly fast.
I'm not so sure about that. I get awesome performance out of my
software. It's how I'm able to do so much, really. I ruthlessly
eliminate overhead, and this sometimes means getting rid of the gui.
> >What we know,
> >as of now, is that the brain is doing something awesome, and that we
> >want to figure out how to do it in other areas too.
> Certainly the brain is cleverly "designed". But it does not contain
> a large amount of arbitrary complexity, only 750K of DNA.
Okay, so let's work with this. Suppose that 750 kilobytes of DNA. The
information that is encoded in the codons is specifying amino acids,
which is somehow deterministically specifying the proteins that are
constructed. Supposedly it should be possible to use our knowledge of
protein folding in order to simulate all of the proteins being
constructed. But. The functionality that the proteins provide isn't
necessarily encoded in the genome itself.
> For an AI to compete with man it should probably run at at least
> man's thinking speed.
I don't know what compete means in this context.
And I don't know what thinking speed means. Thoughts per second?
How are you measuring a thought in a man?
> > > One major driver will be the need for practical intelligence as
> > > robots leave the factory and start to interact with the real
> > > world.
> >No, that's more a 'driver' for people to come to terms with the
> > problems and realize that they might be interested in working on
> > them, it's nothing about the actuality of solving them
> Exactly. And then by trying probably having some success.
> > > In particular cars can already drive themselves over rough desert
> > > tracks and down freeways.
> >You're talking about physical manufacturing and mechanics, tasks
> > that machines can already do. Intelligence isn't really needed for
> > those things.
> Actually, the machine needs to be able to see/sense the environment,
> determine a route through it, react to changes. Certainly not full
> AI but much, much smarter than the program that tots up your bank
Maybe, but there's other available solutions for those problems, like
automated transportation systems and so on. Having ai solve the driving
problem is a major hack. And I know it's starting to work, but it's all
backwards. The idea is to get ai, not a driving machine. There are many
intelligent people who can't drive, for instance. But if I throw in a
cpu into my 2005 Mustang, and it starts driving itself, I wouldn't be
complaining. But I wouldn't also be shouting ai.
> > Who cares if you are out of work? The machines are taking care
> > of the necessities of life anyway, yes? Then what's the big deal?
> People pay me because I can do valued work. If they do not pay me I
> may starve. Socialism is all very well, but I would prefer not to
> bet on it.
You'd have machines that give you food. Much like they do now. Actually,
at the moment, I think the agricultural industry is still using
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT