Re: Flare

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Tue Jul 17 2001 - 16:34:31 MDT


Ben Goertzel wrote:
>
> > Regardless, Flare may not be the perfect tool out of the box, but I'd be
> > really surprised if it isn't (a) a much better starting point and (b)
> > easier to improve.
>
> I'm afraid you may be underestimating the vast amount of work required to
> make a scalable, efficient programming language.

Perhaps. On the other hand, it doesn't need to be a vast amount of work
put in by people on the SIAI payroll.

While Webmind may be currently recoding some operations in C++, I'm not
sure you'd have been so successful if you set out to work in C++ from the
beginning. I certainly don't think C++ would be a good idea for SIAI.

As I recall from your own discussions, Java isn't scalable. Maybe a more
modern programming language, designed in an era of Beowulf networks, will
be nicely and transparently scalable - if we handle it right. Perhaps
that will take an army of coders, but if an open-source project is cool
enough, it can *get* an army of coders. And our alternative is Python; I
don't know whether that's more or less scalable than Java, but my guess
would be less.

> I think that having the right programming language could make self-modifying
> AI go a bit smoother. But still, I believe that understanding the
> conceptual basis of its own functioning is a much bigger task for a young
> would-be-self-modifying AI to overcome, than understanding the particulars
> of the programming language in which it's implemented. An easy language
> will make the first stages of moving toward intelligent self-modification go
> a bit faster, but it won't help with the hard parts.

I agree. A self-optimizing AI is not the same thing as a seed AI; in
fact, I think that you strongly overestimate the degree to which providing
a mathematical representation for code is progress toward understanding
programs. Nonetheless, I think that self-optimization has a certain role
to play in creating the underlying flexibility needed to get to that
infrahuman amount of general intelligence needed for the beginnings of
true seed AI.

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT