Re: AGI project planning

From: Ben Goertzel (ben@goertzel.org)
Date: Mon Dec 05 2005 - 11:41:32 MST


Hi,

> Based on your past statements I'm pretty sure you're well aware
> of all that Ben and would acknowledge your statement as hopeful
> optimisim; e.g. 'given the above, I feel we would have a
> nontrivial chance of success...', which is fine. I'm just
> reminding everyone else, if they haven't already concluded as
> much from the sheer number of people who've said something like
> 'given enough funding to make my project feel serious and well
> equipped, we can probably build an AGI in X years' and failed
> miserably, not to take these kind of claims as serious
> predictions. It is unfortuante that the funding environment
> strongly rewards those who claim 'Yes, I can definitely build
> an AGI in X years given Y million dollars' and sound convincing
> about it; it encourages both scams and honest self-delusion. I
> am glad that so far I have been able to raise funding without
> having to make such claims.
>
> * Michael Wilson

Michael, yes, I basically agree with these sentiments you have expressed.

This conversation reminded me of a recent negotiation with some
potential investors in Novamente, which did not end up successfully.

I suspect that if the Novamente team and I had taken more of an
evangelical, absolutely-certain stance, then we might well have gotten
the funding from these individuals.

But we presented ourselves honestly, as a group of individuals with
different estimates of the time it would take to complete our project
and of our ultimate odds of success. All of us on the team think the
project has a nontrivial probability of achieving human-level AGI
within < 15 years, but each of us has our own specific probability
estimates of the amount that can be achieved within each particular
period of time, and we all realize that our estimates are highly
uncertain. (FYI, I have the highest estimate of the odds of eventual
success of everyone on the team, but perhaps not the briefest estimate
of the expected time to success in the case that success is achieved.
So I am not the biggest optimist according to every measure.)

Unfortunately, it seemed that presenting our opinions and attitudes
honestly in this way turned off the investors, who it appeared would
have been happier to hear everyone on the team say: "We are all
absolutely certain that we will create a human-level AGI within 7
years using X amount of money and Y amount of computing resources."

I am happy you have been able to find funding for your work while
presenting your case honestly. I will keep presenting my case
honestly as well and hope that this will eventually lead to finding an
adequate amount of funding for the Novamente project. Otherwise,
perhaps someone else with superior salesmanship skills and a
different, also workable, AGI design will succeed in raising funding
for their AGI system first and beat me to the end goal. (If so, I
sure hope they're both benevolent and careful... ;-p)

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:54 MDT