Re: What would an AGI be interested in?

From: maru dubshinki (marudubshinki@gmail.com)
Date: Sun Aug 13 2006 - 23:03:18 MDT


On 8/13/06, Tennessee Leeuwenburg <tennessee@tennessee.id.au> wrote:
> Michael Anissimov wrote:
> > Tennessee,
> >
> > An AGI is not a concrete "thing", it is a huge space of possibilities.
> > It is a set defined only by the characteristics of general
> > intelligence and being built artificially. There are more possible
> > AGIs than there are bacteria on earth.
> I beg to request more clarification. Eliezer promotes (for example)
> Bayes as a possibly perfect way of reasoning and inferences. If this is
> so, does this not imply that all questions have a correct,
> non-subjective response? If the correctness of Bayesian reasoning is
> non-subjective, does this not perhaps mean that any perfectly reasoning
> AGI can in fact reach one conclusion?
....
> -T

If that sort of thing really interests you, you might find some of
Robin Hanson's work right up your alley. From
http://hanson.gmu.edu/vita.html , I remember "Are Disagreements
Honest?" and "For Bayesian Wannabes, Are Disagreements Not About
Information?" as being applicable here. (I could have sworn I read one
paper along the lines of "Disagreements between two rational actors
are not honest" but I can't seem to find it now).

~maru



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:57 MDT