lundi 1 février 2016

What is General Intelligence?

Several threads on this forum talk about Artificial General Intelligence (AGI), and how all current (and foreseeable) examples of AI are not AGI; invariably they are highly specialized applications, however impressive in their specific domain. Unstated assumption in all these discussions is that humans possess General Intelligence (of non-Artificial variety). But is this really true?

We evolved to navigate primate social hierarchy, to deal with the kind of threats a primate would normally encounter in Africa, and to reproduce. Consequently, our brains are incredibly good at some tasks - namely, tasks we tend to consider trivial despite the enormous amount of computation they often require. And we absolutely suck at other tasks, often objectively less computation-intensive, because performing then did not affect our ancestors’ survival.

For example, we are incredibly good at recognizing human faces, and distinguishing one face from another. Whereas if presented with a bunch of corn cobs, or pieces of coral, which objectively (i.e. by the edge count) differ as much as human faces in a typical room, most people will have trouble telling them apart, and absolutely no chance of picking out a specific corn cob few days after seeing it. “Recognize faces”, “recognize corn cobs” and “recognize pieces of coral” are really the same problem, from viewpoint of General Intelligence. But our brains are optimized for but one of them.

Likewise, without specialized training humans tend to be terrible at three-dimensional navigation, at evaluating low-probability threats, and at things like Prisoner’s Dilemma. Heck, it just occurred to me that the entire JREF forum is pretty much the litany of all different ways Human Intelligence tends to go haywire because it is not General Intelligence!

So my question is: Is General Intelligence (Artificial or otherwise) even possible? And if it is, how would we recognize it?

Of course you can use the same cop-out as most science fiction writers made, and simply define AGI as “thinks just like a human, except can calculate orbits in real time, and maybe bad at reading emotions”. But that’s a cop-out.


via International Skeptics Forum http://ift.tt/1STZD70

Aucun commentaire:

Enregistrer un commentaire