Is intelligence merely the ability to learn, answer questions and solve logic problems, or is intelligence based on something deeper -- for example, the ability to think creatively, to do unexpected things, to behave irrationally?
I tend to think that the essence of human-like intelligence is the ability to make creative use of both rationality and irrationality. This is something that would be very hard for a logic machine to emulate because computers simply cannot be irrational. I think humans are less limited than computers -- our minds are able to handle fuzziness, incompleteness, inconsistencies, infinities, and paradoxes that computers are simply too brittle to process. Perhaps more chaotic approaches to AI will be better in the long-run. Neural networks and genetic algorithms, as well as cellular automata, use "bottom-up" emergent computation that on the surface seems to resemble human cognition more closely than old-fashioned expert systems. The output of these systems, if they are suitably complex, cannot be determined without simply running the systems for some period of time. In other words, there is no way to compress their computations. However, at the root these systems are still implemented on top of logic machines. So their potential for implementing human-like intelligence comes down to the question of whether the human brain equivalent to a logic machine? Ultimately the answer to all of this may be tied to quantum mechanics -- and the Uncertainty Principle. At a fundamental level is there a certain amount of uncertainty in the process of human cognition, and is this uncertainty actually essential to it? If the human brain is actually doing quantum computation, then is it equivalent to a simple logic machine, or is it doing something that transcends the limits of logic machines?
Comments