A popular concept among AI people is that the Turing Test is a decent measure of both intelligence and consciousness. I disagree. The Turing Test is really not a measure of consciousness -- nor does it actually measure anything about the computer being evaluated, in fact, if anything, the Turing Test is actually a measure of the intelligence of the human who is evaluating the computer. Is the human smart enough to tell that the computer isn't really a human? If a computer passes the Turing Test, that doesn't prove anything about the computer, but it may prove that the human who is evaluating the computer is not very smart.