No. of Recommendations: 1
Well, the point of the example is (and Searle has elaborated since on this), is that symbol manipulation is not conciousness: ie syntactical rules are not sufficient to produce conciousness. Something has to posit a symbol and give it meaning.

Here is where Searle and I part (that's why I said I've got a bit of what he calls "strong Artificial intelligence" to my view): he calls this the "deep" problem of consciousness, whereas I call it a trivial objection. If you look at a grid of light and dark, your visual cortex will be activated in a startlingly similar grid pattern. Your visual cortex will then produce an out put accordingly, that will be transformed from that pattern to another pattern to the assosciative cortices and other areas, which will further encrypt, export and weight the signal depending on context. So the central nervous system is already set up to handle encoded material by its very nature. The positing of an external symbol to represent an internal state is not therefore a big deal: All external signals must be converted to internal states in order to be used, so the adoption of an object as having a significance other than its immediate material meaning (ie the adoption of a symbol) is philosophically trivial: the symbol is immediately converted to an internal state. The neurobiology of learning will handle the rest: many stimuli can aquire the significance of an apparently unrelated state by a variety of mechanisms (the simplest being temporal assosciation). How we are set up to learn language is a very deep problem for empirical science, but honestly, I see it (in principle) as no more than a reflection of a symbol manipulating pathway reflected on itself.

Something has to experience conciousness: it is a subjective quality. Hence whatever runs the set of programs (into my way of thinking) with the appropriate input and output will experience conciousness. However, the syntax is not conciousness. Hence a scientific description of conciousness is of necessity a detailed elaboration of the processes that are necessary and sufficient to be up and running while the organism in question is experiancing conciousness. Conciousness itself cannot be reduced to either circuitry, or to algorithms, because it is neither.

For a typically lucid recent exposition of Searle's ideas, go here.

I do not agree with him on everything (as you can see), but he stands out as a very clear writer, either on this subject, or any other.

Print the post  


What was Your Dumbest Investment?
Share it with us -- and learn from others' stories of flubs.
When Life Gives You Lemons
We all have had hardships and made poor decisions. The important thing is how we respond and grow. Read the story of a Fool who started from nothing, and looks to gain everything.
Contact Us
Contact Customer Service and other Fool departments here.
Work for Fools?
Winner of the Washingtonian great places to work, and Glassdoor #1 Company to Work For 2015! Have access to all of TMF's online and email products for FREE, and be paid for your contributions to TMF! Click the link and start your Fool career.