Well, the point of the example is (and Searle has elaborated since on this), is that symbol manipulation is not conciousness: ie syntactical rules are not sufficient to produce conciousness. Something has to posit a symbol and give it meaning. Here is where Searle and I part (that's why I said I've got a bit of what he calls "strong Artificial intelligence" to my view): he calls this the "deep" problem of consciousness, whereas I call it a trivial objection. If you look at a grid of light and dark, your visual cortex will be activated in a startlingly similar grid pattern. Your visual cortex will then produce an out put accordingly, that will be transformed from that pattern to another pattern to the assosciative cortices and other areas, which will further encrypt, export and weight the signal depending on context. So the central nervous system is already set up to handle encoded material by its very nature. The positing of an external symbol to represent an internal state is not therefore a big deal: All external signals must be converted to internal states in order to be used, so the adoption of an object as having a significance other than its immediate material meaning (ie the adoption of a symbol) is philosophically trivial: the symbol is immediately converted to an internal state. The neurobiology of learning will handle the rest: many stimuli can aquire the significance of an apparently unrelated state by a variety of mechanisms (the simplest being temporal assosciation). How we are set up to learn language is a very deep problem for empirical science, but honestly, I see it (in principle) as no more than a reflection of a symbol manipulating pathway reflected on itself. Something has to experience conciousness: it is a subjective quality. Hence whatever runs the set of programs (into my way of thinking) with the appropriate input and output will experience conciousness. However, the syntax is not conciousness. Hence a scientific description of conciousness is of necessity a detailed elaboration of the processes that are necessary and sufficient to be up and running while the organism in question is experiancing conciousness. Conciousness itself cannot be reduced to either circuitry, or to algorithms, because it is neither.For a typically lucid recent exposition of Searle's ideas, go here.http://www.cogsci.soton.ac.uk/~harnad/Papers/Py104/searle.prob.htmlI do not agree with him on everything (as you can see), but he stands out as a very clear writer, either on this subject, or any other.
Best Of |
Favorites & Replies |
Start a New Board |
My Fool |
BATS data provided in real-time. NYSE, NASDAQ and NYSEMKT data delayed 15 minutes.
Real-Time prices provided by BATS. Market data provided by Interactive Data.
Company fundamental data provided by Morningstar. Earnings Estimates, Analyst Ra