UnThreaded | Threaded | Whole Thread (26) | Ignore Thread Prev | Next
Author: coolerthanranch Add to my Favorite Fools Ignore this person (you won't see their posts anymore) Number: of 25071  
Subject: Re: Consciousness and its relationship to Evolut Date: 6/6/2001 5:02 PM
Post New | Post Reply | Reply Later | Create Poll Report this Post | Recommend it!
Recommendations: 1
Well, the point of the example is (and Searle has elaborated since on this), is that symbol manipulation is not conciousness: ie syntactical rules are not sufficient to produce conciousness. Something has to posit a symbol and give it meaning.

Here is where Searle and I part (that's why I said I've got a bit of what he calls "strong Artificial intelligence" to my view): he calls this the "deep" problem of consciousness, whereas I call it a trivial objection. If you look at a grid of light and dark, your visual cortex will be activated in a startlingly similar grid pattern. Your visual cortex will then produce an out put accordingly, that will be transformed from that pattern to another pattern to the assosciative cortices and other areas, which will further encrypt, export and weight the signal depending on context. So the central nervous system is already set up to handle encoded material by its very nature. The positing of an external symbol to represent an internal state is not therefore a big deal: All external signals must be converted to internal states in order to be used, so the adoption of an object as having a significance other than its immediate material meaning (ie the adoption of a symbol) is philosophically trivial: the symbol is immediately converted to an internal state. The neurobiology of learning will handle the rest: many stimuli can aquire the significance of an apparently unrelated state by a variety of mechanisms (the simplest being temporal assosciation). How we are set up to learn language is a very deep problem for empirical science, but honestly, I see it (in principle) as no more than a reflection of a symbol manipulating pathway reflected on itself.

Something has to experience conciousness: it is a subjective quality. Hence whatever runs the set of programs (into my way of thinking) with the appropriate input and output will experience conciousness. However, the syntax is not conciousness. Hence a scientific description of conciousness is of necessity a detailed elaboration of the processes that are necessary and sufficient to be up and running while the organism in question is experiancing conciousness. Conciousness itself cannot be reduced to either circuitry, or to algorithms, because it is neither.

For a typically lucid recent exposition of Searle's ideas, go here.http://www.cogsci.soton.ac.uk/~harnad/Papers/Py104/searle.prob.html

I do not agree with him on everything (as you can see), but he stands out as a very clear writer, either on this subject, or any other.

Post New | Post Reply | Reply Later | Create Poll Report this Post | Recommend it!
Print the post  
UnThreaded | Threaded | Whole Thread (26) | Ignore Thread Prev | Next

Announcements

Foolanthropy 2014!
By working with young, first-time moms, Nurse-Family Partnership is able to truly change lives – for generations to come.
When Life Gives You Lemons
We all have had hardships and made poor decisions. The important thing is how we respond and grow. Read the story of a Fool who started from nothing, and looks to gain everything.
Post of the Day:
Macro Economics

Looking at Currency Ratios
What was Your Dumbest Investment?
Share it with us -- and learn from others' stories of flubs.
Community Home
Speak Your Mind, Start Your Blog, Rate Your Stocks

Community Team Fools - who are those TMF's?
Contact Us
Contact Customer Service and other Fool departments here.
Work for Fools?
Winner of the Washingtonian great places to work, and "#1 Media Company to Work For" (BusinessInsider 2011)! Have access to all of TMF's online and email products for FREE, and be paid for your contributions to TMF! Click the link and start your Fool career.
Advertisement