Abstract
The Functionalist’s Dilemma George Lakoff Language, Consciousness, Culture: Essays on Mental Structure . Ray Jackendoff. xxiv + 403 pp. MIT Press, 2007. Science, as Thomas Kuhn famously observed, does not progress linearly. Old paradigms remain as new ones supplant them. And science is very much a product of the times. The symbol manipulation paradigm for the mind spread like wildfire in the late 1950’s. Formal logic in the tradition of Bertrand Russell dominated AngloAmerican philosophy, with W.V.O. Quine as the dominant figure in America. Formalism reigned in mathematics, fueled by the Bourbaki tradition in France. Great excitement was generated by the ChurchTuring thesis of the equivalence between Turing machines, formal logic, recursive functions, and Emil Post’s formal languages. The question naturally arose: Could thought be characterized as a symbol manipulation system? The idea of artificial intelligence developed out of an attempt to answer this question, as did the information processing approach to cognitive psychology of the 1960’s. The mind was seen as computer software, with the brain as hardware. The software was what mattered. Any hardware would do—a digital computer or the brain, which was called “wetware” and seen (incorrectly) as a generalpurpose processor. The corresponding philosophy of mind was called “functionalism”, which claimed that you could adequately study the mind independently of the brain in terms of its functions, as carried out by the manipulation of abstract symbols. The time was ripe for Noam Chomsky to adapt the symbol manipulation paradigm to linguistics. Chomsky’s metaphor was simple: A sentence was a string of symbols. A language was a set of such strings. A grammar was a set of recursive procedures for generating such sets. Language was syntacticized—placed mathematically within a Post system, with abstract symbols manipulated in algorithmic fashion by precise formal rules. Since the rules could not look outside the system, language had to be “autonomous”—independent of the rest of the mind. Meaning and communication played no role in the structure of language. The brain was irrelevant. The idea was called “generative linguistics,” and it continues in many US linguistics departments. By the mid1970’s, there was another paradigm shift. Neuroscience burst onto the intellectual stage. Cognitive science developed beyond the formalist cognitive psychology to include neural models. And a linguistic theory committed to viewing language in terms of the brain and integrated with other aspects of the mind developed. It was called “cognitive linguistics,” and has been steadily developing into a rigorously formulated neural theory of language based on theory of neural computation and actual developments in neuroscience. Language and thought are seen as physically embodied and carried out biologically, not merely an abstract symbol manipulation system.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.