John R. Searle sees gaping cracks in the edifice of the mind constructed by cognitive scientists. Searle, a philosopher at the University of California, Berkeley, peruses the mental rules and representations and computer programs that buttress the cognitive citadel with the eye of a skeptical contractor. Watch out for falling bricks, he warns; the structure lacks the mortar of consciousness to hold it together. More than anything else, it is the neglect of consciousness that accounts for so much barrenness and sterility in psychology, the philosophy of mind, and cognitive science, Searle asserts. Although Searle's remark will win him no popularity contests among scientists of the mind, it nevertheless reflects the recently renewed interest in deciphering the nature of consciousness. From a variety of perspectives, scientists are now trying to define more clearly what they mean when they refer to conscious and unconscious mental activity. Searle first rankled cognitive scientists in 1980 when he published his widely cited argument, an attack on the notion, promoted by advocates of artificial intelligence, that the mind corresponds to a computer program implemented in the hardware of the brain. Searle compared the computers favored by artificial intelligence enthusiasts to a person who does not speak Chinese but sits in a room with Chinese dictionaries and a filing system. If an outsider slips questions written in Chinese under the door, the person uses the reference works to compose answers in Chinese. Responses emerging from the room might prove indistinguishable from those of a native Chinese speaker, Searle contended, even though the person toiling in the Chinese Room understands neither the questions nor the answers. The moral of this exercise: A system such as a computer can successfully employ a set of logical rules without knowing the meaning of any of the symbols it manipulates using those rules. Supporters of strong artificial intelligence view the Chinese Room as a flimsy sanctuary from the argument that a properly programmed computer possesses a mind. Philosopher Daniel C. Dennett of Tufts University in Medford, Mass., calls Searle's analogy simplistic and irrelevant. A computer program that could hold its own in a conversation would' contain layers of complex knowledge about the world, its own responses, likely responses of a questioner, and much more, Dennett contends. Indeed, computers have displayed a growing conversational prowess in the last several years. Their increasingly deft dialogue stems from the interactions among various strands of information, each of which comprehends nothing on its own, Dennett maintains. Put another way, proper programming transforms a bunch of unreflective parts into a thinking system, whether they reside in a mainframe or a human skull.