Abstract

The ability of two bottlenosed dolphins ( Tursiops truncatus) to understand imperative sentences expressed in artificial languages was studied. One dolphin (Phoenix) was tutored in an acoustic language whose words were computer-generated sounds presented through an underwater speaker. The second dolphin (Akeakamai) was tutored in a visually-based language whose words were gestures of a trainer's arms and hands. The words represented agents, objects, object modifiers, and actions and were recombinable, according to a set of syntactic rules, into hundreds of uniquely meaningful sentences from two to five words in length. The sentences instructed the dolphins to carry out named actions relative to named objects and named modifiers; comprehension was measured by the accuracy of response to the instructions and was tested within a format that controlled for context cues, for other nonlinguistic cues, and for observer bias. Comprehension, at levels far above chance, was shown for all of the sentence forms and sentence meanings that could be generated by the lexicon and the set of syntactic rules, and included the understanding of: (a) lexically novel sentences; (b) structurally novel sentences; (c) semantically reversible sentences that expressed relationships between objects; (d) sentences in which changes in modifier position changed sentence meaning; and (e) conjoined sentences (Phoenix). Additional abilities demonstrated included a broad and immediate generalization of the lexical items to different exemplars of objects; an ability to modulate the form of response to given action words, in order to apply the action appropriately to new objects, to different object attributes, or to different object locations; an ability to carry out instructions correctly despite changes in the context or location in which a sentence was given, or in the trainer providing the instructions; an ability to distinguish between different relational concepts; an ability to respond correctly to sentences given with no objects present in the tank until 30 seconds after the instruction was given (displacement tests); and an ability to report correctly that the particular object designated in a sentence was in fact not present in the tank, although all other objects were (Akeakamai). These various abilities evidenced that the words of the languages had come to represent symbolically the objects and events referred to in the sentences. The successful processing of either a left-to-right grammar (Phoenix) or of an inverse grammar (Akeakamai) indicated that wholly arbitrary syntactic rules could be understood and that an understanding of the function of words occuring early in a sentence could be carried out by the dolphin on the basis of succeding words, including in at least one case, nonadjacent words. The comprehension approach used was a radical departure from the emphasis on language production in studies of the linguistic abilities of apes; the result obtained offer the first convincing evidence of the ability of animals to process both semantic and syntactic features of sentences. The ability of the dolphins to utilize both their visual and acoustic modalities in these tasks underscored the amodal dependency of the sentence understanding skill. Some comparisons were given of the dolphins' performances with those of language-trained apes and of young children on related or relevant language tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call