Abstract

Subjects were exposed to a small sample of sentences from an artificial linguistic system and tested on their knowledge of the linear and hierarchical structures of the language. Four reference world conditions were included in order to evaluate the effects of providing semantic encoding of the following types of information: (1) class membership of words, (2) an arbitrary grouping of successive words, (3) phrasal (i.e., constituent) grouping of successive words, and (4) linguistic dependencies among successive words. Results showed, first, that presence of constituent structure information was the effective variable in facilitating the learning of complex aspects of syntax, particularly linguistic dependencies; providing an explicit semantic representation of these dependencies did not significantly improve syntax learning. Second, only those subjects receiving constituent structure information succeeded in inducing phrase structure grammars. These results suggest that it is only when the input contains a rich set of correlated cues to constituent organization that learners uniformly succeed in inducing coherent grammatical systems; in these circumstances, both linguistic dependencies and hierarchical structure are represented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call