Abstract

It has been suggested that external and/or internal limitations paradoxically may lead to superior learning, that is, the concepts of starting small and less is more (Elman, 1993; Newport, 1990). In this paper, we explore the type of incremental ordering during training that might help learning, and what mechanism explains this facilitation. We report four artificial grammar learning experiments with human participants. In Experiments 1a and 1b we found a beneficial effect of starting small using two types of simple recursive grammars: right‐branching and center‐embedding, with recursive embedded clauses in fixed positions and fixed length. This effect was replicated in Experiment 2 (N = 100). In Experiment 3 and 4, we used a more complex center‐embedded grammar with recursive loops in variable positions, producing strings of variable length. When participants were presented an incremental ordering of training stimuli, as in natural language, they were better able to generalize their knowledge of simple units to more complex units when the training input “grew” according to structural complexity, compared to when it “grew” according to string length. Overall, the results suggest that starting small confers an advantage for learning complex center‐embedded structures when the input is organized according to structural complexity.

Highlights

  • One would think that learners should acquire information better when they are unhindered by internal or external limitations, such as those relating to constraints on memory or input

  • The results of Experiment 1a show that only when the input was presented in a staged fashion, with 0-level of embedding (LoE) strings presented first, did participants show above-chance learning of the artificial grammar

  • Helmert contrasts indicated that the average performance in the starting-small conditions was higher than in the random condition (p < .01), and that the Starting Small-Structure condition resulted in a higher performance than the Starting Small-Length condition (p < .01)

Read more

Summary

Introduction

One would think that learners should acquire information better when they are unhindered by internal or external limitations, such as those relating to constraints on memory or input. Some proposals take the somewhat paradoxical stance that cognitive limitations and/or reduced input may confer a computational advantage for learning These theories, the notion that less is more (Newport, 1990) and the importance of starting small (Elman, 1990, 1993), are often couched in terms of language acquisition. Our findings suggest that the facilitative effect of starting small occurs both for simple recursive structures as well as more complex ones, and it is greatest when the input “grows” incrementally according to structural complexity. These findings point to a fundamental moderator of learning that has consequences for language acquisition, development, and inductive learning more generally

Starting-small evidence
Recursive artificial grammars
Experiment 1a
Method
Results and discussion
Experiment 1b
Experiment 3
Experiment 4
General discussion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.