Abstract

It is widely held that children's linguistic input underdetermines the correct grammar, and that language learning must therefore be guided by innate linguistic constraints. Here, we show that a Bayesian model can learn a standard poverty-of-stimulus example, anaphoric one, from realistic input by relying on indirect evidence, without a linguistic constraint assumed to be necessary. Our demonstration does, however, assume other linguistic knowledge; thus, we reduce the problem of learning anaphoric one to that of learning this other knowledge. We discuss whether this other knowledge may itself be acquired without linguistic constraints.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call