Abstract

MaxEnt grammar is a probabilistic version of Harmonic Grammar in which the harmony scores of candidates are mapped onto probabilities. It has become the tool of choice for analyzing phonological phenomena involving probabilistic variation or gradient acceptability, but there is a competing proposal for making Harmonic Grammar probabilistic, Noisy Harmonic Grammar, in which variation is derived by adding random ‘noise’ to constraint weights. In this paper these grammar frameworks, and variants of them, are analyzed by reformulating them all in a format where noise is added to candidate harmonies, and the differences between frameworks lie in the distribution of this noise. This analysis reveals a basic difference between the models: in MaxEnt the relative probabilities of two candidates depend only on the difference in their harmony scores, whereas in Noisy Harmonic Grammar it also depends on the differences in the constraint violations incurred by the two candidates. This difference leads to testable predictions which are evaluated against data on variable realization of schwa in French (Smith & Pater 2020). The results support MaxEnt over Noisy Harmonic Grammar.

Highlights

  • Stochastic phonological grammars assign probabilities to outputs, making it possible to analyze variation and gradient acceptability in phonology

  • There is a range of evidence that Maximum Entropy (MaxEnt) grammar is empirically superior to a probabilistic version of standard Optimality Theory, Stochastic OT (e.g. Zuraw & Hayes 2017; Hayes 2020; Smith & Pater 2020), but there is much less evidence concerning the relative merits of the different varieties of stochastic Harmonic Grammar, MaxEnt and Noisy Harmonic Grammar (NHG)

  • The difference between the models lies in the nature of the noise that is added to candidate harmonies: In MaxEnt the noise terms are drawn from identical Gumbel distributions, whereas in NHG, the noise terms are drawn from normal distributions whose variance depends on the number of constraint violations

Read more

Summary

Introduction

Stochastic phonological grammars assign probabilities to outputs, making it possible to analyze variation and gradient acceptability in phonology. Maxent grammar and NHG at least superficially involve very different approaches to making Harmonic Grammar stochastic: MaxEnt takes the harmony scores assigned by a Harmonic Grammar and maps them onto probabilities, while NHG derives variation by adding random ‘noise’ to constraint weights. Given this difference we would expect these frameworks to be empirically distinguishable, but while previous work has demonstrated distinct predictions of the two frameworks (Jesney 2007; Hayes 2017), these have not led to clear empirical tests. We review Harmonic Grammar, and the two dominant proposals for making Harmonic Grammar stochastic, MaxEnt and NHG

Harmonic Grammar
Noisy Harmonic Grammar
Maximum Entropy Grammar
NHG and MaxEnt as Random Utility Models
The relationship between harmony and probability in Stochastic HGs
Harmony and probability in MaxEnt
Harmony and probability in NHG
Variable realization of schwa in French
The effect on candidate probabilities of adding constraint violations
Maxent grammar
Noisy Harmonic Grammar with censored normal noise
Testing the predictions
Fitting the models to the data
Evaluating the fit of the grammars
Interim summary
Calculating candidate probabilities with more than two candidates
Findings
10. Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.