Abstract

We first review a logical-statistical framework called statistical abduction and identify its three computational tasks, one of which is the learning of parameters from observations by ML (maximum likelihood) estimation. Traditionally, in the presence of missing values, the EM algorithm has been used for ML estimation. We report that the graphical EM algorithm, a new EM algorithm developed for statistical abduction, achieved the same time complexity as specialized EM algorithms developed in each discipline such as the Inside-Outside algorithm for PCFGs (probabilistic context free grammars). Furthermore, learning experiments using two corpora revealed that it can outperform the Inside-Outside algorithm by orders of magnitude. We then specifically look into a family of extensions of PCFGs that incorporate context sensitiveness into PCFGs. Experiments show that they are learnable by the graphical EM algorithm using at most twice as much time as plain PCFGs even though these extensions have higher time complexity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.