Abstract
Hidden Markov models (HMMs) are popular models to identify a finite number of latent states from sequential data. However, fitting them to large datasets can be computationally demanding because most likelihood maximization techniques require iterating through the entire underlying dataset for every parameter update. We propose a novel optimization algorithm that updates the parameters of an HMM without iterating through the entire dataset. Namely, we combine a partial E step with variance-reduced stochastic optimization within the M step. We prove the algorithm converges under certain regularity conditions. We test our algorithm empirically using a simulation study as well as a case study of kinematic data collected using suction-cup attached biologgers from eight northern resident killer whales (Orcinus orca) off the western coast of Canada. In both, our algorithm converges in fewer epochs, with less computation time, and to regions of higher likelihood compared to standard numerical optimization techniques. Our algorithm allows practitioners to fit complicated HMMs to large time-series datasets more efficiently than existing baselines. Supplemental materials are available online.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.