Abstract

AbstractWe derive an asymptotic expansion for the log‐likelihood of Gaussian mixture models (GMMs) with equal covariance matrices in the low signal‐to‐noise regime. The expansion reveals an intimate connection between two types of algorithms for parameter estimation: the method of moments and likelihood optimizing algorithms such as Expectation‐Maximization (EM). We show that likelihood optimization in the low SNR regime reduces to a sequence of least squares optimization problems that match the moments of the estimate to the ground truth moments one by one. This connection is a stepping stone towards the analysis of EM and maximum likelihood estimation in a wide range of models. A motivating application for the study of low SNR mixture models is cryo‐electron microscopy data, which can be modeled as a GMM with algebraic constraints imposed on the mixture centers. We discuss the application of our expansion to algebraically constrained GMMs, among other example models of interest. © 2022 The Authors. Communications on Pure and Applied Mathematics published by Wiley Periodicals LLC.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call