Abstract
In this paper, we develop a code length principle which is invariant to the choice of parameterization on the model distributions, that is the code length remains the same under smooth transformations on the likelihood parameters. An invariant approximation formula for easy computation of the marginal distribution is provided for Gaussian likelihood models. We provide invariant estimators of the model parameters and formulate conditions under which these estimators are essentially posteriori unbiased for Gaussian models. An upper bound on the coarseness of discretization on the model parameters is deduced. We introduce a discrimination measure between probability distributions and use it to construct probability distributions on model classes and show how this may induce an additional code length term <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$ kover 4log _2k$</tex> for a <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$k$</tex> -parameter model. The total code length is shown to be closely related to the normalized maximum likelihood (NML) code length of Rissanen when choosing Jeffreys prior distribution on the model parameters together with a uniform prior distribution on the model classes. Our model selection principle is applied to a Gaussian estimation problem for data in a wavelet representation and its performance is tested and compared to alternative wavelet-based estimation methods in numerical experiments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.