Abstract

The concept and the mathematical properties of entropy play an important role in statistics, cybernetics, and information sciences. Indeed, many algorithms and statistical data processing tools, with a wide range of targets and scopes, have been designed based on entropy. The paper describes two estimators inspired by the concept of entropy that allow to robustly cope with multicollinearity, in one case, and outliers, in the other. The Generalized Maximum Entropy (GME) estimator optimizes the Shannon’s entropy function subject to consistency and normality constraints. In regression applications GME allows, for example, to estimate model coefficients in the presence of multicollinearity. The Least Entropy-Like (LEL) estimator is a novel prediction error model coefficient identification algorithm that minimizes a nonlinear cost function of the fitting residuals. As the cost function that is minimized shares the same mathematical properties of entropy, it allows to compute an estimate of the model coefficients corresponding to a positively skewed distribution of the residuals. The resulting estimator exhibits higher robustness to outliers with respect to standard, as ordinary least squares (OLS) model coefficient approaches. Both the GME and LEL estimation methods are applied to a common case study to illustrate their respective properties.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.