Abstract

This chapter reviewed the connection of RELR to maximum entropy and maximum likelihood estimation methods. In particular, RELR was reviewed with respect to the principle of Jaynes which has origins in statistical mechanics. This Jaynes principle views the goal of estimation in complex systems to generate most likely inference given the available certain constraints measurements. RELR extends this principle to most likely inference given most likely constraints on measurements. This chapter reviewed how RELR's error modeling is a form of errors-in-variables regression which assumes that error exists in both independent and dependent variables. RELR's error modeling was then reviewed which makes three critical assumptions in addition to the assumption of independent observations. These assumptions concerned the existence of extreme value logit error, the equal probability of positive and negative errors across independent variable features, and that positive and negative error probabilities are unbiased across even and odd polynomial features. These assumptions were argued to be reasonable and allow the direct estimate of the logit error that standard logistic regression ignores.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call