Abstract

Logistic regression is the closest model, given its sufficient statistics, to the model of constant success probability in terms of Kullback–Leibler information. A generalized binary model has this property for the more general ϕ -divergence. These results generalize to multinomial and other discrete data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call