Abstract

We propose a new method for learning a general statistical inference engine, operating on discrete and mixed discrete/continuous feature spaces. Such a model allows inference on any of the discrete features, given values for the remaining features. Applications are, e.g., to medical diagnosis with multiple possible diseases, fault diagnosis, information retrieval, and imputation in databases. Bayesian networks (BN's) are versatile tools that possess this inference capability. However, BN's require explicit specification of conditional independencies, which may be difficult to assess given limited data. Alternatively, Cheeseman proposed finding the maximum entropy (ME) joint probability mass function (pmf) consistent with arbitrary lower order probability constraints. This approach is in principle powerful and does not require explicit expression of conditional independence. However, until now, the huge learning complexity has severely limited the use of this approach. Here we propose an approximate ME method, which also encodes arbitrary low-order constraints but while retaining quite tractable learning. Our method uses a restriction of joint pmf support (during learning) to a subset of the feature space. Results on the University of California-Irvine repository reveal performance gains over several BN approaches and over multilayer perceptrons.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.