Abstract

Publisher Summary This chapter discusses relative entropy and AI. Various properties of relative entropy have led to its widespread use in information theory. These properties suggest that relative entropy has a role to play in systems that attempt to perform inference in terms of probability distributions. The chapter discusses a few basic properties of relative entropy and its role in probabilistic inference. The chapter also focuses on a few existing and potential applications of relative entropy to so-called artificial intelligence. A more fundamental approach is to interpret a general method of logical inference, to state properties that are required of any consistent method of inference, and then to study their consequences. Although AI emphasizes symbolic information processing, numerical information processing will always have an important role, particularly where uncertain information is involved. In this context, there seem to be two areas where relative entropy should have a role to play. The first arises from relative entropy's properties as an information measure—it should be useful as a means of quantifying information gains and losses within probabilistic inference procedures. The second arises from MRE's properties as a uniquely consistent inference procedure, which suggests that MRE can be used directly for inference in AI applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.