Abstract

This paper exploits knowledge made available by an external source in the form of a predictive distribution in order to elicit a parameter prior. It uses the terminology of Bayesian transfer learning, one of many domains dealing with reasoning as coherent knowledge processing. An empirical solution of the addressed problem was provided in [19], based on an interpretation of the external predictor as an empirical distribution constructed from fictitious data. In this paper, two main contributions are provided. First, the problem is solved using formal hierarchical Bayesian modeling [25], and the knowledge transfer is achieved optimally, i.e. in the minimum-KLD sense. Second, this hierarchical setting yields a distribution on the set of possible priors, with the choice [19] acting as the base distribution. This allows randomized choices of the prior to be generated, avoiding costly and/or intractable estimation of this prior. It also provides measures of uncertainty in the prior choice, allowing subsequent learning tasks to be assessed for robustness to this prior choice. The instantiation of the method in already published applications in knowledge elicitation, recursive learning and flat cooperation of adaptive controllers is recalled, and prospective application domains are also mentioned.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.