Abstract
The problem of sequentially transferring a data-predictive probability distribution from a source to a target Bayesian filter is addressed in this paper. In many practical settings, this transfer is incompletely modelled, since the stochastic dependence structure between the filters typically cannot be fully specified. We therefore adopt fully probabilistic design to select the optimal transfer mechanism. We relax the target observation model via a scale-mixing parameter, which proves vital in successfully transferring the first and second moments of the source data predictor. This sensitivity to the transferred second moment ensures that imprecise predictors are rejected, achieving robust transfer. Indeed, Student-t state and observation models are adopted for both learning processes, in order to handle outliers in all hidden and observed variables. A recursive outlier-robust Bayesian transfer learning algorithm is recovered via a local variational Bayes approximation. The outlier rejection and positive transfer properties of the resulting algorithm are clearly demonstrated in a simulated planar position-velocity system, as is the key property of imprecise knowledge rejection (robust transfer), unavailable in current Bayesian transfer algorithms. Performance comparison with particle filter variants demonstrates the successful convergence of our robust variational Bayes transfer learning algorithm in sequential processing.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.