Abstract

This paper develops a mathematical and computational framework for analyzing the expected performance of Bayesian data fusion, or joint statistical inference, within a sensor network. We use variational techniques to obtain the posterior expectation as the optimal fusion rule under a deterministic constraint and a quadratic cost, and study the smoothness and other properties of its classification performance. For a certain class of fusion problems, we prove that this fusion rule is also optimal in a much wider sense and satisfies strong asymptotic convergence results. We show how these results apply to a variety of examples with Gaussian, exponential and other statistics, and discuss computational methods for determining the fusion system's performance in more general, large-scale problems. These results are motivated by studying the performance of fusing multi-modal radar and acoustic sensors for detecting explosive substances, but have broad applicability to other Bayesian decision problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.