Abstract

This paper presents a new solution for statistical fusion of multi-sensor information acquired from different fields of view, in a centralized sensor network. The focus is on applications that involve tracking unknown number of objects with time-varying states. Our solution is a track-to-track fusion method in which the information contents of posteriors are combined. Existing information-theoretic solutions for track-to-track fusion in sensor networks are commonly devised based on minimizing the average information divergence from the local posteriors to the fused one. A common approach is to use Generalized Covariance Intersection rule for sensor fusion. This approach works best when all the sensors detect the same object(s), and performs poorly when fields-of-view are different. We suggest Cauchy–Schwarz divergence to be used for measuring information divergence. We demonstrate that employing Cauchy–Schwarz divergence leads to fusion rules that are generally more tolerant to imperfect consensus. We show that the proposed fusion rule for multiple Poisson posteriors is the weighted arithmetic mean of the Poisson densities. Furthermore, we derive the fusion rule for labeled multi Bernoulli filter by approximating the labeled multi Bernoulli density to its first order moment. Numerical experiments show the superior performance of our solution compared to Kullback–Leibler averaging method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call