Abstract

In this article, a new systematic approach to sensor fusion and state estimation is proposed for extended target tracking in human–robot coexisting environments. The developed method, called <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">human feature-based extended target tracking via multisensor information fusion</i> (HFBETT-MSIF), can assimilate information from the onboard camera and sonar sensor of a mobile robot in a unified way, during tracking of a pair of human shoes. A novel generalized measurement model containing the complete information of the human target is formulated for both sensors, thus rendering the tracking system potentially robust to the failure of any one sensor. The study illustrates how <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">heteroscedastic Gaussian process</i> (HGP) regression can be used to derive the measurement model. It also develops an advanced HGP model, called <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">bias-minimized most likely HGP</i> , to interpret the real-world shoe-contour data subjected to heteroscedastic noise. Performance evaluations conducted for real-life shoe tracking demonstrate the supremacy of the HFBETT-MSIF.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call