Abstract

In this paper, an approach for identifying corresponding image features across different imaging modalities is presented. The method includes spatial alignment of sensor images on short and long distance as well as a probabilistic fusion approach for combining multiple unimodal to multimodal image features. An experimental statistical comparison of uni- and multimodal image features is performed using RGB, IR and thermal cameras. Therefore, the sensors are mounted on an Ackermann steering platform in a typical industrial environment. The multimodal features are examined regarding repetitive characteristics, quantity and spatial distribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call