Abstract

This paper addresses the problems of real-time localization and 3D depth estimation across disparate sensing systems. The sensing systems include wireless microelectromechanical systems (MEMS) sensor networks, such as MICA sensors by Crossbow Inc., radio frequency identification (RFID) tags and cameras that capture a variety of spectra. Some of the sensing is adaptive in time and space by using a remotely controlled robot for the sensor deployment. The motivation for integrating and analyzing multiple sensing systems and spectral modalities comes from the fact that in many applications a single sensing system or modality does not lead to robust and accurate performance. In this work we design systems for localization using radio frequency identification (RFID) tags and real time 3D depth estimation from stereo vision in order to incorporate power constraints imposed on deployment of battery-operated wireless MICA sensors. The resulting methods are applied to the development of (a) hazard aware spaces (HAS) to alert people in events of fire, and (b) tele-immersive spaces (TEEVE) to enable remote collaborations, training and art performances. The novelty of our work lies in the power efficient deployment of wireless sensors for location aware applications by combining multiple sensors with advanced signal and image processing algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.