Abstract

Three dimensional (3D) mapping of environments has gained tremendous attention, from academic to industry to military, owing to the ever increasing needs of environmental modeling as well as monitoring. While many highly effective techniques have been reported thus far, a few even turned into commercial products, none has explored the use of wearable sensors to capture human foot motion, gait, and phase for 3D map construction, especially in the Architecture, Engineering, and Construction (AEC) domain. In this work, we propose a smart (and wearable) shoe, called “Smart Shoe”, which integrates multiple laser scanners and an inertial measurement unit (IMU) to build a 3D map of environments in real time. Such a Smart Shoe can be a potential tool for floor plan surveying, construction process monitoring, planning renovations, space usage planning, managing building maintenance and other tasks in the AEC domain. Besides, this Smart Shoe could assist disabled people (blind people) to navigate and avoid obstacles in the unknown environment. In another case, the shoes may help firefighters quickly model and recognize objects in the firing, dark, and smoky buildings where traditional camera-based approaches might not be applicable. We integrate this shoe with a novel foot localization algorithm that produces a smooth and accurate pose and trajectory of human walking, which is the key enabling technique to minimize data registration errors from laser point cloud.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.