Abstract

Abstract The pose of an articulated machine includes the position and orientation of not only the machine base (e.g., tracks or wheels), but also its major articulated components (e.g., stick and bucket). To automatically estimate this pose is a crucial component of technical innovations aimed at improving both safety and productivity in many construction tasks. Based on computer vision, an automatic observation and analysis platform using a network of cameras and markers is designed to enable such a capability for articulated machines. To model such a complex system, a theoretical framework termed camera marker network is proposed. A graph abstraction of such a network is developed to both systematically manage observations and constraints, and efficiently find the optimal solution. An uncertainty analysis without time-consuming simulation enables optimization of network configurations to reduce estimation uncertainty, leading to several empirical rules for better camera calibration and pose estimation. Through extensive uncertainty analyses and field experiments, this approach is shown to achieve centimeter level bucket depth tracking accuracy from as far as 15 m away with only two ordinary cameras (1.1 megapixels each) and a few markers, providing a flexible and cost-efficient alternative to other commercial products that use infrastructure dependent sensors like GPS. A working prototype has been tested on several active construction sites confirming the method's effectiveness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.