Abstract

Abstract Structured light (SL) has been extensively researched and developed due to reconstruction the fast and highly accurate depth sensing. In order to achieve this goal, many approaches have been taken. In this article, we have taken some innovative work in the field for building a new type of fast, high-precision depth perception based on structured light. In addition, the basic ideas we propose are as follows: our model for slow motion dynamic scenes, the sparse depth maps are obtained based on the pair of unstructured rigid pattern with different color, and the high-quality depth images are optimized by the ridge line extraction. Finally, the calculated two depth maps based on the different color pattern merged, which to acquire the optimization depth. We further propose a model for the high-speed motion dynamic scene which apply a pair of projectors structure light. We show that, we actively use motion blur to estimate depth which is different from the previous approaches. The patterns are illuminated from two projectors and the motion blur of each line is accurately measured. The scene depth information is estimated by analyzing the length of the blur. As a result, we describe two experiments that we conducted on two different experimental equipment (ball and planner board). The first experiment was conducted with the planer board, measuring the slow moving object. In addition, the second experiment scene was conducted with the ball, which obtained the depth of the ultra-fast object. Our experimental results demonstrate that our approach achieves effective real-time measurement, can successfully obtain accurate depth information of moving objects.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.