Abstract
In order to improve the accuracy of multi-projection correction fusion, a multi-projection correction method based on binocular vision is proposed. To date, most of the existing methods are based on the single-camera mode, which may lose the depth information of the display wall and may not accurately obtain the details of the geometric structure of the display wall. The proposed method uses the depth information of a binocular camera to build a high-precision 3D display wall model; thus, there is no need to know the specific CAD size of the display wall in advance. Meanwhile, this method can be applied to display walls of any shape. By calibrating the binocular vision camera, the radial and eccentric aberration of the camera can be reduced. The projector projects the encoded structured light stripes, and the high-precision 3D structural information of the projection screen can be reconstructed according to the phase relationship after collecting the deformation stripes on the projection screen with the binocular camera. Thus, the screen-projector sub-pixel-level mapping relationship is established, and a high-precision geometric correction is achieved. In addition, by means of the one-to-one mapping relation between the phase information and the three-dimensional space points, accurate point cloud matching among multiple binocular phase sets could be established, so that such method can be applied to any quantity of projectors. The experimental results based on various special-shaped projection screens show that, comparing to the single-camera-based method, the proposed method improves the geometric correction accuracy of multi-projection stitching by more than 20%. This method also has many advantages, such as strong universality, high measurement accuracy, and rapid measurement speed, which indicate its wide application potential in many different fields.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.