Abstract

In this research, the authors have addressed the collaboration calibration and real-time three-dimensional (3D) localization problem in the multi-view system. The 3D localization method is proposed to fuse the two-dimensional image coordinates from multi-views and provide the 3D space location in real time. It is a fundamental solution to obtain the 3D location of the moving object in the research field of computer vision. Improved common perpendicular centroid algorithm is presented to reduce the side effect of the shadow detection and improve localization accuracy. The collaboration calibration is used to generate the intrinsic and extrinsic parameters of multi-view cameras synchronously. The experimental results show that the algorithm can complete accurate positioning in indoor multi-view monitoring and reduce the complexity.

Highlights

  • The object tracking is a challenging task for various data sets; it is still a hot topic of research in computer vision.[1,2,3,4,5] To extract the metric information from twodimensional (2D) images, a flexible calibration technique is proposed.[6]

  • A 3D localization algorithm based on the concept of common perpendicular has been presented for the experimental platforms of Camera networks (CNs) and wireless multimedia sensor networks (WMSNs)

  • It is extended to the dynamic localization of mobile robots, which is the singleor multi-object tracking

Read more

Summary

Introduction

The object tracking is a challenging task for various data sets; it is still a hot topic of research in computer vision.[1,2,3,4,5] To extract the metric information from twodimensional (2D) images, a flexible calibration technique is proposed.[6]. Intrinsic parameters do not depend on the camera location, but rather on the internal camera parameters such as the focal length f, the number of pixels per distance unit in u and v directions ku and kv, the skew factor g which equals zero if and only if the u and v directions are perfectly orthogonal, and the image frame coordinates of the intersection between the optical axis and the image plane called the principal point c0 1⁄4 ðu[0]; v0Þ These parameters define the calibration matrix K~ of the camera expressing the linear transformation between the camera frame and the image frame, given by[32] kuf g u0. If the projection matrix is Mt, which integrates the intrinsic and extrinsic parameters and t 1⁄4 1; . . . ; 4 is the tth camera, the relationship between the image coordinate and world coordinate system of cameras is described as follows

Zcti64 uti vti
Obj ð17Þ
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call