Abstract
This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
Highlights
On 4 November 2010, KinectTM was launched as a non-contact motion sensing device by Microsoft for the Xbox 360 video game console [1]
In addition to motion sensing for gaming, the use of KinectTM sensors in various applications has been actively investigated in many research areas such as robotics, human-computer interface (HCI), and geospatial information
We proposed a mathematical model for spatial measurement uncertainty, which can provide qualitative and quantitative analysis for KinectTM sensors
Summary
On 4 November 2010, KinectTM was launched as a non-contact motion sensing device by Microsoft for the Xbox 360 video game console [1]. KinectTM sensors are very suitable for these applications because the essential functionalities can be achieved using the disparity and the RGB information These problems can be solved by stochastic optimization methods, which contain measurement error and uncertainties. KinectTM sensors, which is represented by the covariance matrix for 3D measurement errors in the actual Cartesian space To achieve this objective, we derive the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. A quantitative analysis of the spatial measurement of KinectTM sensors is performed by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.