Abstract
3D cameras that can capture range information, in addition to color information, are increasingly prevalent in the consumer marketplace and available in many consumer mobile imaging platforms. An interesting and important application enabled by 3D cameras is photogrammetry, where the physical distance between points can be computed using captured imagery. However, for consumer photogrammetry to succeed in the marketplace, it needs to meet the accuracy and consistency expectations of users in the real world and perform well under challenging lighting conditions, varying distances of the object from the camera etc. These requirements are exceedingly difficult to meet due to the noisy nature of range data, especially when passive stereo or multi-camera systems are used for range estimation. We present a novel and robust algorithm for point-to-point 3D measurement using range camera systems in this paper. Our algorithm utilizes the intuition that users often specify end points of an object of interest for measurement and that the line connecting the two points also belong to the same object. We analyze the 3D structure of the points along this line using robust PCA and improve measurement accuracy by fitting the endpoints to this model prior to measurement computation. We also handle situations where users attempt to measure a gap such as the arms of a sofa, width of a doorway etc. which violates our assumption. Finally, we test the performance of our proposed algorithm on a dataset of over 1800 measurements collected by humans on the Dell Venue 8 tablet with Intel RealSense Snapshot technology. Our results show significant improvements in both accuracy and consistency of measurement, which is critical in making consumer photogrammetry a reality in the marketplace.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.