Abstract

To calculate and render the stereo disparity map in real time at a video rate is a challenging problem. In the approach to use the raster scan video system, typically correlations are measured from a point in the left image to a point in the right image along the ID raster, then stereo correspondence for each and every pixel is searched. The time warp algorithm based on the dynamic programming (DP) optimizes the search for the entire raster scan line. To ensure the accuracy up to a pixel distance, the pixel-to-pixel similarity matrix needs to be calculated. This makes it nearly impossible to calculate a dense stereo disparity map to show at a video rate such as 30 frames/sec. In this paper, a method to reduce this enormous time necessary to calculate the pixel-by-pixel similarity matrix is proposed. The idea is to use coarse quantization in the luma and chroma image represented in the YUV color space to capture the global transitional points in a ID raster image as well as reduced sampling in the regions of plateau. Such feature sampling naturally forms a sparse representation of feature points both at edges and plateaus. Thus, the size of the similarity matrix for the time warp algorithm can be dramatically reduced from say, 352 × 288 in CIF down by almost 2 orders of the magnitude.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.