Abstract

This paper presents an efficient image-based rendering system capable of performing online stereo matching and view synthesis at high speed, completely on the graphics processing unit (GPU). Given two rectified stereo images, our algorithm first extracts the disparity map with a stream-centric dense depth estimation approach. For high-quality view synthesis, multi-label masks are then automatically generated to postprocess occlusions and ambiguously estimated regions adaptively. To allow even faster interactive view generation, an alternative forward warping method is also integrated. The experiments show that photorealistic intermediate views of high image quality are yielded by our algorithm. The optimized implementation also provides the state-of-the-art stereo analysis and view synthesis speed, achieving over 47 fps with 450x375 stereo images and 60 disparity levels on an Nvidia GeForce 7900 graphics card.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call