Abstract

We present a of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the onmi-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-hased camera calibration method to omni-directional cameras. We also use a B-Tree data stnicture for the light field. to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a. wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an onlni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.