Abstract

Most of the image based modeling and rendering works found in the literature rely on the input supplied by the user and hence it becomes necessary to optimize the user interaction while building the 3D model. We present an interactive system for image based 3D model building from single view uncalibrated images based on depth-cueing which constructs approximate wireframe from the user specified depth information. The depth information is interpreted by drawing a gray shaded line whose intensity is highest at the vertices closer to viewing point and decreases towards the vertices farther. On the rendering part, the perspective distortion is rectified on each surface based on projective transformation by exploiting the presence of symmetric objects (like circle, square, etc) in the images to get the fronto-parallel view. Our study shows that the symmetric objects like circle get deformed to an ellipse due to perspective distortion and projective transformation. We demonstrated the significance of symmetric objects in a scene/image to rectify the perspective distortion. The rectified surfaces are used to retrieve the actual 3D coordinates of the wireframe and also used as texture maps during rendering to get the photo realistic results. We have used images containing planar surfaces. The user interaction during wireframe building needs no idea about the scene and camera parameters. The results are significant and convincing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.