Abstract

Most approaches to estimate a scene’s 3D depth from a single image often model the point spread function (PSF) as a 2D Gaussian function. However, those methods are suffered from some noises, and difficult to get a high quality of depth recovery. We presented a simple yet effective approach to estimate exactly the amount of spatially varying defocus blur at edges, based on a Cauchy distribution model for the PSF. The raw image was re-blurred twice using two known Cauchy distribution kernels, and the defocus blur amount at edges could be derived from the gradient ratio between the two re-blurred images. By propagating the blur amount at edge locations to the entire image using the matting interpolation, a full depth map was then recovered. Experimental results on several real images demonstrated both feasibility and effectiveness of our method, being a non-Gaussian model for DSF, in providing a better estimation of the defocus map from a single un-calibrated defocused image. These results also showed that our method was robust to image noises, inaccurate edge location and interferences of neighboring edges. It could generate more accurate scene depth maps than the most of existing methods using a Gaussian based DSF model.

Highlights

  • Estimation of 3D depth from the scene is a fundamental problem of computer vision and computer graphics applications including robotics, scene understanding, image deblurring and refocusing and 3D reconstruction

  • We focus on a more challenging problem of recovering the defocus map from a single image captured by an uncalibrated conventional camera, using edge blur defocus

  • The point spread function (PSF) can be approximated by a Gaussian function g (x, σ), where the standard deviation σ= k ⋅ c is proportional to the diameter of the circle of confusion (CoC) c and measures the defocus blur amount

Read more

Summary

Introduction

Estimation of 3D depth from the scene is a fundamental problem of computer vision and computer graphics applications including robotics, scene understanding, image deblurring and refocusing and 3D reconstruction. Several methods have been proposed to recover depths map from a single image, which do not suffer from the correspondence problem of multiple images matching. A defocus map and an all-focused image can be obtained after deconvolution using calibrated blur kernels These methods require additional illumination or camera modification to obtain a defocus map from a single image. We focus on a more challenging problem of recovering the defocus map from a single image captured by an uncalibrated conventional camera, using edge blur defocus. Namboodiri and Chaudhuri [14] model the PSF of defocus blur as a thermal diffusion process and use the inhomogeneous inverse heat diffusion to estimate defocus blur at the edge locations, and apply a graph-cut based method to recover the scene’s depth map. Our method can estimate the depth map of the scene with fairly good extent of accuracy

Defocus Model
Edges Defocus Blur Estimate
The Whole Scene Depth Map Extraction
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call