Abstract

Lambertian photometric stereo (PS) is a seminal computer vision method. However, using depth maps in the image formation model, instead of surface normals as in PS, reduces model parameters by a third, making it preferred from an information-theoretic perspective. The Akaike information criterion (AIC) quantifies this trade-off between goodness of fit and overfitting. Obtaining superior AIC values requires an effective maximum likelihood (ML) depth-map & albedo estimation method. Recently, the authors published an ML estimation method that uses a two-step approach based on PS. While effective, approximations of noise distributions and decoupling of depth-map & albedo estimation have limited its accuracy. Overcoming these limitations, this paper presents an ML method operating directly on images. The previous two-step ML method provides a robust initial solution, which kick starts a new nonlinear estimation process. An innovative formulation of the estimation task, including a separable nonlinear least-squares approach, reduces the computational burden of the optimization process. Experiments demonstrate visual improvements under noisy conditions by avoiding overfitting. As well, a comprehensive analysis shows that refined depth maps & albedos produce superior AIC metrics and enjoy better predictive accuracy than with literature methods. The results indicate that the new method is a promising means for depth-map & albedo estimation with superior information-theoretic performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.