Abstract
Many factors lead to spatially varying blur kernels in a blurred image, such as camera shake, moving objects, and scene depth variation. The traditional camera shake removal methods either ignore the influence of varying depth values or object motion in dynamic scenes, while the methods not limited to removing camera shake always make simple assumptions about camera motion trajectory. We consider these factors in a unified framework, with the aid of an alternate-exposure capture strategy and simultaneously recorded inertial sensor readings. The inertial measurements relate the long-exposed blurred image to preceding and succeeding short-exposed noisy images. The special exposure arrangement effectively addresses the problem inherent in reconstructing camera motion from inertial measurements. In addition, the noisy image pair bracketing the blurred image is used for motion detection and initial depth map estimation, making the proposed method free of user interaction and additional expensive devices. Contrary to previous methods that individually parametrize the motion blur of the moving foreground layer and the static background layer, we exploit the fact that camera shake has a global influence to decompose the motion of the foreground layer such that a more tight constraint between the motion of layers is established. Given the motion and image data, we propose a single-energy model and minimize it using alternating optimization to estimate the spatially varying motion blur and the latent sharp image. Comparative experimental results demonstrate that our method outperforms conventional camera motion deblurring and object deblurring methods on both synthetic and real scenes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.