Abstract

Synthetic aperture imaging is a technique that mimics a camera with a large virtual convex lens with a camera array. Objects on the focal plane will be sharp and off the focal plane blurry in the synthesized image, which is the most important effect that can be achieved with synthetic aperture imaging. The property of focusing makes synthetic aperture imaging an ideal tool to handle the occlusion problem. Unfortunately, to automatically measure the focusness of a single synthetic aperture image is still a challenging problem and commonly employed pixel-based methods include using variance or using a ”manual focus” interface. In this paper, a novel method is proposed to automatically determine whether or not a synthetic aperture image is in focus. Unlike conventional focus estimation methods which pick the focal plane with the minimum variance computed by the variance of corresponding pixels captured by different views in a camera array, our method automatically determines if the synthetic aperture image is focused or not from one single image of a scene without other views using a deep neural network. In particular, our method can be applied to automatically select the focal plane for synthetic aperture images. The experimental results show that the proposed method outperforms the traditional automatic focusing methods in synthetic aperture imaging as well as other focus estimation methods. In addition, our method is more than five times faster than the state-of-the-art methods. By combining with object detection or tracking algorithms, our proposed method can also be used to automatically select the focal plane that keeps the moving objects in focus. To the authors’ best knowledge, it is the first time that such a method of using a deep neural network has been proposed for estimating whether or not a single synthetic aperture image is in focus.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.