Abstract

The precision of image-based measurements is mainly limited by the resolution of the camera hardware. Currently, increasing the resolution of measurement images by super-resolution method is an effective way to improve the precision of vision measurement. However, most image super-resolution methods are only based on the prediction of spatial information, therefore the reconstruction quality and localization accuracy of the edge regions are often insufficient to meet the requirements of precision measurements. For this reason, a Curvelet coefficient prediction method for image super-resolution (CPSR) is proposed to achieve accurate super-resolution results of edge regions in this paper. Firstly, the image is decomposed into sub-bands of Curvelet coefficients at different scale to extract frequency features. Then, deep residual networks are built and trained to fit the mapping function between low- and high-resolution coefficient sub-bands. In addition, Curvelet loss and edge localization loss functions are designed to obtain Curvelet coefficient errors of each scale and edge localization errors of sub-pixel level. The proposed method is evaluated using public super-resolution datasets and actual precision measurement images. Experimental results show that CPSR generates images with better visual effects, and has strong generalization ability for the reconstruction of different edge patterns. Furthermore, compared with the common super-resolution method based on deep learning, vision measurements on the reconstructed image of CPSR achieve smaller measurement errors, indicating that CPSR achieves more accurate edge localization. Experimental results verified the effectiveness of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.