Abstract

In recent years, the rapid development of deep learning has achieved remarkable results in many scientific research fields. Especially in the field of computer vision, deep learning has almost reached the highest level of image processing. Related deep learning methods have also been applied to the field of image inpainting, making researchers begin to use deep learning models to solve the problem of digital image inpainting. The generation of the adversarial network model has greatly improved the inpainting technology of digital images. This paper builds an image inpainting framework based on the generative adversarial network. The inpainting process is divided into two parallel stages, namely reconstruction inpainting and generation inpainting. The network structure of this paper is composed of two parts: a generating network and a discriminating network. The generated network generates an image, and the discriminating network judges whether the image generated by the generating network is consistent with the real image. The loss function of the network uses the loss function of WGAN-GP to calculate the loss of parameters, in order to update the network alternately, making the generated image more natural and realistic. This paper uses the Places2 data set to verify the algorithm, and combines the subjective evaluation method with the objective evaluation method to evaluate the quality of the inpainting image. Experiments show that the algorithm inpainting effect is better.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.