Abstract

Face editing generates a face image with the target attributes without changing the identity or other information. Current methods have achieved considerable performance; however, they cannot effectively retain the face’s identity and semantic information while controlling the attribute intensity. Inspired by two human cognitive characteristics, namely, the principle of global precedence and the principle of homology continuity, we propose a novel face editing approach called the information retention and intensity control generative adversarial network (IricGAN). It includes a learnable hierarchical feature combination (HFC) function, which can construct a sample’s source space through multiscale feature mixing; it can guarantee the integrity of the source space while significantly compressing the network. Additionally, the attribute regression module (ARM) can decouple different attribute paradigms in the source space to ensure the correct modification of the required attributes and preserve the other areas. The gradual process of modifying the face attributes can be simulated by applying different control strengths in the source space. In face editing experiments, both qualitative and quantitative results demonstrate that IricGAN achieves the best overall results among state-of-the-art alternatives. Target attributes can be continuously modified by re-feeding the relationship of the source space and the image, and the independence of each attribute can be retained to the greatest extent. IricGAN:https://github.com/nanfangzhe/IricGAN

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call