Abstract
Steganography is an art to hide information in the carriers to prevent from being detected, while steganalysis is the opposite art to detect the presence of the hidden information. With the development of deep learning, several state-of-the-art steganography and steganalysis based on deep learning techniques have been proposed to improve hiding or detection capabilities. Generative Adversarial Networks (GANs) based steganography directly uses the minimax game between the generator and discriminator, to automatically generate steganography algorithms resisting being detected by powerful steganalysis. The steganography without embedding (SwE) based on GANs, where the generated cover images themselves are stego images carrying secret information has shown its state-of-the-art steganography performance. However, SwE based on GANs has serious weaknesses, such as low information recovery accuracy, low steganography capacity and poor natural showing. To solve these problems, this paper proposes a new SwE based on attention-GAN model, with carefully designed generator, discriminator and extractor, as well as their loss functions and optimized training mode. The generative model utilizes the attention method to improve the correlation among pixels and to correct errors such as image distortion and background abnormality. The soft margin discriminator is used to improve the compatibility of information recovery and fault tolerance of image generation. Experimental evaluations show that our method can achieve a very high information recovery accuracy (100% in some cases), and at the same time improve the steganography capacity and image quality.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.