Abstract
Carrying concealed communication via media is termed as steganography and unraveling details of such covert transmission is known as steganalysis. Extracting details of hidden message like length, position, embedding algorithm etc. forms part of forensic steganalysis. Predicting length of payload in camouflaged interchange is termed as quantitative steganalysis and is an indispensable tool for forensic investigators. When payload length is estimated without any prior knowledge about cover media or used steganography algorithm, it is termed as universal quantitative steganalysis.Most of existing frameworks on quantitative steganalysis available in literature, work for a specific embedding algorithm or are domain specific. In this paper we propose and present USteg-DSE, a deep learning framework for performing universal quantitative image steganalysis using DenseNet with Squeeze & Excitation module (SEM). In deep learning techniques, deeper networks easily capture complex statistical properties. But as depth increases, networks suffer from vanishing gradient problem. In classic architectures, all channels are equally weighted to produce feature maps. Presented USteg-DSE framework overcomes these problems by using DenseNet and SEM. In DenseNet, each layer is directly connected with every other layer. DenseNet makes information and gradient flow easier with fewer feature maps. SEM incorporates content aware mechanism to adaptively regulate weight for every feature map. Presented framework has been compared with existing state-of-the-art techniques for spatial domain as well as transform domain and show better results in terms of Mean Absolute Error (MAE) and Mean Square Error (MSE).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.