Abstract

The prevailing method for steganographic payload location aimed at LSB matching is the MAP method, which requires a few hundreds of stego images with load-carrying pixels at same locations and relatively high embedding rates. However, in practice, especially communication security, it is unwise for steganographers to generate stego images with high payloads or heavily utilize a same embedding key. Thus, the requirement of MAP is actually to some degree out of reach which leads to a performance degradation when faced with insufficient stego images with low embedding rates. To this end, we propose a tailored deep neural network (DNN) equipped with the improved feature named the “mean square of adjacency pixel difference”, which remarkably outperforms the previous state-of-the-art methods not only in terms of accuracy but also efficiency. Our approach can considerably reduce computational costs because no cover estimate, as represented by the key in MAP, is involved. This merit stems from the methodology we adopted that takes payload location as a binary classification problem for each pixel. Additionally, our DNN is consistently superior than MAP irrespective of embedding rates. The significance of our main design points in DNN and the improved features are verified, by experiment results. Besides, the time required in our method to handle 256 × 256 pixel images is 82.54 ms on the average, which is nearly 14 times faster than that of MAP. On the basis of relevant knowledge, the incorporation of feature extraction into DNN architecture is likely to enable future researchers to specify real-time payload locations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.