Abstract

In information theory, the relative entropy (or information divergence or information distance) quantifies the difference between information flows with various probability distributions. In this study, the authors first resolve the asymmetric property of Rényi divergence and Kullback–Leibler divergence and convert the divergence measures into proper metrics. Then the authors propose an effective metric to detect distributed denial-of-service attacks effectively using the Rényi divergence to measure the difference between legitimate flows and attack flows in a network. With the proposed metric, the authors can obtain the optimal detection sensitivity and the optimal information distance between attack flows and legitimate flows by adjusting the order's value of the Rényi divergence. The experimental results show that the proposed metric can clearly enlarge the adjudication distance, therefore it not only can detect attacks early but also can reduce the false positive rate sharply compared with the use of the traditional Kullback–Leibler divergence and distance approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.