Abstract

SummaryIn order to cope with an increasing volume of network traffic, flow sampling methods are deployed to reduce the volume of log data stored for monitoring, attack detection and forensic purposes. Sampling frequently changes the statistical properties of the data and can reduce the effectiveness of subsequent analysis or processing. We propose two concepts that mitigate the negative impact of sampling on the data. Late sampling is based on a simple idea that the features used by the analytic algorithms can be extracted before the sampling and attached to the surviving flows. The surviving flows thus carry the representation of the original statistical distribution in these attached features. The second concept we introduce is that of adaptive sampling. Adaptive sampling deliberatively skews the distribution of the surviving data to over‐represent the rare flows or flows with rare feature values. This preserves the variability of the data and is critical for the analysis of malicious traffic, such as the detection of stealthy, hidden threats. Our approach has been extensively validated on standard NetFlow data, as well as on HTTP proxy logs that approximate the use‐case of enriched IPFIX for the network forensics. Copyright © 2015 John Wiley & Sons, Ltd.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.