Abstract

Abstract The Internet has become an increasingly attractive location for collecting data about cyber threats, driven by the abundance of quality data available and accessible online. As such, researchers and practitioners have turned to automated data collection technologies (ADCT), including ‘web crawlers’ and ‘web scrapers’, to study these threats. The rapid proliferation of ADCT has meant directions for their ethical and legal operation have been slow to adapt, with no clear guidelines regulating their use for research. This article identifies the relevant ethical and legal frameworks guiding the deployment of ADCT in Australia for cybersecurity research. This is accomplished through a systematic review of research within this context, coupled with ethical and jurisprudential analysis. We argue that the use of ADCT can be both ethical and legal, but only where mitigating measures are implemented. We provide a series of practical directions to guide researchers and practitioners when navigating this novel terrain.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.