Abstract

Fog computing promises improved service scalability and lower latency for IoT systems. The concept closes the gap between full computing capabilities at the network's edge and cloud systems' centrally located processing infrastructure. The drawback of the former is the high power requirements at the edge nodes, and the latter is the high latency for the data being transmitted from the edge to the cloud and back. One of the challenges for a digital forensic investigator facing a fog is the number of possible data locations, as the node functioning as the server processing data can be selected among several nodes in the network. An investigator typically has limited resources for an investigation; the more possible evidence locations, the more resources are required to collect and examine the data locations. A triage is thus needed to prioritize collecting and examining the evidence. This work analyzes measures that can identify which fog nodes are more likely to contain data, and it uses simulations to test the measures' precision and sensitivity. It aims for digital forensic investigators to maximize the utility of the available investigation resources, such that all relevant evidence is found on time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.