Evaluating Tamper Resistance of Digital Forensic Artifacts during Event Reconstruction

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Event reconstruction is a fundamental part of the digital forensic process, helping to answer key questions like who, what, when, and how. A common way of accomplishing that is to use tools to create timelines, which are then analyzed. However, various challenges exist, such as large volumes of data or contamination. While prior research has focused on simplifying timelines, less attention has been given to tampering, i.e., the deliberate manipulation of evidence, which can lead to errors in interpretation. This article addresses the issue by proposing a framework to assess the relative tamper resistance of different data sources used in event reconstruction. We discuss factors affecting data resilience, introduce a scoring system for evaluation, and illustrate its application with case studies. This work aims to improve the reliability of forensic event reconstruction by considering tamper resistance.

Similar Papers
  • Conference Article
  • Cite Count Icon 16
  • 10.1145/2345316.2345336
LiDAR data management pipeline; from spatial database population to web-application visualization
  • Jul 1, 2012
  • Paul Lewis + 2 more

While the existence of very large and scalable Database Management Systems (DBMSs) is well recognized, it is the usage and extension of these technologies to managing spatial data that has seen increasing amounts of research work in recent years. A focused area of this research work involves the handling of very high resolution Light Detection and Ranging (LiDAR) data. While LiDAR has many real world applications, it is usually the purview of organizations interested in capturing and monitoring our environment where it has become pervasive. In many of these cases, it has now become the de facto minimum standard expected when a need to acquire very detailed 3D spatial data is required. However, significant challenges exist when working with these data sources, from data storage to feature extraction through to data segmentation all presenting challenges relating to the very large volumes of data that exist. In this paper, we present the complete LiDAR data pipeline as managed in our spatial database framework. This involves three distinct sections, populating the database, building a spatial hierarchy that describes the available data sources, and spatially segmenting data based on user requirements which generates a visualization of these data in a WebGL enabled web-application viewer. All work presented is in an experimental results context where we show how this approach is runtime efficient given the very large volumes of LiDAR data that are being managed.

  • Conference Article
  • Cite Count Icon 20
  • 10.1109/issa.2015.7335050
Adding event reconstruction to a Cloud Forensic Readiness model
  • Aug 1, 2015
  • Victor R Kebande + 1 more

During post-event response, proactive forensics is of critical importance in any organisation when conducting digital forensic investigations in cloud environments. However, there exist no reliable event reconstruction processes in the cloud that can help in analysis and examination of Digital Evidence (DE) aspects, during Digital Forensic Readiness (DFR) process, as defined in the standard of ISO/IEC 27043:2015. The problem that this paper addresses is the lack of an easy way of performing digital event reconstruction process when the cloud is forensically ready in preparation of a Digital Forensic Investigation (DFI). During DFR approaches, event reconstruction helps in examination and pre-analysis of the characteristics of potential security incidents. As a result, the authors have proposed an Enhanced Cloud Forensic Readiness (ECFR) process model with event reconstruction process that can support future investigative technologies with a degree of certainty. We also propose an algorithm that shows the methodology that is used to reconstruct events in the ECFR. The main focus of this work is to examine the addition of event reconstruction to the initially proposed Cloud Forensic Readiness (CFR) model, by providing a more enhanced and detailed cloud forensic readiness model.

  • Conference Article
  • 10.2523/iptc-11240-abstract
Case Study: Integrated Study for Assessing Production Enhancement From a Matured Large Carbonate Reservoir
  • Dec 4, 2007
  • Eisa Al-Maraghi + 1 more

Case Study: Integrated Study for Assessing Production Enhancement From a Matured Large Carbonate Reservoir

  • Research Article
  • Cite Count Icon 5
  • 10.1080/09617353.2016.1252083
Learning from text-based close call data
  • Jul 2, 2016
  • Safety and Reliability
  • Peter Hughes + 2 more

A key feature of big data is the variety of data sources that are available; which include not just numerical data but also image or video data or even free text. The GB railways collects a large volume of free text data daily from railway workers describing close call hazard reports: instances where an accident could have – but did not – occur. These close call reports contain valuable safety information which could be useful in managing safety on the railway, but which can be lost in the very large volume of data – much larger than is viable for a human analyst to read. This paper describes the application of rudimentary natural language processing (NLP) techniques to uncover safety information from close calls. The analysis has proven that basic information extraction is possible using the rudimentary techniques, but has also identified some limitations that arise using only basic techniques. Using these findings further research in this area intends to look at how the techniques that have been proven to date can be improved with the use of more advanced NLP techniques coupled with machine-learning.

  • Conference Article
  • Cite Count Icon 3
  • 10.1109/rtc.2012.6418366
The ALICE high level trigger: The 2011 run experience
  • Jun 1, 2012
  • Thorsten Kollegger

The High Level Trigger (HLT) of the ALICE detector system, one of the four big experiments at the Large Hadron Collider (LHC) at CERN, is a dedicated real time system for online event reconstruction and selection. Its main task is to reduce the large volume of raw data of up to 25 GB/s read out from the detector systems by an order of magnitude to fit within the available data acquisition bandwidth. A dedicated computing cluster of 225 processing nodes, connected by an Infiniband high-speed network, is in operation to provide the necessary computing resources for this task. The available computing power is supplemented by utilizing FPGAs for the first steps of the processing, as well as 64 GPUs which are used at later stages of the event reconstruction. During the 2011 LHC heavy-ion run, the HLT was for the first time actively used to reduce the data volume. For this the raw data of the Time Projection Chamber, the largest data source in ALICE, was replaced by the results of the online FPGA based cluster finder. A further reduction of the data volume by roughly a factor 4 was achieved by optimizing the data format for a subsequent standard Huffman compression. For this, entropy reducing data transformations have been implemented. In this contribution, we will present the experience gained during the 2011 run, both on the technical and operational levels of the system, as well as from a physics performance point of view. Building on the success of the 2011 run, possibilities for even more advanced uses of online reconstruction results in the future will be discussed as well.

  • Research Article
  • Cite Count Icon 19
  • 10.1016/j.future.2017.02.040
A runtime estimation framework for ALICE
  • Feb 28, 2017
  • Future Generation Computer Systems
  • Sarunya Pumma + 4 more

A runtime estimation framework for ALICE

  • Research Article
  • Cite Count Icon 21
  • 10.1016/j.diin.2019.07.006
A formal model for event reconstruction in digital forensic investigation
  • Aug 13, 2019
  • Digital Investigation
  • Somayeh Soltani + 1 more

A formal model for event reconstruction in digital forensic investigation

  • Research Article
  • Cite Count Icon 4
  • 10.1016/j.fsidi.2023.301624
Post-mortem digital forensic analysis of the Garmin Connect application for Android
  • Sep 18, 2023
  • Forensic Science International: Digital Investigation
  • Fabian Nunes + 2 more

Post-mortem digital forensic analysis of the Garmin Connect application for Android

  • Conference Article
  • Cite Count Icon 2
  • 10.1109/ieem44572.2019.8978767
Efficient Compression and Preprocessing for Facilitating Large Scale Spatiotemporal Data Mining - A Case Study based on Automatic Identification System Data
  • Dec 1, 2019
  • Hai-Yan Xu + 7 more

The large scale deployment of sensor, Global Positioning System (GPS) and other mobile devices generates large volumes of spatiotemporal data, which facilitates the understandings of objects' movement trajectories and activities. However, it is very challenging to store, transfer and load such a large volume of data into system memory for processing and analysis. In this study, we look into a study case that processes the large scale of Automatic Identification System (AIS) data in the maritime sector, and propose a computational framework to efficiently compress, transfer and acquire necessary information for further data analysis. The framework is composed of two parts: The first is a lossless compression algorithm that compresses the AIS data into binary form for efficient storage, speedy loading and easy transfer across networks and systems within the organization; the second is an aggregation algorithm which derives movement and activity information of vessels grouped by grid and/or time window from the compressed binary files, therefore improves data accessibility and reduces storage demand. The proposed framework has been applied to extract vessel movement information within Singapore port with high compression rate and fast access speed, and it can be extensively applied for other data processing applications.

  • Research Article
  • Cite Count Icon 3
  • 10.1007/s10844-019-00544-1
Incremental entity resolution process over query results for data integration systems
  • Jan 29, 2019
  • Journal of Intelligent Information Systems
  • Priscilla Kelly Machado Vieira + 2 more

Entity Resolution (ER) in data integration systems is the problem of identifying groups of tuples from one or multiple data sources that represent the same real-world entity. This is a crucial stage of data integration processes, which often need to integrate data at query-time. This task becomes even more challenging in scenarios with dynamic data sources or when a large volume of data needs to be integrated. Then, to deal with large volumes of data, new ER solutions have been proposed. One possible approach consists in performing the ER process over query results rather than in the whole set of tuples being integrated. Additionally, previous results of ER tasks can be reused in order to reduce the number of comparisons between pairs of tuples at query-time. In a similar way, indexing techniques can also be employed to help the identification of equivalent tuples and to reduce the number of comparisons between pairs of tuples. In this context, this work proposes an incremental ER process over query results. The contributions of this work are the specification, the implementation and the evaluation of the proposed incremental process. We performed some experiments and we concluded that the incremental ER at query-time is more efficient than traditional ER processes.

  • Research Article
  • Cite Count Icon 23
  • 10.1016/j.diin.2006.06.013
An empirical study of automatic event reconstruction systems
  • Jul 12, 2006
  • Digital Investigation
  • Sundararaman Jeyaraman + 1 more

An empirical study of automatic event reconstruction systems

  • Research Article
  • Cite Count Icon 2
  • 10.1063/1.3099569
Physicists in forensics
  • Mar 1, 2009
  • Physics Today
  • Toni Feder

From faulty products to murder, physicists help figure out what really happened.

  • Research Article
  • Cite Count Icon 25
  • 10.1080/01431161.2017.1303218
Accounting for positional uncertainty in historical shoreline change analysis without ground reference information
  • Apr 3, 2017
  • International Journal of Remote Sensing
  • Phillipe Wernette + 3 more

ABSTRACTSystematic shifts in shoreline position are important indicators of environmental change. Shoreline position interpreted from historical aerial imagery is frequently used to assess shoreline change. Although most published studies do not formally consider the effects of source error and interpretation error when analysing shoreline change, the effects of these errors may be significant. This article introduces and evaluates a new uncertainty-aware approach to assessing shoreline change in the presence of positional uncertainty and without ground-reference data, which is typical for historical coastline analysis. The overlapping double buffer (ODB) approach extends the epsilon band model to account for the effects of source and interpretation errors on shoreline position, and assesses the degree of overlap in order to distinguish between significant change and noise. This approach improves upon standard shoreline change analytical techniques that use regularly spaced shore–normal transects to measure the direction and magnitude of shoreline change without accounting for error. Shoreline position interpreted from historical aerial images for four sites along the Michigan (USA) coast between 1938 and 2010 were used to demonstrate the feasibility of the ODB approach. An epsilon band was constructed around each shoreline with radius equal to the combined source and interpretation error for each image. These bands were merged and intersected to test whether the observed change was real or an artefact caused by uncertainty in the data sources. The most significant advantage of the ODB approach is its ability to account for uncertainty in shoreline position where ground-reference information does not exist, as is often the case with historical aerial imagery. Results indicate that the overlapping epsilon band method is viable for analysing change in linear features in the absence of a higher-accuracy reference dataset, and that source error contributed more to the positional uncertainty than did interpretation error.

  • Conference Article
  • Cite Count Icon 3
  • 10.1145/1982185.1982230
Towards designing a tool for event reconstruction using Gladyshev Approach
  • Mar 21, 2011
  • Merin Sebastian + 1 more

Event reconstruction is the process that explains the 'how' and 'why' of an evidence in a digital investigation. Formalization of this process is essential because the evidence is of legal value which requires a logical reasoning. A finite state machine approach and formalization of the event reconstruction problem had been proposed by Gladyshev. The authors present their model for a tool for event reconstruction built upon Gladyshev formalization. The model is formulated by analyzing a case study, of hidden/deleted files, and forms the first step towards the design of a generic tool for event reconstruction.

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.fsidi.2024.301759
Was the clock correct? Exploring timestamp interpretation through time anchors for digital forensic event reconstruction
  • Jul 1, 2024
  • Forensic Science International: Digital Investigation
  • Céline Vanini + 3 more

Was the clock correct? Exploring timestamp interpretation through time anchors for digital forensic event reconstruction

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.