Effects of chemical & biological warfare agent decontaminants on trace survival: Impact on digital media.

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Effects of chemical & biological warfare agent decontaminants on trace survival: Impact on digital media.

Similar Papers
  • Conference Article
  • Cite Count Icon 2
  • 10.1109/asiancon55314.2022.9909100
Secured Data Integrity Scheme for Internet of Things
  • Aug 26, 2022
  • Poornima M Chanal + 1 more

The provision of security to any information in an IoT network is a big challenge, which must be given the top priority for many current and future applications of IoT. Traditional data integrity verification approaches only use encryption algorithms to secure information, depending on Third-Party Auditors (TPAs). The blockchain’s basic principle is that information produced by users or nodes is tested for accuracy and cannot be altered once it is updated on the blockchain. Blockchain based data integrity schemes can effectively overcome TPA’s issues. In this paper, we propose a blockchain based data integrity technique with a bilinear design for IoT information. We achieve data integrity according to the characteristics of bilinear design in the form of blockchain communications. The proposed blockchain based framework for data integrity verification consists of different entities such as client, Key Generation Centre (KGC), cloud storage server, and blockchain. Proposed data integrity verification operates in three stages: Setup stage, Processing stage and Verification stage. We analysed the prformance of proposed scheme using different parameters like end to end delay, memory utilazation and accuracy rate.

  • Book Chapter
  • 10.1007/978-981-15-5859-7_35
Evaluation of Confidentiality and Data Integrity Based on User Session for IoT Environment
  • Oct 1, 2020
  • Alwi Bamhdi + 4 more

Confidentiality and data integrity are two important security services that are provided by the most popular VPN protocols such IPsec and SSL/TLS. It is a well-known fact that security comes at the cost of performance, and performance is affected by the cryptographic algorithms and their execution. In this study, confidentiality and data integrity processes were implemented using client--server Java applications in a real LAN to assess their performance based on different encryptions methods. The execution time of data encryption, data decryption, and data integrity verification was measured and total session times were computed for the following encryption algorithms AES, Blowfish, 3DES, RC2, MD5, and SHA-1. The observation results have been analyzed and the performance of these encryption algorithms and digest ciphers was compared, interpreted, and outlined in terms of total session time and different security parameters (data integrity, encryption, decryption, and data integrity verification).

  • Conference Article
  • Cite Count Icon 307
  • 10.1109/icws.2017.54
Blockchain Based Data Integrity Service Framework for IoT Data
  • Jun 1, 2017
  • Bin Liu + 4 more

It is a challenge to ensure data integrity for cloud-based Internet of Things (IoT) applications because of the inherently dynamic nature of IoT data. The available frameworks of data integrity verification with public auditability cannot avoid the Third Party Auditors (TPAs). However, in a dynamic environment, such as the IoT, the reliability of the TPA-based frameworks is far from being satisfactory. In this paper, we propose a blockchain-based framework for Data Integrity Service. Under such framework, a more reliable data integrity verification can be provided for both the Data Owners and the Data Consumers, without relying on any Third Party Auditor (TPA). In this paper, the relevant protocols and a subsequent prototype system, which is implemented to evaluate the feasibility of our proposals, are presented. The performance evaluation of the implemented prototype system is conducted, and the test results are discussed. The work lays a foundation for our future work on dynamic data integrity verification in a fully decentralized environment.

  • Research Article
  • Cite Count Icon 10
  • 10.1016/j.jksuci.2022.07.015
DCIV: Decentralized cross-chain data integrity verification with blockchain
  • Jul 20, 2022
  • Journal of King Saud University - Computer and Information Sciences
  • Jiajia Jiang + 5 more

DCIV: Decentralized cross-chain data integrity verification with blockchain

  • Conference Article
  • 10.1109/iccic.2016.7919623
DVHT: Enabling the efficient data verification using homomorphic authenticable tags
  • Dec 1, 2016
  • G Kalpana + 2 more

Cloud was visualized as the authentic elucidation for rising storage space and service expenditure for Information Technology Applications. Outsourcing the data to cloud storage servers ease the client accountability by facilitating stumpy cost, secure, locality independent platforms and scalable resources. With the proliferation of cloud technologies, outsourced data is scrolling among the nodes, client unaware of storage location and no longer have possession of data. Data privacy and integrity are formidable concerns for data owners. So, verification of data integrity is vital. Thus, an efficient auditing scheme is required to guarantee data possessor so as to assure them data safety and accuracy. In this paper, we propose zero knowledge proof interactive system for data integrity verification on cloud. Our method strengthened by using Homomorphic encryption tags, generated on encrypted data with zero knowledge to the interactive members. We consider TPA (Third Party Auditor) as audit agent inside the cloud on behalf of client to reduce the communication cost. The scheme drastically reduces input and output costs by generating the probabilistic proof of possession using sampling sets of data blocks from the server. Client maintains meta data of file for verification, which reduces huge communication cost on network. Our theoretical analysis and experimental results show the efficiency of the scheme for integrity proofs.

  • Research Article
  • Cite Count Icon 53
  • 10.1016/j.jpdc.2020.06.007
Blockchain-based verification framework for data integrity in edge-cloud storage
  • Jun 22, 2020
  • Journal of Parallel and Distributed Computing
  • Dongdong Yue + 4 more

Blockchain-based verification framework for data integrity in edge-cloud storage

  • Research Article
  • Cite Count Icon 24
  • 10.1007/s11280-019-00761-2
Outsourced data integrity verification based on blockchain in untrusted environment
  • Mar 4, 2020
  • World Wide Web
  • Kun Hao + 3 more

Outsourced data, as the significant component of cloud service, has been widely used due to its convenience, low overhead, and high flexibility. To guarantee the integrity of outsourced data, data owner (DO) usually adopts a third party auditor (TPA) to execute the data integrity verification scheme. However, during the verification process, DO cannot fully confirm the reliability of the TPA, and handing over the verification of data integrity to the untrusted TPA may lead to data security threats. In this paper, we focus on the problem of integrity verification of outsourced data in untrusted environment, that is, how to improve the security and efficiency of data integrity verification without utilizing untrusted TPA. To address the problem, we design a decentralized model based on blockchain consisting of some collaborative verification peers (VPs), each of which maintains a replication of the entire blockchain to avoid maliciously tampering with. Based on the model, we present an advanced data integrity verification algorithm which allows DO to store and check the verification information by writing and retrieving the blockchain. In addition, in order to improve the concurrent performance, we extend the algorithm by introducing the verification group (VG) constituting by some VPs organized by Inner-Group and Inter-Group consensus protocols. We conduct a completed security analysis as well as extensive experiments of our proposed approach, and the evaluation results demonstrate that our proposed approaches achieve superior performance.

  • Research Article
  • 10.52783/jisem.v10i18s.2939
SpinalSAENet: An Intelligent Intrusion Detection and Data Integrity Framework for Cloud Environments
  • Mar 11, 2025
  • Journal of Information Systems Engineering and Management
  • N.Savitha, E.Saikiran

The swift growth of cloud computing has heightened cybersecurity vulnerabilities, demanding robust intrusion detection systems (IDS). Conventional IDS models face challenges, such as excessive false positives and limited flexibility. This study introduces Spinal Stacked AutoEncoder Net (SpinalSAENet), an innovative hybrid deep-learning-based IDS that merges SpinalNet and Deep Stacked AutoEncoders (DSAE) to enhance anomaly detection and data integrity verification. The system employs feature extraction and Chebyshev distance-based fusion to improve classification, while Principal Component Analysis (PCA) is utilised to reduce dimensionality, thereby increasing computational efficiency. When tested on the Bot-IoT dataset, SpinalSAENet demonstrated superior performance with 96.87% accuracy, 95.4% recall, 96.1% precision, and a 95.7% F1-score, surpassing Decision Trees, Random Forests, and Support Vector Machines. The incorporation of SHA-256 hashing and Merkle tree proofs ensures data integrity, offering a multitiered security approach. Its streamlined architecture and cloud-native scalability (Docker and Kubernetes) facilitate real-time deployment in cloud environments. This paper presents a highly precise and scalable IDS framework capable of real-time intrusion detection and data integrity verification. Subsequent research will investigate the resistance to adversarial attacks, explainable AI, and serverless deployment to further enhance cloud security.

  • Conference Article
  • Cite Count Icon 4
  • 10.1109/trustcom56396.2022.00119
Data Integrity Verification Scheme Based on Blockchain Smart Contract
  • Dec 1, 2022
  • Kai Zhang + 2 more

In cloud computing circumstance, users upload data to the cloud server and verify data integrity through a third-party audit (TPA). However, verifying data integrity is still a computationally intensive and time-consuming operation. If there are illegal users or unreliable cloud servers, it can only be known from the verification result, resulting in invalid computation and time overhead. In order to solve the above problems, RSA algorithm is used to verify the legitimacy of the user, and when the verification passes, Merkle hash tree is used to filter unreliable servers. To prevent replay attacks, data integrity is verified through the bilinear mapping feature. Finally, the simulation results show that the scheme not only can detect the legitimacy of users, but also filter out unreliable servers, and effectively reduce the computational and time overhead of verifying data integrity.

  • Research Article
  • Cite Count Icon 8
  • 10.1080/23311916.2019.1654694
Verification of data integrity and co-operative loss recovery for secure data storage in cloud computing
  • Jan 1, 2019
  • Cogent Engineering
  • Paul R Rejin + 1 more

In Cloud Computing, the data stored in the external servers may be tampered or deleted by unauthorized persons or selfish Cloud Service Providers (CSPs). Hence, the Cloud Data Owners (CDOs) have to provide assurance to the integrity and correctness of the stored data in the server. In this paper, a Verification of Data Integrity and Co-operative Loss Recovery technique for secure data storage in Cloud Computing is proposed. In this technique, a ciphertext file is split into various cipher blocks and distributed to randomly selected cloud service providers (CSPs) by the cloud data owner (CDO). If a cloud data user (CDU) wants to access any file, the corresponding ciphertext file is reconstructed from the blocks and downloaded by the user. The file can be decrypted if the attribute set of the user matches the access policy of the application. By simulation results, we show that the proposed technique enhances the data integrity and confidentiality.

  • Research Article
  • Cite Count Icon 3
  • 10.30574/ijsra.2024.12.1.1076
Block-chain based data provenance and integrity verification
  • Jun 30, 2024
  • International Journal of Science and Research Archive
  • Vaghani Divyeshkumar

Blockchain applications face major scalability challenges, hindering their ability to support services with large-scale and frequent transactions, such as the computational and communication overhead involved in integrity verification for large-scale IoT data. To tackle the problem, we propose a Blockchain and Bilinear mapping-based Data Integrity Scheme (BB-DIS) tailored for large-scale IoT data in cloud storage. The paper introduces a blockchain-based framework for data integrity verification of large-scale IoT data which includes a series of protocols, verification algorithms, and detailed performance analysis; develops a prototype system incorporating an edge computing processor near the IoT devices to preprocess the large-scale IoT data to significantly reduce communication costs and computational burdens; and performs multiple simulation experiments on Hyperledger Fabric to provide a comparative analysis of computational and communication overhead between BB-DIS and other baseline schemes. Experimental results demonstrate that the proposed BB-BIS surpasses existing blockchain-based methods in computational cost and communication overhead for large-scale IoT data.

  • Research Article
  • Cite Count Icon 1
  • 10.3390/s22176496
On the Design and Implementation of the External Data Integrity Tracking and Verification System for Stream Computing System in IoT †
  • Aug 29, 2022
  • Sensors (Basel, Switzerland)
  • Hongyuan Wang + 4 more

Data integrity is a prerequisite for ensuring data availability of IoT data and has received extensive attention in the field of IoT big data security. Stream computing systems are widely used in the field of IoT for real-time data acquisition and computing. However, the real-time, volatility, suddenness, and disorder of stream data make data integrity verification difficult. According to the survey, there is no mature and universal solution. To solve this issue, we constructed a data integrity verification algorithm scheme of the stream computing system (S-DIV) by utilizing homomorphic message authentication code and pseudo-random function security assumption. Furthermore, based on S-DIV, an external data integrity tracking and verification system is constructed to track and analyze the message data stream in real time. By verifying the data integrity of message during the whole life cycle, the problem of data corruption or data loss can be found in time, and error alarm and message recovery can be actively implemented. Then, we conduct the formal security analysis under the standard model and, finally, implement the S-DIV scheme in simulation environment. Experimental results show that the scheme can guarantee data integrity in an acceptable time without affecting the efficiency of the original system.

  • Conference Article
  • Cite Count Icon 2
  • 10.1109/icices.2016.7518926
A review on public auditing in cloud environment
  • Feb 1, 2016
  • M Thangavel + 5 more

Nowadays, Cloud storage helps the users to remotely store and access the data. Any modification or corruption in data by unauthorized users will lead to insecure cloud framework. Protecting the users data from unauthorized access or malicious attacks in cloud is one of the major research direction to address the issues of data privacy and integrity in cloud. Ensuring cloud data integrity and privacy seems to be the major issue. Inorder to overcome unauthorized access of users data even by Cloud Service Provider (CSP), Data integrity verification is performed through Trusted Third Party Auditor (TTPA). By TTPA, the cloud data auditing need to be performed and data security need to be ensured, without the knowledge of the actual data stored in cloud. Many researchers had keen interest to provide a cloud framework, which preserves the privacy and ensures the integrity of cloud data. In this paper, various researcher ideas based on privacy preservation public auditing schemes has been analyzed as an literature review.

  • Book Chapter
  • 10.1016/b978-0-323-90585-5.00013-8
Chapter 14 - Computationally efficient integrity verification for shared data in cloud storage
  • Jan 1, 2022
  • Edge-of-Things in Personalized Healthcare Support Systems
  • M.B Smithamol + 1 more

Chapter 14 - Computationally efficient integrity verification for shared data in cloud storage

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 2
  • 10.1155/2022/4756899
Cloud Data Integrity Verification Algorithm Based on Data Mining and Accounting Informatization
  • Sep 9, 2022
  • Scientific Programming
  • Junli Wang + 2 more

Data integrity verification means that the data in the cloud are uploaded by the user. In addition to the user’s own update of the data, any external factors including the cloud service provider’s data are destroyed, tampered with, and lost, and the data are not updated in a timely manner. Any inconsistencies in the actual data required can be detected by the user. This article aims to study cloud data integrity verification algorithms based on data mining and accounting informatization. This article proposes data mining technology, accounting information to help business managers assist management work. The article uses Company H as an example to illustrate the accounting information system and proposes a new information management strategy based on it. The CBF algorithm and data integrity verification algorithm are used to study the cloud storage data integrity verification protocol, the cloud data integrity verification model is constructed, the data program flow design is analyzed, and the time-consuming operation of file data insertion is analyzed. The experiment in this paper uses 16 standard mathematical calculation formulas to strengthen the analysis. The results show that the study of cloud data integrity verification algorithms based on data mining and accounting information is beneficial to the integrity and protection of data. When the number of documents added increases from 0 to 400, the document agreement shows an upward trend, and the agreement in this paper basically fluctuates between 10 and 80.

More from: Forensic science international
  • New
  • Research Article
  • 10.1016/s0379-0738(25)00280-4
Editorial Board
  • Nov 1, 2025
  • Forensic Science International

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112579
A review of the literature on the applications of machine learning in forensic anthropology.
  • Nov 1, 2025
  • Forensic science international
  • Eman Faisal + 1 more

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112611
Integrated visualization and genetic profiling of latent fingerprints via UVITEX OB and adapted fluorescence microscopy.
  • Nov 1, 2025
  • Forensic science international
  • Sabina Bunescu + 5 more

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112588
Identification of deceased. Interpol definitions versus police routines in Denmark.
  • Nov 1, 2025
  • Forensic science international
  • Anja Skov + 3 more

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112581
Investigation of semen deposition time: Spectral analysis using Fourier transform infrared spectroscopy at different temperatures in vivo.
  • Nov 1, 2025
  • Forensic science international
  • Puxu Di + 7 more

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112708
Using long RNA fragment degradation ratio to estimate the time elapsed since bloodstain deposition
  • Nov 1, 2025
  • Forensic Science International
  • Hiroaki Nakanishi + 4 more

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112585
Genetic sex prediction from human gut shotgun metagenomic data: An ethical appraisal.
  • Nov 1, 2025
  • Forensic science international
  • Sahid Afrid Mollick

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112580
Bridging methodological gaps in forensic science: A study of hydrochloric acid and human dentition.
  • Nov 1, 2025
  • Forensic science international
  • Tammy Bracewell + 1 more

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112587
I just killed someone … where to from here? The environmental context in body disposal of non-organised crime homicide offenders: A geographic profiling perspective.
  • Nov 1, 2025
  • Forensic science international
  • Adam Marsden

  • New
  • Research Article
  • 10.1016/j.forsciint.2025.112586
A methodology for constructing narrative Bayesian networks for the evaluation of forensic fibre evidence given activity level propositions.
  • Nov 1, 2025
  • Forensic science international
  • Victoria Lau + 2 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon