Articles published on Data delivery
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
3550 Search results
Sort by Recency
- New
- Research Article
- 10.1016/j.mex.2025.103428
- Dec 1, 2025
- MethodsX
- Seyed Hossein Abrehdari
Leveraging the process mining technique to optimize data preparation time in a database used as an automated data delivery center.
- New
- Research Article
- 10.1038/s41598-025-26056-3
- Nov 24, 2025
- Scientific reports
- Iqra Javid + 6 more
With the introduction of 5G technology, wireless communication is expected to undergo revolutionary changes that will allow high-speed connectivity and scalability. Though 5G networks have the potential to be revolutionary, they also pose new challenges in ensuring the security and integrity of data transfer, especially in Non-IP Data Delivery (NIDD) scenarios. The need for robust anomaly detection systems becomes even more critical in this scenario to safeguard IoT and other reliable networks. Anomaly detection has been the subject of much research in network contexts, as it is crucial for identifying hostile activity, system failures, and odd behavior. The growing dependence on technologies, particularly with the advent of 5G and its potential to connect nearly everything, has made it imperative to investigate intelligent and efficient techniques that ensure network availability, secrecy, and integrity. To address the botnet infiltration, DDoS mitigation, and other incursions in 5G networks, a novel lightweight intrusion detection model is proposed for 5G and beyond networks, which uses the 5GNIDD dataset in the experiments. The proposed model is powered by a robust prepossessing model, which uses Gini Importance for feature selection and state-of-the-art classifiers, namely, AdaBoost, Easy Ensemble, GRU, 1D-CNN, LSTM, and hybrid CNN-LSTM for classification. Two different case studies with k-best features are driven in experiments showcasing the effect of the curse of dimensionality on precision. The model has obtained 99.64% accuracy and a 0.9830 precision using 1D-CNN and a hybrid LSTM-CNN model.
- New
- Research Article
- 10.1080/03772063.2025.2568112
- Nov 18, 2025
- IETE Journal of Research
- Venkatesh Sugumar + 3 more
Over the past few years, the application of Wireless Sensor Networks has rapidly increased in the healthcare sector due to their ability to continuously monitor crucial health metrics, including heart rate, blood pressure, and temperature. Traditional approaches have faced challenges in achieving fault tolerance, energy efficiency, and optimal data transmission in healthcare wireless sensor networks. To solve this challenges, this study aims to develop an advanced optimization framework. The proposed framework introduces a novel Golden Sine SeaHorse Optimization Boosted Quantum Dirichlet Convolutional Learning model. Golden Sine Optimization, SeaHorse Optimization, and Quantum Dirichlet Convolutional Learning are the advanced optimization techniques incorporated through this proposed framework to address network failures, improve energy efficiency, and optimize data delivery. Golden Sine Optimization and Quantum Dirichlet Convolutional Learning algorithms select the most economical and energy-efficient routes to optimize the exploration and exploitation balance of alternate paths. Network topology and communication paths are dynamically modified through the Quantum Dirichlet Convolutional Learning model. The proposed model achieved a packet delivery ratio of 96%, with an energy consumption of 119 millijoule and a fault recovery time of 45 milliseconds. The proposed framework exhibits significant potential for optimizing healthcare-based wireless sensor networks and fault-tolerant healthcare monitoring systems.
- New
- Research Article
- 10.1177/18333583251389095
- Nov 7, 2025
- Health information management : journal of the Health Information Management Association of Australia
- Gina Helstad + 3 more
The Norwegian Health Archives Registry (NHAR) is a national initiative dedicated to digitising, centralising, and providing access to historical full-text patient health records (PHRs) for research purposes. Established in 2019, NHAR includes PHRs from the deceased population in Norway's specialist healthcare services, offering a unique long-term data source for future research. NHAR has now digitised 1.7 million paper-based PHRs, covering medical history dating back to 1875. The registry is now expanding to include digital-born PHRs. This article describes NHAR's innovation potential as a health registry, its data management processes, and the integration of artificial intelligence (AI) tools to facilitate data management and research in compliance with strict health data regulations. NHAR's data value chain includes structured metadata acquisition, large-scale digitisation and secure data delivery for research. The workflow includes a custom optical character recognition (OCR) tool tailored to Norwegian medical terminology, concept-based search tools for unstructured clinical full text and robust strategies for long-term data management. A novel AI-based de-identification system automatically detects and masks personal identifiers in digitised PHRs. Despite these innovations, challenges persist in processing handwritten and historical PHRs due to OCR limitations and language-specific complexities. Key challenges include improving data quality, enhancing OCR accuracy and refining AI tools for information retrieval, data extraction and de-identification. NHAR offers significant potential for interdisciplinary research across various medical fields.Implications for health information management practice:NHAR establishes a foundation for secure access to historical health data and introduces advanced data management strategies to facilitate future research.
- Research Article
- 10.35546/kntu2078-4481.2025.1.2.51
- Nov 5, 2025
- Вісник Херсонського національного технічного університету
- D Rumiantsev
Financial institutions increasingly rely on rapidly processing vast, heterogeneous data streams for effective risk management and regulatory compliance. However, integrating and enriching this data in near real-time presents significant architectural challenges, particularly at scale. This paper details the design, implementation, and impact of a high-availability, microservices-based platform developed to automate data processing for a central Eastern European bank. The primary objective was to create a modular, scalable, and fault-tolerant system capable of managing massive data volumes while ensuring data integrity and low-latency delivery of insights. A case study methodology was employed to document the system's architecture. The platform utilizes an event-driven, asynchronous model built with Java and the Spring Boot framework. Core components include decoupled microservices for data ingestion, normalization, enrichment, and delivery, orchestrated via RabbitMQ message queues. A novel dispatcher service was implemented to manage entity-level concurrency control, preventing race conditions during parallel data processing. The system's performance and health are monitored through an integrated observability stack comprising Prometheus, Grafana, Loki, and Zipkin. The platform successfully processes over 160 million data units daily from over 20 disparate sources. It reduced data processing latency from several days to under 15 minutes and consistently meets a critical service-level agreement (SLA) of delivering search query results in less than 10 seconds. Automation eliminated manual data handling, reduced data quality error rates by approximately 87%, and significantly enhanced the bank’s ability to detect adverse financial events in near real-time. This case study validates the efficacy of a microservices architecture combined with explicit concurrency control for building high-throughput FinTech data platforms. The presented design is a practical blueprint for engineering resilient, scalable, and observable systems to address complex data integration challenges in the financial sector and other dataintensive industries.
- Research Article
- 10.1007/s44196-025-01042-9
- Nov 3, 2025
- International Journal of Computational Intelligence Systems
- Savita Jadhav + 4 more
Abstract Traffic management and street lighting optimization are increasingly dependent on intelligent systems for smart cities. The main objective of this paper is to design the Intelligent Traffic and Lighting Systems (ITLS) using the dream optimization algorithm (DOA) for Wireless Sensor Networks (WSNs). The system assimilates single-lane roads, moving vehicles, sensor-equipped streetlights, and a centralized control station. The fitness function of DOA optimizes network performance by balancing energy consumption, reducing congestion, and stabilizing vehicle speed variations. The network adjusts to changing traffic conditions, optimizing routes and lighting efficiency. The MATLAB simulation shows that DOA surpasses traditional rule-based systems by refining traffic flow while reducing energy usage. The performance of our proposed approach, DOA-ITLS, is compared with existing techniques like IB-SEC and KFFOA-PDES in terms of network lifetime, packet delivery, throughput, and energy efficiency. The protocol enhances packet delivery by 50% and extends network lifetime by effectively delaying node failures. DOA-ITLS is found to be a scalable, robust, and energy efficient solution for urban traffic and lighting control. By enhancing data delivery and system responsiveness, this framework makes urban mobility more sustainable and efficient.
- Research Article
- 10.11591/ijres.v14.i3.pp766-784
- Nov 1, 2025
- International Journal of Reconfigurable and Embedded Systems (IJRES)
- Rizki Ananta Dwiyanto + 2 more
This study studies the design and implementation of a REST API and its performance analysis for an internet of things (IoT)-based vehicles monitoring system. This system incorporates brake pad sensors, a tire pressure monitoring system (TPMS) for assessing tire pressure and temperature, light detection and ranging (LIDAR) for measuring tire thickness, and radio frequency identification (RFID) for tire identification. Data is gathered using an ESP32 microcontroller and transmitted in real-time to the server via a REST API over a wireless network. The JSON Web Token (JWT) authentication mechanism is employed to ensure data security. Testing indicates that this system has an average response time of 4–11 ms, with optimal performance recorded at 3.93 ms for the RFID sensor and peak performance at 9.19 ms for the LIDAR sensor. Load testing with 100 concurrent users demonstrates that the system maintains stability with a 100% data delivery success rate. Authentication testing demonstrates that the API is accessible solely with a valid token, hence preventing unauthorized access. This study's results demonstrate that integrating REST API with IoT monitoring systems facilitates real-time vehicle monitoring, enhances maintenance efficiency, and offers viable solutions for future predictive maintenance systems.
- Research Article
- 10.63278/jicrcr.vi.3376
- Oct 28, 2025
- Journal of International Crisis and Risk Communication Research
- Mohamed Rizwan Syed Sulaiman
Rapid growth in the deployment of artificial intelligence applications has unveiled inherent shortcomings in traditional cloud computing infrastructures, uncovering essential performance bottlenecks that reduce the efficacy of deep learning deployments. General-purpose workload-optimized data center designs cannot service the specific needs of neural network inference and training, where computational complexity, memory bandwidth limitations, and communication latency jointly control system throughput. Purpose-designed accelerators with custom tensor processing units have become critical building blocks, providing orders of magnitude better compute compared to traditional processors based on architectural innovations such as systolic array designs and high-bandwidth memory subsystems. Yet, computational capability is not enough without commensurate innovation in data pipeline architecture and network infrastructure. Hierarchical storage systems that weigh object repositories against parallel file systems provide continuous data delivery to computational clusters, while ring-allreduce communication and interconnect fabrics optimize synchronization overhead in distributed training applications. The joining of edge computing with artificial intelligence also brings forth extra architectural concerns that necessitate hierarchical infrastructures that cover cloud facilities, edge servers, and endpoint devices. Most efficient overall performance requires end-to-end integration throughout all infrastructure levels, such that devoted compute assets, excessive-throughput garage hierarchies, and low-latency networks work as interconnected factors and not as separated subsystems. Corporations working with huge-scale AI systems want to appreciate that infrastructure optimization is an ongoing engineering venture rather than a single implementation.
- Research Article
- 10.63468/sshrr.158
- Oct 25, 2025
- Social Sciences & Humanity Research Review
- Nayyar Bashir + 2 more
Education systems across the globe face unique issues and possibilities about autism spectrum disorder (ASD). Hence, in a mixed socio-cultural context of Pakistan, where special education is still in its emergent phase, the delivery of effective data–driven services for children with ASD is indeed a challenge. This paper provides a comprehensive overview of the existing state of educational interventions for ASD in Pakistan. First, it is based on the sociocultural context and cultural beliefs and stigma surrounding people's lives. It goes on to methodically detail interventions for both conventional and unconventional therapy, as well as covers the relatively new use of ABA and TEACCH programs. Scratch beneath the surface, and Pakistan's school system reveals a whole range of shortfalls in policy, teacher training, and resources. It also addresses the key role of the family, including its various loads and potential for elevation. Based on these analyses, the paper proposes an overarching plan that includes public awareness, policy transformation, broadening professional development, and integrating technology, as well as extending inclusive education. The final point to be emphasized is that, while daunting, a multi-label solution involving government intervention, professionalism of the experts, and participation of the community can help the realization of an eco-structure in which Pakistani children who have ASD could learn/work/live/and play and flourish to become more than they are.
- Research Article
- 10.47604/ajcet.3541
- Oct 17, 2025
- Asian Journal of Computing and Engineering Technology
- Nour Abd + 1 more
Purpose: This article introduces an energy-efficient routing protocol for wireless sensor networks (WSNs) that integrates a dynamic K-means clustering algorithm with Q-Learning and adaptive sleep scheduling. The proposed model aims to extend the network’s lifetime, reduce energy consumption, and maintain the reliability of high data delivery in limited-resource nodes. Methodology: Each sensor tag autonomously makes optimal forwarding decisions based on local parameters such as remaining energy, distance, jumping, link quality and sensory data variation. To increase adaptation, the network regularly prepares the cluster depending on the node energy and position. In contrast, the sensor nodes enter sleep mode when no significant data changes are detected, reducing inactive communication. Findings: The model was evaluated with separate network density and simulation settings in 25 scenarios. The best executive landscape achieved a package delivery ratio (PDR) of 94.73 %, delayed 5080.1 episodes in First Node Death (FND), and reduced the average energy consumption by up to 0.0111189 J per episode. Unique Contribution to Theory, Practice, and Policy: Compared to standard protocols such as LEACH and RLBEEP, the proposed method outperforms them in all performance matrices. These results demonstrated the effectiveness of learning combined with adaptive grouping and transmission control for achieving durable and intelligent WSN operation.
- Research Article
- 10.59934/jaiea.v5i1.1740
- Oct 15, 2025
- Journal of Artificial Intelligence and Engineering Applications (JAIEA)
- Muhammad Zulfarhan + 1 more
The Internet of Things (IoT) plays a crucial role in real-time air quality monitoring, yet battery-powered devices face energy constraints that make conventional periodic transmission inefficient. This study proposes the use of the Q-Learning algorithm to optimize adaptive air quality data delivery. A prototype system was built using an ESP32 with MQ-2, MQ-135, DHT22, and INA219 sensors connected to a web-based server. Experimental results showed a decision distribution of 55.6% transmit and 44.4% delay, with the average reward for delay actions (87.44) higher than for transmit actions (54.83). Compared to the periodic method, Q-Learning reduced transmission frequency by 40–50%, lowered energy consumption, and maintained data accuracy. These findings confirm that Q-Learning is effective in designing an energy-efficient, adaptive, and reliable IoT transmission mechanism for air quality monitoring.
- Research Article
- 10.1038/s41598-025-19748-3
- Oct 15, 2025
- Scientific reports
- Lingaraj K + 5 more
Wireless Sensor Networks (WSNs) are composed of small, cost-effective sensing nodes that are primarily employed for the collection of environmental data. These networks are integral to various applications including industrial pollution monitoring, disaster management, and air quality regulation. However, WSNs encounter significant challenges, such as energy efficiency, end-to-end delay, and packet loss during data transmission. Existing methodologies often fall short in optimizing the network lifespan while ensuring reliable data delivery. To address these limitations, this study introduces FLPSO-AMPS, a novel Fuzzy Logic-based Particle Swarm Optimization (FLPSO) approach aimed at enhancing energy-efficient routing in WSN-based Air Pollution Monitoring Systems (APMS) for Tier-2 smart cities. The proposed approach leverages fuzzy logic principles combined with PSO to intelligently select optimal routing paths, thereby ensuring minimal energy consumption and enhanced network longevity. Unlike conventional methodologies, FLPSO-AMPS incorporates real-time pollutant data collection and mobility-aware optimization to improve network performance. The effectiveness of FLPSO-AMPS was validated through extensive simulations, demonstrating superior performance over existing approaches, particularly with improvements of 10% in energy efficiency, 15% in task delay, 24.5% in packet delivery ratio (PDR), 11.5% in packet loss ratio (PLR), and 20.1% in throughput. These findings underscore the potential of FLPSO-AMPS in establishing an intelligent, resource-efficient air quality monitoring framework for smart cities. Future research will explore security enhancements to safeguard data transmissions in APMS networks.
- Research Article
- 10.1002/dac.70292
- Oct 13, 2025
- International Journal of Communication Systems
- Chao Ma + 2 more
ABSTRACTAiming to promise timely data delivery and quantify the information freshness of unmanned surface vehicles (USVs), age of information (AoI) is proposed as a novel metric regarding exploring implementations for reconfigurable intelligent surface (RIS)–assisted unmanned aerial vehicle (UAV)–USV multiaccess edge computing network. In this paper, a set of RIS‐carried UAVs serves USVs via time division multiple access; each RIS is capable of delivering a single reflection per USV within its service duration. USV long‐term time‐averaged AoI (AAoI) minimization problem is investigated under USV service duration indicators, UAV‐mounted RIS phase shift vector, terrestrial base station (TBS) beamforming vector, and UAV trajectory constraints. To efficiently solve the formulated problem, Lyapunov framework is applied to decompose the original problem into an array of per‐slot problems, where each of which can be divided into numerous subproblems, for example, TBS beamforming vector subproblem, RIS phase shift subproblem, and the joint UAV trajectories and USV service duration indicator subproblem. Then, each subproblem can be solved using the proposed successive convex approximation, semidefinite relaxation method, and the enhanced differential evolution algorithm iteratively. Consequently, one can efficiently obtain the feasible solution. The proposed solution reduces USV AAoI by up to 55% while maintaining lower UAV power consumption compared with benchmarks. Also, the proposed solution can promise adequate network queue backlogs under typical USV task data size.
- Research Article
- 10.3390/math13193196
- Oct 6, 2025
- Mathematics
- Seyed Salar Sefati + 4 more
Wireless Sensor Networks (WSNs) consist of numerous battery-powered sensor nodes that operate with limited energy, computation, and communication capabilities. Designing routing strategies that are both energy-efficient and attack-resilient is essential for extending network lifetime and ensuring secure data delivery. This paper proposes Adaptive Federated Reinforcement Learning-Hunger Games Search (AFRL-HGS), a Hybrid Routing framework that integrates multiple advanced techniques. At the node level, tabular Q-learning enables each sensor node to act as a reinforcement learning agent, making next-hop decisions based on discretized state features such as residual energy, distance to sink, congestion, path quality, and security. At the network level, Federated Reinforcement Learning (FRL) allows the sink node to aggregate local Q-tables using adaptive, energy- and performance-weighted contributions, with Polyak-based blending to preserve stability. The binary Hunger Games Search (HGS) metaheuristic initializes Cluster Head (CH) selection and routing, providing a well-structured topology that accelerates convergence. Security is enforced as a constraint through a lightweight trust and anomaly detection module, which fuses reliability estimates with residual-based anomaly detection using Exponentially Weighted Moving Average (EWMA) on Round-Trip Time (RTT) and loss metrics. The framework further incorporates energy-accounted control plane operations with dual-format HELLO and hierarchical ADVERTISE/Service-ADVERTISE (SrvADVERTISE) messages to maintain the routing tables. Evaluation is performed in a hybrid testbed using the Graphical Network Simulator-3 (GNS3) for large-scale simulation and Kali Linux for live adversarial traffic injection, ensuring both reproducibility and realism. The proposed AFRL-HGS framework offers a scalable, secure, and energy-efficient routing solution for next-generation WSN deployments.
- Research Article
- 10.1093/neuonc/noaf193.535
- Oct 3, 2025
- Neuro-Oncology
- L Pakzad-Shahabi + 4 more
Abstract BACKGROUND We have designed and carried out a series of previous trials (BrainWear, CaPaBLE, and BrainApp) that assessed novel ways (research accelerometers; Patient Generated Index; mobile apps) to monitor disease progression and/ or quality of life for patients with a malignant brain tumour and their caregivers. Following on from these and integrating feedback and comments from our Patient and Public Involvement and Engagement focus groups, we have designed BrainWear2. This is a Hybrid Decentralised Master Protocol study of Observational Digital Health Tools assessing their feasibility and acceptability and their relative value to patients and clinicians, and feasibility of providing near-real time feedback. MATERIAL AND METHODS All adult patients with a primary or secondary brain tumours and their caregivers undergoing active treatment will be enrolled into the CORE/ CARE stream. They will wear commercially available smartwatches, and complete paper-based PROMs (EQ-5D-5L&EORTC BN20, CCSS-NCQ -patients; CARGoQOL and Zarit Caregiver burden -caregivers) and clinician/patient-reported functional status assessments. A subset of patients will be offered the opportunity to participate in other streams. Our initial streams are: ePROMs: Electronic PROMs; eSymp: electronic capturing of Symptom reporting; eCog: Electronic Cognitive testing; DTI: Addition of DTI imaging sequences and rContact: Decentralized component - impact of a single brief additional research contact on data completeness. Feasibility and acceptability will be assessed based on participation rates, device wear time, and questionnaire completion. The decentralised trial element will be evaluated by comparing recruitment and retention rates between locally and remotely recruited participants. The feasibility of near real-time feedback will be assessed based on the timeliness of data collection and delivery. Pre-planned analyses will be conducted for each stream after reaching specific recruitment milestones. RESULTS Upon study completion, we will report on feasibility, acceptability, decentralized trial aspects, near real-time feedback implementation, and the impact of extra clinical contact. Exploratory analyses will examine relationships between data streams and assess the accuracy of statistical and machine learning models in identifying disease progression and functional decline. CONCLUSION The aim of BrainWear2 is to provide a firm evidential foundation for the use of near-patient sensing technologies in brain tumour patients. This is designed to lead onto a study using wearables technologies to provide early actionable data to intervene in patients who are deteriorating and prevent hospital admission. This work is funded by Brain Tumour Research.
- Research Article
- 10.53625/jirk.v5i5.11406
- Oct 1, 2025
- Journal of Innovation Research and Knowledge
- Eka Rahmawati + 2 more
To ascertain if the degree of customer loyalty at CV Tirta Fertindo Pratama Semarang is impacted by the quality of the delivery service, a descriptive qualitative research approach is employed. This study uses data reduction, data delivery, and drawing conclusions to conduct data analysis. According to the study's findings, CV Tirta Fertindo Pratama's delivery service in Semarang only satisfies three of the five criteria for service quality: confidence, responsiveness, and empathy. Service quality is compromised by late delivery, delivery operations, and lack of tracking systems. To overcome these problems, you can use delivery technology, monitor damaged or lost goods, and improve communication with customers
- Research Article
- 10.11591/ijeecs.v40.i1.pp499-507
- Oct 1, 2025
- Indonesian Journal of Electrical Engineering and Computer Science
- Adil Hilmani + 4 more
The optimization of energy consumption and the assurance of efficient data transmission are critical factors in enhancing the longevity and performance of wireless sensor networks (WSNs). This study introduces an advanced clustering technique aimed at prolonging the network's lifespan while facilitating reliable data delivery. By integrating the Calinski-Harabasz index into the traditional K-Means clustering approach, the methodology evaluates the quality of clusters and determines the optimal number of clusters, which leads to better node organization within the network. Moreover, the selection of routing pathways from cluster heads to the base station is strategically optimized to conserve energy. Simulation results demonstrate that this novel dual enhancement technique surpasses traditional K-Means in multiple areas, including power consumption, network reliability, and successful data delivery. Consequently, the suggested advancements in cluster formation and routing substantially enhance the performance of energy-limited wireless sensor networks, boosting their robustness and reliability in practical applications.
- Research Article
- 10.1016/j.jbi.2025.104905
- Oct 1, 2025
- Journal of biomedical informatics
- David S Smith + 12 more
Secondary use of radiological imaging data: Vanderbilt's ImageVU approach.
- Research Article
- 10.28925/2663-4023.2025.29.886
- Sep 26, 2025
- Cybersecurity Education Science Technique
- Petro Klimushyn + 3 more
The Internet of Things (IoT) is a very large source of both data and many vulnerabilities. In this regard, the issue of security arises for protecting the resources of IoT nodes and the data they exchange. This process is complicated by the insufficiency of the resources of these nodes in terms of computing power, memory size, energy resources, range and wireless connection performance. IoT devices can be deployed in critical environments where any information leakage to an interceptor or unauthorized penetration into the network can become a serious security threat, especially in the Internet of Military Things and Medical Things. In such networks, cryptographic methods are mainly used to ensure security. Here, the primary task is to generate cryptographic keys for IoT devices interacting with each other. Generating one common (session) key for both parties allows the use of symmetric encryption algorithms. To distribute these keys, public-key cryptography (asymmetric cryptography) can be used, which is too computationally complex and energy-intensive to run on resource-constrained IoT devices. A pressing task for implementing secure technologies and security rules in the IoT network is the task of generating and updating symmetric cryptographic keys with high entropy. Along with this, to simplify the system of exchange of cryptographic keys in the IoT network, the main issue is the secure delivery of new key data and key update during exchange. Most of the proposed key generation strategies are applied based on the physical layer of IoT for general wireless environments. The study provides a new taxonomy of key generation systems for IoT with a classification of approaches by hardware, which demonstrates the fundamental difference in the interaction of IoT devices by components: radio, audio, cameras, IMU sensors with inertial measurement units, various hardware and hybrid approaches. With this taxonomy, users can easily identify the most suitable method for their application scenarios. IoT physical layer-based key generation has received extensive research interest and has been applied with several wireless technologies such as Wi-Fi, ZigBee, LoRa/LoRaWAN, etc.
- Research Article
- 10.1021/acs.est.5c05862
- Sep 24, 2025
- Environmental science & technology
- Jianan Ren + 7 more
Understanding the integrity of water samples between collection and analysis is essential for the delivery of high-quality data. In this study, we evaluated the spiking of isotopically labeled chemicals into environmental samples postsampling as indicators of sample integrity. Our approach compared the ratio of unstable tracers (those sensitive to poor storage conditions or the matrix) to stable tracers (those insensitive to these factors) in samples to track sample integrity. We systematically evaluated 18 deuterium-labeled pharmaceuticals for this purpose in wastewater samples, classifying them into varying stability levels from highly stable to very unstable. Stability was determined under varying temperatures (37 °C, 20 °C, 4 °C, and -20 °C) for up to 84 days and using common preservation techniques (hydrochloric acid: HCl, sodium metabisulfite: SMS, and untreated). Based on our proposed selection criteria, of the 18 investigated, one was found suitable as an unstable tracer and 6 as stable tracers in acidified wastewater, while 3 unstable and 2 stable tracers were suitable for SMS-preserved wastewater and 4 unstable and 2 stable tracers when no preservatives are required. This study provides the first assessment of chemical tracers to measure the sample integrity for environmental sampling campaigns.