Articles published on Integrity assurance
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
260 Search results
Sort by Recency
- New
- Research Article
- 10.4108/eetsis.8828
- Dec 2, 2025
- ICST Transactions on Scalable Information Systems
- Jun Zhang + 1 more
INTRODUCTION: With the rapid development of cloud computing and intelligent technology, traditional accounting informatization systems are gradually migrating to cloud computing platforms, promoting financial management's automation and intelligence. OBJECTIVES: However, data security in the cloud computing environment, especially the issue of data integrity, has become one of the core challenges for intelligent accounting informatization systems. Accounting informatization systems rely on the storage and processing of a large amount of financial, tax, and transaction data, and the integrity of these data is directly related to the financial transparency, compliance, and decision-making accuracy of an enterprise. METHODS: Therefore, how to design an efficient and scalable cloud data integrity verification algorithm becomes the key to ensuring the reliability and security of intelligent accounting systems. This paper proposes a cloud data integrity verification algorithm for smart accounting informationization based on a cloud computing environment. The algorithm combines the efficient data processing capability of cloud computing with the automation requirements of smart accounting systems to ensure data integrity during storage, transmission, and processing through various data validation techniques such as digital signatures, hash algorithms, and blockchain-based distributed ledgers. RESULTS: The algorithm design considers the dynamics and scalability of the cloud computing environment. It can realize fast, real-time data validation and integrity detection in a large-scale data environment. Through experimental verification, the algorithm proposed in this paper performs well in processing large-scale accounting data with high verification efficiency and accuracy. CONCLUSION: Compared with traditional validation methods, the algorithm improves the speed of data validation. It enhances the system's ability to prevent data tampering and loss, providing a new cloud data integrity assurance scheme for intelligent accounting informatization systems.
- New
- Research Article
- 10.23939/csn2025.02.134
- Dec 1, 2025
- Computer systems and network
- P.P Petriv + 1 more
The article proposes a comprehensive approach to solving the data protection problem in decentralized distributed information storage systems based on blockchain technology. A conceptual "SecureChain" model has been developed that integrates modern cryptographic protection methods with programmable smart contract logic for automated access management and data integrity assurance. The model employs a multi-level architecture including data layer, smart contract layer, network interaction layer, and user interface. The key innovation is the application of Shamir's threshold encryption schemes (t, n) controlled by smart contracts, combination of symmetric and asymmetric encryption algorithms (AES-256 for data, Curve25519 for keys), and implementation of a "secret disclosure" mechanism to enhance critical information security. Experimental validation of the model was conducted in three typical use cases: medical data storage system, corporate document management system, and electronic voting platform. Results demonstrate significant improvement in key security metrics compared to traditional approaches: resistance to attacks on individual nodes (by 65%), data confidentiality (by 72%), access audit capability (by 90%), and fault tolerance (by 58%) with moderate increases in storage costs (by 15%) and access time (by 10%). Additional scalability analysis showed a near-linear relationship between performance and both data volume and number of users. The proposed model and implementation methodology have significant practical value for organizations working with confidential data and requiring reliable distributed storage systems that meet modern security requirements and regulatory standards. Keywords: smart contracts, cryptographic protection, threshold encryption, distributed systems, key management, data integrity, network security.
- New
- Research Article
- 10.30574/wjaets.2025.17.2.1499
- Nov 30, 2025
- World Journal of Advanced Engineering Technology and Sciences
- Vaghela Upendra
Background: The pharmaceutical industry operates in an increasingly globalized environment where multinational companies must navigate divergent regulatory frameworks for computerized systems and electronic records. Audit trail review requirements represent a critical component of data integrity assurance, yet significant variations exist between major regulatory authorities. Objective: This manuscript provides a comprehensive comparative analysis of audit trail review requirements under FDA 21 CFR Part 11 and EU GMP Annex 11, examines the challenges multinational pharmaceutical companies face in establishing unified Global Quality Systems (GQS), and proposes harmonization strategies. Methods: A systematic review of regulatory guidance documents, industry publications, warning letters, and current literature was conducted. Additional regulatory frameworks from WHO, PIC/S, MHRA, TGA, and PMDA were analyzed to provide a global perspective. Results: Fundamental differences exist in the scope, frequency, and depth of audit trail review requirements across regulatory jurisdictions. FDA 21 CFR Part 11 mandates comprehensive audit trails for creation, modification, and deletion of electronic records, while EU GMP Annex 11 focuses primarily on modification and deletion of GMP-relevant data. Regulatory interpretations of "regular review" vary significantly, creating implementation challenges for global organizations. Conclusions: Achieving harmonization requires risk-based approaches, leveraging emerging technologies including artificial intelligence and machine learning, and implementing robust governance frameworks. The manuscript concludes with practical recommendations for establishing effective global audit trail review programs.
- Research Article
- 10.34190/icer.2.1.4080
- Oct 31, 2025
- International Conference on Education Research
- Nothando Shiba + 2 more
The emergence of advanced educational technologies such as Artificial Intelligence (AI) has revolutionised learning and teaching methods. For example, at the University of South Africa (UNISA), IRIS is used for exam invigilation. This tool provides educators assurance of assessment integrity during online and remote assessment. It monitors students’ movement during the exam by recording a video of their face, audio, and taking screenshots of their computer screens at regular intervals and reports any alleged misconduct. However, IRIS often does not detect, where AI such as ChatGPT was used to generate answers. Furthermore, the university policies currently allow the use of Grammarly and Quillbot apps, which are increasingly incorporating AI features. These apps generate real time writing suggestions and rephrasing information from the internet to prevent any plagiarism. In addition, the University uses Turnitin's AI detection software that gives false positives if the student has written well in the passive voice. Considering that apps constantly evolve, the university needs to regularly check and mandate their use based on the latest features of the app. ChatGPT is amongst the latest AI writing apps, it enables students to easily access pre-written content without actively engaging in critical thinking and learning, potentially leading to widespread plagiarism, which poses a threat to education. In this paper, we present a concise overview of the use of IRIS and Turnitin invigilation and detection tools for online assessments. In an action study, the co-authors reflect on the implications of the use of these AI apps on the integrity and validity of the assessments. Directions for further research are suggested.
- Research Article
2
- 10.1038/s41598-025-16678-y
- Aug 21, 2025
- Scientific reports
- Bhupesh P Nandurkar + 7 more
Concrete strength prediction is of great relevance for construction safety and quality assurance; however, these methods often trade-off their accuracy or interpretability, especially when it comes to the use of supplementary cementitious materials like fly ash in process. This study aims to build an interpretable, highly accurate model for predicting the compressive and tensile strength of concrete with a hybrid approach based on gradient boosting (XGBoost), deep neural networks (DNNs), and optimization via AutoGluon Process. The model is put into a multitask learning (MTL) framework that includes mix design variables, environmental factors, and non-destructive testing (NDT) data samples. The interpretation of model predictions is accomplished through SHAP and LIME to quantify global and local importance. Results show an impressive R² score of 0.91 on the test set with a 23% reduction in MSE and LIME fidelity exceeding 0.87. This shows a 10-15% increase in the mean-squared error, surpassing existing models. Feature analysis shows that fly ash percentage contributes around 25% to the predictions. The proposed solution thus offers a robust interpretability platform for concrete strength prediction and further shows great promise for optimization in material design and structural integrity assurances. This work serves as a landmark in bridging the gap between hybrid modeling with automated optimization and explainability for concrete strength predictions.
- Research Article
- 10.30632/pjv66n4-2025a5
- Aug 1, 2025
- Petrophysics – The SPWLA Journal of Formation Evaluation and Reservoir Description
- Abdulaziz Bazaid + 3 more
Cement bond evaluation across unconventional pipes, such as fusion-bonded epoxy (FBE) coated casings, presents many challenges for the existing conventional ultrasonic technology. The coating interface behaves as an additional interface affecting the acoustic impedance measurements from conventional ultrasonic tools. This work details the testing and validation of the flexural and ultrasonic tool to address these challenges. The new flexural and ultrasonic transducers that emit and receive the ultrasonic waves to probe the casing, cement, and cement-to-formation/second casing interface were used at surface in a pressurized safety tank. Four FBE coated pipe samples were tested; the samples varied in outside diameter and the slurry density with which they were cemented. To validate the flexural and ultrasonic tool response in FBE coated casing, the 7- and 9.625-in. FBE coated casings were set up in the tank to simulate measurements for each casing size. The simulation included two scenarios. First, the free pipe measurement involved filling the tank with fresh water (1.48 MRayls), targeting the 7-in. casing and then the 9.625-in. casing, with free pipe outside in the annulus. Second, the cemented pipe scenario involved applying a cement sheath to both casing targets, using 121 pcf (pounds per cubic foot) slurry for the 7-in. casing and 91 pcf slurry for the 9.625-in. casing, while maintaining fresh water (1.48 MRayls) inside the tank. Successful testing of the 7- and 9.625-in. pipe samples was performed with the flexural and ultrasonic tool. By simultaneously acquiring ultrasonic pulse-echo measurements and flexural wave imaging, the flexural and ultrasonic tool provides a high-resolution map of pipe thickness and cement with full azimuthal coverage in coated pipes. The results showed that the casing thickness and casing internal radius were as expected for the tested samples. The acoustic impedance of the annulus material obtained from flexural measurements was within the accuracy range of ±0.5 MRayl for both the 7- and 9.625-in. pipe samples in free pipe as well as cemented pipe conditions. Post-validation, the flexural and ultrasonic tool has been successfully deployed in more than 50 wells with coated casing and has helped in delivering enhanced cement and corrosion evaluation for all these wells. This work demonstrates the integration of advanced technology and technical expertise to enable reliable barrier evaluation for well integrity assurance.
- Research Article
- 10.32996/jcsts.2025.7.7.106
- Jul 23, 2025
- Journal of Computer Science and Technology Studies
- Surya Prabha Busi
Confidential computing introduces a sophisticated security framework addressing the protection deficit for data during active computational processes within cloud infrastructure. Contemporary security mechanisms effectively safeguard information in storage repositories and network transit; however, substantial vulnerability persists during processing operations. Through the implementation of hardware-enforced isolated execution environments, confidential computing enables computational operations on protected data without exposure to host systems or administrative credentials. This architectural construct delivers considerable security benefits for entities operating within regulated domains where stringent data protection requirements predominate. The cryptographic verification mechanisms inherent in these systems establish computational integrity assurance before execution commencement. Entities within financial sectors conducting analytical operations, healthcare institutions processing clinical information, and governmental organizations managing classified intelligence derive substantial advantages from these protective capabilities. The article facilitates protected collaborative initiatives across organizational boundaries while maintaining requisite confidentiality parameters. When integrated with established identity verification protocols, contextual authorization frameworks, and continuous monitoring apparatus, confidential computing enhances a comprehensive security posture significantly. The accelerating adoption across diverse industrial sectors indicates recognition of efficacy against sophisticated adversarial methodologies targeting privileged access within heterogeneous computational environments.
- Research Article
- 10.21511/gg.06(1).2025.03
- Jul 9, 2025
- Geopolitics under Globalization
- Artem Artyukhov + 1 more
Type of the article: Reflexive Preface AbstractThis study aims to examine the role of international organizations in promoting academic integrity, analyzing how diverse cultural interpretations of scholarly ethics create challenges for standard consistency across borders and the mechanisms through which international bodies facilitate the coordination and standardization of integrity practices. A comparative approach examines academic integrity across different jurisdictions and cultural contexts, a multi-level analytical framework understands the organizational structure of academic integrity governance, and illustrative case studies demonstrate practical applications. The analysis identifies five interconnected components of academic integrity: educational, research, managerial, professional association, and publishing integrity. This study reveals a sophisticated five-level hierarchical system of academic integrity governance spanning from international organizations to individual practitioners. International organizations function as collaborative facilitators rather than rigid rule-makers, developing flexible frameworks that can be adapted to diverse cultural contexts while maintaining universal ethical principles. Significant variations exist between national approaches, with some countries implementing comprehensive legislative frameworks while others relying on cultural principles and institutional traditions. International organizations facilitate dialogue and consensus-building that transcends national boundaries while respecting institutional autonomy. The hierarchical governance system demonstrates that academic integrity assurance requires top-down coordination to establish consistent standards and bottom-up commitment to implement those standards. The strength of this system lies in its ability to maintain the universality of core ethical principles while allowing for cultural adaptation in implementation methods, ultimately ensuring that the fundamental commitment to honesty, originality, and fairness in scholarship remains constant regardless of cultural context or geographical location. Acknowledgment The research is prepared in the frame of the project “OSEE – Open Science and Education in Europe: success stories for Ukrainian academia” (project number ERASMUS-JMO-2022-HEI-TCH-RSCH-101085198). Funded by the European Union. The views and opinions expressed are, however, those of the authors only and do not necessarily reflect those of the European Union or EACEA. Neither the European Union nor the granting authority can be held responsible for them.
- Research Article
- 10.1071/ep24419
- Jun 19, 2025
- Australian Energy Producers Journal
- Tim Thomas
Presented on 28 May 2025: Session 12 Assurance of casing and cement integrity is a key component in well integrity management. Traditional methods to assess casing and cement have largely required rigs or logging units to intervene on a well. Such methods are being enhanced by the introduction of new technologies, especially in downhole gauges. Many operators now can gather greater volumes of data than in the past, which has led to significant interest in adopting machine learning (ML)-based applications. This interest has resulted in ML being applied in many operators, especially in the areas of production surveillance, drilling optimisation and reservoir engineering. One area that hasn’t received as much interest is well integrity management systems. This paper assists in addressing this research gap by examining key reasons why usage in well integrity management has been less than other areas, reviewing analyses from previous researchers and discussing what assurance activities are suited for use in ML. Additionally, this paper summarises the results of a collaborative research project conducted with a coal seam gas (CSG) operator to assess the feasibility of implementing an artificial neural network-based application to support its Well Integrity Management System. This research demonstrated that such an application could be implemented in a CSG environment and could improve outcomes by augmenting the current system, especially in risk assessment and well selections of interventions. To access the Oral Presentation click the link on the right. To read the full paper click here
- Research Article
- 10.63265/jkti.v3i3.102
- Jun 16, 2025
- JURNAL KESEHATAN TROPIS INDONESIA
- Ajibu Jonas + 2 more
The practice of Ramadan fasting, as intermittent fasting, is growing in popularity because of its ability to prevent cancer development. The practice of fasting allows patients to experience multiple changes at metabolic, physiological, and epigenetic levels, which reduce cancer advancement and better their response to treatment. The study performs a systematic review of published studies to explore protective mechanisms within this investigation about assurance of scientific integrity and transparency. The authors have implemented the standards included in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). The research demonstrates how fasting metabolism affects ketone body production while reducing glucose availability and simultaneously lowering both insulin and insulin-like growth factor-1 (IGF-1) levels. Proliferation control of cancer cells depends heavily on these three regulatory factors. The DNA damage response (DSR) becomes more effective while cellular homeostasis maintains itself because fasting-induced autophagy accomplishes both cellular component elimination and defective cellular component clearance. The reduction of oxidative stress caused by fasting leads to constrained development of malignant cell transformations and protects against DNA. The epigenetic effects of fasting during Ramadan result in non-coding RNA regulation, together with DNA methylation and histone remodeling processes to turn on beneficial tumor suppressor genes and deactivate cancer-causing oncogenes. The latest research supports the potential role of fasting as an additional form of care for fighting and stopping cancer development. These findings lead to major impacts on clinical applications and dietary remedies, and public health policy within integrative cancer treatment.
- Research Article
- 10.71097/ijsat.v16.i2.5572
- Jun 11, 2025
- International Journal on Science and Technology
- Kothwala Dr Deveshkumar Mahendralal - + 3 more
Abstract Good Documentation Practices (GDP) are the protocols used in regulated industries that comply with the principles of GDP to ensure that documents are not only authentic, but accurate, adequate and what they claim to be. This article gives a deep insight into the importance of GDP in pharmaceuticals, health sector and clinical research. We begin by noting GDP ensures that data integrity and maintains that it enables organizations to become compliant with stringent regulatory mandates. At the center of the GDP is the ALCOA protocol of information reporting which affirms that records should be Attributable, Legible, Contemporaneous, Original, and Accurate. Building on this, the ALCOA+ model adds further attributes—Complete, Consistent, Enduring, and Available—to promote the highest standards of data integrity. We also explore how leading global regulatory agencies, including the U.S. Food and Drug Administration (FDA), the European Medicines Agency (EMA), the World Health Organization (WHO), and the International Council for Harmonisation (ICH), have embedded GDP principles into their regulatory expectations and industry guidelines. Various types of documentation commonly seen in regulated environments are classified, and real-world applications of GDP across industries are discussed with examples (e.g. in clinical trial records, manufacturing batch documents, QMS records). The review also addresses the challenges organizations face in implementing GDP – from inadequate training and resistance to digital change, to human errors and insufficient review processes – and the risks of non-compliance. This section outlines best practices and strategies for improving Good Documentation Practices. Key approaches include implementing standardized procedures and leveraging digital tools—such as electronic documentation systems that comply with 21 CFR Part 11—to enhance efficiency and reliability. Continuous training is also emphasized as a vital component in maintaining high documentation standards. In conclusion, effective GDP is shown to be indispensable for regulatory compliance, operational excellence, and the assurance of data integrity in all activities. Organizations that invest in robust documentation practices and cultural transformation not only meet regulatory expectations but also achieve greater efficiency and trust in their processes and products.
- Research Article
- 10.1071/ep24156
- May 22, 2025
- Australian Energy Producers Journal
- Tim Thomas + 1 more
Assurance of casing and cement integrity is a key component in well integrity management. Traditional methods to assess casing and cement have largely required rigs or logging units to intervene on a well. Such methods are being enhanced by the introduction of new technologies, especially in downhole gauges. Many operators now can gather greater volumes of data than in the past, which has led to significant interest in adopting machine learning (ML)-based applications. This interest has resulted in ML being applied in many operators, especially in the areas of production surveillance, drilling optimisation and reservoir engineering. One area that hasn’t received as much interest is well integrity management systems. This paper assists in addressing this research gap by examining key reasons why usage in well integrity management has been less than other areas, reviewing analyses from previous researchers and discussing what assurance activities are suited for use in ML. Additionally, this paper summarises the results of a collaborative research project conducted with a coal seam gas (CSG) operator to assess the feasibility of implementing an artificial neural network-based application to support its Well Integrity Management System. This research demonstrated that such an application could be implemented in a CSG environment and could improve outcomes by augmenting the current system, especially in risk assessment and well selections of interventions.
- Research Article
- 10.3233/shti250622
- May 15, 2025
- Studies in health technology and informatics
- Marta Durá + 5 more
Good pharmacy and manufacturing practices identify data integrity as a critical factor for patient safety. However, many of the processes in hospital pharmacies are still documented manually, which can compromise the integrity of the information. This work presents a software customization prototype for data integrity assurance in hospital pharmacy formulation and dispensing processes. The proposed solution combines the implementation of ALCOA+ principles by design, GAMP5 V-model and human-center design methodologies to adapt a validated data integrity software for a hospital pharmacy use case. Moreover, our proposal is being evaluated in the hospital pharmacy environment to achieve a higher readiness level. After complete validation, the software customization can support the digitalization of drug formulation and dispensing processes and improve patient safety.
- Research Article
- 10.55640/ijdsml-05-01-17
- May 15, 2025
- International journal of data science and machine learning
- Shriprakashan L Parapalli
Current Good Manufacturing Practice (cGMP) regulations emphasize stringent control over production processes, personnel, equipment, and documentation in pharmaceutical manufacturing. Meeting cGMP requirements involves meticulous recordkeeping, comprehensive quality control, and robust oversight—processes that are prone to human error when relying on traditional, paper-based approaches. Against this backdrop, Manufacturing Execution Systems (MES) offer a powerful solution for managing production workflows and ensuring regulatory adherence. This paper explores the integration of MES in a cGMP environment to automate compliance assurance and details key strategies including automated validation, data integrity assurance, QMS integration, regulatory reporting, training and competency tracking, risk-based automation, and AI-driven continuous improvement. Through literature reviews and case study analyses, we identify critical process elements where MES adds the most value, such as reducing human error, streamlining documentation, and facilitating digital audit trails. The findings suggest that adopting MES not only enhances operational efficiency but also enables a proactive approach to regulatory compliance, positioning organizations to adapt quickly to evolving industry standards.
- Research Article
- 10.3390/cryptography9020027
- Apr 29, 2025
- Cryptography
- Alex Shafarenko
We present a new construction of a one-time pad (OTP) with inherent diffusive properties and a redundancy injection mechanism that benefits from them. The construction is based on interpreting the plaintext and key as members of a permutation group in the Lehmer code representation after conversion to factoradic. The so-constructed OTP translates any perturbation of the ciphertext to an unpredictable, metrically large random perturbation of the plaintext. This allows us to provide unconditional integrity assurance without extra key material. The redundancy is injected using Foata’s “pun”: the reading of the one-line representation as the cyclic one; we call this Pseudo Foata Injection. We obtain algorithms of quadratic complexity that implement both mechanisms.
- Research Article
- 10.2118/217732-pa
- Mar 27, 2025
- SPE Journal
- C E Mcmillan + 4 more
Summary The development and application of a fit-for-purpose CO2 injection model (CIM) is presented in the context of a front-end engineering design for a new carbon capture and storage (CCS) project targeting a depleted gas reservoir in the North Sea. The growing trend toward long-term industrial-scale CCS presents challenges for the current industry design capabilities. In conjunction with the wider CCS system design, particular engineering design and integrity assurance requirements must be addressed for the wellbore components. An appropriate equation of state (EOS) that accounts for impurities and multiphase flow conditions is required to predict rapid changes in fluid behavior. With a depleted gas reservoir as the storage target, early-life vs. late-life wellbore conditions can be significantly different. The tubular design workflow was finalized using a CIM deployed via a cloud-based software platform developed in concert with the well engineering design process in an agile manner. Validation of the model was provided by comparison with benchmarks from legacy software. Coupled with drilling and cementing conditions from the well construction phase, the resultant thermal stresses on tubulars, connections, and completion components during gaseous, dense, and multiphase CO2 well operating conditions need to be accurately predicted as they can be significant. Downhole conditions can be affected by cooling from adiabatic expansion and Joule-Thomson effects across chokes, due to wellbore friction and at the sandface. Transient operations during shut-in and restart result in low design case temperatures. Low-probability survival conditions under simulated blowout or leakage scenarios require to be modeled and can result in worst-case temperature qualification requirements for wellbore equipment. Industry work groups have proposed the GERG-2008 EOS as a standard model for CCS operations and well design. However, potential limitations noted in technical literature appear evident from detailed well sensitivity analysis. Potential improvements to the GERG-2008 model as well as requirements for an improved fit-for-purpose EOS are outlined.
- Research Article
- 10.3390/rs17071110
- Mar 21, 2025
- Remote Sensing
- Zhaochen Li + 4 more
The BeiDou Satellite-Based Augmentation System (BDSBAS), based on the Radio Technical Commission for Aeronautics (RTCA) protocol, aims to provide high-precision, single-frequency positioning with integrity assurance for civil aviation users in China and surrounding regions. Given the anticipated high solar activity between 2023 and 2025, ionospheric anomalies may degrade positioning accuracy and significantly impact BDSBAS integrity performance. To enhance BDSBAS integrity, this study evaluates and analyzes the system’s ionospheric degradation parameters for 2023. The results indicate that during the active ionospheric period in 2023, the rate of ionospheric grid delay changes exceeding the limits of the currently broadcasted parameters increased by 0.86%, posing potential integrity risks compared to 2022. To address this issue, we propose a novel algorithm for ionospheric degradation parameters and assess its applicability, stability, and effectiveness using BDSBAS single-frequency service message data from IGS monitoring stations in China. Statistical analysis in the localization domain demonstrates that the new method reduces the rate of ionospheric degradation parameters exceeding the threshold by 1.10% in 2023–2024. This approach significantly enhances BDSBAS integrity service capabilities, supporting its performance improvement and official deployment.
- Research Article
- 10.36548/jaicn.2025.1.002
- Mar 1, 2025
- Journal of Artificial Intelligence and Capsule Networks
- Revathy S.P + 3 more
The growing complexity of satellite operations requires an advanced and secure monitoring system to ensure data integrity, system reliability, and operational efficiency. Traditional satellite monitoring frameworks rely on centralized data management, which is vulnerable to cyber threats, data loss, and unauthorized modifications. This study presents a Blockchain-Enabled Prediction System that acts as a supporting tool for existing satellite monitoring infrastructures. The proposed system utilizes Hyperledger fabric to establish a tamper-proof ledger for storing satellite telemetry data, ensuring data security and traceability. Unlike conventional models, the blockchain architecture guarantees immutability, enabling secure and verifiable satellite performance monitoring. A web-based dashboard is integrated to facilitate real-time alerts and parameter visualization. The implementation of smart contracts allows for automated validation and alert generation when anomalies are detected in satellite parameters. The performance of the system is evaluated in terms of transaction efficiency, security validation, and integrity assurance, demonstrating its feasibility for scalable and secure satellite operations.
- Research Article
1
- 10.1177/0926227x241296435
- Feb 21, 2025
- Journal of Computer Security
- Ramadan Abdunabi + 2 more
Body area networks (BANs) frequently generate sensitive healthcare data from sensors and other devices. Security and privacy breaches in BAN systems can compromise information affecting patients’ physical health, emotional state, and financial well-being. The lack of well-defined security perimeters and qualified personnel to administer security in such dynamic environments requires an authorization framework for protecting patient data, where access depends on the users’ credentials, location, and time. Toward this end, this work aims to define a secure system architecture to incorporate fine-grained information access management. It also leverages a spatiotemporal attribute-based access control (STABAC) model to make it possible to enforce location and time factors with BAN policies and required attributes to make access decisions. The BAN policies have various dynamic constraints that may conflict with each other or introduce inconsistencies. Therefore, this work proposes a formal verification framework using timed colored Petri nets to ensure such errors are not introduced. The blockchain network is utilized to maintain policy integrity, where STABAC verifies policy integrity from the network through smart contract services before making access decisions. Finally, the policy and attribute management framework ensures that STABAC maintains a verified set of policies and attributes for authorizing uninterrupted care and services.
- Research Article
- 10.52783/jisem.v10i12s.1942
- Feb 19, 2025
- Journal of Information Systems Engineering and Management
- Bhagath Chandra Chowdari Marella
Fraud prevention is critical for companies in the modern, evolving digital era. With the complexity of fraud methods, companies are forced to reimagine their models of risk management to ensure confidential data remains protected and be able to trust their stakeholders. Rising cases of cybercrime, social engineering, and advanced persistent threats have compelled companies to adopt a proactive approach towards fraud detection and prevention. The report reveals significant fraud detection, prevention, and mitigation trends, including the application of new technologies such as AI, blockchain, and behavioural analytics. AI has significantly advanced real-time detection capabilities using machine learning algorithms to identify patterns and anomalies within large datasets. Blockchain technology ensures immutability and transparency, and it is a potent tool for transactional integrity assurance, particularly in finance and supply chain management. On the other hand, behavioural analytics provides information on users' behaviour and interactions, enabling companies to identify potentially fraudulent activity through patterns of abnormal behaviour. This paper offers an organisational guide to improving fraud resilience strategies through in-depth analysis and case studies. The study compiles some risk mitigation strategies and models specifically crafted to address the varying requirements of various types of enterprises, from small and medium-sized businesses (SMEs) to large multinational companies. The study investigates how companies can use technological innovations with current legislation to create a multilayered fraud defence system. Moreover, the article highlights the importance of predictive analytics and big data in detecting concealed fraud trends so businesses can take action before it's too late. Even with all the technological advancements, companies find it difficult to deploy successful fraud resilience models. Cybersecurity attacks, implementation costs, and shortages of skills within the employee population pose hindrances to using such technologies. The paper also discusses the privacy and ethical aspects of AI and other data-driven technologies. It demands an equilibrium strategy to balance security and user privacy. This research culminates with strategic recommendations for organizations wishing to create fraud-resilient systems. Key recommendations involve investing in employee training to bridge competency gaps, employing cloud-based fraud detection platforms to save costs, and establishing public-private sector partnerships to foster knowledge sharing and innovation. Recommended areas of future work are also given, such as investigating the potential of quantum computing to detect fraud and designing ethical guidelines for AI-based fraud resilience models. In summary, the paper presents an in-depth review of fraud resilience. It offers insights that can significantly benefit organizations that wish to shield their operations from a dynamic threat environment. By embracing technological innovation, strategic planning, and ethical conduct, organizations can build an effective defence against the ever-changing fraud issue.