Articles published on Data access control
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
1001 Search results
Sort by Recency
- New
- Research Article
- 10.1186/s42400-025-00533-8
- Jan 4, 2026
- Cybersecurity
- Fei Tang + 3 more
Abstract Private Information Retrieval (PIR) is a cryptographic technique that allows Data User (DU) to retrieve data from a SERVER without revealing which specific data item is being accessed. Traditional PIR protocols typically assume that the data is locally stored and directly controlled by Data Owner (DO), but in real-world scenarios, data is often hosted on untrusted third-party SERVERs, making it difficult for DO to effectively restrict the SERVER’s access to their data or control which DU is authorized to retrieve the data. Consequently, malicious SERVERs or unauthorized DU may infringe upon the privacy rights of DO. This paper presents SecOutPIR, a novel outsourced PIR system that addresses two key challenges: privacy preservation for DO and access control for DU. SecOutPIR integrates attribute-based encryption for fine-grained retrieval access control to ensure that only DU with valid retrieval can access the data, while also utilizing a decentralized identity management system based on decentralized identifiers and verifiable credentials to authenticate DU requests. The proposed system ensures that the DO’s data privacy is protected during data storage and retrieval, while also ensuring that only DU with authorized retrieval can make retrieval requests, thus preventing unauthorized access. We provide a detailed description of the system model, security requirements, and an in-depth security analysis. Furthermore, experimental results demonstrate that SecOutPIR significantly enhances the practicality and efficiency of PIR in outsourced settings by enabling fine-grained retrieval access control without degrading query performance. Our implementation demonstrates that the SERVER reply time increases with the dataset size, from 82.5 ms (1000 entries) to 113.8 ms (2000 entries) and 199.6 ms (5000 entries), while the query generation time remains approximately constant at around 2.0 ms.
- New
- Research Article
- 10.3897/biss.10.183778
- Dec 29, 2025
- Biodiversity Information Science and Standards
- Robert Lewis + 4 more
Background and Rationale Despite decades of progress in ecological monitoring, primary biodiversity and environmental data remain unevenly mobilised and poorly interoperable (Hampton et al. 2015, Poisot et al. 2019). Datasets, often gathered with public funds, frequently remain inaccessible or insufficiently described, limiting their reuse in global syntheses (Culina et al. 2018). Ecologists’ concerns about trust, transparency, and control of shared data persist, particularly where data production is resource-intensive or socially embedded. These concerns echo the foundational properties of distributed ledgers, where ownership and governance are distributed across peer networks rather than centralized repositories (Lewis et al. 2023). Forests exemplify both the potential and the challenge of such decentralised infrastructures. As globally significant carbon and biodiversity reservoirs, forests are also deeply fragmented across ownership and jurisdictional boundaries. In Europe alone, over half of forested land is privately owned, yet these actors often lack mechanisms to derive tangible value from stewardship. At the same time, digital twins (macroecological models) that integrate in situ and remotely sensed data, are becoming central to forest policy and monitoring frameworks (e.g., Food and Agriculture Organization of the United Nations (FAO), Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), Global Biodiversity Framework (GBF)). ForestWeb3 (FW3) hypothesizes that a decentralised, Findable Accessible, Interoperable, Reusable (FAIR; Wilkinson et al. 2016, Nosek et al. 2022)-aligned data network can unlock the latent value of underused biodiversity data while building trust and incentives for participation.. Objectives Mobilisation and harmonisation of forest biodiversity and environmental data (Objective 1): to spearhead a shift from data curation to data stewardship through a decentralised data infrastructure built on open-source blockchain frameworks. Incentivisation and uptake (Objective 2): to design transnational pathways through which private forest owners and local communities can be economically rewarded for verifiable ecological data via nature-backed digital assets and ReFi mechanisms. Together, these objectives align technical innovation (Objective 1) with behavioural and economic motivation (Objective 2), establishing the groundwork for distributed biodiversity observatories capable of sustaining long-term ecological data flows. Methodological Approach WP 1 develops a blockchain-based data ledger with smart contracts that autonomously manage data registration, access control, and reuse. Metadata and identifiers are immutably recorded on-chain, while primary datasets remain decentralised on contributor-managed nodes. This architecture enables contributors to retain data sovereignty while ensuring transparency and traceability in reuse transactions. WP 2 extends the infrastructure to real-time environmental sensing through the integration of modular Internet ofThings (IoiT)-based microclimate sensors. These devices stream environmental data at high temporal resolution directly into the distributed ledger, forming a Decentralized Physical Infrastructure Network (DePIN) for ecological data. WP 3 links these data streams to the creation of digital twins of forest ecosystems, combining in situ biodiversity observations with satellite and climate datasets to model ecosystem integrity. These models underpin the valuation of nature-backed digital assets, a form of tokenised evidence for ecological performance, providing the data foundation for voluntary biodiversity and carbon markets. Finally, WP4 investigates forest owners’ perceptions, motivations, and barriers to adopting regenerative finance (ReFi)-based conservation mechanisms. Through interviews and a pan-European survey, it explores how varying sociocultural and institutional contexts shape engagement with emerging biodiversity credit schemes, drawing parallels to established Payment for Ecosystem Services frameworks (Kaiser et al. 2021). Significance and Legacy FW3 exemplifies the convergence of data decentralisation, digital sensing, and regenerative economics, a triad capable of transforming how ecological knowledge is produced, verified, and valued. By embedding data provenance and attribution within the infrastructure itself, we addresses long-standing issues of trust and recognition in ecological data sharing. Its incentive mechanisms offer pathways to decouple conservation finance from traditional public funding, potentially scaling stewardship and democratizing data mobilisation across millions of hectares of privately owned forest land. The project’s legacy lies in demonstrating that data infrastructures can be both scientific and economic commons, capable of sustaining biodiversity monitoring through distributed participation. Beyond its immediate technical deliverables, ForestWeb3 contributes to a broader vision of dynamic, self-sustaining ecological data ecosystems that power both global biodiversity frameworks and locally grounded conservation action.
- Research Article
- 10.58325/ijisct.005.01.00147
- Dec 15, 2025
- International Journal of Information Systems and Computer Technologies
- Nimra Sajjad Hussain + 5 more
Edge computing has become an essential paradigm of real-time data processing in a distributed environment, allowing data processing near the data sources. However, this move towards decentralized computing poses some major security and trust concerns, especially regarding data transmission in a decentralized network. This study presents a framework for secure data transmission in edge computing using blockchain that addresses data integrity, confidentiality, and access control issues. The framework is based on a multi-layered architecture with the use of smart contracts, role-based permissioning, hybrid data classification (on-chain, off-chain, and oracle-based), and secure communication protocols. It provides tamper tamper-resistant, auditable, and decentralized data transmission design that is appropriate in resource-constrained and dynamic edge networks. The framework is assessed in comparison with ISO/IEC 27001 information security standards and confirmed by an expert opinion, proving a high correlation with best practice and feasibility. The framework is capable of increasing data security, minimizing the dependence on central authorities, and allowing scalability and resilience to edge computing applications. The study provides a fundamental solution for implementing trustworthy, decentralized data management frameworks in the rising edge-based applications.
- Research Article
- 10.55208/bistek.v18i2.415
- Dec 2, 2025
- Majalah Bisnis & IPTEK
- Asep Ririh Riswaya + 1 more
The patient data management system in healthcare services currently faces serious challenges regarding data security and availability. Many system users can access patient information without proper authorization, which increases the risk of sensitive data leaks and misuse of information. Doctors have not fully implemented comprehensive security protocols to maintain the confidentiality and integrity of data, further worsened by the lack of authentication mechanisms and staff education on digital security. Additionally, disruptions to the system can lead to downtime, reducing data availability and directly impacting healthcare operations. This research aims to enhance the efficiency of healthcare services in medical practices by developing a web-based data management system that presents accurate and up-to-date information. The system development follows the Waterfall method and incorporates One Time Password (OTP) security features to ensure controlled and secure data access. The implementation of this system is expected to improve the quality of healthcare services by speeding up service processes while safeguarding patient information.
- Research Article
- 10.59395/ijadis.v6i3.1387
- Dec 1, 2025
- International Journal of Advances in Data and Information Systems
- Herman Herman + 2 more
This research studies implementation of decentralized applications (DApps) that are combined with blockchain technology and IPFS for storing patient medical data. The goal of this research is to increase the security, transparency, and access control of stored medical data to make sure only legitimate users can access the data. The proposed system uses smart contracts on the Ethereum network to handle user rights of access (doctors, patients, and admins) and ensure data integrity through the blockchain immutability feature. Patient medical records are retained in IPFS and traced using the Content Identifier (CID). Implementation outcome reveals that the system can safely process medical information, keeping patients in full control of their information, and restricting data access only to scheduled time. This system also shows the potential of blockchain and IPFS technology-based applications in achieving a more efficient health ecosystem focused on safeguarding people's data.
- Research Article
- 10.23939/csn2025.02.050
- Dec 1, 2025
- Computer systems and network
- O Deineka + 1 more
In a world where the amount of electronic data is growing at a rapid pace every day, businesses face a new challenge: how to maintain control over information, make it secure, yet at the same time accessible and useful. The authors of the article see the answer to this question in the implementation of an ITSM component that ensures compliance with the international SOC 2 Type 2 standard. This standard is a kind of "trust mark" for companies, as it confirms their ability to guarantee data security, confidentiality, and integrity. A key element of the proposed methodology is data classification – a process that allows organizations to identify the most sensitive data and determine the appropriate level of protection. On this basis, a comprehensive information management system is built, which includes data collection, processing, storage, incident response, and access control. Importantly, the approach integrates the best practices of ITSM and ITIL: incident management, change management, knowledge management, and access management, thereby creating a transparent and controlled ecosystem. Special emphasis is placed on the use of automation and intelligent technologies – from large language models for data analysis to ETL processes that ensure information hygiene. The methodology also introduces important roles such as data owner and data steward, who are responsible for accuracy, relevance, and compliance with SOC 2 Type 2 standards. The proposed approach not only reduces risks and ensures regulatory compliance but also increases client and partner trust while strengthening the culture of security within the organization. Despite challenges – such as the need for continuous monitoring and balancing between convenience and security – the methodology demonstrates how a properly implemented ITSM component can become a strategic advantage for business in the era of digital transformation. Keywords: SOC2 Type 2, ITSM, ITIL, data classification, data storage, data processing, data security, access control, information management.
- Research Article
- 10.3390/math13223686
- Nov 17, 2025
- Mathematics
- Kisung Park
The Industrial Internet of Things (IIoT) integrates a wide range of devices and identities, making the protection of sensitive industrial data a critical challenge. However, existing centralized systems still face limitations such as single points of failure, inefficient identity authentication, and dependence on trusted third parties (TTPs). To address these issues, we present a blockchain-based authentication and data access control scheme for IIoT systems. The proposed scheme eliminates TTP involvement by employing decentralized identifiers (DIDs) and key-aggregate searchable encryption (KASE), utilizing scalable authentication without requiring all industrial data to be stored on the blockchain. Security robustness is demonstrated through informal analysis, the Real-or-Random (ROR) model, and the AVISPA simulation tool (v1.6). Furthermore, performance evaluation using the Multiprecision Integer and Rational Arithmetic Cryptographic Library (MIRACL) SDK shows that the proposed scheme achieves computational efficiency compared with existing solutions. Overall, the results confirm that the proposed scheme provides secure, efficient, scalable, and TTP-free data management for IIoT environments.
- Research Article
- 10.1007/s41019-025-00317-7
- Nov 17, 2025
- Data Science and Engineering
- Achraf Hmimou + 4 more
Abstract Data spaces have recently emerged as an innovative paradigm for cross-organizational data sharing. These decentralized environments require sophisticated data governance protocols to ensure compliance with data standards, roles and policies. While current policy-based solutions address enforcement of data access control and usage rights, they lack mechanisms for automated data validation -essential for ensuring data quality for collaborative analytics. To address this gap, we present a knowledge graph-based framework to automate data validation inline with data policies. This framework relies on the concept of policy checkers, which represent high-level and technology-agnostic data validation plans that can be dynamically translated into technology-specific user defined functions (UDFs) for compliance checking. Importantly, the usage of knowledge graphs to describe the policy checkers enhances the transparency and traceability of data validation processes, while the two-stage process (technology-agnostic policy checkers and technology-specific UDFs) accommodate data validation on multimodal data. We accompany the description of our approach with a proof of concept that demonstrates the feasibility of this solution in real data spaces.
- Research Article
- 10.1007/s11704-025-41356-7
- Nov 11, 2025
- Frontiers of Computer Science
- Yan-Qing Yao + 5 more
Traceable and revocable multi-authority ABE supporting decryption outsourcing and policy update for cloud data access control
- Research Article
- 10.1109/mce.2024.3524750
- Nov 1, 2025
- IEEE Consumer Electronics Magazine
- Xinlei Sheng + 4 more
Verifiable Private Data Access Control in Consumer Electronics for Smart Cities
- Research Article
- 10.3389/fcomp.2025.1670473
- Oct 29, 2025
- Frontiers in Computer Science
- Wafa Shujaa + 2 more
Blockchain technology has emerged as a potential solution for securing the rapidly expanding Internet of Things (IoT). This review critically analyzes 49 recent scientific publications to assess the current state of blockchain-based IoT security. We examine the strengths and weaknesses of various approaches, focusing on their ability to address data integrity, authentication, and access control vulnerabilities. The review identifies persistent challenges related to scalability, energy efficiency, and privacy, and proposes actionable future research directions. These directions include the development of context-aware security protocols, adaptive trust models, and privacy-preserving analytics techniques. This paper provides a valuable resource for researchers seeking to advance the field of blockchain-based IoT security.
- Research Article
- 10.22399/ijcesen.4176
- Oct 24, 2025
- International Journal of Computational and Experimental Science and Engineering
- N V L Kashyap Mulukutla
Healthcare organizations today face the critical challenge of harnessing the transformative power of data analytics while maintaining absolute commitment to patient privacy and regulatory compliance. This article examines the complex landscape where healthcare innovation intersects with privacy protection, exploring how organizations can successfully navigate regulatory requirements such as HIPAA while pursuing data-driven insights that improve patient outcomes. The article begins by establishing the fundamental importance of patient trust and the severe consequences that can result from privacy breaches, including financial penalties, reputational damage, and erosion of the patient-provider relationship. Through a comprehensive examination of technical safeguards, process-oriented protections, and organizational governance strategies, the article demonstrates that effective privacy protection requires a multi-layered approach encompassing data anonymization techniques, encryption protocols, access controls, and staff training programs. Real-world case studies illustrate how healthcare institutions have successfully implemented privacy-preserving analytics frameworks that enable collaborative research, support clinical decision-making, and drive operational improvements without compromising patient confidentiality. The article extends to emerging technologies and future considerations, addressing challenges posed by artificial intelligence, Internet of Things devices, and cross-institutional data sharing initiatives. Key findings emphasize that privacy protection and analytical innovation are not mutually exclusive objectives, but rather complementary elements that together strengthen healthcare delivery systems. The article concludes that organizations adopting privacy-by-design principles, establishing robust governance frameworks, and maintaining transparent communication with patients will be best positioned to realize the full potential of healthcare analytics while preserving the trust that forms the foundation of effective patient care.
- Research Article
1
- 10.1109/jiot.2025.3598320
- Oct 15, 2025
- IEEE Internet of Things Journal
- Farooq Ahmed + 5 more
Enhancing Healthcare Data Integrity and Access Control Using Blockchain and Industry 5.0
- Research Article
- 10.38007/ijbmet.2025.060115
- Oct 15, 2025
- International Journal of Business Management and Economics and Trade
Research on Secure Data Notarization and Access Control Algorithms for Supply Chain Finance Based on an On-Chain/Off-Chain Hybrid Storage Architecture and Smart Contracts
- Research Article
- 10.69996/fmep.2025021
- Sep 30, 2025
- Journals Fringe Multi-Engineering Proceedings
- Shiva Narayana Reddy V
In the digital era, identity verification processes such as Know Your Customer (KYC) are critical for ensuring the legitimacy of users in financial and governmental systems. However, conventional KYC mechanisms suffer from centralized storage models, making sensitive user data vulnerable to breaches, unauthorized access, and misuse. This paper presents a blockchain-based solution for secure KYC data sharing, aiming to enhance privacy, transparency, and data control for users. Leveraging the decentralized and immutable nature of blockchain technology, the proposed system enables secure storage, retrieval, and access control of user identity data through smart contracts. The model integrates a notification mechanism that alerts users via email whenever their data is accessed, thereby reinforcing transparency and accountability. Through a series of simulations and user interface walkthroughs, the effectiveness of the model is demonstrated in enabling only authorized institutions to access KYC data while safeguarding it against tampering or leakage.
- Research Article
- 10.36690/2733-2039-2025-3-4-16
- Sep 30, 2025
- Pedagogy and Education Management Review
- Igor Korzhevskyi
Corporate reputation has shifted from a vague intangible to a measurable strategic resource exposed to disinformation waves, generative-AI risks, cyberattacks, and tighter disclosure/privacy rules. In this context, reputation assessors require integrated competencies that combine data analytics, ethics and law, risk management, and operational execution. The article designs and empirically validates an evidence-based, five-factor competency framework - Data & Intelligence; Ethics/Law/Governance; Risk & Resilience; Strategy & Stakeholders; Assurance & Performance - links these competencies to measurable outcomes (accuracy, time-to-decision, incident severity, stakeholder trust), and delivers practical instruments (validated scale, training pathways, governance templates, explainability artifacts, and benchmark datasets). An explanatory, sequential mixed-methods program integrates scoping review and expert Delphi, psychometric development (EFA/CFA, reliability, convergent/discriminant validity, measurement invariance), field measures and crisis simulations, quasi-experimental evaluations of analytics and governance (event studies, difference-in-differences, synthetic controls), randomized usability tests of XAI artifacts, and A/B studies on data-governance ROI. The model exhibits strong fit, reliability, and cross-industry/language invariance; higher competency levels are associated with greater assessment accuracy and trust, faster decisions, and lower incident severity. Quasi-experimental estimates indicate that adopting NLP/graph analytics and implementing MRM controls causally reduces time-to-detect, peak severity, and market impact. SHAP summaries paired with model cards improve practitioner comprehension and decision readiness, while data lineage, DQ rules, and access controls enhance model performance, auditability, and evidentiary robustness with minimal privacy-driven utility loss. Targeted micro-credentials produce durable gains across domains. A competency-centric, analytics-enabled, governance-anchored approach transforms reputation assessment into a managed, auditable discipline that organizations can operationalize immediately through the provided scale, governance templates, explainability playbooks, and open benchmarks.
- Research Article
- 10.1142/s0218126625504456
- Sep 27, 2025
- Journal of Circuits, Systems and Computers
- Yuan Ai + 3 more
With the advent of big data and cloud computing, enterprises, organizations and individuals are increasingly interconnected, leading to a geometric increase in data resource sharing among institutions. However, this trend raises significant concerns regarding user data security and access control, particularly within power systems. This paper proposes leveraging deep neural network models integrated with blockchain technology to process and analyze security information in power big data. We introduce an enhanced approach by combining the Hopfield Neural Network (HNN) with the Simulated Annealing (SA) algorithm, addressing the limitations inherent in the traditional HNN model. Our proposed framework, SA-HNN, is designed to improve the adaptive capabilities of power systems through blockchain-based security strategies. In our experiments, we compared SA-HNN with other machine learning models, including XGBoost, LightGBM and Linear SVC, focusing on storage time and data integrity. The results indicate that SA-HNN outperforms these models in both metrics. Specifically, during a power system security defense test, SA-HNN achieved an algorithm recognition rate exceeding 95%. Furthermore, when evaluating the average transaction time consumption in power blockchain transactions, SA-HNN demonstrated superior performance, handling large volumes of transaction data efficiently with shorter processing times. In terms of user attribute revocation efficiency, SA-HNN exhibited greater file processing capacity and shorter time performance compared with other models. This research highlights the potential of integrating advanced neural networks with blockchain technology to enhance the security and efficiency of power systems. Future work will focus on further refining these models and exploring their applications in broader contexts.
- Research Article
- 10.59573/emsj.9(5).2025.35
- Sep 11, 2025
- European Modern Studies Journal
- Rajkumar Sekar
Financial technology institutions face growing challenges in securing sensitive data within serverless architectures due to reduced infrastructure visibility, ephemeral processing environments, and complex regulatory requirements. This work presents an integrated framework that combines automation of sensitive data detection, classification, and access controls specifically tailored for serverless financial applications. Machine learning techniques enable real-time identification of sensitive information across distributed environments, while automated classification systems organize data according to regulatory and organizational policies. Intelligent access control mechanisms, including role-based and attribute-based models, provide enforcement capabilities adapted to ephemeral computing contexts. Cloud-native security implementations leverage provider services and event-driven architectures to create comprehensive protection throughout data lifecycles. Evaluation through case studies demonstrates significant improvements in security posture, compliance adherence, and operational efficiency compared to manual approaches. The integration of these automated components creates a cohesive security framework that addresses the unique challenges of protecting financial data in dynamic serverless environments.
- Research Article
- 10.1016/j.jisa.2025.104163
- Sep 1, 2025
- Journal of Information Security and Applications
- Suriya U-Ruekolan + 3 more
Enforcing data access control and privacy: The graph-driven data regulatory approach
- Research Article
- 10.69996/jcesh.2025009
- Aug 31, 2025
- Journal of Computing in Education, Sports and Health
- Ramana Reddy N
Cybersecurity in kids' applications is a critical concern in today’s digitally connected world, where children increasingly access learning, gaming, and social platforms through mobile and web-based applications. These applications often collect sensitive data such as names, locations, behavior patterns, and usage history, making them potential targets for cyber threats.This paper presents Cyber Kids, a secure, gamified educational platform designed to enhance cybersecurity awareness among children through interactive learning. The system integrates a novel cryptographic framework called SmartCrypt, which utilizes symmetric homomorphic encryption for fine-grained, flexible data access control, ensuring confidentiality of user data even during processing. To further strengthen data integrity and source authentication, a Homomorphic Message Authentication Code (HomMAC) mechanism is introduced, enabling secure verification of encrypted data streams. The platform features engaging quiz-based challenges and educational games, where access is granted based on correct responses, reinforcing key cybersecurity concepts. Simulation results demonstrate the effectiveness of the platform in terms of learning accuracy, user engagement, and system performance, with high reliability and minimal latency. This work showcases the potential of combining advanced cryptographic techniques with gamified learning to foster safe digital behavior and early cybersecurity literacy in children.