Abstract

AbstractWith the growing significance of Credit Risk Analysis (CRA) with a focus on privacy, there is a pressing demand for a Privacy Preserving Machine Learning (PPML) decision support system. In this context, we introduce a framework for privacy‐preserving credit risk analysis that utilizes Homomorphic Encryption aware Logistic Regression (HELR) on encrypted data. The implementation involves the use of TenSEAL and Torch libraries for Logistic Regression (LR), integrating the proposed HELR on polynomial degrees 3 and 5 across German, Taiwan, Japan, and Australian datasets. The presented model yields satisfactory results compared to non‐Homomorphic Encryption (HE) models, demonstrating a minimal accuracy difference ranging from 0.5% to 7.8%. Notably, HELR_g5 outperforms HELR_g3, exhibiting a higher Area Under Curve (AUC) value. Additionally, a thorough security analysis indicates the resilience of the proposed system against various privacy attacks, including poison attacks, evasion attacks, member inference attacks, model inversion attacks, and model extraction attacks at different stages of machine learning. Finally, in the comparative analysis, we highlight that the proposed model ensures data privacy, encompassing training privacy and model privacy during the training phase, as well as input and output privacy during the inference phase a level of privacy not achieved by existing systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call