Abstract

Privacy has raised considerable concerns recently, especially with the advent of information explosion and numerous data mining techniques to explore the information inside large volumes of data. These data are often collected and stored across different institutions (banks, hospitals, etc.), or termed cross-silo. In this context, cross-silo federated learning has become prominent to tackle the privacy issues, where only model updates will be transmitted from institutions to servers without revealing institutions’ private information. In this paper, we propose a cross-silo federated XGBoost approach to solve the federated anomaly detection problem, which aims to identify abnormalities from extremely unbalanced datasets (e.g., credit card fraud detection) and can be considered a special classification problem. We design two privacy-preserving mechanisms that are tailored to the federated XGBoost: anonymity based data aggregation and local differential privacy. In the anonymity based data aggregation scenario, we cluster data into different groups and using a cluster-level data feature to train the model. In the local differential privacy scenario, we design a federated XGBoost framework by incorporate differential privacy in parameter transmission. Our experimental results over two datasets show the effectiveness of our proposed schemes compared with existing methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.