Abstract

Credit card defaults cost the economy tens of billions of dollars every year. However, financial institutions rarely collaborate to build more comprehensive models due to legal regulations and competition. Federated XGBoost is an emerging paradigm that enables several companies to build a classification model cooperatively without transferring local data to others. The conventional Federated XGBoost suffers from the inverse inference according to splitting nodes selection and the class imbalance problem severely. Utilising the characteristic of splitting points' selection, we propose an optional splitting extraction model to reduce the leakage risk of raw data statistics. Moreover, an adjusted AUPRC (the area under the precision-recall curve) is introduced into the gain function to alleviate the class imbalance problem. Our experimental results show recall and AUPRC increased by 7%-10% and 4%-8%, respectively, without sacrificing other estimations compared to the existing state-of-the-art. Furthermore, communication iterations also decreased significantly in our proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call