Abstract
Federated learning has shown its unique advantages in many different tasks, including brain image analysis. It provides a new way to train deep learning models while protecting the privacy of medical image data from multiple sites. However, previous studies suggest that domain shift across different sites may influence the performance of federated models. As a solution, we propose a gradient matching federated domain adaptation (GM-FedDA) method for brain image classification, aiming to reduce domain discrepancy with the assistance of a public image dataset and train robust local federated models for target sites. It mainly includes two stages: 1) pretraining stage; we propose a one-common-source adversarial domain adaptation (OCS-ADA) strategy, i.e., adopting ADA with gradient matching loss to pretrain encoders for reducing domain shift at each target site (private data) with the assistance of a common source domain (public data) and 2) fine-tuning stage; we develop a gradient matching federated (GM-Fed) fine-tuning method for updating local federated models pretrained with the OCS-ADA strategy, i.e., pushing the optimization direction of a local federated model toward its specific local minimum by minimizing gradient matching loss between sites. Using fully connected networks as local models, we validate our method with the diagnostic classification tasks of schizophrenia and major depressive disorder based on multisite resting-state functional MRI (fMRI), respectively. Results show that the proposed GM-FedDA method outperforms other commonly used methods, suggesting the potential of our method in brain imaging analysis and other fields, which need to utilize multisite data while preserving data privacy.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE transactions on neural networks and learning systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.