Abstract

Satellite communication and Low Earth Orbit (LEO) satellites are important components of the 6G network, widely used for Earth observation tasks due to their low cost and short return period, making them a key technology for 6G network connectivity. Due to limitations in satellite system technology and downlink bandwidth, it is not feasible to download all high-resolution image information to ground stations. Even in existing federated learning (FL) methods, sharing well-trained parts of the model can still bottleneck with increasing model size. To address these challenges, we propose a new federated learning framework (FL-M3D) for LEO satellite communication that employs multi-round decentralized dataset distillation techniques. It allows satellites to independently extract local datasets and transmit them to ground stations instead of exchanging model parameters. Communication costs depend only on the size of the synthesized dataset and do not increase with larger models. However, the heterogeneity of satellite datasets can lead to sample ambiguity and decreased model convergence speed. Therefore, we propose distilling the datasets to mitigate the negative effects of data heterogeneity. Through experiments using real-world image datasets, FL-M3D reduces communication volume in simulated satellite networks by approximately 49.84% and achieves improved model performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.