Abstract
AbstractThe rapid advancement of health monitoring technologies has led to increased adoption of fitness training applications that collect and analyze personal health data. This paper presents a personalized differential privacy‐based federated learning (PDP‐FL) algorithm with two stages. Classifying the user's privacy according to their preferences is the first stage in achieving personalized privacy protection with the addition of noise. The privacy preference and the related privacy level are sent to the central aggregation server simultaneously. In the second stage, noise is added that conforms to the global differential privacy threshold based on the privacy level that users uploaded; this allows the global privacy protection level to be quantified while still adhering to the local and central protection strategies simultaneously adopted to realize the complete protection of global data. The results demonstrate the excellent classification accuracy of the proposed PDP‐FL algorithm. The proposed PDP‐FL algorithm addresses the critical issue of health monitoring privacy in fitness training applications. It ensures that sensitive data is handled responsibly and provides users the necessary tools to control their privacy settings. By achieving high classification accuracy while preserving privacy, the framework balances data utility and protection, thus positively impacting health monitoring ecosystem and medical systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: Internet Technology Letters
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.