VQ-FedDiff: Federated Learning Algorithm of Diffusion Models With Client-Specific Vector-Quantized Conditioning.
Modern generative models, particularly denoising diffusion probabilistic models (DDPMs), provide high-quality synthetic images, enabling users to generate diverse images and videos that are realistic. However, in a number of situations, edge devices or individual institutions may possess locally collected data that is highly sensitive and should ensure data privacy, such as in the field of healthcare and finance. Under such federated learning (FL) settings, various methods on training generative models have been studied, but most of them assume generative adversarial networks (GANs), and the algorithms are specific to GANs and not other forms of generative models such as DDPM. This paper proposes a new algorithm for training DDPMs under federated learning settings, VQ-FedDiff, which provides a personalized algorithm for training diffusion models that can generate higher-quality images FID while still keeping risk of breaching sensitive information as low as locally-trained secure models. We demonstrate that VQ-FedDiff shows state-of-the-art performance on existing federated learning of diffusion models in both IID and non-IID settings, and in benchmark photorealistic and medical image datasets. Our results show that diffusion models can efficiently learn with decentralized, sensitive data, generating high-quality images while preserving data privacy.
- Research Article
- 10.54254/2755-2721/49/20241161
- Mar 22, 2024
- Applied and Computational Engineering
Preserving user data privacy is paramount for airlines. However, achieving highly accurate predictions of passenger satisfaction while safeguarding the privacy of individual companys data remains a considerable challenge. To achieve this objective, we utilized a privacy-preserving machine learning technique known as Federated Learning (FL). FL allows model training on decentralized devices while ensuring data security and privacy. The FL process comprises client training processes, communication rounds, and weight aggregation. We simulate FL principles using multiple processes to enable distributed learning. Clients preprocess data and train local models, ensuring data privacy. Communication rounds involve clients downloading the global model, local training, and transmitting updates to a central server. Weight aggregation methods like Federated Averaging merge these updates, preserving data privacy. Additionally, we leverage Artificial Neural Networks (ANNs) as foundational techniques. ANNs consist of input, hidden, and output layers, with weight adjustments based on real value differences to achieve accuracy. Our approach combines FL with ANNs to demonstrate FLs potential in privacy-preserving predictive analytics. We use the Airline Passenger Satisfaction dataset for modeling and evaluate the impact of neural network depth and submodel quantity on prediction accuracy. Our experimental results reveal that neither the depth of neural networks nor the number of submodels significantly affects prediction accuracy. FL emerges as a promising approach to balance data privacy and prediction accuracy effectively.
- Research Article
- 10.55041/isjem02079
- Sep 20, 2024
- International Scientific Journal of Engineering and Management
Federated Learning (FL) is a decentralized machine learning paradigm that enables model training across distributed devices while preserving data privacy. With the proliferation of mobile devices and the increasing demand for privacy-preserving AI, FL has emerged as a promising solution for training models on edge devices. This paper explores the implementation of Federated Learning on mobile devices, highlighting the technical challenges, opportunities, and future directions. By addressing issues such as resource constraints, communication overhead, and heterogeneity, FL can unlock the potential of collaborative learning on mobile platforms while ensuring data privacy and security. Keywords Mobile applications, accessibility, disabilities, user interface, android, ios.
- Research Article
3
- 10.1016/j.future.2024.04.047
- May 1, 2024
- Future Generation Computer Systems
Elastic Federated Learning with Kubernetes Vertical Pod Autoscaler for edge computing
- Research Article
26
- 10.1016/j.eswa.2024.123776
- Mar 26, 2024
- Expert Systems with Applications
FedCRMW: Federated model ownership verification with compression-resistant model watermarking
- Research Article
5
- 10.1016/j.jairtraman.2024.102693
- Oct 24, 2024
- Journal of Air Transport Management
A privacy-preserving federated learning approach for airline upgrade optimization
- Research Article
24
- 10.1016/j.future.2023.10.013
- Oct 31, 2023
- Future Generation Computer Systems
FederatedTrust: A solution for trustworthy federated learning
- Research Article
- 10.56038/oprd.v5i1.574
- Dec 31, 2024
- Orclever Proceedings of Research and Development
This study examines the benefits of applying federated learning (FL) technology to OPC (Operational Performance Control) systems within industrial automation and data analysis processes. FL enables each production facility to process its data locally while only transmitting model parameters to a central server, thereby preserving data privacy. This approach provides significant advantages in industrial environments, particularly concerning data privacy and communication costs. The study evaluates FL's potential to ensure data privacy, reduce communication costs, improve efficiency in training time, and deliver high performance in predictive maintenance and quality estimation. Model performance was analyzed using accuracy, F1 score, precision, and loss metrics; the results demonstrated that FL achieved a 90% accuracy rate, offering competitive performance compared to centralized modeling. In predictive maintenance and quality analysis specifically, FL achieved 85-88% accuracy while reducing network data load by 65%. These findings validate that FL provides a secure, cost-effective, and efficient solution for industrial data analysis processes by eliminating the need for centralized data collection. In conclusion, FL and OPC integration supports data privacy, cost savings, and communication efficiency in industrial processes. The study highlights that FL could become a prevalent technology in industrial data analysis, establishing a new standard particularly in digital manufacturing processes.
- Book Chapter
- 10.71443/9789349552111-06
- Mar 17, 2025
The rapid integration of IIoT in power electronics has transformed industrial automation, enabling real-time monitoring, predictive maintenance, and intelligent decision-making. The distributed nature of IIoT-enabled power electronics introduces significant challenges in anomaly detection, including data heterogeneity, privacy concerns, and computational limitations of edge devices. Traditional centralized learning approaches are inefficient in handling these constraints, necessitating the adoption of decentralized learning paradigms. Federated Learning (FL) has emerged as a transformative approach, enabling collaborative model training across edge devices while preserving data privacy. FL faces challenges such as communication overhead, resource constraints, and performance degradation due to non-independent and identically distributed (non-IID) data. To address these limitations, Transfer Learning (TL) was integrated with FL to enhance model adaptability, enabling efficient knowledge transfer across different industrial environments and reducing dependency on extensive labeled datasets. This book chapter presents a comprehensive study on the integration of FL and TL for distributed anomaly detection in IIoT-enabled power electronics. The research explores optimization techniques for scalable FL deployment, including low-latency model aggregation, edge-to-cloud collaboration, and privacy-preserving secure model aggregation using Secure Multi-Party Computation (SMPC). The role of meta-learning in improving FL model generalization for handling heterogeneous data was analyzed. To address computational inefficiencies, the study examines Federated Knowledge Distillation (FKD) as a lightweight learning approach that minimizes resource consumption while maintaining high anomaly detection accuracy. The findings highlight the advantages of hybrid FL-TL frameworks in enhancing fault diagnosis, reducing communication overhead, and ensuring energy-efficient real-time anomaly detection. The proposed approach strengthens the reliability and security of industrial automation by providing a scalable and adaptive learning framework for power electronics systems. Future research directions include optimizing FL-TL integration for dynamic industrial environments, developing energy-efficient federated architectures, and enhancing privacy-preserving techniques for large-scale IIoT networks.
- Research Article
- 10.1016/j.adhoc.2024.103728
- Dec 2, 2024
- Ad Hoc Networks
PopFL: A scalable Federated Learning model in serverless edge computing integrating with dynamic pop-up network
- Research Article
5
- 10.1016/j.vehcom.2023.100709
- Dec 2, 2023
- Vehicular Communications
A state-of-the-art on federated learning for vehicular communications
- Research Article
- 10.58346/jowua.2025.i3.018
- Sep 30, 2025
- Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications
Federated Learning (FL) has emerged as a suitable option for collaborative, user-centric machine learning while protecting sensitive user data. FL differs significantly from traditional centralized frameworks that collect raw data on a central server. FL focuses on collecting model updates generated locally on edge devices. With traditional FL approaches, data must be downloaded to a central server for model training. In contrast, training occurs on the edge and only model updates are sent. This decentralized framework, unlike traditional FL approaches, is ideal for ubiquitous applications—systems woven seamlessly into everyday surroundings like smart homes, advanced mobile devices, and IoT systems—where sensitive data is constantly and automatically generated. In these systems, user privacy is paramount, and advanced methods must be implemented to ensure privacy and system efficacy. This paper discusses the integration of FL into ubiquitous environments, addressing the relevance, advantages, and disadvantages of this approach. I’d like to focus on privacy-preserving methods, but their preservation against data leakage and adversary attacks adds significant value. There are case studies to show the challenges encountered, lessons learned, and the use of FL within privacy-centric ubiquitous systems. In closing this paper, I summarize the key points and underscore the transformational potential of FL for privacy-sensitive, innovative systems. I also promote further research in this area to support its practical relevance. This paper demonstrates that among the many approaches to privacy within pervasive computing, FL is the only one that is practical and scalable.
- Research Article
1
- 10.30955/gnj.07511
- May 5, 2025
- Global NEST Journal
<p>Flooding in coastal regions of smart cities poses significant challenges, including infrastructure damage, economic losses, and threats to public safety. Traditional flood prediction models often suffer from data privacy concerns, limited spatial-temporal generalisation, and computational inefficiencies. To address these challenges, this study proposes an advanced Federated Learning (FL) and CNN-LSTM-based predictive framework for flood forecasting in coastal urban regions. The FL paradigm enables decentralised model training across multiple locations while ensuring data privacy. Convolutional Neural Networks (CNNs) extract spatial flood-related features, while Long Short-Term Memory (LSTM) networks capture temporal dependencies in hydrometeorological data. Various sensors, IoT devices and geospatial equipment are deployed to monitor and record flood-related environmental factors in different coastal regions in smart cities. The generated data is analysed by CNN and LSTM models to predict the flood levels based on the flood-influencing factors estimated. The proposed FL-CNN-LSTM model is implemented and experimented with in Python, and the prediction efficiency is verified. It is also compared with the other earlier methods and evaluates performance. It shows that the FL-CNN-LSTM provides more accuracy and promising quality services like dependency reduction in centralised data storage, adaptiveness, and privacy preservation in flood forecasting systems. Most importantly, it provides a proactive natural disaster mitigation model, making it suitable for real-time coastal regions in smart cities.</p>
- Research Article
- 10.61927/igmin294
- Apr 7, 2025
- IgMin Research
Federated learning (FL) has emerged as a promising approach for collaborative model training across multiple institutions without sharing sensitive patient data. In the context of cancer diagnosis and prognosis prediction, FL offers a potential solution to the challenges associated with data privacy and security. This paper reviews the application of FL in cancer diagnosis and prognosis prediction, highlighting its key benefits, limitations, and future research directions. We discuss the potential of FL to improve the accuracy and generalizability of predictive models by leveraging diverse and distributed datasets while preserving data privacy. Furthermore, we examine the technical and regulatory considerations associated with implementing FL in the healthcare domain. Finally, we identify opportunities for future research and development in FL for cancer diagnosis and prognosis prediction.
- Research Article
305
- 10.1109/tnnls.2022.3166101
- Dec 1, 2023
- IEEE Transactions on Neural Networks and Learning Systems
Federated learning (FL) allows model training from local data collected by edge/mobile devices while preserving data privacy, which has wide applicability to image and vision applications. A challenge is that client devices in FL usually have much more limited computation and communication resources compared to servers in a data center. To overcome this challenge, we propose PruneFL -a novel FL approach with adaptive and distributed parameter pruning, which adapts the model size during FL to reduce both communication and computation overhead and minimize the overall training time, while maintaining a similar accuracy as the original model. PruneFL includes initial pruning at a selected client and further pruning as part of the FL process. The model size is adapted during this process, which includes maximizing the approximate empirical risk reduction divided by the time of one FL round. Our experiments with various datasets on edge devices (e.g., Raspberry Pi) show that: 1) we significantly reduce the training time compared to conventional FL and various other pruning-based methods and 2) the pruned model with automatically determined size converges to an accuracy that is very similar to the original model, and it is also a lottery ticket of the original model.
- Conference Article
3
- 10.1109/wcsp49889.2020.9299730
- Oct 21, 2020
Artificial intelligence is a bellwether of today’s new technological revolution. As a branch of artificial intelligence, machine learning faces two main problems in practical application: 1) data owned by most enterprises are difficult to aggregate; and, 2) big data owners pay more and more attention to data privacy and security, which leads to the problem of data island. Federated learning (FL), as a distributed FL paradigm, which can enable all parties to achieve the purpose of co-building models while ensuring data privacy and exposure, provides a possible solution to the mentioned problems of machine learning. The FL takes full advantage of participants’ data and computing power and builds a more robust machine learning model without sharing the data. In an environment with strict data regulation, the FL can effectively solve the key problems, such as those related to data privacy and data rights. However, most of the existing FL-based frameworks do not pay attention to the impact of data source distribution on FL training. Therefore, this paper proposed a data-oriented FL framework called the Federated AI Engine(FAE), which can solve FL problems without leaving the data in control. The proposed framework provides a method that can be used to verify FL quickly for researchers who intend to try federal learning.
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.