Federated learning (FL) has gained significant traction across diverse industries, which allows multiple clients or institutions to enhance model performance and outcomes while preserving data privacy collaboratively. In recent years, tensor networks (TNs) have become important in machine learning because they allow the compact representation of high-dimensional tensors by decomposing them into lower-dimensional components with polynomial complexity. The application of TNs in FL is a natural extension because of its flexible framework for representing and optimizing models. Inspired by quantum computing principles, we have integrated a quantum-inspired tensor network into the FL framework. This framework focuses on a one-dimensional matrix product state (MPS) tensor network (TN) in a federated setting (FedTN), with data distributed across homogeneous and heterogeneous partitions among clients. Our experiments demonstrate that tensor network-based federated learning can be made practical, as FedTN is robust to the unbalanced and non-IID data distributions typically encountered in such settings. Our research assessed the effectiveness and feasibility of comparing quantum-inspired TN and conventional methods, evaluating their performance, and exploring the benefits of incorporating quantum principles in FL settings. Furthermore, we have investigated its performance when training for many local epochs (large E) between the averaging steps.
Read full abstract