Distributed optimization aims to effectively complete specified tasks through cooperation among multi-agent systems, which has achieved great success in large-scale optimization problems. However, it remains a challenging task to develop an effective distributed algorithm with theoretical guarantees, especially when dealing with nonconvex constraints. More importantly, high-dimensional data often exhibits inherent structures such as sparsity, which if exploited accurately, can significantly enhance the capture of its intrinsic characteristics. In this paper, we introduce a novel distributed sparsity constrained optimization framework over the Stiefel manifold, abbreviated as DREAM. DREAM innovatively integrates the ℓ2,0-norm constraint and Stiefel manifold constraint within a distributed optimization setting, which has not been investigated in existing literature. Unlike the existing distributed methods, the proposed DREAM not only can extract the similarity information among samples, but also more flexibly determine the number of features to be extracted. Then, we develop an efficient Newton augmented Lagrangian-based algorithm. In theory, we delve into the relationship between the minimizer, the Karush–Kuhn–Tucker point, and the stationary point, and rigorously demonstrate that the sequence generated by our algorithm converges to a stationary point. Extensive numerical experiments verify its superiority over state-of-the-art distributed methods.
Read full abstract