Abstract

In recent years, distributed estimation of graph Laplacian matrices for smooth graph signals has received much attention. Traditional methods for estimating the graph Laplacian matrices usually estimate the global parameters in a centralized manner, which is computationally intensive and hard to apply in large-scale networks. In this paper, in order to reduce the computational complexity and meanwhile maintain the estimation accuracy, we propose a distributed graph Laplacian matrix estimation method called the distributed combinatorial graph Laplacian estimation (DCGL). In our method, a local parameter estimation problem is first formulated for each vertex by maximizing the marginal likelihood of the data collected from the neighborhood of the vertex. Then, by discussing the connectivity and Laplacian property of the marginal precision matrix, the Laplacian and structural constraints are added to each local estimation problem to resolve the non-convexity between the local and global estimates. Finally, through a simple and single message-passing rule, the global graph Laplacian matrix is obtained by extracting, combining, and symmetrizing the locally estimated parameters. Experiments on synthetic and real datasets demonstrate that the proposed distributed estimator is asymptotically consistent in the classical regime while having advantages in the high-dimensional regime.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call