Abstract

In this paper, we focus on inferring graph Laplacian matrix from the spatiotemporal signal which is defined as "time-vertex signal". To realize this, we first represent the signals on a joint graph which is the Cartesian product graph of the time- and vertex-graphs. By assuming the signals follow a Gaussian prior distribution on the joint graph, a meaningful representation that promotes the smoothness property of the joint graph signal is derived. Furthermore, by decoupling the joint graph, the graph learning framework is formulated as a joint optimization problem which includes signal denoising, time- and vertex-graphs learning together. Specifically, two algorithms are proposed to solve the optimization problem, where the discrete second-order difference operator with reversed sign (DSODO) in the time domain is used as the time-graph Laplacian operator to recover the signal and infer a vertex-graph in the first algorithm, and the time-graph, as well as the vertex-graph, is estimated by the other algorithm. Experiments on both synthetic and real-world datasets demonstrate that the proposed algorithms can effectively infer meaningful time- and vertex-graphs from noisy and incomplete data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.