Abstract

This paper proposes efficient batch-based and online strategies for kernel regression over graphs (KRG). The proposed algorithms do not require the input signal to be a graph signal, whereas the target signal is defined over the graph. We first use random Fourier features (RFF) to tackle the complexity issues associated with kernel methods employed in the conventional KRG. For batch-based approaches, we also propose an implementation that reduces complexity by avoiding the inversion of large matrices. Then, we derive two distinct online strategies using RFF, namely, the mini-batch gradient KRG (MGKRG) and the recursive least squares KRG (RLSKRG). The stochastic-gradient KRG (SGKRG) is introduced as a particular case of the MGKRG. The MGKRG and the SGKRG are low-complexity algorithms that employ stochastic gradient approximations in the regression-parameter update. The RLSKRG is a recursive implementation of the RFF-based batch KRG. A detailed stability analysis is provided for the proposed online algorithms, including convergence conditions in both mean and mean-squared senses. A discussion on complexity is also provided. Numerical simulations include a synthesized-data experiment and real-data experiments on temperature prediction, brain activity estimation, and image reconstruction. Results show that the RFF-based batch implementation offers competitive performance with a reduced computational burden when compared to the conventional KRG. The MGKRG offers a convenient trade-off between performance and complexity by varying the number of mini-batch samples. The RLSKRG has a faster convergence than the MGKRG and matches the performance of the batch implementation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.