Abstract

We propose and investigate two new methods to approximate f(A)b for large, sparse, Hermitian matrices A. Computations of this form play an important role in numerous signal processing and machine learning tasks. The main idea behind both methods is to first estimate the spectral density of A, and then find polynomials of a fixed order that better approximate the function f on areas of the spectrum with a higher density of eigenvalues. Compared to state-of-the-art methods such as the Lanczos method and truncated Chebyshev expansion, the proposed methods tend to provide more accurate approximations of f(A)b at lower polynomial orders, and for matrices A with a large number of distinct interior eigenvalues and a small spectral width. We also explore the application of these techniques to (i) fast estimation of the norms of localized graph spectral filter dictionary atoms, and (ii) fast filtering of time-vertex signals.

Highlights

  • Computing f (A)b, a function of a large, sparse Hermitian matrix times a vector, is an important component in numerous signal processing, machine learning, applied mathematics, and computer science tasks

  • As opposed to other methods that incorporate prior knowledge about b into the choice of the approximating polynomial (e.g., [35] considers vectors b from a zero-mean distribution with a known covariance matrix), the polynomial approximations resulting from the methods we propose do not depend on b or any information about b

  • The spectrum-adapted methods we propose in Section 3 utilize both pλ (z) and Pλ−1 (y) to focus on regions of the spectrum with higher eigenvalue density when generating polynomial approximations

Read more

Summary

Introduction

Computing f (A)b, a function of a large, sparse Hermitian matrix times a vector, is an important component in numerous signal processing, machine learning, applied mathematics, and computer science tasks. A fourth advantage is that the ith element of pK (A)b only depends on the elements of b within K hops of i on the graph associated with A (e.g., if A is a graph Laplacian matrix, a nonzero entry in the (i, j)th element of A, where i 6= j, corresponds to an edge connecting vertices i and j in the graph) This localization property is important in many graph-based data analysis applications, such as graph spectral filtering [34] and deep learning [5]. While in our setting we do not have access to the complete set of eigenvalues, our approach in this work is to leverage recent developments in efficiently estimating the spectral density of the matrix A, to adapt the polynomial to the spectrum in order to achieve better approximation accuracy at the (unknown) eigenvalues.

Spectral Density Estimation
Spectrum-Adapted Methods
Numerical Examples and Discussion
Application I
Application II
Time-Vertex Signals
Time-Vertex Filtering
Spectrum-Adapted Approximation of Time-Vertex Filtering
Numerical Experiments
Dynamic Mesh Denoising
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call