Transform-domain least mean squares (TDLMS) adaptive filters encompass the class of learning algorithms where the input data are subjected to a data-independent unitary transform followed by a power normalization stage as preprocessing steps. Because conventional transformations are not data-dependent, this preconditioning procedure was shown theoretically to improve the convergence of the least mean squares (LMS) filter only for certain classes of input data. So, one can tailor the transformation to the class of data. However, in reality, if the class of input data is not known beforehand, it is difficult to decide which transformation to use. Thus, there is a need to devise a learning framework to obtain such a preconditioning transformation using input data prior to applying on the input data. It is hypothesized that the underlying topology of the data affects the selection of the transformation. With the input modeled as a weighted finite graph, our method, called preconditioning using graph (PrecoG), adaptively learns the desired transform by recursive estimation of the graph Laplacian matrix. We show the efficacy of the transform as a generalized split preconditioner on a linear system of equations and in Hebbian-LMS learning models. In terms of the improvement of the condition number after applying the transformation, PrecoG performs significantly better than the existing state-of-the-art techniques that involve unitary and nonunitary transforms.
Read full abstract