Abstract

We propose a novel nearest-neighbors approach to organize and curb the growth of radial basis function network in kernel adaptive filtering (KAF). The nearest-instance-centroid-estimation (NICE) kernel least-mean-square (KLMS) algorithm provides an appropriate time-space tradeoff with good performance. Its centers in the input/feature space are organized by quasi-orthogonal regions for greatly simplified filter evaluation. Instead of using all centers to evaluate/update the function approximation at every new point, a linear search among the iteratively-updated centroids determines the partial function to be used, naturally forming locally-supported partial functionals. Under this framework, partial functionals that compose the adaptive filter are quickly stored/retrieved based on input, each corresponding to a specialized “spatial-band” subfilter. The filter evaluation becomes the update of one of the subfilters, creating a content addressable filter bank (CAFB). This CAFB is incrementally updated for new signal applications with mild constraints, always using the past-learned partial filter sums, opening the door for transfer learning and significant efficiency for new data scenarios, avoiding training from scratch as have been done since the invention of adaptive filtering. Using energy conservation relation, we show the sufficient condition for mean square convergence of the NICE-KLMS algorithm and establish the upper and lower bounds of steady-state excess-mean-square-error (EMSE). Simulations on chaotic time-series prediction demonstrate similar levels of accuracy as existing methods, but with much faster computation involving fewer input samples. Simulations on transfer learning using both synthetic and real-world data demonstrate that NICE CAFB can leverage previously learned knowledge to related task or domain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call