Abstract

A learning task is sequential if its data samples become available over time. Kernel adaptive filters (KAF) are sequential learning algorithms. There are two main challenges in KAF: (1) the lack of an effective method to determine the kernel-sizes in the online learning context; (2) how to tune the step-size parameter. We propose a framework for online prediction using KAF which does not require a predefined set of kernel-sizes; rather, the kernel-sizes are both created and updated in an online sequential way. Further, to improve convergence time, we propose an online technique to optimize the step-size parameter. The framework is tested on two real-world data sets, i.e., internet traffic and foreign exchange market. Results show that, without any specific hyperparameter tuning, our proposal converges faster to relatively low values of mean squared error and achieves better accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.