Abstract

Generalization-Memorization learning endeavors to minimize empirical risk while simultaneously reducing expected risk. Although we often do not pay attention to whether the training samples are accurately memorized during regression, improving generalization performance with better memory is always a goal pursued by regression. To tackle this issue, we introduce two new regression models, Least Squares Generalization-Memorization Regression (LSGMR) and Soft Least Squares Generalization-Memorization Regression (SLSGMR), by introducing the memory kernel learning on Least Squares Support Vector Regression (LSSVR). We conduct tests on these models using synthetic dataset and showcase that the LSSVR model can be viewed as a special case of our proposed model. Our experiments highlight that, for numerous problems, the models incorporating the employed memory mechanisms, LSGMR and SLSGMR, prove highly effective in yielding superior results compared to LSSVR on noise regression.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call