Abstract

In many fields such as finance and electrical load, locally weighted support vector regression (LWSVR) method can achieve better prediction results by constructing different sub-models for each prediction target. However, it selected a fixed and single nearest neighbor parameter of kglobal for each query, and adopted the traditional k-nearest neighbor (KNN) algorithm, which is suboptimal. The weights in LWSVR assigned to the neighboring sample points also affected the accuracy of the model predictions. In this paper, we propose an adaptive locally weighted support vector regression with asymmetrically parametric insensitive/margin model, called Asy-par-v-ALWSVR. Since the selection of similar sample for query points is significant. We propose a novel locally q-neighbour selection algorithm that adaptively selects the most suitable number of neighbours for each query point based on Mahalanobis distance, rather than using a fixed kglobal. This significantly improves prediction accuracy. The weights of neighbours for each query point in Asy-par-v-ALWSVR is based on the distance within the kernel-induced feature space. This effectively discerns the importance of samples during the training process. In particular, Asy-par-v-ALWSVR can handle skewed noise with heteroscedastic structure in regression problems. Finally, we evaluate the proposed model on a synthetic sinc dataset and 14 real datasets. The results demonstrate that Asy-par-v-ALWSVR outperforms six state-of-the-art SVR-based models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call