Abstract

The minimum squared error (MSE) for classification is a linear discriminant function-based method that has been used in many applications such as face and handwritten character recognition. Nevertheless, MSE may not deal well with nonlinearly separable data sets. To address this problem, we improve the MSE and propose a new MSE-based algorithm, local MSE (LMSE), which is a local learning algorithm. For a test sample, we first determine its nearest neighbors from the training set. By using the determined neighbors, we construct a local MSE model to predict the class label of the test sample. LMSE can effectively capture the nonlinear structure of the data. It generally outperforms MSE, particularly when the data distribution is nonlinearly separable. Extensive experiments on many nonlinearly separable data sets show that LMSE achieves desirable recognition results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.