Abstract
Support vector machine (SVM) is gaining much popularity as a powerful machine learning technique. SVM was originally developed for pattern classification and later extended to regression. One of main features of SVM is that it generalizes the maximal margin linear classifiers into high dimensional feature spaces through nonlinear mappings defined implicitly by kernels in the Hilbert space so that it may produce nonlinear classifiers in the original data space. On the oilier hand, the authors developed a family of various SVMs using multi-objective programming and goal programming (MOP/GF) techniques. This paper extends the family of SVM for classification to regression, and discusses their performance through numerical experiments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.