Abstract

First, the all-important no free lunch theorems are introduced. Next, kernel methods, support vector machines (SVMs), preprocessing, model selection, feature selection, SVM software and the Fisher kernel are introduced and discussed. A hidden Markov model is trained on foreign exchange data to derive a Fisher kernel for an SVM, the DC algorithm and the Bayes point machine (BPM) are also used to learn the kernel on foreign exchange data. Further, the DC algorithm was used to learn the parameters of the hidden Markov model in the Fisher kernel, creating a hybrid algorithm. The mean net returns were positive for BPM; and BPM, the Fisher kernel, the DC algorithm and the hybrid algorithm were all improvements over a standard SVM in terms of both gross returns and net returns, but none achieved net returns as high as the genetic programming approach employed by Neely, Weller, and Dittmar (1997) and published in Neely, Weller, and Ulrich (2009). Two implementations of SVMs for Windows with semi-automated parameter selection are built.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.