Abstract
Due to the complexity of real task for learning, the learning algorithms based on ERM (empirical risk minimization) principle always have good fit for the training samples, but bad prediction for future samples. SVM(support vector machine) as a new kernel learning algorithm has embodied Vapnik's SRM (structure risk minimization) principle. It overcomes the problem for ERM posing by optimizing the object consisting of the learning error on the training samples and the capacity of hypothesis space. We'll use nonlinear support vector regression (SVR) in the prediction of finance time series. This may be illumined by the success of BP neural network and RBF neural network applied in finance time series. Before applying SVR, we use another new hot tool called ICA(independent component analysis)for feature extraction. Traditional PCA only takes into account the uncorrelated between features, however ICA considers the independence which is a more strict condition than PCA. Experiments have shown that our method based on ICA+SVR are superior to other methods, i.e. PCA+SVR, KPCA+SVR.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.