Abstract

Article history: Received April 28, 2012 Accepted 6 July 2012 Available online July 12 2012 Autoregressive moving average (ARMA) process and dynamic neural networks namely the nonlinear autoregressive moving average with exogenous inputs (NARX) are compared by evaluating their ability to predict financial time series; for instance the S&P500 returns. Two classes of ARMA are considered. The first one is the standard ARMA model which is a linear static system. The second one uses Kalman filter (KF) to estimate and predict ARMA coefficients. This model is a linear dynamic system. The forecasting ability of each system is evaluated by means of mean absolute error (MAE) and mean absolute deviation (MAD) statistics. Simulation results indicate that the ARMA-KF system performs better than the standard ARMA alone. Thus, introducing dynamics into the ARMA process improves the forecasting accuracy. In addition, the ARMA-KF outperformed the NARX. This result may suggest that the linear component found in the S&P500 return series is more dominant than the nonlinear part. In sum, we conclude that introducing dynamics into the ARMA process provides an effective system for S&P500 time series prediction. © 2012 Growing Science Ltd. All rights reserved.

Highlights

  • The modeling and forecasting of time series is an appealing yet difficult task in real world problems

  • Autoregressive moving average (ARMA) process, ARMA coupled with Kalman filter (ARMA-KF), and the Nonlinear AutoRegressive with Exogenous inputs (NARX) neural network have been compared for financial time series prediction

  • In terms of empirical results, we have found that across these predictive systems for the test case of the S&P500 variations and on the basis of two evaluation criteria namely the mean absolute errors (MAE) and mean absolute deviation (MAD), the ARMA-KF which is a linear dynamic predictive system performs the best

Read more

Summary

Introduction

The modeling and forecasting of time series is an appealing yet difficult task in real world problems. A number of techniques to predict time series have been introduced in the literature including the well known autoregressive–integrated-movingaverage (ARIMA) processes and artificial neural networks (ANN). Introduced by Box and Jenkins (Box & Jenkins, 1976), ARIMA models originate from the autoregressive (AR) model, moving average (MA) model, and an integrated (I) part to make the data series stationary by differencing. ARIMA models assume a linear relationship between current and past values of a time series as well as with white noise. They fit linear relations better than nonlinear relations

Objectives
Methods
Findings
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call