Abstract

Recent studies have shown that there is predictable variation in returns of financial assets over time. We investigate whether the predictive power of the economic and financial variables employed in the above studies can be enhanced if the statistical method of linear regression is replaced by feedforward neural networks with backpropagation of error. A shortcoming of backpropagation networks is that too many free parameters allow the neural network to fit the training data arbitrarily closely resulting in an "overfitted" network. Overfitted networks have poor generalization capabilities. We explore two methods that attempt to overcome this shortcoming by reducing the complexity of the network. The results of our experiments confirm that an "overfitted" network, while making better predictions for within-sample data, makes poor predictions for out-of-sample data. The methods for reducing the complexity of the network, explored in this paper, clearly help improve out-of-sample forecasts. We show that one cannot say that the linear regression forecasts are conditionally efficient with respect to the neural networks forecasts with any degree of confidence. However, one can say that the neural networks forecasts are conditionally efficient with respect to the linear regression forecasts with some confidence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call