Abstract

This study conducts a comparative analysis of traditional and machine learning models for financial option pricing, using historical stock prices and interest rates data. Traditional models such as the Black-Scholes, Heston, Merton Jump-Diffusion, and GARCH are evaluated against machine learning models including Multi-Layer Perceptrons (MLPs) and Long Short-Term Memory (LSTM) networks. The analysis employs performance metrics like Mean Squared Error (MSE), Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), and R value. Results indicate that the GARCH model excels in predictive accuracy due to its ability to capture volatility clustering, while machine learning models, especially the Tuned Neural Network, demonstrate superior flexibility and adaptability in managing complex non-linear relationships in financial data. Traditional models, although theoretically robust, show limitations under varying market conditions. The study underscores the potential of hybrid approaches combining traditional and machine learning techniques to leverage their respective strengths for more accurate and reliable option pricing. Future research directions include exploring advanced machine learning architectures and improving model transparency through explainable AI.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.