Articles published on Mathematical finance
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
4277 Search results
Sort by Recency
- New
- Research Article
- 10.55640/ijefms/volume11issue02-03
- Feb 6, 2026
- International Journal of Economics Finance & Management Science
- Dr Mikhail Urinson
The article examines the Kendall methodology as an integrated quantitative WT/PE architecture that translates heuristic market intuition into a formalized language of signals, portfolio rules, and rigid verification procedures on long historical series. The growth of backtest overfitting risks drives the relevance of the study, demands for regulatory and institutional transparency, and the industry’s shift toward integrated AI/ML pipelines, in which the verifiability of trading algorithms becomes as significant as their predictive power. The objective of the work is to articulate, in scientific terms, the internal logic of the WaveTech/PortfolioExpert linkage, the core of proprietary indicators (SMA bands 10/21/41, PPM oscillators, ER metric), and to demonstrate how thematic concentrated Top ER portfolios with a controlled risk profile are formed through signal confluence and ER-based selection. The scientific novelty consists in a systemic formalization of the robustness-by-design principle for a practical trading platform: multi-level timeframe alignment, a discrete signal refresh regime, outlier exclusion, strict in-/out-of-sample and walk-forward discipline, and a substantiated inclusion of ML-enhanced Genetic Evolution Algorithms as a meta-layer for searching interpretable strategies without transitioning to an opaque black box. The main results show that a portfolio constructed exclusively on ER logic and transferred statically from the in-sample period 2015–2020 into the out-of-sample window 2021–2025 preserves positive dynamics under an expected reduction in returns, which is interpreted as empirical evidence of construct transferability and of the limits of its adaptability to changing market regimes. The article will be helpful to researchers and practitioners of quantitative trading, portfolio managers, trading-platform architects, and risk specialists interested in reproducible and auditable algorithmic strategies.
- New
- Research Article
- 10.1016/j.apenergy.2025.127139
- Feb 1, 2026
- Applied Energy
- Silvia Romagnoli + 1 more
A fuzzy-and-fair framework for solar irradiance modeling and derivative pricing: Bridging photovoltaic production risk and climate-linked finance
- New
- Research Article
- 10.3389/frai.2025.1752580
- Jan 21, 2026
- Frontiers in artificial intelligence
- Jiarui Chi
The rapid development of robo-advisory and quantitative investment has been accompanied by persistent concerns about limited personalization and the opacity of black-box models operating on multimodal financial information. This paper addresses these issues from a decision-support perspective by constructing FinErva, a multimodal chain-of-thought dataset tailored to financial applications. FinErva comprises 7,544 manually verified question-answer pairs, divided into two economically relevant tasks: contract and disclosure understanding (FinErva-Pact) and candlestick-chart-based technical analysis (FinErva-Price). Building on this dataset, the paper propose a two-stage training framework: Supervised-CoT Learning followed by Self-CoT Refinement, and apply it to eight vision-language models, each with fewer than 0.8 billion parameters. Empirical results show that those lightweight models approach the performance of finance professionals and clearly outperform non-expert investors. Overall, the findings indicate that appropriately designed multimodal chain of thought supervision enables interpretable modeling of key research tasks such as contract review and chart interpretation under realistic computational and deployment constraints, providing new data and methodology for the development of personalized, explainable, and operationally feasible AI systems in investment advisory and risk management.
- New
- Research Article
- 10.1142/s0219024925500220
- Jan 20, 2026
- International Journal of Theoretical and Applied Finance
- Cyril Benezet + 2 more
The dynamic hedging theory only makes sense in the setup of one given model, whereas the practice of dynamic hedging is just the opposite, with models fleeing after the data through daily recalibration. This is quite a quantitative finance paradox. In this paper, we revisit the notion of hedging valuation adjustment (HVA), originally intended to deal with dynamic hedging frictions, in the direction of recalibration and model risks. Specifically, we extend to callable assets the HVA model risk approach from earlier work. The classical way to deal with model risk is to reserve the differences between the valuations in reference models and in the local models used by traders. However, while traders’ prices are thus corrected, their hedging strategies and their exercise decisions are still wrong, which necessitates a risk-adjusted reserve. We illustrate our approach on a stylized callable range accrual representative of huge amounts of structured products on the market. We show that a model risk reserve adjusted for the risk of wrong exercise decisions may largely exceed a basic reserve only accounting for valuation differences.
- New
- Research Article
- 10.63363/aijfr.2026.v07i01.2947
- Jan 18, 2026
- Advanced International Journal for Research
- Jaspal Singh
Accurate stock market forecasting remains one of the most intellectually challenging and financially critical tasks in quantitative finance, driven by the inherent volatility, non-linearity, and chaotic nature of financial time series data. Traditional statistical and econometric models (such as ARIMA) often prove inadequate in capturing the complex, long-term dependencies and multimodal influencing factors, leading to a paradigm shift toward advanced Deep Learning (DL)methodologies. This comprehensive review synthesizes recent research (2022–2025) across diverse global markets, including the S&P 500, the Indian National Stock Exchange (NSE), and cryptocurrency exchanges, to evaluate the efficacy of cutting-edge DL architectures. We focus on the performance of stand-alone Recurrent Neural Network (RNN) variants Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) alongside Convolutional Neural Networks (CNNs) and emerging hybrid models (e.g., CNN-LSTM, LSTM-DNN, LSTM-GNN, and Attention-based architectures). Furthermore, this paper details the critical role of multimodal data fusion, integrating market sentiment derived from financial news and social media (NLP), and addresses the growing demand for Explainable Artificial Intelligence (XAI) to foster transparency and trust in automated investment systems. The evidence overwhelmingly supports the superior predictive power and robustness of hybrid and deep recurrent models, affirming their role in advanced algorithmic trading and robust portfolio optimization.
- Research Article
- 10.3390/e28010084
- Jan 11, 2026
- Entropy
- Keyue Yan + 6 more
Stock price prediction is a core challenge in quantitative finance. While machine learning has advanced the modeling of complex financial time series, existing methods often rely on single-target predictions, underutilize multidimensional market information, and are disconnected from practical trading systems. To address these gaps, this research develops a hybrid machine learning framework for flexible target forecasting and systematic trading of major American technology stocks. The framework integrates Ensemble Models (AdaBoost, Decision Tree, LightGBM, Random Forest, XGBoost) with Fusion Models (Voting, Stacking, Blending) and introduces a Transfer Learning method enhanced by Dynamic Time Warping to facilitate knowledge sharing across assets, improving robustness. Focusing on ten key stocks, we forecast three distinct momentum indicators: next-day Closing Price Difference, Moving Average Difference, and Exponential Moving Average Difference. Empirical results demonstrate that the proposed Transfer Learning approach achieves superior predictive performance and trading simulations confirm that strategies based on these predicted momentum signals generate substantial returns. This research demonstrates that the proposed hybrid machine learning framework can mitigate the high information entropy inherent in financial markets, offering a systematic and practical method for integrating machine learning with quantitative trading.
- Research Article
- 10.20956/j.v22i2.46442
- Jan 10, 2026
- Jurnal Matematika, Statistika dan Komputasi
- Andi Karina Dwi Maharani + 1 more
The accurate valuation of European options is a primary challenge in quantitative finance, particulary when dividend payment influence the underlying stock price. Convetional option pricing models often overlook the dividend variable or face computational complexities that reduce the accuracy and stability of the result. This study aims to determine the value of Eropean-style options using the trinomial method with dividend payments. The trinomial method with dividend was applied to five stock with the same period and expiration date. Based on the Mean Absolute Percentage Error (MAPE) calculation, it can be concluded that the trinomial model used to predict option prices has varying levels of accuracy. The result show the MAPE values for a 2.25 month dividend as follows: AAPL at 27.11% for calls and 25.86% for puts; MSFT at 10.03% for calls and 4.57% for puts; AON at 7.56% for calls and 6.52% for puts; IBM at 24.96% for calls and 15.22% for puts; and META at 17.46% for calls and 28.59% for puts. Meanshile, the MAPE calvulation for a 3 month dividend yielded: AAPL at 27.13% for call and 25.86% for puts; MSFT at 10.03% for calls and 4.57% for puts; AON at 7.94% for calls and 7.69% for puts; IBM at 24.96% for calls and 15.22% for puts; and META at 17.46% for calls and 28.59% for puts. Oevrall, the calculations show aggregate MAPE values of 17.42% for calls and 16.15% for puts (2.25 month dividend), and 17.50% for calls and 16.39% for puts (3 month dividend). This indicates that the trinomial model, which accounts for distributions, produces values thah approximate actual option prices.
- Research Article
- 10.36948/ijfmr.2026.v08i01.65636
- Jan 8, 2026
- International Journal For Multidisciplinary Research
- Kartik Garg
The paper systematically examines the conceptual underpinnings of quantitative trading and machine learning, followed by a detailed discussion of data sources and feature engineering practices commonly employed in ML-based trading systems. It reviews the use of supervised, unsupervised, deep learning, and reinforcement learning models for return prediction, portfolio optimization, and trade execution. Particular attention is given to model evaluation frameworks, backtesting methodologies, and performance metrics used to assess economic significance while mitigating risks such as overfitting and data snooping bias. The review highlights that while machine learning models often outperform traditional approaches in predictive accuracy, their real-world effectiveness is constrained by transaction costs, model decay, interpretability challenges, and market microstructure effects. Furthermore, the paper discusses emerging research directions, including explainable artificial intelligence, federated learning, ESG-aware trading, and the integration of macroeconomic and geopolitical data. Overall, the study concludes that machine learning serves as a powerful complement rather than a substitute for financial theory, and that future advancements will depend on hybrid models that balance predictive performance, economic intuition, and ethical responsibility.
- Research Article
- 10.1016/j.ribaf.2026.103329
- Jan 1, 2026
- Research in International Business and Finance
- Manish + 1 more
Bridging behavioral insights and quantitative finance: AI-powered Black-Litterman framework with technical and sentiment signals
- Research Article
- 10.70088/864a3286
- Dec 31, 2025
- GBP Proceedings Series
- Yitao Chen
With increasing volatility in global financial markets and the rapid development of complex trading strategies, the importance of financial mathematics in asset pricing, risk measurement, and investment decision-making has become increasingly evident. Stochastic process models offer systematic methods for characterizing dynamic asset price movements, capturing both short-term fluctuations and long-term trends, while risk assessment and pricing models provide a quantitative foundation for evaluating market uncertainty and guiding investment strategies. This study explores the role of financial mathematics in modern markets by examining the fundamental mechanisms behind price fluctuations and the stability of market risk, supported by empirical analysis that considers the proportion of quantitative model application alongside established market volatility indices. The results demonstrate that an appropriately moderate application of financial mathematical tools can effectively mitigate overall market volatility and promote more efficient allocation of resources across different financial instruments and sectors. Conversely, when the use of quantitative models is highly consistent or overly concentrated, short-term volatility may temporarily increase, exerting transient pressure on market stability. These findings highlight the nuanced impact of financial mathematics, indicating that while these tools are essential for informed decision-making and risk management, their implementation requires careful calibration to avoid unintended destabilizing effects on financial markets.
- Research Article
- 10.61136/038h7f12
- Dec 30, 2025
- Khidmatan
- Mushawir Mushawir + 2 more
This community service program aims to enhance basic sharia financial management literacy among small food vendors around the Sunan Pandanaran Islamic Boarding School Complex in Yogyakarta through the utilization of digital financial applications. Prior to the implementation, key challenges identified included inconsistent bookkeeping practices, the mixing of personal and business finances, and limited understanding of basic financial mathematics. The program employed a Participatory Action Research (PAR) approach, consisting of initial data collection, written interviews, and sharia financial education emphasizing the principle of trust (amanah) in financial recording, the separation of business and personal assets, and basic understanding of commercial zakat (zakat al-tijarah), along with basic financial mathematics training and technical assistance in using Google Sheets and BukuWarung applications. The results indicate an improvement in participants’ ability to record daily transactions, prepare simple profit–loss statements, and apply sharia-based financial management practices. Approximately 70% of participants were able to independently implement digital bookkeeping, although challenges related to digital literacy and internet connectivity remained. Overall, the program proved effective in improving financial governance and efficiency among small food vendors based on sharia financial principles.
- Research Article
- 10.26571/reamec.v13.18973
- Dec 29, 2025
- REAMEC - Rede Amazônica de Educação em Ciências e Matemática
- Ananda Itsu Moraes Conceição + 1 more
The study addresses the identification of the application of Ethnomathematics in the teaching of mathematics in Basic Education. The objective of this research is to identify dissertations and educational products of professional master's degrees that worked with mathematical content combining ethnomathematical practices. The methodology used was qualitative and as a data collection method we used assumptions of the Systematic Literature Review according to Mendes and Pereira (2020), searching for professional master's dissertations from 2013 and 2023. The search revealed 36 dissertations with educational products that explored mathematical content, such as basic arithmetic operations, plane geometry, quantities and measurements, concepts of financial mathematics, proportion, brought in diversified educational products, such as didactic sequences, comic books and documentary videos. The research showed that the ethnomathematical practices identified in the productions value local knowledge and connect mathematical content to the students' daily lives, promoting greater engagement and understanding. It is considered that research such as that presented in this text encourages researchers and teachers to teach mathematics in a more dynamic way that values cultural diversity, which can promote the development of more creative and real classroom practices.
- Research Article
- 10.36948/ijfmr.2025.v07i06.64761
- Dec 27, 2025
- International Journal For Multidisciplinary Research
- Madhavi Kunchi
Optimization plays a vital role in financial decision-making by providing mathematical frameworks to achieve optimal results under constraints. Modern finance deals with uncertainty, limited resources, and risk-return trade-offs. Optimization techniques help in portfolio selection, asset allocation, risk management, capital budgeting, derivative pricing, and financial planning. This paper presents a comprehensive and systematic study of optimization problems in finance, covering classical and modern techniques such as linear programming, nonlinear programming, quadratic programming, stochastic optimization, and dynamic programming. The study emphasizes theoretical foundations as well as practical relevance in real-world financial systems.
- Research Article
- 10.63593/fms.2788-8592.2025.11.011
- Dec 24, 2025
- Frontiers in Management Science
- Allen Lin
Traditional financial institutions (TFIs), particularly community banks and small asset management firms (SAMFs) with assets under $50 billion, face a trifecta of bottlenecks when accessing Web3: prohibitive technical barriers, fragmented regulatory compliance risks, and cognitive dissonance between crypto asset valuation and traditional financial logic. In the U.S. market, constrained by multi-agency oversight (SEC, OFAC, FinCEN), the adoption rate of Web3 access among these small TFIs remains merely 5.2% (SIFMA, 2025), far below the 37.8% penetration among large institutions with assets exceeding$500 billion. Leveraging my dual expertise in quantitative finance (CFA Level III) and Web3 multi-chain development (Uniswap V3/V4 protocol experience, daos.world multi-chain DAO incubation), this study constructs a three-dimensional synergistic theoretical framework integrating regulatory adaptation, technical simplification, and valuation migration. A low-barrier access pathway is proposed, centered on the “TradFi-Web3 Connector” system—featuring compliant wallet custody based on EIP-4337 account abstraction and a traditional finance-derived Web3 asset valuation model. Empirical validation across 8 U.S. small TFIs (4 community banks, 4 SAMFs) over an 8-month period (March–October 2025) demonstrates that this pathway reduces the average onboarding cycle from 2.8 months to 9.7 days (82.5% improvement), cuts compliance costs by 61.3% (from $95,400 to$37,300 per annum), achieves a 92.4% investment decision accuracy rate, and maintains a 100% pass rate in SEC compliance reviews with zero regulatory incidents. This research fills a critical gap in low-barrier Web3 access for resource-constrained TFIs, provides a replicable paradigm for the digital transformation of U.S. traditional finance, and empirically validates the synergy between regulatory compliance and technical innovation in cross-ecosystem integration.
- Research Article
- 10.54254/2754-1169/2026.ld30815
- Dec 24, 2025
- Advances in Economics, Management and Political Sciences
- Xihao Qin
Volatility forecasting is vital for risk management and quantitative trading, yet accurately predicting movements across diverse global indices remains challenging. This study addresses this by proposing a specialized framework to enhance forecasting precision and practical utility. This paper introduces a novel category-specific eXtreme Gradient Boosting (XGBoost) framework for global index volatility forecasting. Using an extensive Wharton Research Data Services (WRDS) dataset (Oct 2015 - Oct 2025), the study rigorously compares models including Linear Regression, Random Forest, and Long Short-Term Memory (LSTM). Results show that a hyperparameter-optimized XGBoost model achieves superior performance, reducing Mean Squared Error (MSE) by 60.5%. The category-specific approach significantly boosts accuracy across index types, yielding exceptional results for Socit des Bourses Franaises (SBF) indices (MSE: 0.00311) and robust performance for Financial Times Stock Exchange (FTSE) and Standard & Poor's (S&P) categories. The framework's practical value is confirmed via a trading strategy that generates a 2.08% annual return with a Sharpe ratio of 0.626, while maintaining strong risk control (max drawdown: -5.00%). The research highlights the critical advantage of tailored modeling and ensemble techniques over generic approaches, substantially advancing financial volatility forecasting capabilities for both academic and practical applications.
- Research Article
- 10.54254/2754-1169/2025.bj30766
- Dec 24, 2025
- Advances in Economics, Management and Political Sciences
- Yi Gu
This analysis explores how Te Hiku Media, a Mori-led non-profit organization, integrates interdisciplinary strategies to protect and revitalize the Mori language while actively resisting the forces of data colonization driven by large technology corporations. Central to their approach is the development of in-house machine learning models that leverage semi-supervised learning methods, which significantly reduce the need for extensive labeled datasets. By adopting this strategy, Te Hiku maintains linguistic sovereignty, safeguards cultural heritage, and minimizes reliance on external corporate infrastructures. Beyond the technical dimension, the organization applies financial tools such as cost-benefit analysis, Bayesian probability, and multivariate regression to support risk-aware decision-making in areas of data ownership, potential commercialization, and the long-term preservation of cultural value. The initiative also foregrounds critical ethical considerations, including the avoidance of algorithmic bias, the prevention of price discrimination in educational access, and the enforcement of strict community-led governance structures to ensure data privacy and cultural accountability. Te Hiku Medias work demonstrates how localized, ethical applications of artificial intelligence can simultaneously promote cultural continuity and advance social equity. Their model provides a replicable pathway for other Indigenous communities navigating similar technological, cultural, and geopolitical challenges in the rapidly evolving digital age.
- Research Article
- 10.37001/1c94s042
- Dec 22, 2025
- Educação Matemática em Revista - RS
- Arthur Medeiros Barros + 1 more
Research has indicated the potential of using calculators to approach mathematical content and challenges for this to actually happen, one of which is initial teacher training. Therefore, we asked ourselves: how have research focused on calculators in Mathematics Degree Courses? In this text we discuss a bibliographical survey of Brazilian dissertations on calculators in Mathematics Degree Courses, based on works available in the Brazilian Digital Library of Theses and Dissertations. Based on a qualitative bibliographic survey and the use of content analysis techniques, we analyzed six dissertations. The analysis indicates an integration of the use of calculators with other technologies, such as Geogebra, Maple and Desmos. Furthermore, the focus is on financial mathematics, geometry and integral and differential calculus content. We indicate the need for a connection between the use of calculators in Basic Education, starting from Higher Education, aiming at the training of future teachers.
- Research Article
- 10.1080/03610918.2025.2604845
- Dec 20, 2025
- Communications in Statistics - Simulation and Computation
- Louis O Scott
Square root processes are used in quantitative finance to model state variables that are non-negative. Applications include models for stochastic volatility, stochastic credit spreads, and strictly non-negative interest rates. Applications for derivative pricing typically require simulation of stochastic processes, including the square root process. The purpose of this paper is to revisit methods for fast simulation of square root process to be applied with parallel processing. The paper reviews the square root process and a variety of simulation methods, including the simulation of the non-central chi squared distribution, the exact distribution for the process over discrete time intervals. Simulation methods for the non-central chi squared distribution are relatively slow and not well-suited for simulation in dynamic financial models. Alternative approximation methods are developed, and a battery of goodness-of-fit statistical tests are applied to the alternative methods. The paper includes discussion of random number generation for the purpose of simulating square root processes across parallel processors. The alternative simulation methods require only one uniform random number generator per time step and facilitate hedging and parameter calibration with parallel processing. The test results indicate that the approximation methods generally converge and significantly reduce computing time when parallel processing is applied on a GPU.
- Research Article
- 10.61173/37xawa31
- Dec 19, 2025
- Finance & Economics
- Zhibo Feng
This study applies machine learning algorithms to the field of quantitative finance. By employing both Random Forest and Extreme Gradient Boosting (XGBoost) models to predict price movements of nine different Exchange-traded funds (ETFs) from the US, it assesses the practical performance of machine learning in ETFs’ price forecasting, thereby assisting investors and institutions in better evaluating future ETF trends. The ETFs’ price data used in this research are sourced from the US ETF Prices datasets on Kaggle. Technical indicators such as Bollinger Bands, Relative Strength Index (RSI), and Moving Average (MA) were incorporated into the models through feature engineering. The performance of both models was evaluated across different time windows and ETF products. Comparative analysis revealed that both Random Forest and XGBoost perform well within the 5 to 200-day forecasting horizon. The results indicate that larger sample sizes positively impact the goodness-of-fit of the Random Forest model, while excessively large samples may lead to degraded performance in XGBoost. In conclusion, while machine learning algorithms show strong promise in predicting ETF price movements, practitioners should still integrate market experience, sentiment analysis, and multi-factor evaluation to comprehensively assess ETF performance.
- Research Article
- 10.61173/3ktmwd61
- Dec 19, 2025
- Science and Technology of Engineering, Chemistry and Environmental Protection
- Zeyu Chen
In the financial world, it is always difficult to predict stock prices due to the high volatility of stock prices and the noise inherent in market data. And because of some of the benefits of machine learning techniques, researchers and practitioners are increasingly adopting these techniques to increase the accuracy of predictions and support investment decisions. Therefore, this paper provides classification model (random forest, logical regression), linear regression model (linear regression, XGBoost regression), time series model (LSTM, Prophet) and other complex machine learning models. It explains the theoretical basis, key functions, and practical applications in finance and stock markets. Furthermore, it also explores the deployment of these models in specific contexts such as short-term trading guidance, portfolio rebalancing, and quantitative trading strategy development. It also addresses critical challenges associated with ML-based prediction, including data quality, model overfitting, and the need for robust risk management frameworks. Finally, this paper critically examins the essential advantages and limitations of ML in this sphere and suggests a positive outlook for future research, including the integration of unstructured data and the development of hybrid models that combine statistical and deep learning approaches.