Abstract
Portfolio optimization has attracted substantial interest within the artificial intelligence community due to its significant impact on financial decision-making, risk management, and market analysis. Reinforcement learning fits well with portfolio optimization because their goal is to maximize cumulative returns. In reinforcement learning, state transition probabilities are often unknown and must be estimated. However, in portfolio backtesting experiments, these probabilities are deterministic, making the conventional reinforcement learning approach to estimating state transitions suboptimal for portfolio optimization. Addressing this issue, this study decomposes the portfolio optimization into two core tasks: prediction and profit policy optimization and proposes a novel reinforcement learning framework that assumes deterministic state transition probabilities, comprised of three main modules: feature extraction, prediction, and profit strategy optimization. To model assets more effectively and comprehensively, we capture their temporal features, relational features, and market state. We introduce a patch-wise correlation method and attribute based gate to enhance feature extraction. In the profit policy module, we utilize a deterministic strategy, employing a recursive reinforcement learning method based on Monte Carlo sampling to train the policy network. This enables dynamic adjustments of asset investment weights, ensuring the maximization of cumulative returns. Extensive experiments conducted on cryptocurrency datasets demonstrate the superior performance of our approach, and achieving 36.6%-75.6% improvements in main measurements on cryptocurrency datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.