Abstract

In this paper, we study an optimal control problem in which their cost function is interval-valued. Also, a stochastic differential equation governs their state space. Moreover, we introduce a generalized version of Bellman's optimality principle for the stochastic system with an interval-valued cost function. Also, we obtain the Hamilton–Jacobi–Bellman equations and their control decisions. Two numerical examples happen in finance in which their cost function are interval-valued functions, illustrating the efficiency of the discussed results. The obtained results provide significantly reliable decisions compared to the case where the conventional cost function is applied.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call