Abstract

Since the mid-20th century the Dynamic Programming (DP) algorithms showed the ability to solve optimal decisions problems. Nevertheless, the immense amount of mathematical operations involved to solve complex high dimensional problems using DP limited their use to small and or simplified real problems. In the last decades and trying to overcome the limitations of DP, many new algorithms of Approximate Dynamic (ADP) Programming emerged in different branches of science. The ADP algorithms do not enumerate and calculate every possible state of a system during the optimization process as DP algorithms do. Instead they perform an approximation of relevant features of the state space, which is iteratively improved by means of simulation and Monte Carlo methods. This technique allows the ADP algorithms to solve the dimensionality limitations of the conventional DP while retaining many of its benefits. In this paper is considered a stochastic optimization of the dynamic sell strategy of a generator, which is allowed to change during the period of analysis. The consequences that a present decision has on future decisions and the associated cost of this decision are taken into account. The model considers a perfectly competitive two-settlement market. The stochastic nature of the spot and future prices is modeled using a spectral representation algorithm and the availability of the generator is simulated through a 4-state markovian chronological model. The ADP algorithm implemented is validated against a DP algorithm for a simplified case and then used to solve a complete model of decision. For the risk measure, a high moment risk metric is used to approximate the CVaR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call