Abstract

We demonstrate a density projection approximation method for solving resource management problems with imperfect state information. The method expands the set of partially-observed Markov decision process (POMDP) problems that can be solved with standard dynamic programming tools by addressing dimensionality problems in the decision maker's belief state. Density projection is suitable for uncertainty over both physical states (e.g. resource stock) and process structure (e.g. biophysical parameters). We apply the method to an adaptive management problem under structural uncertainty in which a fishery manager's harvest policy affects both the stock of fish and the belief state about the process governing reproduction. We solve for the optimal endogenous learning policy—the active adaptive management approach—and compare it to passive learning and non-learning strategies. We demonstrate how learning improves efficiency but typically follows a period of costly short-run investment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call