Problems of optimal bounded control for randomly excited systems are studied by the Dynamic Programming approach. An effective hybrid solution method has been developed for the corresponding Hamilton–Jacobi–Bellman (HJB) equations for the optimized functional of the response energy. These PDEs are shown to be amenable to an exact analytical solution within a certain ‘outer’ domain of the phase space. The solution provides boundary conditions for numerical study within the remaining ‘inner’ domain. The simple ‘dry-friction’ law had been shown previously to be optimal for a SDOF system within the outer domain and to become asymptotically optimal within the whole phase plane for the important case of so-called ‘long-term’ control, whereby steady-state response is to be optimized according to the integral cost functional. These results are extended in this paper to MDOF systems by using modal transformation. Thus, the multidimensional dry-friction law is shown to be the optimal one for the case of long-term control. The expected response energy is predicted for this case by a direct energy balance based on application of the SDE Calculus to the energy equation. Stochastic averaging is also used in order to obtain certain reliability estimates. In particular, numerical results for the expected time to first-passage failure are presented, illustrating reduction of reliability due to the imposed bound on the control force. Monte-Carlo simulation results are presented, which demonstrate adequate accuracy of the predictions far beyond the expected applicability range of the asymptotic approaches.
Read full abstract