AbstractThis paper describes a structure of the linear discrete‐time stochastic system which contains colored observation noise and changes its system parameters randomly due to, say, abrupt load changes. In the system under consideration, the switching system parameters are not identified during adaptive control; rather a priori identified parameters are used in determining the adaptive control input. First, an algorithm for optimal control from an arbitrary initial state to an arbitrary state space (target state) is derived by extending the dynamic programming technique. Also, an algorithm is derived for realizing the state estimate of a system with colored observation noise with a discrete‐time Kalman filter. Next, the a posteriori probability of each system parameter is found from the probabilities of system parameters at a certain time by using the Bayes' formula, and it is shown that the adaptive control input can be determined by summing the products of a posteriori probability and the optimal control input calculated from dynamic programming for each parameter. Also, the effectiveness of the proposed stochastic adaptive control algorithm is confirmed by computer simulation of a two‐dimensional model system and by comparing four different methods of probability calculation when the system parameters are switched randomly.