AbstractA reinforcement learning‐based adaptive consensus control method has been proposed for the non‐affine nonlinear multiagent systems with semi‐Markov switching topographies in this paper. The considered system contains the nonlinear features, multiple source disturbances and non‐affine inputs, which makes it difficult to design the consensus controller. Firstly, to overcome the adverse influence caused by the multiple disturbances existing in the nonlinear system, a reinforcement learning based adaptive disturbance observer has been designed. Secondly, to handle the unknown nonlinear terms and the non‐affine features of the control inputs, an auxiliary loop has been introduced and a reinforcement learning based adaptive control law has been designed. Finally, aiming at the semi‐Markov switching topographies possessed by the multiagent systems, a novel adaptive consensus controller integrating the reinforcement learning approximator, the adaptive disturbance observer and the consensus control law, has been proposed. Through the Lyapunov direct method, it has been proven that the tracking errors are ultimately uniformly bounded. Moreover, within an acceptable margin of error, the state of all the followers can proved to be consistent with the leader. Several numerical experiments are conducted to illustrate the feasibility and effectiveness of the proposed reinforcement learning‐based leader‐following consensus control law.