This paper focuses on the robust mean-square consensus control problem for linear multiagent systems over randomly switching signed interaction topologies. The stochastic process is governed by a time-homogeneous Markov chain with partly unknown transition rates. Sufficient conditions for a consensus in the form of linear matrix inequalities are given via distributed adaptive control based on parameter-dependent Lyapunov functions. The adaptive control protocols require only the neighbor information of the agents, and the algorithm that designs the protocols reduces the influence of the communication topology on the consensus, which can prevent undesirable interaction impacts. Moreover, the disturbance rejection problem is addressed as an extension. Finally, two simulations are utilized to illustrate the effectiveness of the algorithms.