This paper studies both continuous and discrete time consensus problems for multi-agent systems with linear time-invariant agent dynamics over randomly switching topologies. The switching is governed by a time-homogeneous Markov process, whose state corresponds to a possible interaction topology among agents. Necessary and sufficient conditions are derived for achieving consensus under a common control protocol, respectively. It is shown that the effect of switching topologies on consensus is determined by the union of topologies associated with the positive recurrent states of the Markov process. Moreover, the effect of random link failures on discrete time consensus is investigated. The implications and relationships with the existing results are discussed. Finally, the theoretical results are validated via simulations.