We consider dynamic systems which evolve on discrete time domains where the time steps form a sequence of independent, identically distributed random variables. In particular, we classify the mean-square stability of linear systems on these time domains using quadratic Lyapunov functionals. In the case where the system matrix is a function of the time step, our results agree with and generalize stability results found in the Markov jump linear systems literature. In the case where the system matrix is constant, our results generalize, illuminate, and extend to the stochastic realm results in the field of dynamic equations on time scales. In order to help see the factors that contribute to stability, we prove a sufficient condition for the solvability of the Lyapunov equation by appealing to a fixed point theorem of Ran and Reurings. Finally, an example using observer-based feedback control is presented to demonstrate the utility of the results to control engineers who cannot guarantee uniform timing of the system.