This work is concerned with discrete-time Markov chains having countable state spaces and two-time-scale structures. We examine two classes of Markov chains. In the first class, the state space of the Markov chain is nearly decomposable into a finite number of subspaces, each of which has countably many states, whereas in the second class, the state space is nearly decomposable into infinitely many subspaces each of which has finitely many states. Singular perturbation methods and two-time scales are used to alleviate the computational complexity. Under appropriate conditions, for the first class of models, we show that the formulation is ‘equivalent’ to a continuous-time Markov chain with a finite state space resulting in a substantial reduction of computational burden; for the second class of models, a similar ‘equivalence’ is established. These results are obtained using asymptotic expansions of probability vectors and transition matrices, and properties of aggregated processes. Moreover, we prove that suitably scaled sequences of occupation measures converge weakly to switching diffusion processes. An application to queueing networks is also presented.