Estimation of covariance matrices from a finite sample set of data observations plays a central role in signal processing. The true data covariance is rarely known in practice, but optimized algorithms inevitably depend on such statistics to enable robust system performance; for example, environments plagued by dominant interference, and/or those challenged by complex propagation. Classical (finite) random matrix theory (RMT) facilitates assessment of finite sample effects surrounding covariance estimation. Implements shown to be very useful in this regard under a circular complex Gaussian data assumption are reviewed. Recent advances in RMT explore limiting behavior of eigenvalues and eigenvectors of random matrices as dimensions become increasingly large (referred to as infinite RMT). Defining the notion of an empirical distribution for the eigenvalues deviates from classical treatments, but yields amazing convergence toward deterministic distributions. Coupled with the Stieltjes transform, powerful tools emerge that can provide new insights especially for signal processing methods intimately tied to the eigen-decomposition, e.g. diagonal loading and dominant mode rejection used in adaptive beamforming. Although many of the theorems are based on asymptotic convergence as dimensionality increases, they describe performance of finite systems quite well. Aspects of infinite RMT are also reviewed and contrasted with classical RMT.