The (k, a)-generalized wavelet transform is a novel addition to the class of wavelet transforms, which has gained a respectable status in the realm of time-frequency signal analysis within a short period of time. Since the study of time-frequency analysis is both theoretically interesting and practically useful, in this article, we investigated several subjects of time-frequency analysis for the (k, a)-generalized wavelet transform. First, we analyze the concentration of this transform on sets of finite measure. In particular, we prove Donoho–Stark and Benedicks-type uncertainty principles. We prove several versions of Heisenberg-type uncertainty principles for this transformation. Furthermore, involving the reproducing kernel and spectral theories, we investigate the time frequency and study the scalogram for the same wavelet transform. Finally, we provide Shapiro’s mean dispersion type theorems at the end.