Any measurement in condition monitoring applications is associated with disturbing noise. Till now, most of the diagnostic procedures have assumed the Gaussian distribution for the noise. This paper shares a novel perspective to the problem of local damage detection. The acquired vector of observations is considered as an additive mixture of signal of interest (SOI) and noise with strongly non-Gaussian, heavy-tailed properties, that masks the SOI. The distribution properties of the background noise influence the selection of tools used for the signal analysis, particularly for local damage detection. Thus, it is extremely important to recognize and identify possible non-Gaussian behavior of the noise. The problem considered here is more general than the classical goodness-of-fit testing. The paper highlights the important role of variance, as most of the methods for signal analysis are based on the assumption of the finite-variance distribution of the underlying signal. The finite variance assumption is crucial but implicit to most indicators used in condition monitoring (such as the root-mean-square value, the power spectral density, the kurtosis, the spectral correlation, etc.), in view that infinite variance implies moments higher than 2 are also infinite. The problem is demonstrated based on three popular types of non-Gaussian distributions observed for real vibration signals. We demonstrate how the properties of noise distribution in the time domain may change by its transformations to the time–frequency domain (spectrogram). Additionally, we propose a procedure to check the presence of the infinite-variance of the background noise. Our investigations are illustrated using simulation studies and real vibration signals from various machines.