Nowadays, GNSS (Global Navigation Satellite System) receivers can measure with sampling rates up to 100 Hz, and it is possible to provide displacements at a level of precision of a few millimeters. This opened an interesting field of applications, e.g. for disciplines that monitor dynamic movements, like seismology or structural health monitoring. However, the detection of smallest displacements possible requires a thorough analysis of both error types, systematic and stochastic. In view of this, this paper deals with the detection of small vibrations with amplitudes down to the sub-millimeter level, and the error modelling necessary to achieve this. The data under investigation are time series of kinematic GNSS displacement measurements. We will use parametric time series models of the ARIMA (Autoregressive Integrated Moving Average) type to characterize the GNSS errors and to do a pre-filtering based on a calibrated model. With non-parametric spectral estimation methods – namely periodograms – we can then test for the presence of harmonic processes (sinusoids) in the data. Once a sinusoid is detected, its amplitude and phase are estimated by means of harmonic regression. Subsequently, the sinusoid can be removed, and the procedure can be repeated. This algorithm is tested with data collected for a GNSS shake table experiment; it is shown that it is possible resolve and detect vibrations down to a level of tens of microns with few minutes of data being analyzed. We also highlight some theoretical considerations on minimum detectable amplitudes of vibrations for measurements of different quality and record lengths.