Abstract Seismic hazard assessment in low-to-moderate seismicity regions can benefit from the knowledge of surface deformation rates to better constrain earthquake recurrence models. This, however, amounts to assuming that the known seismicity rate, generally observed over historical times (i.e., up to a few centuries in Europe), provides a representative sample of the underlying long-term activity. We here investigate how this limited sampling can affect the estimated seismic hazard and whether it can explain the disagreement between the seismic moment loading rate as seen by nowadays Global Navigation Satellite Systems (GNSS) measurements and the seismic moment release rate by past earthquakes, as is sometimes observed in regions with limited activity. We approach this issue by running simulations of earthquake time series over very long timescales that account for temporal clustering and the known magnitude–frequency distribution in such regions, and that those are constrained to a seismic moment rate balance between geodetic and seismicity estimates at very long timescales. We show that, in the example of southeastern Switzerland, taken here as a case study, this sampling issue can indeed explain this disagreement, although it is likely that other phenomena, including aseismic deformation and changes in strain rate due to erosional and/or glacial rebound, may also play a significant role in this mismatch.