The performance of 5G/6G cellular systems operating in millimeter wave (mmWave, 30–100 GHz) and sub-terahertz (sub-THz, 100–300 GHz) bands is conventionally assessed by utilizing the static distributions of user locations. The rationale is that the use of the beam tracking procedure allows for keeping the beams of a base station (BS) and user equipment (UE) aligned at all times. However, by introducing 3GPP Reduced Capability (RedCap) UEs utilizing the Radio Resource Management (RRM) Relaxation procedure, this may no longer be the case, as UEs are allowed to skip synchronization signal blocks (SSB) to improve energy efficiency. Thus, to characterize the performance of such UEs, methods explicitly accounting for UE mobility are needed. In this paper, we will utilize the tools of the stochastic geometry and random walk theory to derive signal-to-noise ratio (SNR), spectral efficiency, and rate as an explicit function of time by accounting for mmWave/sub-THZ specifics, including realistic directional antenna radiation patterns and micro- and macro-mobilities causing dynamic antenna misalignment. Different from other studies in the field that consider time-averaged performance measures, these metrics are obtained as an explicit function of time. Our numerical results illustrate that the macro-mobility specifies the overall trend of the time-dependent spectral efficiency, while local dynamics at 1–3 s scales are mainly governed by micro-mobility. The difference between spectral efficiency corresponding to perfectly synchronized UE and BS antennas and time-dependent spectral efficiency in a completely desynchronized system is rather negligible for realistic cell coverages and stays within approximately 5–10% for a wide range of system parameters. These conclusions are not affected by the utilized antenna array at the BS side. However, accounting for realistic radiation patterns is critical for a time-dependent performance analysis of 5G/6G mmWave/sub-THz systems.