Abstract

Root Mean Square (RMS) is one of the most important signal parameters. To measure RMS, digital measurement methods are currently applied. The application of digital measurement methods dictates the use of analog-to-digital converters to obtain the input signal samples. Samples of real ADCs are obtained at unequal time intervals, which results to the RMS measurement error. The resulting error depends on many factors and cannot be reduced by procedures such as zero adjusting or calibration. The paper discusses such concepts as aperture time, aperture jitter and the reasons for it occurrence. The influence of these parameters on the RMS measurement error is analyzed. The representation of the aperture jitter as a random function with a uniform and normal distribution is considered. The influence of input signal parameters (amplitude, frequency, initial phase) and measuring system parameters (sampling frequency, measurement time) on the RMS measurement error is analyzed. Analytical dependences are obtained, which allows to estimate the RMS measurement error caused by the aperture jitter, presented as a random function with a uniform and normal distribution. It is shown that the proposed approach makes it possible to approximate the RMS error estimate to the simulation results by at least 5 times in comparison with existing approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call