Abstract

This paper presents an analytical method for the performance evaluation of the correlation-based impulse radio (IR) receivers, which detect the data based on synchronizing the received signal with a locally generated template pulse. In this paper, a phase interpolation technique is employed to preserve synchronization accuracy with low power consumption. The analytical method computes the spectral density of jitter at the phase interpolator output, resulting from both flicker and thermal noise components. The analysis considers the aliasing phenomenon due to the sampling of the noise to predict the induced jitter accurately. Bit error rate (BER) of the receiver is also statistically estimated in terms of the induced jitter on the template pulse, the noise effect of the receiver chain, and the template signal level. The proposed estimation procedure is validated by comparing its results with transient noise simulations. It is shown that increasing the template signal range in the correlator and improving the linearity of the phase interpolation transfer curve, at a given power consumption, enhances the receiver sensitivity by 2 and 3.5 dB, respectively. The BER estimation results also reveal that the effect of the template signal jitter on the sensitivity of the correlation-based IR receivers becomes dominant with the increasing input signal level.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call