Abstract

Time-based ADC is an essential block in designing software radio receivers because it exhibits higher speed and lower power compared to the conventional ADC, especially, at scaled CMOS technologies. In time-based ADCs, the input voltage is first converted to a pulse delay time by using a voltage-to-time converter (VTC) circuit, and then the pulse delay time is converted to a digital word by using a time-to-digital converter circuit. In this paper, an analytical model for the timing jitter and skew due to noise and process variations, respectively, is proposed for the VTC circuit. The derived model is verified and compared to Monte Carlo simulations and Eldo transient noise simulations by using industrial 65-nm CMOS technology. This paper provides new design insights such as the impact of timing jitter/skew on the ADC resolution and the maximum input voltage frequency. Also, this paper shows how the timing jitter/skew can be reduced by using circuit design knobs such as the supply voltage and the load capacitance. Therefore, the circuit designers can utilize these results to design the time-based ADC circuits under power and performance constraints in early design cycles. These results are particularly important for timing jitter/skew tolerant designs in sub-micron technologies, especially, for low power operations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call