Abstract

In this article, we show a 60 GS/s two-stage 8 × 8 time-interleaved sampling circuit, where the second-stage nonlinearity can be controlled by using the voltage that optimizes the static distortions of the sampler. A calibration algorithm can extract the nonlinear contributions of the stages and compensate for them by setting the optimal bias voltage. This can also be used to cancel the front-end nonlinear effects. The sampler was verified by implementing it in TSMC 5 nm FinFET, and a calibration system in a Pulse Amplitude Modulation transceiver, detecting and minimizing the nonlinearities, is presented. The optimum voltage biasing of the sampler was obtained by co-simulating the circuit with the linearity calibration loop implemented in Verilog-A. The histogram of the sampled signal at the slicer input is shown before and after the calibration to show the improvement in the sampled eye opening. Moreover, the resulting bias is equal to the one that maximizes the total harmonic distortion in transient simulations with a 1 GHz input signal, obtaining a minimum of 48.5 dB of total harmonic distortion across different PVT conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call