Abstract

The time-to-digital converter (TDC) is the primary instrument for measuring time intervals. A common solution for FPGA-based TDCs is to construct a tapped delay line (TDL) for time interpolation to produce a sub-clock time resolution. The jitter and quantization, granularity and uniformity of the delay cells, and so on in TDL, determine the achievable TDC time resolution and linearity. To achieve higher linearity, a TDL and its encoding method for compact TDC on the UltraScale FPGA is proposed in this paper. A dual-sampling method with TDL is adopted, which improves precision and efficiency by directly encoding the state of the delay line, allowing further subdivision of the delay unit in combination with wave union. The bin width of the TDL obtained based on our method is measured here using the code density calibration method and draw the integral nonlinearity and differential nonlinearity. Based on the chain structure and encoding method proposed in this article, the TDL and encoding layout have been optimized to achieve a more compatible UltraScale FPGA and can also be further applied to multi-channel TDC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call