Abstract

Microcontrollers with embedded timers can directly measure resistive and capacitive sensors by determining the charging or discharging time of an RC circuit that includes the sensor. This time-to-digital conversion is affected by the quantization of the timer and the trigger noise, which limit the resolution to an effective number of bits (ENOB). This paper analyses the standard uncertainty and the ENOB of that time-to-digital conversion. When interfacing resistive sensors and the capacitor C is small, quantization effects predominate over trigger noise effects, and ENOB increases for increasing C. But, for capacitor values larger than a given C, trigger noise effects predominate and the ENOB remains constant regardless of C. Therefore, an optimal time constant yields the best speed–ENOB trade-off. This type of sensor interface was implemented by using an AVR microcontroller with an embedded 16-bit timer connected to a resistor simulating a Pt1000-type temperature sensor. The experimental results agree with the theoretical predictions. If the time was determined from a single observation, the optimal time constant was about 2–3 ms and the ENOB was about 11.5 b, which corresponds to a 0.22 Ω resolution. By averaging ten observations, that resolution improved to 13.5 b (0.05 Ω).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call