Abstract

This paper analyzes the Digital Pre-Distortion linearization technique using a low-precision Analog-to-Digital Converter (ADC). The output of a power amplifier exhibits various spurious emissions, spectral regrowth and intermodulation distortion (IMD) products due to its non-linear behavior. So, to preserve the performance of power amplifier, linearization becomes mandatory. Digital Pre-Distortion does the training on the output of the power amplifier (distorted signal) and generates exactly the inverse characteristics to that of power amplifier. Their cascading results into a linear response. In practical systems, the output of power amplifier has to go through an analog-to-digital converter for digital processing and a low-resolution ADC results in the degradation of the signal and affects the DPD performance. But a low-resolution ADC not only reduces the computational complexity in the digital processing but it also provides lower power consumption and costs less because less hardware would be required. In this work, the aim is to find the precision up to which ADC resolution can be reduced without affecting the DPD performance in a significant manner. This paper evaluates the performance of two DPD systems - Full-band DPD and Sub-band DPD and from simulations, it is observed that for a full-band DPD, 1-bit ADC can be reliably used and for a sub-band DPD, single bit to 4-bits ADC can be used.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.