Abstract

Designing a non-ideal delay line (DL) with phase distortion in a transmitted-reference ultra-wideband system with an autocorrelation receiver is a great technical challenge. Differing from the currently empirical design method of DL, a semi-analytic approach is proposed through Gaussian approximation of the expression for conditional bit error rate (BER), based on investigation on the degradation of average BER caused by a group delay ripple range (GDRR) over independent Nakagami-m fading channels. This GDRR-based design method can directly evaluate its effects on the system performance and determine the acceptable phase distortion level to trade-off the BER performance and system complexity.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.