Abstract

It is a common belief that a noise-free frequency divider by D scales down the input phase by a factor of 1/D, thus the phase-noise power spectral density (PSD) by 1/D <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> . We prove that the behavior described does not apply to digital dividers. Instead, the digital divider scales the white phase-noise PSD down by 1/D. Phase downsampling and aliasing, inherent in digital frequency division, is the reason. However the 1/D <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> law holds asymptotically for flicker, where the aliases can be neglected. We propose a new de-aliased divider, which scales the input phase-noise PSD by approximately 1/D <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> . The scheme is surprisingly simple and suitable to CPLD and FPGA implementation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.