Abstract

For transmission of digital data over a linear channel with additive white noise, it can be shown that the optimal symbol-decision equalizer is nonlinear. The Kernel Adaline algorithm, a nonlinear generalization of Widrow's and Hoff's (1960) Adaline, is capable of learning arbitrary nonlinear decision boundaries while retaining the desirable convergence properties of the linear Adaline. This work investigates the use of the Kernel Adaline as an equalizer for such transmission channels. We show that the performance of the Kernel Adaline approaches that of the optimal symbol-decision equalizer given by Bayes theory and further, still produces useful results when the additive noise is nonwhite. A description and preliminary results of an adaptive version of the Kernel Adaline are also presented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.