Abstract

We consider the discrete-time intersymbol interference (ISI) channel model, with additive Gaussian noise and fixed independent identically distributed inputs. In this setting, we investigate the expression put forth by Shamai and Laroia as a conjectured lower bound for the input–output mutual information after application of a minimum mean-square error decision-feedback equalizer receiver. A low-signal to noise ratio (SNR) expansion is used to prove that the conjectured bound does not hold under general conditions, and to characterize inputs for which it is particularly ill-suited. One such input is used to construct a counterexample, indicating that the Shamai–Laroia expression does not always bound even the achievable rate of the channel, thus excluding a natural relaxation of the original conjectured bound. However, this relaxed bound is then shown to hold for any finite entropy input and ISI channel, when the SNR is sufficiently high. We derive two conditions under which the relaxed bound holds, involving compound channel capacity and quasiconvexity arguments. Finally, new simple bounds for the achievable rate are proven, and compared with other known bounds. Information-estimation relations and estimation–theoretic bounds play a key role in establishing our results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.