Abstract

This paper explores bounds on the minimum number of channel probes required to learn sufficient information to establish reliable communication at a given communication rate (below channel capacity) for a discrete memoryless channel. Given a set of discrete channel input-output sample pairs (where for each discrete input value, the associated aggregate set of discrete output values observations is multinomially distributed), we leverage a non-asymptotic probably approximately correct (PAC) bound on the mutual information (channel capacity) between the label (discrete) random variable (RV) and the observation RV to establish a convergence rate for the worst case channel. Previous bounds (such as those based on Sanov's Theorem) provide high probability (i.e. PAC) bounds on the true mutual information that converge with rate <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$O(\log(N)/\sqrt{N})$</tex> , where N is the number of independent and identically distributed (i.i.d.) samples used to compute the empirical probability mass functions. Using an improved PAC sublevel-set bound, we sharpen the rate of convergence to <tex xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">$O(\sqrt{\log(\log(N))\log(N)}/N)$</tex> .

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.