Abstract

This paper shows the existence of the optimal training, in terms of achievable mutual information rate, for an output feedback implicit estimator for finite-state Markov communication channels. Implicit (blind) estimation is based on a measure of how modified is the input distribution when filtered by the channel transfer function and it is shown that there is no modification of an input distribution with maximum entropy rate. Input signal entropy rate reduction enables implicit (blind) channel process estimation, but decreases information transmission rate. The optimal input entropy rate (optimal implicit training rate) which achieves the maximum mutual information rate, is found.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call