Abstract

The channel output entropy property introduced by A.D. Wyner and J. Ziv (ibid., vol.IT-19, p.769-762, Nov.1973) for a binary symmetric channel is extended to arbitrary memoryless symmetric channels with binary inputs and discrete or continuous outputs. This yields lower bounds on the achievable information rates of these channels under constrained binary inputs. Using the interpretation of entropy as a measure of order and randomness, the authors deduce that output sequences of memoryless symmetric channels induced by binary inputs are of a higher degree of randomness if the redundancy of the input binary sequence is spread in memory rather than in one-dimensional asymmetry. It is of interest to characterize the general class of schemes for which this interpretation holds. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call