Abstract
The joint source and channel coding theorem states that for any stationary ergodic process X/sub 1/, X/sub 2/.... whose entropy H/sub /spl infin// (X) lies below the capacity C of the channel, it is possible to find a joint source and channel encoder of dimension n such that the probability of a decoding error is smaller than /spl epsiv/ > 0. A joint source and channel encoder in this context is a mapping from the sources sequences of length n to code sequences of length n, i.e., a rate 1 encoder. The information over the channel is the entropy of the source. The question we try to resolve is whether there are Markov sources for which a joint source and channel encoder is not necessary. What reliability can be achieved by a decoder that uses the natural redundancy of the source to reconstruct its output, when the source output is transmitted uncoded over the channel? Human decoders are able to reconstruct English text when up to half of the letters in the text are missing. Are French or German preferable to English in this respect? What properties of a Markov source make it suitable for uncoded transmission? At equal entropy rates, can one Markov source be better suited than another for uncoded transmission? Is there a good and a bad redundancy?.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have