Abstract

Motivated by applications of biometric identification and content identification systems, we consider the problem of random coding for channels, where each codeword undergoes vector quantization, and where the decoder bases its decision only on the compressed codewords and the channel output, which is, in turn, the channel’s response to the transmission of an original codeword, before compression. For memoryless sources and memoryless channels with finite alphabets, we propose a new universal decoder and analyze its error exponent, which improves on an earlier result by Dasarathy and Draper (2011), who used the classic maximum mutual information universal decoder. We show that our universal decoder provides the same error exponent as that of the optimal, maximum likelihood decoder, at least as long as all single-letter transition probabilities of the channel are positive.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call