Abstract

<para xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> A joint source-channel multiple description (JSC-MD) framework for signal estimation and communication in resource-constrained lossy networks is presented. To keep the encoder complexity at a minimum, a signal is coded by a multiple description quantizer (MDQ) with neither entropy nor channel coding. The code diversity of MDQ and the path diversity of the network are exploited by decoders to combat transmission errors. A key design objective is resource scalability: powerful nodes in the network can perform JSC-MD estimation under the criteria of maximum <emphasis emphasistype="boldital">a posteriori</emphasis> probability (MAP) or minimum mean-square error (MMSE), while primitive nodes resort to simpler MD decoding, all working with the same MDQ code. The application of JSC-MD to distributed estimation of hidden Markov models in a sensor network is demonstrated. The proposed JSC-MD MAP estimator is an algorithm of the longest path in a weighted directed acyclic graph, while the JSC-MD MMSE decoder is an extension of the well-known forward-backward algorithm to multiple descriptions. Both algorithms simultaneously exploit the source memory, the redundancy of the fixed-rate MDQ and the inter-description correlations. They outperform the existing hard-decision MDQ decoders by large margins (up to 8 dB). For Gaussian Markov sources, the complexity of JSC-MD distributed MAP sequence estimation can be made as low as that of typical single description Viterbi-type algorithms. </para>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call