Abstract
This paper addresses the issue of robust and progressive transmission of signals (e.g., images, video) encoded with variable length codes (VLCs) over error-prone channels. This paper first describes bitstream construction methods offering good properties in terms of error resilience and progressivity. In contrast with related algorithms described in the literature, all proposed methods have a linear complexity as the sequence length increases. The applicability of soft-input soft-output (SISO) and turbo decoding principles to resulting bitstream structures is investigated. In addition to error resilience, the amenability of the bitstream construction methods to progressive decoding is considered. The problem of code design for achieving good performance in terms of error resilience and progressive decoding with these transmission strategies is then addressed. The VLC code has to be such that the symbol energy is mainly concentrated on the first bits of the symbol representation (i.e., on the first transitions of the corresponding codetree). Simulation results reveal high performance in terms of symbol error rate (SER) and mean-square reconstruction error (MSE). These error-resilience and progressivity properties are obtained without any penalty in compression efficiency. Codes with such properties are of strong interest for the binarization of-ary sources in state-of-the-art image, and video coding systems making use of, for example, the EBCOT or CABAC algorithms. A prior statistical analysis of the signal allows the construction of the appropriate binarization code.
Highlights
Entropy coding, producing variable length codes (VLCs), is a core component of any image and video compression scheme
The main drawback of VLCs is their high sensitivity to channel noise: when some bits are altered by the channel, synchronization losses can occur at the receiver, the positions of symbol boundaries are not properly estimated, leading to dramatic symbol error rates (SER)
The performance of the different codes and bitstream construction (BC) algorithms have been assessed in terms of SER, signal-to-noise ratio (SNR), and Levenshtein distance with Source S(1) and Source S(2), introduced in Examples 1 and 5, respectively
Summary
Entropy coding, producing VLC, is a core component of any image and video compression scheme. The main drawback of VLCs is their high sensitivity to channel noise: when some bits are altered by the channel, synchronization losses can occur at the receiver, the positions of symbol boundaries are not properly estimated, leading to dramatic symbol error rates (SER). This phenomenon has motivated studies of the synchronization capability of VLCs as well as the design of codes with better synchronization properties [1,2,3]. The principle can be pushed further in order to optimize the criteria of resilience, computational complexity, and progressivity
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.