Abstract

The principle of vector quantization is briefly reviewed. It is pointed out that, for vector quantizers based on random codebooks, memory requirements and computational complexity grow exponentially with transmission rate and vector length. As a possible solution to this problem it is suggested to introduce sufficient algebraic structure into the codebook so as to facilitate a fast systematic and nonexhaustive search through a greatly reduced codebook. This goal is achieved by using n-dimensional lattices in real Euclidean space as quantizers. Two construction methods are introduced whereby dense lattices can be constructed from linear binary error-correcting codes. The densest lattices in up to 24 dimensions are presented and their performance as n-dimensional lattice quantizers is evaluated, based on the mean-square error criterion. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call