Abstract

Vector quantization is a powerful technique for image coding. Generalized Lloyd Algorithm is a commonly acknowledged non-neural network benchmark algorithm for vector quantizer design. The main drawbacks of this algorithm are its computational complexity and large memory requirement. On the other hand, competitive learning algorithms in neural networks appear to be well-suited for use in vector quantization applications. In this paper, we first discuss three existing competitive learning algorithms for training a vector quantizer, Basic Competitive Learning Algorithm, Kohonen Self-Organizing Feature Map, and Frequence- Sensitive Competitive Learning Algorithm. Then, we introduce a new Near-Optimal Learning Algorithm (NOLA) which combines the advantages of both KSFM and K-means algorithms. It `on-line' generates codebooks with the least disturbance to previously learned results. Finally, we compare NOLA with the Generalized Lloyd Algorithm. The experimental results show that NOLA achieves a near-optimal performance with the entire execution time slightly more than one iteration of Generalized Lloyd Algorithm. This offers the potential for achieving real-time vector quantization. In addition, NOLA does not need to store the entire training set but only the previously trained results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call