Abstract

The major problems with finite state vector quantization (FSVQ) are the lack of accurate prediction of the current state, the state codebook design, and the amount of memory required to store all state codebooks. This paper presents a new FSVQ scheme called finite-state residual vector quantization (FSRVQ), in which a neural network based state prediction is used. Furthermore, a novel tree-structured competitive neural network is used to jointly design the next-state and the state codebooks for the proposed FSRVQ. The proposed FSRVQ scheme differs from the conventional FSVQ in that the state codebooks encode the residual vectors instead of the original vectors. The neural network predictor predicts the current block based on the four previously encoded blocks. The index of the codevector closest to the predicted vector (in the Euclidean distance sense) represents the current state. The residual vector obtained by subtracting the predicted vector from the original vector is then encoded using the current state codebook. The neural network predictor is trained using the back propagation learning algorithm. The next-state codebook and the corresponding state codebooks are jointly designed using the tree-structured competitive neural network. This joint optimization eliminates the large number of unnecessary states which in turn reduces the memory requirement by several order of magnitude when compared to the ordinary FSVQ.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call