Abstract

The goal of vector quantization is to use a few reproduction vectors to represent original vectors/data while maintaining the necessary fidelity of the data. Distributed signal processing has received much attention in recent years, since in many applications data are dispersedly collected/stored in distributed nodes over networks, but centralizing all these data to one processing center is sometimes impractical. In this paper, we develop a distributed vector quantization (VQ) algorithm based on Kullback-Leibler (K-L) divergence. We start from the centralized case and propose to minimize the K-L divergence between the distribution of global original data and the distribution of global reproduction vectors, and then obtain an online iterative solution to this optimization problem based on the Robbins-Monro stochastic approximation. Afterwards, we extend the solution to apply to distributed cases by introducing diffusion cooperation among nodes. Numerical simulations show that the performances of the distributed K-L–based VQ algorithm are very close to the corresponding centralized algorithm. Besides, both the centralized and distributed K-L–based VQ show more robustness to outliers than the (centralized) Linde-Buzo-Gray (LBG) algorithm and the (centralized) self-organization map (SOM) algorithm.

Highlights

  • Vector quantization is a signal processing method which uses reproduction vectors to represent original data vectors while maintaining necessary fidelity of the data [1,2]

  • There are some crucial parameters in our proposed algorithms, such as the degree of node unbalance, the threshold value, the number of reproduction vectors, the network structure, etc

  • We consider a network composed of 10 nodes, which are randomly distributed in a region

Read more

Summary

Introduction

Vector quantization is a signal processing method which uses reproduction vectors to represent original data vectors while maintaining necessary fidelity of the data [1,2]. Based on information theoretic concepts, vector quantization algorithms which aim to minimize the Cauchy-Schwartz (C-S) divergence or the Kullback-Leibler (K-L) divergence between the distributions of the original data and the reproduction vectors have been devised and have been proven to perform better than the LBG and SOM algorithms [8,9]. In a majority of these distributed algorithms, signal processing tasks are accomplished at each node based on local computation, local data, as well as limited information exchange among neighbor nodes. The existing divergence-based vector quantization algorithms [8,9] cannot be directly/ extended to the distributed case due to the lack of data samples in estimating the global data distribution for each individual node (details are provided ). The performances of the distributed algorithm are very close to the corresponding centralized algorithm

Starting from the Centralized Case
Extended to Distributed Cases
Communication Complexity Analysis
Numerical Experiments
Data Generation and Evaluation Indexes
MK ř i“1
Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call