Abstract

In this paper, we analyze diffusion strategies in which all nodes attempt to estimate a common vector parameter for achieving distributed estimation in adaptive networks. Under diffusion strategies, each node essentially needs to share processed data with predefined neighbors. Although the use of internode communication has contributed significantly to improving convergence performance based on diffusion, such communications consume a huge quantity of power in data transmission. In developing low-power consumption diffusion strategies, it is very important to reduce the communication cost without significant degradation of convergence performance. For that purpose, we propose a data-reserved periodic diffusion least-mean-squares (LMS) algorithm in which each node updates and transmits an estimate periodically while reserving its measurement data even during non-update time. By applying these reserved data in an adaptation step at update time, the proposed algorithm mitigates the decline in convergence speed incurred by most conventional periodic schemes. For a period $p$ , the total cost of communication is reduced to a factor of $1/p$ relative to the conventional adapt-then-combine (ATC) diffusion LMS algorithm. The loss of combination steps in this process leads naturally to a slight increase in the steady-state error as the period $p$ increases, as is theoretically confirmed through mathematical analysis. We also prove an interesting property of the proposed algorithm, namely, that it suffers less degradation of the steady-state error than the conventional diffusion in a noisy communication environment. Experimental results show that the proposed algorithm outperforms related conventional algorithms and, in particular, outperforms ATC diffusion LMS over a network with noisy links.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call