Abstract

Abstract We study online parameter estimation over a distributed network, where the nodes in the network collaboratively estimate a dynamically evolving parameter using noisy observations. The nodes in the network are equipped with processing and communication capabilities and can share their observations or local estimates with their neighbors. The conventional distributed estimation algorithms cannot perform the team-optimal online estimation in the finite horizon global mean-square error sense (MSE). To this end, we present a team-optimal distributed estimation algorithm through the disclosure of local estimates for tracking an underlying dynamic parameter. We first show that the optimal estimation can be achieved through the diffusion of all the time stamped observations for any arbitrary network and prove that the team optimality through disclosure of local estimates is only possible for certain network topologies such as tree networks. We then derive an iterative algorithm to recursively calculate the combination weights of the disclosed information and construct the team-optimal estimate for each time step. Through series of simulations, we demonstrate the superior performance of the proposed algorithm with respect to the state-of-the-art diffusion distributed estimation algorithms regarding the convergence rate and the finite horizon MSE levels. We also show that while conventional distributed estimation schemes cannot track highly dynamic parameters, through optimal weight and estimate construction, the proposed algorithm presents a stable MSE performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call