Abstract

The problem of detecting the initial state of a network dynamics from noisy local observations is examined. Specifically, a linear synchronisation dynamics defined on a graph is modelled as being initiated by two possible initial conditions (or hypotheses) with certain a priori probabilities, to capture two possible evolutions of the network dynamics; an external agent is modelled as measuring the network dynamics at one network component, and is tasked with determining which hypothesis is more likely. We find that the external agent's detection performance (specifically, probability of error in MAP detection) can be classified into three cases, depending on the network's spectrum and graph topology, the hypotheses, and the observation location. Specifically, the detector performance can be dichotomised into: (1) a no-improvement case, in which the measured data does not permit improved detection compared to an a priori detection; (2) an asymptotically-perfect case in which the error probability approaches 0 exponentially with increasing measurement horizon; and (3) an improved-but-imperfect estimation case in which measurements reduce error but do not eliminate it. Beyond this dichotomy, we obtain spectral characterisations of detector performance in the imperfect-estimation case, which can be translated into graph-theoretic results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call