Abstract

The design and performance of distributed signal detection systems with processor failures is presented. Using a general processor fault model, which also models an imperfect channel, the optimal design given prior failure probabilities is considered. The optimal design provides a performance bound for each system when failures may occur. It is shown that for general distributed systems, a likelihood ratio test is the optimal design for each local processor, provided failures are independent of the received observation vectors. The optimal design performs significantly better than the design that assumes no failures. It is shown that the fusion network performs better than a single centralized processor for many channels. This illustrates that, for the fusion network, the performance gained by decreasing the effect of failures outweighs the performance lost by distributing the processing. The serial network always performs worse than centralized; its performance is bounded as the number of channels increases.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.