Abstract

We consider the quickest detection of idle periods in multiple ON-OFF processes. At each time, only one process can be observed, and the observations are random realizations drawn from two different distributions depending on the current state (ON or OFF) of the chosen process. The objective is to catch an idle period in any of the ON-OFF processes as quickly as possible subject to a reliability constraint. We show that this problem presents a fresh twist to the classic problem of quickest change detection that considers only one stochastic process. A Bayesian formulation of the problem is developed for both infinite and finite number of processes based on the theory of partially observable Markov decision process (POMDP). While a general POMDP is PSPACE-hard, we show that the optimal decision rule has a simple threshold structure for the infinite case. For the finite case, basic properties of the optimal decision rule are established, and a low-complexity threshold policy is proposed which converges to the optimal decision rule for the infinite case as the number of processes increases. This problem finds applications in spectrum sensing in cognitive radio networks where a secondary user searches for idle channels in the spectrum.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.