Abstract

Level set estimation (LSE) is the process of classifying the region(s) that the values of an unknown function exceed a certain threshold. It has a wide range of applications such as spectrum sensing or environment monitoring. In this paper, we study the the optimal LSE of a linear random field that changes with respect to time. A linear sensor network is used to take discrete samples of the spatially-temporally correlated random field in both the space and time domain, and the sensors operate under a total power constraint. The samples are congregated at a fusion center (FC), which performs LSE of the random field by using the noisy observation of the samples. Under the Gaussian process (GP) framework, we first develop an optimal LSE algorithm that can minimize the LSE error probability. The results are then used to derive the exact LSE error probability with the assistance of frequency domain analysis. The analytical LSE error probability is expressed as an explicit function of a number of system parameters, such as the distance between two adjacent nodes, the sampling period in the time domain, the signal-to-noise ratio (SNR), and the spatial-temporal correlation of the random field. With the analytical results, we can identify the optimum node distance and sampling period that can minimize the LSE error probability.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.