Abstract
It is desired to determine the worst-case asymptotic error probability performance of a given detector operating in an environment of uncertain data dependency. A class of Markov data process distributions is considered which satisfy a one-shift dependency bound and agree with a specified univariate distribution. Within this dependency contamination class the distribution structure which minimizes the exponential rate of decrease of detection error probabilities is identified. This is a uniform least-favorability principle, because the least-favorable dependency structure is the same for all bounded memoryless detectors. The error probability exponential rate criterion used is a device of large deviations theory. The results agree well with previous results obtained using Pitman's asymptotic relative efficiency (ARE), which is a more tractable small-signal performance criterion. In contrast to ARE, large deviations theory is closely related to finite-sample error probabilities via the finite-sample Chernoff bounds and other exponentially tight bounds and other approximations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.