Abstract

This paper deals with a situation of some importance for the analysis of experimental data via Neural Network (NN) or similar devices: Let N data be given, such that N = N s + N b , where N s is the number of signals, N b the number of background events, and both are unknown. Assume that a NN has been trained, such that it will tag signals with efficiency F s (0 < F s < 1) and background data with F b (0 < F b < 1). Applying the NN yields N Y tagged events. We demonstrate that the knowledge of N Y is sufficient to calculate confidence bounds for the signal likelihood, which have the same statistical interpretation as the Clopper-Pearson bounds for the well-studied case of direct signal observation. Subsequently, we discuss rigorous bounds for the a posteriori distribution function of the signal probability, as well as for the (closely related) likelihood that there are N s signals in the data. We compare them with results obtained by starting off with a maximum entropy type assumption for the a priori likelihood that there are N s signals in the data and applying the Bayesian theorem.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call