Abstract

Data processing in high energy physics experiments is a multi-tiered process in which raw detector signals are first processed locally into physics objects, and then collated into event records which can be scrutinized by a fast online trigger system. The resulting selection of events are reconstructed and pass through a number of software filters before arriving at a final offline analysis where hard physical constants are extracted. Although sophisticated statistical data analysis techniques are routinely employed high energy physics, the use of statistical signal processing in the field is has until now been rare. Our paper will begin with an overview of a typical high energy physics data acquisition system, outlining the technologies and tradeoffs involved at each stage. We will then proceed to argue that the dominant roles of model dependence and systematic errors in final physics analyses render statistical signal processing techniques largely inapplicable at this level. We observe, however, that at the low-level pattern recognition and event reconstruction levels, statistical signal processing techniques have been making inroads in high energy physics for a number of years, and examples from the literature will be cited. The viability of the technique for second level triggers will be assessed. Parallels to other other approaches, such as neural networks, will also be drawn. It will be argued that the falling cost of computing hardware favors the growth of statistical signal processing methods in high energy physics

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call