Abstract

The track recognition process (tracking) plays a key role in the event reconstruction in high energy physics. Our study is devoted to tracking for the proposed inner detector of the BES-III experiment, which has only three cylindrical GEM stations. This means, if a particle is registered only at two out of three stations, then its track in the magnetic field cannot be restored without additional information. Such information can be the coordinates of the primary event vertex, from which all tracks exit. Knowing the location of the primary vertex would also help to improve the precision in determining the particle momentum. Besides, it leads to the significant reduction of the algorithmic complexity during the track-candidate search – from O(n2) to O(n), and can improve the overall track reconstruction efficiency. It should also be noted that tracking is especially complicated for strip GEM detectors due to their design specifics, which leads to the appearance of two orders of magnitude more fake hits in addition to useful ones. To solve the problem of vertex finding for the inner detector of the BES-III experiment, we chose a deep convolutional neural network model Look Once On Tracks (LOOT), which processes the whole event at once, as an image, and after proper training can predict the coordinates of the primary vertex location. In this work, the preliminary results of primary vertex prediction on the BES-III simulated data using the LOOT model are presented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.