Abstract

Semiconductor detectors for high-energy sensing (X/γ-rays) play a critical role in fields such as astronomy, particle physics, spectroscopy, medical imaging, and homeland security. The increasing need for precise detector characterization highlights the importance of developing advanced digital twins, which help optimize the design and performance of imaging systems. Current simulation frameworks primarily focus on modeling electron-hole pair dynamics within the semiconductor bulk after the photon absorption, leading to the current signals at the nearby electrodes. However, most simulations neglect charge diffusion and Coulomb repulsion, which spatially expand the charge cloud during propagation due to the high complexity they add to the physical models. Although these effects are relatively weak, their inclusion is essential for achieving a high-fidelity replication of real detector behavior. There are some existing methods that successfully incorporate these two phenomena with minimal computational cost, including those developed by Gatti in 1987 and by Benoit and Hamel in 2009. The present work evaluates these two approaches and proposes a novel Monte Carlo technique that offers higher accuracy in exchange for increased computational time. Our new method enables more realistic performance predictions while remaining within practical computational limits.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.