Semiconductor detectors for high-energy sensing (X/γ-rays) play a critical role in fields such as astronomy, particle physics, spectroscopy, medical imaging, and homeland security. The increasing need for precise detector characterization highlights the importance of developing advanced digital twins, which help optimize the design and performance of imaging systems. Current simulation frameworks primarily focus on modeling electron–hole pair dynamics within the semiconductor bulk after the photon absorption, leading to the current signals at the nearby electrodes. However, most simulations neglect charge diffusion and Coulomb repulsion, which spatially expand the charge cloud during propagation due to the high complexity they add to the physical models. Although these effects are relatively weak, their inclusion is essential for achieving a high-fidelity replication of real detector behavior. There are some existing methods that successfully incorporate these two phenomena with minimal computational cost, including those developed by Gatti in 1987 and by Benoit and Hamel in 2009. The present work evaluates these two approaches and proposes a novel Monte Carlo technique that offers higher accuracy in exchange for increased computational time. Our new method enables more realistic performance predictions while remaining within practical computational limits.
Read full abstract