Abstract

Compton camera has been proposed as an imaging tool in astronomy, industry, homeland security and medical imaging where source location can be identified based on detection of Compton scattered and photoabsorbed photons. However, due to the geometrical complexity of the problem, image reconstruction from Compton camera data is difficult and/or ineffective when using techniques such as filtered backprojection (FBP) or maximum likelihood expectation maximization (MLEM). In this paper, we propose a novel stochastic origin ensembles (SOE) approach based on Markov chains (previously implemented and tested for PET) to the reconstruction of Compton camera data. The advantages of this method are that it is universal (i.e. works with any camera geometry), does not require any rebinning of acquired data, parallelizable, and does not require forward- and back-projection operations nor voxelization of the image space. During image reconstruction the origins of every measured events are the randomly assigned locations on the conical surface (which is the Compton camera analog of lines-of-response in PET). Therefore, the image is defined as an event ensemble holding the coordinates of all possible event origins. During the course of the reconstruction, origins of the events are randomly moved and the acceptance of the new event origin is determined by the acceptance probability, which is proportional to the change in event density. For example, if event density at the new location is higher than in the previous location, the new position is always accepted. As a result of many iterations, the reconstructed image converges to the quasi-stationary state which can be voxelized and displayed. Comparison to the list-mode MLEM shows that SOE algorithm has similar performance in terms of image quality while clearly outperforming it in terms of reconstruction time. Implementation of corrections for detector energy resolution can be done at almost no additional computational cost, which is a major advantage of SOE over other reconstruction methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.