Abstract
For correct numerical interpretation of tomographic images, i.e., estimates of the attenuation coefficients of objects, it is important to obtain reconstruction of high quality, which depends directly on the methods of processing experimental data. Data processing flow begins with its preparation for the application of the reconstruction algorithm. The necessary part of data processing contains the subtraction of the black field, normalization considering empty data, and taking logarithm. This part is not sufficient for obtaining high-quality reconstruction when working with real data since it is not ideal. Real data include noise and distortions due to changes in the setup geometrical parameters during the experiment. We have analyzed two possible types of data distortions during experiment and suggested corrections for them. The first one corrects thermal shifts regarding beam decentralization, and the second eliminates the effect of the polychromatic nature of X-ray radiation on the results of tomographic reconstruction. These methods were tested with both real and synthetic data. Both synthetic and real experiments show that suggested methods improve the reconstruction quality. In real experiments, the level of agreement between the automatic parameter adjustment and experts is about 90%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Optoelectronics, Instrumentation and Data Processing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.