The usage of modern positron emission tomography scanners, in particular with digital detectors, allows obtaining images with better quality, increases the detection of small pathological lesions, reduces scanning time and the activity administered to the patient which leads to a decrease of patient dose as well. However, the values of the quantitative image parameters shift upward, which can lead to significant differences with the quantitative assessment obtained on the previous generation device. In order to compare quantitative assessments obtained on different generations of PET/CT, it is necessary to harmonise quantitative image parameters and perform regular quality control. The aim of current work is the comparison of different methods for harmonization of quantitative image parameters on the example of harmonisation of two PET/CT: Biograph mCT 128 and Biograph Vision 600. NEMA IEC Body phantom filled with 18F solution was scanned in Listmode in two bed positions with overlap in the sphere area during five minutes per bed position. Recovery coefficient used for harmonisation was measured for each sphere of the phantom. Harmonisation between Vision and mCT was performed using two methods: choosing of harmonised reconstruction parameters and EQ.PET technology. The acceptable divergence range between the recovery coefficients for Vision and for mCT is ±10% (20% range). The recovery coefficients measured for reconstruction: 4 iterations and 5 subsets, ToF+PSF, Gaussian 7 mm, matrix 220x220 completely fit within the 20% range. The recovery coefficients measured using EQ = 6 mm (optimal value) fit within the 20% range except the spheres with a diameter of 10 and 13 mm. Both harmonisation methods allow to approximate the quantitative assessment/ However, EQ.PET has limitations for the small lesions. Choosing harmonised reconstruction parameters is the mostwidely used harmonisation method; the EQ.PET allows to harmonise quantitative assessment without the use of multiple reconstruction protocols and losses in visualization ability
Read full abstract