Abstract

Maximum Likelihood (ML) reconstruction algorithms are non biased and achieve the lowest variance, called the Cramer-Rao lower bound (CRLB), for an infinite number of counts and iterations. This result is however not true at finite number of counts or iterations. In this study, we concentrate on the two dimensional Ordered Subsets Expectation Maximization (2D OSEM) algorithm with a finite number of counts and iterations, and investigate the question: given its bias, does this algorithm achieve the minimum variance predicted by the Cramer-Rao lower bound? We found a threshold under which the variance significatively exceeds the biased CRLB. We also found that above this threshold, the variance almost equals the biased CRLB, even for a finite number of iterations and in cold regions. A further analysis is needed to investigate the reason of the observed difference, which might indicate that there exists an algorithm with a smaller variance than OSEM for the same bias, or that a higher bound could be found.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call