Groupwise registration has been recently introduced to simultaneously register a group of images by avoiding the selection of a particular template. To achieve this, several methods have been proposed to take advantage of information-theoretic entropy measures based on image intensity. However, simplistic utilization of voxelwise image intensity is not sufficient to establish reliable correspondences, since it lacks important contextual information. Therefore, we explore the notion of attribute vector as the voxel signature, instead of image intensity, to guide the correspondence detection in groupwise registration. In particular, for each voxel, the attribute vector is computed from its multi-scale neighborhoods, in order to capture the geometric information at different scales. The probability density function (PDF) of each element in the attribute vector is then estimated from the local neighborhood, providing a statistical summary of the underlying anatomical structure in that local pattern. Eventually, with the help of Jensen–Shannon (JS) divergence, a group of subjects can be aligned simultaneously by minimizing the sum of JS divergences across the image domain and all attributes. We have employed our groupwise registration algorithm on both real (NIREP NA0 data set) and simulated data (12 pairs of normal control and simulated atrophic data set). The experimental results demonstrate that our method yields better registration accuracy, compared with a popular groupwise registration method.
Read full abstract