Abstract

A non-rigid registration method is presented for the alignment of pre-procedural magnetic resonance (MR) images with delineated suspicious regions to intra-procedural 3D transrectal ultrasound (TRUS) images in TRUS-guided prostate biopsy. In the first step, 3D MR and TRUS images are aligned rigidly using six pairs of manually identified approximate matching points on the boundary of the prostate. Then, two image volumes are non-rigidly registered using a finite element method (FEM)-based linear elastic deformation model. A vector of observation prediction errors at some points of interest within the prostate volume is computed using an intensity-based similarity metric called the modality independent neighborhood descriptor (MIND). The error vector is employed in a classical state estimation framework to estimate prostate deformation between MR and TRUS images. The points of interests are identified using speeded-up robust features (SURF) that are scale and rotation-invariant descriptors in MR images. The proposed registration method on 10 sets of prostate MR and TRUS images yielded a target registration error of 1.99±0.83 mm, and 1.97±0.87 mm in the peripheral zone (PZ) and whole gland (WG), respectively, using 68 manually-identified fiducial points. The Dice similarity coefficient (DSC) was 87.9±2.9, 82.3±4.8, 93.0±1.7, and 84.2±6.2 percent for the WG, apex, mid-gland and base of the prostate, respectively. Moreover, the mean absolute distances (MAD) between the WG surfaces in the TRUS and registered MR images was 1.6±0.3 mm. Registration results indicate effectiveness of the proposed method in improving the targeting accuracy in the TRUS-guided prostate biopsy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call