Abstract
Statistical shape models of soft-tissue organ motion provide a useful means of imposing physical constraints on the displacements allowed during non-rigid image registration, and can be especially useful when registering sparse and/or noisy image data. In this paper, we describe a method for generating a subject-specific statistical shape model that captures prostate deformation for a new subject given independent population data on organ shape and deformation obtained from magnetic resonance (MR) images and biomechanical modelling of tissue deformation due to transrectal ultrasound (TRUS) probe pressure. The characteristics of the models generated using this method are compared with corresponding models based on training data generated directly from subject-specific biomechanical simulations using a leave-one-out cross validation. The accuracy of registering MR and TRUS images of the prostate using the new prostate models was then estimated and compared with published results obtained in our earlier research. No statistically significant difference was found between the specificity and generalisation ability of prostate shape models generated using the two approaches. Furthermore, no statistically significant difference was found between the landmark-based target registration errors (TREs) following registration using different models, with a median (95th percentile) TRE of 2.40 (6.19) mm versus 2.42 (7.15) mm using models generated with the new method versus a model built directly from patient-specific biomechanical simulation data, respectively (N = 800; 8 patient datasets; 100 registrations per patient). We conclude that the proposed method provides a computationally efficient and clinically practical alternative to existing complex methods for modelling and predicting subject-specific prostate deformation, such as biomechanical simulations, for new subjects. The method may also prove useful for generating shape models for other organs, for example, where only limited shape training data from dynamic imaging is available.
Highlights
Statistical shape models (SSMs) of soft-tissue organ motion provide a useful means of imposing physical constraints on the displacements allowed during non-rigid image registration, which is especially useful when registering sparse and/or noisy image data (Hawkes et al, 2005; Heimann and Meinzer, 2009)
A growing body of research has investigated a number of alternative solutions to the problem of nonrigid magnetic resonance (MR)-transrectal ultrasound (TRUS) registration of the prostate, including manual approaches (Kuru et al, 2012; Xu et al, 2008), intensity-based approaches (Mitra et al, 2012; Sun et al, 2013) and surfacebased approaches (Narayanan et al, 2009; Sparks et al, 2013; van de Ven et al, 2015), which are commonly employed in commercial image guidance systems (Marks et al, 2013)
The SSM predicts the displacement of all internal points, providing a full 3D displacement field within the organ of interest that can be applied to deform the original MR image and, in particular, determine the location of MR-visible lesions within the TRUS volume that are targeted during biopsy or treatment
Summary
Statistical shape models (SSMs) of soft-tissue organ motion provide a useful means of imposing physical constraints on the displacements allowed during non-rigid image registration, which is especially useful when registering sparse and/or noisy image data (Hawkes et al, 2005; Heimann and Meinzer, 2009). We have used this approach successfully in previous work to compensate for prostate deformation due to transrectal ultrasound- (TRUS-) probe pressure when registering MR and 3D TRUS images of the prostate in the context of MRI-tumour-targeted biopsy and minimally-invasive surgical interventions (Hu et al, 2012, 2011). Information on the size, shape and location of a target lesion/tumour, as well as additional information, such as the location of vulnerable structures or surgical margins, both of which are important for treatment applications, can be embedded very naturally within such models by labelling the elements within the FEM
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.