Abstract
In many clinical applications, 3D reconstruction of patient-specific structures is of major interest. Despite great effort put in 2D-3D reconstruction, gold standard bone reconstruction obtained by segmentation on CT images is still mostly used – at the expense of exposing patients to significant ionizing radiation and increased health costs. State-of-the-art 2D-3D reconstruction methods are based on non-rigid registration of digitally reconstructed radiographs (DRR) – aiming at full automation – but with varying accuracy often exceeding clinical requirements. Conversely, contour-based approaches can lead to accurate results but strongly depend on the quality of extracted contours and have been left aside in recent years. In this study, we revisit a patient-specific 2D-3D reconstruction method for the proximal femur based on contours, image cues, and knowledge-based deformable models. 3D statistical shape models were built using 199 CT scans from THA patients that were used to generate pairs of high fidelity DRRs. Convolutional neural networks were trained using the DRRs to investigate automatic contouring. Experiments were conducted on the DRRs, and calibrated radiographs of a pelvis phantom and volunteers – with an analysis of the quality of contouring and its automatization. Using manual contours and DRR, the best reconstruction error was 1.02 mm. With state-of-the-art results for 2D-3D reconstruction of the proximal femur, we highlighted the relevance and challenges of using contour-driven reconstruction to yield patient-specific models.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.