Abstract

Image registration is a crucial technology in robot-assisted knee arthroplasty, which provides real-time patient information by registering the pre-operative image data with data acquired during the operation. The existing registration method requires surgeons to manually pick up medical feature points (i.e. anatomical points) in pre-operative images, which is time-consuming and relied on surgeons experience. Moreover, different doctors have different preferences in preoperative planning, which may influence the consistency of surgical results. A medical feature points automatic extraction method based on PointNet++ named Point_RegNet is proposed to improve the efficiency of preoperative preparation and ensure the consistency of surgical results. The proposed method replaces the classification and segmentation layer of PointNet++ with a regression layer to predict the position of feature points. The comparative experiment is adopted to determine the optimal set of abstraction layers in PointNet++. The proposed network with three set abstraction layers is more suitable for extracting feature points. The feature points predictions mean error of our method is less than 5mm, which is 1mm less than the manual marking method. Ultimately, our method only requires less than 3s to extract all medical feature points in practical application. It is much faster than the manual extraction way which usually requires more than half an hour to mark all necessary feature points. Our deep learning-based method can improve the surgery accuracy and reduce the preoperative preparation time. Moreover, this method can also be applied to other surgical navigation systems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.