Abstract

PurposeIn medical applications, it is crucial to evaluate the geometric accuracy of rapid prototyping (RP) models. Current research on evaluating geometric accuracy has focused on identifying two or more specific anatomical landmarks on the original structure and the RP model, and comparing their corresponding linear distances. Such kind of accuracy metrics is ambiguous and may induce misrepresentations of the actual errors. The purpose of this paper is to propose an alternative method and metrics to measure the accuracy of RP models.Design/methodology/approachThe authors propose an accuracy metric composed of two different approaches: a global accuracy evaluation using volumetric intersection indexes calculated over segmented Computed Tomography scans of the original object and the RP model. Second, a local error metric that is computed from the surfaces of the original object and the RP model. This local error is rendered in a 3D surface using a color code, that allow differentiating regions where the model is overestimated, underestimated, or correctly estimated. Global and local error measurements are performed after rigid body registration, segmentation and triangulation.FindingsThe results show that the method can be applied to different objects without any modification, and provide simple, meaningful and precise quantitative indexes to measure the geometric accuracy of RP models.Originality/valueThe paper presents a new approach to characterize the geometric errors in RP models using global indexes and a local surface distribution of the errors. It requires minimum human intervention and it can be applied without any modification to any kind of object.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.