Abstract
A fully automated method is presented to classify 3-D CT data into material fractions. An analytical scale-invariant description relating the data value to derivatives around Gaussian blurred step edges--arch model--is applied to uniquely combine robustness to noise, global signal fluctuations, anisotropic scale, noncubic voxels, and ease of use via a straightforward segmentation of 3-D CT images through material fractions. Projection of noisy data value and derivatives onto the arch yields a robust alternative to the standard computed Gaussian derivatives. This results in a superior precision of the method. The arch-model parameters are derived from a small, but over-determined, set of measurements (data values and derivatives) along a path following the gradient uphill and downhill starting at an edge voxel. The model is first used to identify the expected values of the two pure materials (named L and H) and thereby classify the boundary. Second, the model is used to approximate the underlying noise-free material fractions for each noisy measurement. An iso-surface of constant material fraction accurately delineates the material boundary in the presence of noise and global signal fluctuations. This approach enables straightforward segmentation of 3-D CT images into objects of interest for computer-aided diagnosis and offers an easy tool for the design of otherwise complicated transfer functions in high-quality visualizations. The method is applied to segment a tooth volume for visualization and digital cleansing for virtual colonoscopy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.