Additive manufacturing (AM), or 3D printing, has revolutionized the fabrication of complex parts, but assessing their quality remains a challenge. Quality assessment, especially for the interior part geometry, relies on post-print inspection techniques unsuitable for real-time in situ analysis. Vision-based approaches could be employed to capture images of any layer during fabrication, and then segmentation methods could be used to identify in-layer features in order to establish dimensional conformity and detect defects for in situ evaluation of the overall part quality. This research evaluated five image segmentation methods (simple thresholding, adaptive thresholding, Sobel edge detector, Canny edge detector, and watershed transform) on the same platform for their effectiveness in isolating and identifying features in 3D-printed layers under different contrast conditions for in situ quality assessment. The performance metrics used are accuracy, precision, recall, and the Jaccard index. The experimental set-up is based on an open-frame fused filament fabrication printer augmented with a vision system. The control system software for printing and imaging (acquisition and processing) was custom developed in Python running on a Raspberry Pi. Most of the segmentation methods reliably segmented the external geometry and high-contrast internal features. The simple thresholding, Canny edge detector, and watershed transform methods did not perform well with low-contrast parts and could not reliably segment internal features when the previous layer was visible. The adaptive thresholding and Sobel edge detector methods segmented high- and low-contrast features. However, the segmentation outputs were heavily affected by textural and image noise. The research identified factors affecting the performance and limitations of these segmentation methods and contributing to the broader effort of improving in situ quality assessment in AM, such as automatic dimensional analysis of internal and external features and the overall geometry.
Read full abstract