Abstract

The presence of surface defects during abrasive belt grinding has a significant impact on the service performance of the parts, and the precise extraction of surface defect feature information helps to enhance grinding quality. However, in the face of abrasive belt grinding surfaces with complex texture features, it is a challenging task to effectively extract defect features and quantify the characterization. Therefore, this research suggests a quantitative explanation approach for surface scratch defects in abrasive belt grinding based on the deep learning theory. Facing challenging issues like complex texture background, defect size, area, and depth of grinding surface, the automatic segmentation model (FCSNet) concentrating on the multiscale channel and spatial attention information is proposed. To concentrate on the multiscale channel and spatial information and accurately detect the defect features, the residual atrous convolutional pyramid module with channel and spatial dual attention module (RAPCS) is constructed. To capture the global, long-range context characteristics and emphasize target regions, the convolutional block attention module (CBAM) concatenate block is included in the skip connections between the encoder path and the decoder path. To address the issue of foreground-background data imbalance caused by too small segmented target regions, the Focal Tversky hybrid loss function was adopted to guide the model to learn the difficult region of interest (ROI). At the end of the network structure, Grad-CAM is adopted and statistical knowledge is incorporated for a qualitative and quantitative visual explanation of the segmentation results. Finally, the dataset of surface scratch defects in abrasive belt grinding was established and numerous experiments were carried out based on this dataset. The results demonstrate that the proposed model has an excellent segmentation performance and defect quantitative explanation ability, achieving 98.80% in Accuracy, 81.72% in Recall, 81.81% in Precision, 81.42% in F1-score, and 81.87% in mIoU, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call