Abstract Tool wear status monitoring plays an important role in the safety of machine and the efficiency of production. Due to the limitation of sensor installation and complex operation conditions, traditional machine vision methods are difficult to directly act on the tool itself to obtain corresponding wear status information. Meanwhile, the surface quality of the machined workpiece can provide indirect insights into the wear status of the tool to a significant degree. Therefore, an indirect tool wear status monitoring method is proposed in this paper by workpiece surface quality inversion and EfficientNet. Firstly, the surface texture data of workpiece are obtained in the form of image by an industrial camera, and the representative features that indirectly reflect tool wear status are further highlighted by using the weighted average image grayscale and the Laplacian operator. Secondly, to monitor tool wear effectively, we've optimized three key settings, width, model depth, and input image resolution, of the EfficientNet backbone network. This optimization boosts its ability to understand and generalize, ensuring it can handle various surface textures and extract crucial information accurately. This, in turn, enhances the accuracy of our tool wear monitoring system. Finally, the improved EfficientNet network is further utilized as a pre-trained network to achieve tool wear state monitoring. Utilizing the powerful representation ability of the pre-trained model and the surface texture features of workpiece, the model can accurately identify and judge the wear degree of the tool. The tool experiment results show that the proposed tool wear monitoring method has achieved remarkable results with recognition accuracy of 97.77%. While other popular methods such as VGG19, ResNet50, and ResNet152, whose training accuracies are 95.66%, 96.37%, and 91.27%, respectively. The comparative results highlight the excellent performance of the proposed method in recognition rate, training time and generalization performance.
Read full abstract