Abstract
The accurate determination of the tool-wear state helps workers maximise tool utilisation while reducing waste. It also ensures the machining quality and improves the machining efficiency. This study proposed a deep learning network based on the self-attention mechanism that enabled global modeling and long-term dependence for better tool wear state recognition by cutting signals. The Hierarchical Temporal Transformer Network (HTT-Net) was constructed by improving the Swin Transformer backbone network to enable the global modeling of the input temporal sequence signal. The token merging layer is used to build a hierarchical feature map, that enables the model to continuously increase the receptive field and improve its global modeling capability. Self-attention calculation is performed on the temporal sequence data partition window, which reduces the complexity of the model from a quadratic into a linear form of the sequence length. The shifted window enables information interaction between non-overlapping windows, which can improve the global modeling capability of the model. PHM2010 public cutting data and TC4 milling data were used for model training and test. The results showed that the tool wear state recognition capability of HTT-Net on the PHM2010 dataset outperformed the models in existing studies, with up to 16.13% improvement in recognition accuracy on the relevant dataset. The recognition accuracy on the TC4 milling dataset can reach 98.87%, which further verified the actual application capability of the model. At last, The Q-Q graph analysis verified that the model had strong robustness and stability, and the ablation experiment verified that each module of the model had positive gain on the recognition results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.