Abstract
Aiming at the problems of the poor recognition effect and low recognition rate of the existing methods in the process of belt deviation detection, this paper proposes a real-time belt deviation detection method. Firstly, ResNet18 combined with the attention mechanism module is used as a feature extraction network to enhance the features in the belt edge region and suppress the features in other regions. Then, the extracted features are used to predict the approximate locations of the belt edges using a classifier based on the contextual information on the fully connected layer. Next, the improved gradient equation is used as a structural loss in the model training stage to make the model prediction value closer to the target value. Then, the authors of this paper use the least squares method to fit the set of detected belt edge line points to obtain the accurate belt edge straight line. Finally, the deviation threshold is set according to the requirements of the safety production code, and the fitting results are compared with the threshold to achieve the belt deviation detection. Comparisons are made with four other methods: ultrafast structure-aware deep lane detection, end-to-end wireframe parsing, LSD, and the Hough transform. The results show that the proposed method is the fastest at 41 frames/sec; the accuracy is improved by 0.4%, 13.9%, 45.9%, and 78.8% compared to the other four methods; and the F1-score index is improved by 0.3%, 10.2%, 32.6%, and 72%, respectively, which meets the requirements of practical engineering applications. The proposed method can be used for intelligent monitoring and control in coal mines, logistics and transport industries, and other scenarios requiring belt transport.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.