Abstract

The scrap falling state in the shearing process of a Wide Heavy Plate Mill depends on manual monitoring and judgment. The steel plate can only move forward after the scrap falls, otherwise, the equipment will be damaged. An experienced operator can use the sound they hear to estimate the falling state. To automatically recognize sound events, a scheme of sound data acquisition processing and recognition for monitoring the scrap state of the shearing process is proposed in this paper. Firstly, a pickup is installed at the monitoring point of the crop shear to collect the sound. Afterward, the sound data is processed and finally fed into the model for recognition. Since many sound segments in the field are not related to key actions, an endpoint detection technique based on a short-term energy threshold and the variable window length strategy is applied in the data processing section to obtain the sound segments to be identified. The Log-Mel spectrogram feature of the sound segment is then extracted. After a data filling to ensure a uniform data shape, the feature map is fed into a lightweight deep convolution neural network called MobileNetV2 to quickly identify sound events. For the shearing process events, the model recognition accuracy can reach 95% of the F1 score in a realistic noise environment and we can identify the scrap falling 0.3s after the action starts. Finally, the model was applied to the UrbanSound8K dataset and achieved 95% recognition accuracy, validating the performance of the method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call