Abstract

Machine Learning (ML) based condition monitoring and fault detection of industrial equipment is the current scenario for maintenance in the era of Industry-4.0. The application of ML techniques for automatic fault detection minimizes the unexpected breakdown of the system. However, these techniques heavily rely on the historical data of equipment for its training which limits its widespread application in industry. As the historical data is not available for each industrial machine and generating the data experimentally for each fault condition is not viable. Therefore, this challenge is addressed for gear application with tooth defect. In this paper, ML algorithms are trained using simulated vibration data of the gearbox and tested with the experimental data. Simulated data is generated for the gearbox with different operating and fault conditions. A gearbox dynamic model is utilized to generate simulated vibration data for normal and faulty gear condition. A pink noise is added to simulated data to improve the exactness to the actual field data. Further, these simulated-data are processed using Empirical Mode Decomposition and Discrete Wavelet Transform, and features are extracted. These features are then fed to the training of different well-established ML techniques such as Support Vector Machine, Random Forest and Multi-Layer Perceptron. To validate this approach, trained ML algorithms are tested using experimental data. The results show more than 87% accuracy with all three algorithms. The performance of the trained model is evaluated using precision, recall and ROC curve. These metric show the affirmative results for the applicability of this approach in gear fault detection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call