Abstract

The adoption of vibration signals in sparse representation (SR) modeling is very popular in realizing bearing fault diagnosis in recent years. However, it is still challenging since the feature information in the vibration signal is usually submerged by strong and complex noise. Thus, the noise distribution assumption in the SR model needs to be carefully studied, namely, noise modeling. In this article, we propose a new SR model under the assumption that the noise in the signal obeys a mixture of generalized Gaussian (MoGG) distribution. Also, the $L_{p}$ norm ( $0\leq p \leq 1$ ) is adopted as the regularization to keep the proposed method sufficiently sparse and adjustable. Thus, this new model is named a MoGG noise distribution enabled SR (MoGG-SR) model. The mixed distribution characteristic can make this model more adaptive, and the use of the generalized Gaussian function as the basis function makes it more robust to outliers. Then, an algorithm to solve MoGG-SR is developed based on expectation–maximization (EM) and alternating direction method of multipliers (ADMM). After a series of simulations, all parameters in MoGG-SR are determined, and it has a much better denoising performance compared with other classic sparse models under different noise distribution interferences. Finally, the effectiveness of MoGG-SR in bearing fault diagnosis is verified by extracting fault impulses from the vibration signal of two data sets. The simulation and experiment together demonstrate that MoGG-SR is not only innovative in theory but also valuable in practical bearing fault diagnosis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call