Abstract

There are two fundamental problems of the Kernel Fisher Discriminant Analysis (KFDA) for nonlinear fault diagnosis. The first one is the classification performance of KFDA between the normal data and fault data degenerates as long as overlapping samples exist. The second one is that the computational cost of kernel matrix becomes large when the training sample number increases. Aiming at the two major problems, in this paper, an improved fault diagnosis method based on KFDA(IKFDA) is proposed. There are two aspects are improved in the method. Firstly, the variable weighting vector was incorprated into KFDA which can improve the discriminant performance. Secondly, when the training sample number becomes large, a feature vector selection scheme based on a geometrical consideration is given to reduce the computational complexity of KFDA for fault diagnosis. Finally, Gaussian mixture model (GMM) is applied for fault isolation and diagnosis on the KFDA subspace. Experimental results show that the proposed method outperforms traditional kernel principal component analysis (KPCA) and general KDA algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call