Abstract

With the rapid development of high-power tractor, the fault diagnosis of high-power tractor has become more and more important for ensuring the operating safety and efficiency. PSO is an iterative optimization evolutionary algorithm, which can iterate through different particles to find the optimal solution. However, there is only one population in the standard PSO algorithm, and the information exchange between the populations is relatively single, which can easily lead to the stagnation of the development of the population. In this paper, due to high-power tractor diesel engine fault complexity, fault correlation, and multifault concurrency, a multigroup coevolution particle swarm optimization BP neural network for diesel engine fault diagnosis method was proposed. First, the USB-CAN device was used to collect data of 8 items of the diesel engine under five different working conditions, and the data was parsed through the SAE J1939 protocol; then, the BP neural network was reconstructed, and a competitive multiswarm cooperative particle swarm optimizer algorithm (COM-MCPSO) was used to optimize its structure and weights. Finally, the data of optimized neural network under five different fault conditions show that, compared with BP neural network and PSO optimized BP neural network, the fault diagnosis of COM-MCPSO optimized BP neural network not only improves the network training speed, but also enhances generalization ability and improves recognition accuracy.

Highlights

  • In the current diesel engine fault diagnosis research, researchers used engine vibration signals to obtain engine fault data

  • (1) Under the five different tractor fault conditions, the CAN bus was used to collect the information of eight sensors such as diesel engine speed, engine load, and air flow; (2) analyzed the collected data through SAE J1939 protocol; (3) established BP and PSO-BP fault diagnosis models, respectively, and optimized the PSO optimization algorithm on the basis of both; (4) optimized the weights in the PSO algorithm and established the Linear Decreasing Weight particle swarm optimization (LDWPSO)-BP neural network; (5) optimized the particle swarm in the PSO algorithm

  • In the late stage of the algorithm, it can perform fine search in the optimal value area, so that the algorithm has a greater probability of converging to the global optimal solution position [7]. is algorithm is called Linear Decreasing Weight particle swarm optimization (LDWPSO), and the expression is shown in formula (3): π πmax −

Read more

Summary

Error backpropagation

E weight between the input layer and the hidden layer is Wji, the threshold is θj, the weight between the hidden layer and the output layer is Wkj, the threshold is θk′, and the neuron output of each layer satisfies n xj′ f􏼐uj􏼑 f􏼠􏽐 Wjixi − θj􏼡,. Assume that Wsc is the connection weight between any two neurons in the network (that is, Wji and Wkj), including the threshold, and E is a nonlinear error function related to. In the traditional BP neural network algorithm, the gradient descent method depends on the selection of the initial value; the training time is long, and it tends to fall into the local minimum and is difficult to select the number of hidden layers of the network structure. E improved PSO algorithm can optimize the threshold and weight of the BP neural network, improving the shortcomings of the learning ability and convergence speed of the BP neural network, and giving full play to the powerful nonlinear mapping ability of the BP neural network [11]. (1) Initialize the algorithm parameters and determine the parameters of the BP neural network (2) Select the mean square error of BP neural network as the fitness function of PSO algorithm (3) Find individual extreme values and group extreme values

Simulate predictions and get results
Master Nm
Data range
Serial number
Fault type
Findings
Neural network category
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call