Abstract

Though the decision feedback equalizer (DFE) with multilayer perceptron (MLP) structure can be trained effectively by the backpropagation (BP) algorithm, it is always accompanied by the problem of local minimum. In order to solve some problems of the local minimum in the BP algorithm and to improve the performance of the BP algorithm under the same MLP structure, we combine the hierarchical approach and the BP algorithm to implement the MLP DFE, and we call the new scheme hierarchical BP (HBP) algorithm. Based on the hierarchical approach, from the input layer to the output layer of the MLP, every two layers of neural nodes (with one hidden layer) will be trained with an individual BP algorithm. Therefore, the entire MLP can be trained by several independent BP algorithms, unlike the standard BP algorithm, which utilizes only one BP algorithm to train the whole MLP structure. The results of performance evaluation indicate that the HBP algorithm not only strongly reduces the mean squared error but also yields a much lower bit-error rate than the standard BP algorithm does for equal computational cost and conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call