Abstract

Feed for word neural networks (FFNN) have attracted a great attention, in digital communication area. Especially they are investigated as nonlinear equalizers at the receiver, to mitigate channel distortions and additive noise. The major drawback of the FFNN is their extensive training. We present a new approach to enhance their training efficiency by adapting the activation function. Adapting procedure for activation function extensively increases the flexibility and the nonlinear approximation capability of FFNN. Consequently, the learning process presents better performances, offers more flexibility and enhances nonlinear capability of NN structure thus the final state kept away from undesired saturation regions. The effectiveness of the proposed method is demonstrated through different challenging channel models, it performs quite well for nonlinear channels which are severe and hard to equalize. The performance is measured throughout, convergence properties, minimum bit error achieved. The proposed algorithm was found to converge rapidly, and accomplish the minimum steady state value. All simulation shows that the proposed method improves significantly the training efficiency of FFNN based equalizer compared to the standard training one.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call