Abstract

Recently, recurrent neural network (RNN) has demonstrated superior performance for channel decoder, which motivates us to explore which RNN decoder can be more efficient. In this paper, we propose three kinds of RNN decoders, which are built upon long short term memory (LSTM), gated recurrent unit (GRU) and bidirectional gated recurrent units (Bi-GRU), respectively. The performance of these three RNN decoders are evaluated through lots of simulations, which indicate that the GRU decoder with simplest structure and least computational time, has similar bit error rate (BER) performance as that of the LSTM decoder. The Bi-GRU decoder has the best BER performance at the expenses of more computational time. However, it is prone to overfitting. Furthermore, we find that the BER performance of RNN decoders without dropout is better than that of the decoders with dropout when decoding models are underfitting, while the RNN decoders are better to dropout when decoding models are overfitting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call