Abstract

In a previous paper, we have shown that a recurrent neural network (RNN) can be used to detect cellular network radio signal degradations accurately. We unexpectedly found, though, that accuracy gains diminished as we added layers to the RNN. To investigate this, in this article, we build a parallel model to illuminate and understand the internal operation of neural networks (NNs), such as the RNN, which store their internal state to process sequential inputs. This model is widely applicable in that it can be used with any input domain where the inputs can be represented by a Gaussian mixture. By looking at RNN processing from a probability density function (pdf) perspective, we are able to show how each layer of the RNN transforms the input distributions to increase detection accuracy. At the same time we also discover a side effect acting to limit the improvement in accuracy. To demonstrate the fidelity of the model, we validate it against each stage of RNN processing and output predictions. As a result, we have been able to explain the reasons for RNN performance limits with useful insights for future designs for RNNs and similar types of NN.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call