Abstract

Optical Neural Networks (ONNs) can be promising alternatives for conventional electrical neural networks as they offer ultra-fast data processing with low energy consumption. However, lack of suitable nonlinearity is standing in their way of achieving this goal. While this problem can be circumvented in feed-forward neural networks, the performance of the recurrent neural networks (RNNs) depends heavily on their nonlinearity. In this paper, we first propose and numerically demonstrate a novel reconfigurable optical activation function, named ROA, based on adding or subtracting the outputs of two saturable absorbers (SAs). RAO can provide both bounded and unbounded outputs by facilitating an electrically programmable adder/subtractor design. Second, with the help of RAO, which can be altered to resemble Tanh or Sigmoid activation functions, we take a step further and numerically demonstrate OptoRNN, a new design for an all-optical RNN in free-space optics. Moreover, by benefiting from mathematical modeling of the multiplication noise, as well as an altered loss function, we enable vector-matrix multiplication (VMM) six times more parallel than conventional optical VMM method for the linear part of the OptoRNN. Finally, through comprehensive simulation studies, we demonstrate that utilizing the OptoRNN, we can achieve 96.3% train accuracy for sequential-MNIST dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call