Abstract

Neuromorphic photonics came to the fore promising neural networks (NNs) with orders of magnitude higher computational speeds compared to electronic counterparts. In this direction, research efforts have been mainly concentrated on the development of spiking, convolutional and Feed-Forward (FF)-NN architectures, aiming to solve complex cognitive problems. However, in order to solve time-series classification and prediction complex tasks, state-of-the-art deep-learning models require in most cases the employment of Recurrent-NNs (RNNs) along with their gated variants, such as Long-Short-Term-Memories (LSTMs) and Gated-Recurrent-Units (GRUs). Herein, we experimentally demonstrate the first, to the best of our knowledge, all-optical RNN with a gating mechanism, laying the foundations for all-optical LSTMs and GRUs. The proposed layouts exploit a Semiconductor-Optical-Amplifier (SOA)-based sigmoid activation within a fiber loop and were validated using asynchronous Wavelength-Division-Multiplexed (WDM) signals with 100psec optical pulses. A SOA-Mach-Zehnder-Interferometer (SOA-MZI) gate was employed in the Gated-RNN version, with the RNN output defining the input signal fraction that is desired to enter the RNN. Finally, a complex NN architecture was trained using the FI-2010 financial dataset exploiting the proposed non-gated and gated-RNNs, showcasing in an outstanding F1 score of $\text{41.68}\%$ and $\text{41.85}\%$ , respectively, outperforming the Multi-Layer Perceptron (MLP) based models by $\text{6.49}\%$ in average.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call