Abstract

Optical neural networks (ONNs) have recently attracted extensive interest as potential alternatives to electronic artificial neural networks, owing to their intrinsic capabilities in parallel signal processing with reduced power consumption and low latency. Preliminary confirmation of parallelism in optical computing has been widely performed by applying wavelength division multiplexing (WDM) to the linear transformation of neural networks. However, interchannel crosstalk has obstructed WDM technologies from being deployed in nonlinear activation on ONNs. Here, we propose a universal WDM structure called multiplexed neuron sets (MNS), which applies WDM technologies to optical neurons and enables ONNs to be further compressed. A corresponding backpropagation (BP) training algorithm was proposed to alleviate or even annul the influence of interchannel crosstalk in MNS-based WDM-ONNs. For simplicity, semiconductor optical amplifiers are employed as an example of MNS to construct a WDM-ONN trained using the new algorithm. The results show that the combination of MNS and the corresponding BP training algorithm clearly downsizes the system and improves the energy efficiency by a factor of 10 while providing similar performance to traditional ONNs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call