Abstract

A twa-dimensional amorphous silicon photoconductor array and a liquid-crystal display form the core components of a hardware system for the implementation of a multilayer perceptron neural network. All connections between layers, as well as the nonlinear transfer characteristics associated with the hiddenand output-layer neurons, are implemented in analog circuitry so that the network, once trained, behaves as a stand-alone processor. Subject to a standard backpropagation training algorithm, the network is shown to train very successfully. Training of the network is studied under different levels of weight quantization, neuron output resolution, and random weight-defect probability. A computer simulation of the hardware network is also performed, and excellent agreement is shown between the results of the hardware network and those of the computer simulation. It is concluded that the training capability of the present hardware network is very little degraded by its nonidealities, including the level of weight quantization and limit in neuron output resolution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call