Abstract

This paper presents the design of a new architecture of digital neurons for use in the feed-forward neural networks (FFNN) and their subsequent implementation on a chip. The proposed neuron uses a special type of multiplication realized by AND gate. Comparison of usual ways of implementing digital feed-forward neural networks using fixed/floating point numbers to the novel architecture using the special multiplication was performed. Consequently, the investigated FFNN architectures were implemented into FPGA and ASIC, where the chip area was the main concern. Chip area and other features of both the new neural network architecture and standard NN architectures we compared and evaluated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call