Abstract

In this paper, a new neural network structure is proposed that presents a low complexity and is particularly suitable for digital hardware realization. The proposed structure is based on the “polynomial neuron”, a classical additive neuron but with a polynomial activation function. Such a neuron is then used to build multilayer networks trainable with an algorithm very similar to the Back Propagation. The adaptive polynomial neural networks (APNN) allow a good reduction in terms of dimensions and computational complexity both in learning and in forward phase compared with traditional MLPs using sigmoidal activation functions. Many experiments have been extensively carried out both on pattern recognition and data processing problems. The relationship of the APNNs with the polynomial Adaline and the Volterra expansion is also discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call