Abstract
This paper deals with a review of the non-linear threshold logic developed in collaboration by D. Dubois, G. Resconi and A. Raymondi. This is a significant extension of the neural threshold logic pioneered by McCulloch and Pitts. The output of their formal neuron is given by the Heaviside function with an argument depending on a linear weighted sum of the inputs and a threshold parameter. All Boolean tables cannot be represented by such a formal neuron. For example, the exclusive OR and the parity problem need hidden neurons to be resolved. A few years ago, Dubois proposed a non-linear fractal neuron to resolve the exclusive OR problem with only one single neuron. Then Dubois and Resconi introduce the non-linear threshold logic, that is to say a Heaviside function with a non-linear sum of the inputs which can represent any Boolean tables with only one neuron where the Dubois’ non-linear neuron model is a Heaviside fixed function. In this framework the supervised learning is direct, that is to say without recursive algorithms for computing the weights and threshold, related to the new foundation of the threshold logic by Resconi and Raymondi. This paper will review the main aspects of the linear and non-linear threshold logic with direct learning and applications in pattern recognition with the software TurboBrain. This constitutes a new tool in the framework of Soft Computing.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have