Abstract

The fusion of artificial neural networks with soft computing enables to construct learning machines that are superior compared to classical artificial neural networks, because knowledge can be extracted and explained in the form of simple rules. An efficient method for selecting the optimal structure of a fuzzy neural network architecture is developed. The Vapnik Chervonenkis (VC) dimension is introduced as a measure of the capacity of the learning machine. A prediction of the expected error on the yet unseen examples is estimated with the help of the VC dimension. The structural risk minimization principle is introduced for constructing the optimal architecture with the lowest expected error for the small data sets. A comparison between fuzzy neural network and the neural network ARX model is presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call