Abstract

This paper discusses the learning of multilayer feedforward neural networks from linguistic knowledge and numerical data. These two kinds of information are simultaneously utilized in the learning of neural networks. We show backpropagation type learning algorithms for classification problems and function approximation problems. For pattern classification problems, linguistic knowledge is represented by fuzzy if-then rules such as If x/sub 1/ is and x/sub 2/ is large then Class 1 and If x/sub 1/ is large then Class 3. These fuzzy if-then rules are used in the learning of neural networks together with numerical data such as {(x/sub 1/, x/sub 2/, x/sub 3/; class label)}= {(0.1, 0.9, 0.3; Class 1),..., (0.7, 0.9, 0.8; Class 3)}. For function approximation problems, linguistic knowledge such as If x/sub 1/ is and x/sub 2/ is large then y is small is utilized in the learning of neural networks together with numerical data such as {(x/sub 1/, x/sub 2/, x/sub 3/; y)}={(0.1, 0.8, 0.2; 0.2),..., (0.2, 0.3, 0.9; 0.9)}. The learning of neural networks from these two kinds of information is illustrated using computer simulations on several numerical examples. Handling of inconsistency in linguistic knowledge is discussed in this paper. Inconsistency between linguistic knowledge and numerical data is also discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call