Abstract

A new technique for adaptation of fuzzy membership functions in a fuzzy inference system is proposed. The pointer technique relies upon the isolation of the specific membership functions that contributed to the final decision, followed by the updating of these functions' parameters using steepest descent. The error measure used is thus backpropagated from output to input, through the min and max operators used during the inference stage. This occurs because the operations of min and max are continuous differentiable functions and, therefore, can be placed in a chain of partial derivatives for steepest descent backpropagation adaptation. Interestingly, the partials of min and max act as pointers with the result that only the function that gave rise to the min or max is adapted; the others are not. To illustrate, let /spl alpha/=max [/spl beta//sub 1/, /spl beta//sub 2/, /spl middot//spl middot//spl middot/, /spl beta//sub N/]. Then /spl part//spl alpha///spl part//spl beta//sub n/=1 when /spl beta//sub n/ is the maximum and is otherwise zero. We apply this property to the fine tuning of membership functions of fuzzy min-max decision processes and illustrate with an estimation example. The adaptation process can reveal the need for reducing the number of membership functions. Under the assumption that the inference surface is in some sense smooth, the process of adaptation can reveal overdetermination of the fuzzy system in two ways. First, if two membership functions come sufficiently close to each other, they can be fused into a single membership function. Second, if a membership function becomes too narrow, it can be deleted. In both cases, the number of fuzzy IF-THEN rules is reduced. In certain cases, the overall performance of the fuzzy system can be improved by this adaptive pruning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call