Abstract

We first argue that the extension principle is too computationlly involved to be an efficient way for a computer to evaluate fuzzy functions. We then suggest using /spl alpha/-cuts and interval arithmetic to compute the values of fuzzy functions. Using this method of computing fuzzy functions, we then show that neural nets are universal approximators for (computable) fuzzy functions, when we only input non-negative, or non-positive, fuzzy numbers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call