Abstract
We first argue that the extension principle is too computationally involved to be an efficient way for a computer to evaluate fuzzy functions. We then suggest using α-cuts and interval arithmetic to compute the valued of fuzzy functions. Using the method of computing fuzzy functions, we then show that neural nets are universal approximators for (computable) fuzzy functions, when we only input non-negative, or non-positive, fuzzy numbers.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have