Abstract
Convolutional Neural Networks (CNNs) are a kind of artificial neural network designed to extract features and find out patterns for tasks such as segmentation, recognizing objects, and drawing up classification. Within a CNNs architecture, pooling operations are used until the number of parameters and the computational complexity are reduced. Numerous papers have focused on investigating the impact of pooling on the performance of Convolutional Neural Networks (CNNs), leading to the development of various pooling models. Recently, a fuzzy pooling operation based on type-1 fuzzy sets was introduced to cope with the local imprecision of the feature maps. However, in fuzzy set theory, it is not always accurate to assume that the degree of non-membership of an element in a fuzzy set is simply the complement of the degree of membership. This is due to the potential existence of a hesitation degree, which implies a certain level of uncertainty. To overcome this limitation, intuitionistic fuzzy sets (IFS) were introduced to incorporate the concept of a degree of hesitation. In this paper, we introduce a novel pooling operation based on intuitionistic fuzzy sets to incorporate the degree of hesitation heretofore neglected by a fuzzy pooling operation based on classical fuzzy sets, and we investigate its performance in the context of image classification. Intuitionistic pooling is performed in four steps: bifuzzification (by the transformation of data through the use of membership and non-membership maps), first aggregation (through the transformation of the IFS into a standard fuzzy set, second aggregation (through the transformation and use of a sum operator), and the defuzzification of feature map neighborhoods by using a max operator. IFS pooling is used for the construction of an intuitionistic pooling layer that can be applied as a drop-in replacement for the current, fuzzy (type-1) and crisp, pooling layers of CNN architectures. Various experiments involving multiple datasets demonstrate that an IFS-based pooling can enhance the classification performance of a CNN. A benchmarking study reveals that this significantly outperforms even the most recent pooling models, especially in stochastic environments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.