Abstract

In this paper, the existing enhanced fuzzy min–max (EFMM) neural network is improved with a flexible learning procedure for undertaking pattern classification tasks. Four new contributions are introduced. Firstly, a new training strategy is proposed for avoiding the generation of unnecessary overlapped regions between hyperboxes of different classes. The learning phase is simplified by eliminating the contraction procedure. Secondly, a new flexible expansion procedure is introduced. It eliminates the use of a user-defined parameter (expansion coefficient) to determine the hyperbox sizes. Thirdly, a new overlap test rule is applied during the test phase to identify the containment cases and activate the contraction procedure (if necessary). Fourthly, a new contraction procedure is formulated to overcome the containment cases and avoid the data distortion problem. Both the third and fourth contributions are important for preventing the catastrophic forgetting issue and supporting the stability-plasticity principle pertaining to online learning. The performance of the proposed model is evaluated with benchmark data sets. The results demonstrate its efficiency in handling pattern classification tasks, outperforming other related models in online learning environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call