Abstract

Fuzzy neural networks (FNNs), with suitable structures, have been demonstrated to be an effective tool in approximating nonlinearity between input and output variables. However, it is time-consuming to construct an FNN with appropriate number of fuzzy rules to ensure its generalization ability. To solve this problem, an efficient optimization technique is introduced in this paper. First, a self-adaptive structural optimal algorithm (SASOA) is developed to minimize the structural risk of an FNN, leading to an improved generalization performance. Second, with the proposed SASOA, the fuzzy rules of SASOA-based FNN (SASOA-FNN) are generated or pruned systematically. This SASOA-FNN is able to organize the structure and adjust the parameters simultaneously in the learning process. Third, the convergence of SASOA-FNN is proved in the cases with fixed and updated structures, and the guidelines for selecting the parameters are given. Finally, experimental studies of the proposed SASOA-FNN have been performed on several nonlinear systems to verify the effectiveness. The comparison with other existing methods has been made, and it demonstrates that the proposed SASOA-FNN is of better performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.