Abstract

Artificial neural networks (ANNs) are powerful computational tools that are designed to replicate the human brain and adopted to solve a variety of problems in many different fields. Fault tolerance (FT), an important property of ANNs, ensures their reliability when significant portions of a network are lost. In this paper, a fault/noise injection-based (FIB) genetic algorithm (GA) is proposed to construct fault-tolerant ANNs. The FT performance of an FIB-GA was compared with that of a common genetic algorithm, the back-propagation algorithm, and the modification of weights algorithm. The FIB-GA showed a slower fitting speed when solving the exclusive OR (XOR) problem and the overlapping classification problem, but it significantly reduced the errors in cases of single or multiple faults in ANN weights or nodes. Further analysis revealed that the fit weights showed no correlation with the fitting errors in the ANNs constructed with the FIB-GA, suggesting a relatively even distribution of the various fitting parameters. In contrast, the output weights in the training of ANNs implemented with the use the other three algorithms demonstrated a positive correlation with the errors. Our findings therefore indicate that a combination of the fault/noise injection-based method and a GA is capable of introducing FT to ANNs and imply that the distributed ANNs demonstrate superior FT performance.Electronic supplementary materialThe online version of this article (doi:10.1007/s13238-016-0302-5) contains supplementary material, which is available to authorized users.

Highlights

  • The brain is composed of biological neural networks (BNNs) that contain billions of interconnecting neurons with the ability to perform computations

  • All the output neuron weights in the Artificial neural networks (ANNs) training with the GE-genetic algorithm (GA), BP, and modification of weight (MW) strongly correlated with the errors (Fig. 1D); those in the fault/ noise injection-based (FIB)-GA ANN training did not. These results showed that in the ANN training with the FIB-GA, no parameter set correlated with the fitting error, a finding that implies there is no dominant parameter in ANNs trained via the use of an FIB-GA

  • Since the performance of ANNs highly relies on the number of nodes in the hidden layer(Xu and Xu, 2013; Sasakawa et al, 2014). we investigated whether the number of hidden neurons could affect the Fault tolerance (FT) performance of the four ANNs in solving the exclusive OR (XOR) problem

Read more

Summary

Introduction

The brain is composed of biological neural networks (BNNs) that contain billions of interconnecting neurons with the ability to perform computations. Artificial neural networks (ANNs), mathematical models that mimic BNNs, are typically built as structured node groups with activation functions and connection weights that are adjusted based on the applied learning rules (Hampson, 1991, 1994; Basheer and Hajmeer, 2000; Krogh, 2008). FT performance has become more important, partly due to the fact that the soft errors caused by transient faults are an unavoidable concern in very large-scale integration (VLSI) technology, whose dimension is approaching the nanoscale (Mahdiani et al, 2012)

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.