Abstract

The broad learning system (BLS) framework gives an efficient solution for training flat-structured feedforward networks and flat structured deep neural networks. However, the classical BLS model and other variants focus on the faultless situation only, where enhancement nodes, feature mapped nodes, and output weights of a BLS network are assumed to be realized in a perfect condition. When a trained BLS network suffers from coexistence of weight/node failures, the trained network has a greatly degradation in its performance if a countermeasure is not taken. In order to reduce the effect of weight/node failures on the BLS network's performance, this paper proposes an objective function for enhancing the fault aware performance of BLS networks. The objective function contains a fault aware regularizer term which handles the weight/node failures. A learning algorithm is then derived based on the objective function. The simulation results show that the performance of the proposed fault aware BLS (FABLS) algorithm is superior to the classical BLS and two state-of-the-arts BLS algorithms, namely correntropy criterion BLS (CBLS) and weighted BLS (WBLS).

Highlights

  • Without doubt, human brain is capable to handle fault and noise situations [1]

  • As the idea of artificial neural networks (ANNs) is based on human brain, it was presumed that a trained ANN should have an inherent ability against weight/node failures

  • There is a strong evidence that the fault aware BLS (FABLS) is better than weighted BLS (WBLS)

Read more

Summary

INTRODUCTION

Human brain is capable to handle fault and noise situations [1]. For instance, human being is able to recognize a partially occluded object without much difficulty. This paper studies the effect of coexistence of weight/node failures on BLS [13]. In [43] fault tolerant methods based on diversifying learning were proposed to improve deep neural networks(DNNs) dependability. In the original BLS learning, the connection weights between the input to the enhancement node are generated in a random manner. This paper studies the behaviour of original BLS which are concurrently affected by node noise, weight noise, and open weight fault. Based on the proposed objective function, a fault tolerant BLS (FABLS) is proposed to mitigate the effect of network failures. In the original BLS algorithm, hyperbolic tangent is employed as the activation function for all the enhancement nodes.

CONSTRUCTION OF WEIGHT MATRICES AND VECTORS
MULTIPLICATIVE NOISE AND OPEN WEIGHT FAULT IN BLS
MULTIPLICATIVE NOISE IN OUTPUT WEIGHTS β
NOISE STATISTICS OF THE WEIGHTED INPUTS TO THE OUTPUT NODES
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.