Abstract

Deep learning suffers from many notorious issues such as low convergence rate, over-fitting, and time-consuming. To alleviate these problems, an alternative learning framework with a non-iterative training mechanism named Broad Learning System (BLS) was proposed, which randomly assigns the parameters of hidden nodes and frozen them throughout the training process and then obtains its output weights using the ridge regression theory. This training method makes BLS have very high training efficiency. However, using ridge regression to solve the output weights cannot guarantee the stability of the solution in many cases, especially when the number of training samples is large, which may cause over-fitting and instability of BLS models. To solve this problem, we propose an improved BLS with a dense architecture and use the Proportional-Integral-Differential (PID) and Adaptive moment estimation (Adam) to replace the ridge regression operation. The new algorithm is called PID-A-DBLS, and its advantages include: 1) dense architecture can improve the feature extraction ability of the model; 2) using PID and Adam to solve the output weights can avoid the disadvantages of ridge regression. Extensive experimental results on four benchmark data sets show that PID-A-DBLS can achieve much better generalization ability and stability than BLS and its variants.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.