Abstract

Conventional training mechanism for deep learning, which is based on gradient descent, suffers from many notorious issues such as low convergence rate, over-fitting, and time-consuming. To alleviate these problems, a novel deep learning algorithm with a different learning mechanism named Broad Learning System (BLS) was proposed by Prof. C. L. Philip Chen in 2017. BLS randomly selects the parameters of the feature nodes and enhancement nodes during its training process and uses the ridge regression theory to solve its output weights. BLS has been widely used in many fields because of its high efficiency. However, there is a fundamental problem that has not yet been solved, that is, the appropriate value of the parameter λ for the ridge regression operation of BLS is difficult to be set properly, which often leads to the problem of over-fitting and seriously limits the development of BLS. To solve this problem, we proposed a novel Dense BLS based on Conjugate Gradient (CG-DBLS) in this paper, in which each feature node is connected to other feature nodes and each enhancement node is connected to other enhancement nodes in a feed-forward fashion. The recursive least square method and conjugate gradient method are used to calculate the output weights of the feature nodes and enhancement nodes respectively. Experiment studies on four benchmark regression problems from UCI repository show that CG-DBLS can achieve much lower error and much higher stability than BLS and its variants.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call