Abstract
Randomized sampling-based ensemble learning is emerging as a new alternative to deep neural networks (DNNs) because it supports diversity and locality and does not require backpropagation in the learning process. By connecting randomized weak classifiers in a layer-by-layer manner, it is possible to create models having a performance similar to that of DNNs, but with less overfitting and better generalizability than before. Therefore, in this paper, we propose a deep random ferns (d-RFs) model, in which extremely randomized ferns are connected to multilayers, allowing a high classification performance and a lightweight and fast structure. The input vector is first encoded as a transformed feature vector in the feature encoder layer and then is input to the cascade layers. The feature encoding process is similar to the DNN convolution and helps improve the performance of the final output layer. Unlike in the backpropagation paradigm, the cascade layer adjusts the number of ferns and layers required for the d-RFs adaptively, using only a small scale of data. RFs ensemble approaches have considerably fewer hyper-parameters than a DNN or deep forest model, and the complexity can be determined automatically in a data-dependent manner. In addition, experimental results show that an ensemble of multiple weak classifiers reduces the bias between models through an averaging of the weakly correlated classifiers, resulting in a better overall model. The proposed lightweight d-RFs was successfully applied to benchmark datasets and yielded a similar or better accuracy level and a smaller number of parameters and operations as compared with state-of-the-art methods.
Highlights
In the field of machine learning and computer vision, noteworthy progress has been made in the area of deep neural network (DNNs) in recent years; because of limitations, new types of deep models have been required
D-random ferns (RFs) assigns only 55 ferns to an individual layer and one feature encoder with two RF layers to reduce the number of parameters
We propose a method for constructing lightweight deep random ferns (d-RFs), in which each neuron of a layer is composed of individual RFs and each layer is considered a type of RF
Summary
In the field of machine learning and computer vision, noteworthy progress has been made in the area of deep neural network (DNNs) in recent years; because of limitations, new types of deep models have been required. A. CONTRIBUTIONS OF THIS WORK In this study, we replaces ensemble forests in the model layer with the RFs. RFs is based on a completely randomized training process and performs a binary test in a simple fashion, unlike that in [1] and [2], where the split function is calculated at every node. For more efficient d-FRs model, we update the structure and learning process of previous version to improve the accuracy of the model that can be run on a CPU and executed to DNN-based methods with a small number of parameters, operations, and training data.
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have