Abstract

Deep learning technologies have been rapidly developing recently. They have shown excellent performances in many fields. However, deep learning networks have weak adaptability to small sample sizes.Usually, deep learning networks require tens of thousands of samples to avoid overfitting. In contrast, the kernel method can deal with small sample sizes. In this paper, we use only dozens of samples to train a successful classifier. Deep multiple kernel learning (DMKL) has emerged in recent years. It combines the ideas of deep learning and multiple kernel learning. Unfortunately, the DMKL network with a fixed structure cannot adapt to data of different dimensions and sizes. Therefore, in this paper, we propose a novel depth-width-scaling multiple kernel learning (DWS-MKL) algorithm. It has the ability to adjust the architecture according to the input data. We optimize the estimation of leave-one-out error by using the span bound instead of the dual objective function. Finally, we choose a large number of data sets from the UCI benchmark data sets for classification tasks. We also evaluate DWS-MKL on the MNIST data set. The experimental results show that different frameworks have different performances on the same dataset. Our DWS-MKL algorithm obtains better classification results than the state-of-the-art kernel learning algorithms, as determined by a thorough comparison. The encouraging results demonstrate that our method has the potential to improve generalization performance of the model.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.