Abstract

Convolutional neural networks, as most artificial neural networks, are frequently viewed as methods different in essence from kernel-based methods. In this work we translate several classical convolutional neural networks into kernel-based counterparts. Each kernel-based counterpart is a statistical model called a convolutional kernel network with parameters that can be learned from data. We provide an alternating minimization algorithm with mini-batch sampling and implicit partial differentiation to learn from data the parameters of each convolutional kernel network. We also show how to obtain inexact derivatives with respect to the parameters using an algorithm based on two inter-twined Newton iterations. The models and the algorithms are illustrated on benchmark datasets in image classification. We find that the convolutional neural networks and their kernel counterparts often perform similarly. Supplemental appendices and code for the article are available online.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call