Abstract

Training deep neural networks usually invokes error gradient backpropagation (BP), which costs a long training time even on high-performance computing devices or with professional skills of experienced experimenters. In order to accelerate and simplify the training process of deep networks, non-iterative training procedures have been proposed. Here we present another non-iterative approach, Feedforward Convolutional Conceptor Neural Network (FCCNN), for training feedforward networks on image classification tasks. Our work makes two major contributions: (1) a conceptor based classifier which is specific for non-temporal data; (2) a simple non-iterative neural network model. The architecture is established based on the combination of a Convolutional Neural Network (CNN) shaped by an unsupervised feature learning based on Principal Component Analysis (PCA), binary thresholding and the proposed non-temporal conceptor classifier. Through various experiments on MNIST variation datasets, FCCNN achieves classifying accuracy comparable to the state-of-the-art methods while requires significantly reduced training time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call