Abstract

Deep neural networks (DNNs) have achieved the state of the art performance in numerous fields. However, DNNs need high computation times, and people always expect better performance in a lower computation. Therefore, we study the human somatosensory system and design a neural network (SpinalNet) to achieve higher accuracy with fewer computations. Hidden layers in traditional NNs receive inputs in the previous layer, apply activation function, and then transfer the outcomes to the next layer. In the proposed SpinalNet, each layer is split into three splits: 1) input split, 2) intermediate split, and 3) output split. Input split of each layer receives a part of the inputs. The intermediate split of each layer receives outputs of the intermediate split of the previous layer and outputs of the input split of the current layer. The number of incoming weights becomes significantly lower than traditional DNNs. The SpinalNet can also be used as the fully connected or classification layer of DNN and supports both traditional learning and transfer learning. We observe significant error reductions with lower computational costs in most of the DNNs. Traditional learning on the VGG-5 network with SpinalNet classification layers provided the state-of-the-art (SOTA) performance on QMNIST, Kuzushiji-MNIST, and EMNIST (Letters, Digits, and Balanced) datasets. Traditional learning with ImageNet pre-trained initial weights and SpinalNet classification layers provided the SOTA performance on STL-10, Fruits 360, Bird225, and Caltech-101 datasets. The scripts of the proposed SpinalNet training are available at the following link: <uri>https://github.com/dipuk0506/SpinalNet</uri> <i>Impact Statement</i>&#x2014;Research in deep neural networks (DNNs) has gained significant attention from industries and academia due to their eye-catching performance. DNNs have enabled machines to perform myriad tasks with high accuracy that once only humans could do. Several researchers have recently proposed different types of NNs and have achieved high accuracy. The recent success of biologically inspired convolutional neural networks and the miraculous spinal architecture of humans has motivated us to develop a neural network with gradual inputs. We have achieved superior performance in several datasets. After the first online appearance of the initial version of this paper, several researchers have applied the proposed neural network in several new datasets and reported promising results. We may observe numerous novel applications of SpinalNet in upcoming years.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call