Abstract

Convolutional neural networks (CNNs) have found great success in many artificial intelligence applications. At the heart of a CNN is the operation of convolution between multi-channel data and learned kernels. It is usually implemented as a floating-point large matrix multiplication, a major bottleneck of computational speed and memory usage. In this paper, we propose a binary outer product expansion (BOPE) method to represent a kernel matrix or tensor as a weighted sum of the outer products between binary vectors of values +1 or −1. This allows for network compression and simplified computation at the same time. Our theoretical analysis shows such a decomposition converges to the original matrix given a sufficient number of binary vectors. We present computational methods to estimate outer product weights using either optimized or random binary base vectors. Significant data compression can be achieved for a highly redundant matrix, since weights and binary vectors require less storage than array elements. In addition, most floating-point multiplication in matrix convolution can be replaced by addition and binary XOR, lessening computation and memory requirements. We propose a compact convolutional layer in which highly redundant convolutional kernels are projected onto binary vectors, and represented as a weighted sum of outer products. It is shown that the number of weights in AlexNet, VGG−19 and ResNet-50 can be reduced 3.45, 6.87 and 2.95 times respectively with less than a 1% loss in the top-1 and top-5 classification accuracy on ImageNet, and MobileNetV2 can be reduced 2.31 times with a 2% loss. Compared to the standard CNN, this compact convolutional network has fewer trainable weights, is better regularized, and is easier to train from fewer training samples. Therefore, it is particularly suited for devices with limited computation, memory, and battery power.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call