Abstract

In this paper, we consider applying the concept of Saak (Subspace approximation with augmented kernels) transform to the convolutional neural networks (CNNs). In CNNs, the activation function known as ReLU (Rectified linear unit) is widely used for image or signal processing applications, e.g., image classification, image super resolution, etc. Activation functions including ReLU discards negative values of the input to achieve nonlinear input-output relations. In CNNs, therefore, ReLU discards negative values of filter outputs although those negative values may have equal importance as positive values in image processing. The Saak transform is proposed to utilize the information carried by the negative values. In this paper, we consider the CNN architectures to utilize the concept of Saak transform by introducing a modified ReLU which discards positive values. Then we show that we can construct several CNN architectures based on the concept of Saak and the modified ReLU. To see the possibility of the proposed architecture, we apply them to the image classification and consider the validity from the results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call