Abstract

Activation function is an integral part of convolutional neural networks. Through many experiments we find that there are some complementary properties between Relu activation function and Tanh activation function. The output of Tanh function could increase the values activated by Relu units and decrease the values clipped by Relu units. By changing Relu activation function into the weighted sum of Relu activation function and Tanh activation function, the networks could obtain a great improvement. We conduct a series of experiments on some datesets, the results show that our method could improve the accuracy of ResNet and Inception by a large margin with only two parameters added every convolutional layer.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call