Abstract

In deep neural networks, the activation function is an important component. The most popular activation functions at the moment are Sigmoid, Sin, rectified linear unit (ReLU), and some variants of ReLU. However, each of them has its own weakness. To improve the network fitting and generalization ability, a new activation function, TSin, is designed. The basic design idea for TSin function is to rotate the Sin function 45° counterclockwise and then finetune it to give it multiple better properties needed as an activation function, such as nonlinearity, global differentiability, unsaturated property, zero-centered property, monotonicity, quasi identity transformation property, and so on. The first is a theoretical derivation of TSin function by formulas. Then three experiments are designed for performance test. The results show that compared with some popular activation functions, TSin has advantages in terms of training stability, convergence speed, and convergence precision. The study of TSin not only provides a new choice of activation function in deep learning but also provides a new idea for activation function design in the future.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call