Abstract

Activation functions play increasingly important roles in deep learning. The activation functions give the neural network the ability to learn complex patterns by introducing non-linearity. Complex-valued activation functions for complex-valued neural networks are extended from real-valued activation functions like sigmoid, tanh, rectified linear unit, and exponential linear unit. Recently, the most widely used complex-valued activation function is the real-imaginary-type activation function (RIAF), which applies separate real-valued activation functions on both the real and the imaginary parts of a neuron. This separate activation method ignores the internal relationship between the real and imaginary parts of the complex input and loses the integrity of activation. In this work, a complex-valued activation function called Gaussian-type activation function (GTAF) is proposed to activate the real and imaginary parts of a neuron jointly and enhance the learning ability and learning speed. GTAF consists of an ordinary real-valued activation function and a Gaussian function and is suitable for any given complex-valued neural network. Two polar-SAR image classification experiments on complex-valued convolutional neural networks show that GTAF converges faster than RIAF and has higher accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call