Abstract

In this paper, the nonlinear activation functions based on fluid dynamics are presented. We propose two types of activation functions by applying the so-called parametric softsign to the negative region. We apply the activati... | Find, read and cite all the research you need on Tech Science Press

Highlights

  • The appropriate choice of the activation functions for neural networks is a key factor in the deep learning framework

  • New activation functions have been proposed such as PReLU [2] and ELU [3]

  • The purpose of this paper is to propose the activation functions based on the concept of fluid dynamics framework

Read more

Summary

Introduction

The appropriate choice of the activation functions for neural networks is a key factor in the deep learning framework. There is a problem that so-called dying unit issue becase the slope of the negative region is always 0. New activation functions have been proposed such as PReLU [2] and ELU [3]. The PReLU mitigates this problem by applying a parametric slope to the negative region. The ELU is more robust against noise than ReLU and PReLU by nonlinearly extending the negative region. We present two types of activation functions by applying the so-called parametric softsign to the negative part of ReLU. We apply the activation function to CNN (Convolutional Neural Network) which performs image recognition and approaches from multiple benchmark datasets such as MNIST, CIFAR-10. Numerical results demonstrate the workability and the validity of the present approach through comparison with other numerical performances

Nonlinear Activation Functions
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call