Abstract

In this paper, we study Tsallis’ fractional entropy (TFE) in a complex domain by applying the definition of the complex probability functions. We study the upper and lower bounds of TFE based on some special functions. Moreover, applications in complex neural networks (CNNs) are illustrated to recognize the accuracy of CNNs.

Highlights

  • A strategic amount in information theory is entropy

  • Its applications appeared in many fields, such as thermodynamics, chaos, artificial neural networks, image processing, complex systems, information theory, etc

  • One of the important applications of complex probability theory is in realistic quantum mechanics [17]; for example, the two slit experiment where a source releases a single particle, which moves to a wall with two slits and is spotted at position χ on a shelter placed behind the wall

Read more

Summary

Introduction

A strategic amount in information theory is entropy. Entropy measures the amount of uncertainty appearing in the assessment of a random variable or the outcome of a random process. The advantages of complex probability theory are that it is considered a supplementary dimension (imaginary part) to the event appearing in the real dimension laboratory (real part). It represents physical quantities of complex networks in terms of currents, complex potentials and impedance. The typical argument that an interference design on the shelter infers that the particle did not either drive through one slit or the other is an argument in probability theory such that P(χ) = P(χ)1 + P(χ) , where P(χ) and P(χ) are the probability via the first and second slit, respectively, which is a critical process.

Results
Fractional Sigmoid Function FSF
Complex-Valued Neural Networks
Discussion
Conclusions and Future Research
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call