Extracting meaningful information from signals has always been a challenge. Due to the influence of environmental noise, collected signals often exhibit nonlinear characteristics, rendering traditional metrics inadequate in capturing the dynamic properties and complex structures of signals. To address this challenge, this study proposes an innovative metric for quantifying signal complexity-dispersion network-transition entropy (DNTE), which integrates the concepts of complex networks and information entropy. Specifically, we assign single cumulative distribution function values to network nodes and utilize Markov chains to represent links, transforming nonlinear signals into weighted directed complex networks. Subsequently, we assess the importance of network nodes and links, and employ the mathematical expression of information entropy to calculate the DNTE value, quantifying the complexity of the original signal. Next, through extensive experiments on simulated chaotic models and real underwater acoustic signals, we confirm the outstanding performance of DNTE. The results indicate that, compared to Lempel-Ziv complexity, permutation entropy, and dispersion entropy, DNTE not only more accurately reflects changes in signal complexity but also exhibits higher computational efficiency. Importantly, DNTE demonstrates optimal performance in distinguishing different categories of chaotic models, ships, and modulation signals, showcasing its significant potential in extracting effective information from signals.
Read full abstract