Abstract
The Jensen–Shannon divergence is a symmetrized and smoothed version of the Kullback–Leibler divergence. Recently it has been widely applied to the analysis and characterization of symbolic sequences. In this paper we investigate a generalization of the Jensen–Shannon divergence. This generalization is done in the framework of the non-extensive Tsallis statistics. We study its basic properties and we investigate its applicability as a tool for segmentating symbolic sequences.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have