We show at work a technique of scaling detection based on evaluating the Shannon entropy of the diffusion process obtained by converting the time series under study into trajectories. This method, called diffusion entropy, affords information that cannot be derived from the direct evaluation of waiting times. We apply this method to the analysis of the distribution of time distance tau between two nearest-neighbor solar flares. This traditional part of the analysis is based on the direct evaluation of the distribution function psi(tau), or of the probability Psi(tau), that no time distance smaller than a given tau is found. We adopt the paradigm of the inverse power-law behavior, and we focus on the determination of the inverse power index mu, without ruling out different asymptotic properties that might be revealed, at larger scales, with the help of richer statistics. We then use the DE method, with three different walking rules, and we focus on the regime of transition to scaling. This regime of transition and the value of the scaling parameter itself, delta, depends on the walking rule adopted, a property of interest to shed light on the slow process of transition from dynamics to thermodynamics often occurring under anomalous statistical conditions. With the first two rules the transition regime occurs throughout a large time interval, and the information contained in the time series is transmitted, to a great extent, to it, as well as to the scaling regime. By using the third rule, on the contrary, the same information is essentially conveyed to the scaling regime, which, in fact, emerges very quickly after a fast transition process. We show that the DE method not only causes to emerge the long-range correlation with a given mu < 3, and so a basin of attraction different from the ordinary Gaussian one, but it also reveals the presence of memory effects induced by the time dependence of the solar flare rate. When this memory is annihilated by shuffling, the scaling parameter delta is shown to fit the theoretically expected function of mu. All this leads us to the compelling conclusion that mu = 2.138+/-0.01.
Read full abstract