Abstract

A framework is presented for the theoretical analysis of the error in estimating the capacity dimension $D_c $ of an attractor (repellor) embedded in $\mathbb{R}''$. Given a set of M points sampled from the attractor independently according to its invariant measure, the error is divided into the sample error, due to the presence of low density regions of the attractor, and the truncation error, since computation halts at some finite resolution $\log (1/\varepsilon )$. The sampling error will decay if given the sample size M, the (smallest) grid size $\varepsilon $ is chosen so that the expected number of sample points in each occupied box goes to infinity as $M \to \infty $, $\varepsilon \to 0$. This condition is conveniently formulated in terms of a lower bound for $\log M/\log (1/\varepsilon )$. The lower bound is approximately computable for a set that is the repellor of a one-dimensional piecewise linear map with a Bernouilli probability measure. The truncation error can also be analyzed in this case. The results are summarized in a theorem giving sufficient conditions for the convergence of least square estimates of $D_c $. The results suggest that convergence may fail for sets with large variation in the pointwise dimension or for which the power law relation fails to hold.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.