Most of the dynamical mass loss from star clusters is thought to be caused by the time variability of the tidal field (“tidal shocks”). Systematic studies of tidal shocks have been hampered by the fact that each tidal history is unique, implying both a reproducibility and a generalization problem. Here we address these issues by investigating how star cluster evolution depends on the statistical properties of its tidal history. We run a large suite of direct N-body simulations of clusters with tidal histories generated from power spectra of a given slope and with different normalizations, which determine the timescales and amplitudes of the shocks, respectively. At fixed normalization (i.e., the same median tidal field strength), the dissolution timescale is nearly independent of the power spectrum slope. However, the dispersion in dissolution timescales, obtained by repeating simulations for different realizations of statistically identical tidal histories, increases with the power spectrum slope. This result means that clusters experiencing high-frequency shocks have more similar mass-loss histories than clusters experiencing low-frequency shocks. The density–mass relationship of the simulated clusters follows a power law with slope between 1.08 and 1.45, except for the lowest normalizations (for which clusters effectively evolve in a static tidal field). Our findings suggest that star cluster evolution can be described statistically from a time-series analysis of its tidal history, which is an important simplification for describing the evolution of the star cluster population during galaxy formation and evolution.
Read full abstract