Abstract

We investigate the effects of self-interacting dark matter (SIDM) on the tidal stripping and evaporation of satellite galaxies in a Milky Way-like host. We use a suite of five zoom-in, dark-matter-only simulations, two with velocity-independent SIDM cross sections, two with velocity-dependent SIDM cross sections, and one cold dark matter simulation for comparison. After carefully assigning stellar mass to satellites at infall, we find that stars are stripped at a higher rate in SIDM than in CDM. In contrast, the total bound dark matter mass loss rate is minimally affected, with subhalo evaporation having negligible effects on satellites for viable SIDM models. Centrally located stars in SIDM haloes disperse out to larger radii as cores grow. Consequently, the half-light radius of satellites increases, stars become more vulnerable to tidal stripping, and the stellar mass function is suppressed. We find that the ratio of core radius to tidal radius accurately predicts the relative strength of enhanced SIDM stellar stripping. Velocity-independent SIDM models show a modest increase in the stellar stripping effect with satellite mass, whereas velocity-dependent SIDM models show a large increase in this effect towards lower masses, making observations of ultra-faint dwarfs prime targets for distinguishing between and constraining SIDM models. Due to small cores in the largest satellites of velocity-dependent SIDM, no identifiable imprint is left on the all-sky properties of the stellar halo. While our results focus on SIDM, the main physical mechanism of enhanced tidal stripping of stars apply similarly to satellites with cores formed via other means.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call