Abstract

ABSTRACT A tidal disruption event (TDE) occurs when a star is destroyed by the strong tidal shear of a massive black hole (MBH). The accumulation of TDE observations over the last years has revealed that post-starburst galaxies are significantly overrepresented in the sample of TDE hosts. Here we address the post-starburst preference by investigating the decline of TDE rates in a Milky-Way like nuclear stellar cluster featuring either a monochromatic (1 $\, \mathrm{M}{\odot {}}$) or a complete, evolved stellar mass function. In the former case, the decline of TDE rates with time is very mild, and generally up to a factor of a few in 10 Gyr. Conversely, if a complete mass function is considered, a strong TDE burst over the first 0.1–1 Gyr is followed by a considerable rate drop, by at least an order of magnitude over 10 Gyr. The decline starts after a mass segregation time-scale, and it is more pronounced assuming a more top-heavy initial mass function and/or an initially denser nucleus. Our results thus suggest that the post-starburst preference can be accounted for in realistic systems featuring a complete stellar mass function, even in moderately dense galactic nuclei. Overall, our findings support the idea that starbursting galactic nuclei are characterized by a top-heavy initial mass function; we speculate that accounting for this can reconcile the discrepancy between observed and theoretically predicted TDE rates even in quiescent galaxies.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.