Abstract

We compare observed far infra-red/sub-millimetre (FIR/sub-mm) galaxy spectral energy distributions (SEDs) of massive galaxies ($M_{\star}\gtrsim10^{10}$ $h^{-1}$M$_{\odot}$) derived through a stacking analysis with predictions from a new model of galaxy formation. The FIR SEDs of the model galaxies are calculated using a self-consistent model for the absorption and re-emission of radiation by interstellar dust based on radiative transfer calculations and global energy balance arguments. Galaxies are selected based on their position on the specific star formation rate (sSFR) - stellar mass ($M_{\star}$) plane. We identify a main sequence of star-forming galaxies in the model, i.e. a well defined relationship between sSFR and $M_\star$, up to redshift $z\sim6$. The scatter of this relationship evolves such that it is generally larger at higher stellar masses and higher redshifts. There is remarkable agreement between the predicted and observed average SEDs across a broad range of redshifts ($0.5\lesssim z\lesssim4$) for galaxies on the main sequence. However, the agreement is less good for starburst galaxies at $z\gtrsim2$, selected here to have elevated sSFRs$>10\times$ the main sequence value. We find that the predicted average SEDs are robust to changing the parameters of our dust model within physically plausible values. We also show that the dust temperature evolution of main sequence galaxies in the model is driven by star formation on the main sequence being more burst-dominated at higher redshifts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call