ABSTRACT The tight relationship between infrared luminosity (LTIR) and 1.4 GHz radio continuum luminosity ($L_\mathrm{1.4\, GHz}$) has proven useful for understanding star formation free from dust obscuration. Infrared emission in star-forming galaxies typically arises from recently formed, dust-enshrouded stars, whereas radio synchrotron emission is expected from subsequent supernovae. By leveraging the wealth of ancillary far-ultraviolet – far-infrared photometry from the Deep Extragalactic VIsible Legacy Survey and Galaxy and Mass Assembly surveys, combined with 1.4 GHz observations from the Meer Karoo Array Telescope International GHz Tiered Extragalactic Exploration survey and Deep Investigation of Neutral Gas Origin projects, we investigate the impact of time-scale differences between far-ultraviolet – far-infrared and radio-derived star formation rate (SFR) tracers. We examine how the spectral energy distribution (SED)-derived star formation histories (SFHs) of galaxies can be used to explain discrepancies in these SFR tracers, which are sensitive to different time-scales. Galaxies exhibiting an increasing SFH have systematically higher LTIR and SED-derived SFRs than predicted from their 1.4 GHz radio luminosity. This indicates that insufficient time has passed for subsequent supernovae-driven radio emission to accumulate. We show that backtracking the SFR(t) of galaxies along their SED-derived SFHs to a time several hundred megayears prior to their observed epoch will both linearize the SFR–$L_\mathrm{1.4\, GHz}$ relation and reduce the overall scatter. The minimum scatter in the SFR(t)–$L_\mathrm{1.4\, GHz}$ is reached at 200 – 300 Myr prior, consistent with theoretical predictions for the time-scales required to disperse the cosmic ray electrons responsible for the synchrotron emission.
Read full abstract