We introduce a simple parametric model of the radio–infrared correlation (i.e., the ratio between the IR luminosity and the 1.4 GHz radio luminosity, q IR) by considering the energy loss rate of high-energy cosmic-ray (CR) electrons governed by radiative cooling (synchrotron, bremsstrahlung, inverse Compton scattering), ionization, and adiabatic expansion. Each process of CR electron energy loss is explicitly computed and compared to each other. We rewrite the energy loss rate of each process to be dependent on the gas surface density and redshift using the relevant scaling relations. By combining each energy loss rate, the fraction of the synchrotron energy loss rate is computed as a function of gas surface density and redshift and used to extrapolate the well-established “local” radio–infrared correlation to the high-redshift Universe. The locally established q IR is reformulated to be dependent upon the redshift and the gas surface density and applied for understanding the observed distribution of the radio–infrared correlation of high-redshift galaxies in I. Delvecchio et al. Our model predicts that the q IR value is anticorrelated with gas surface density and the redshift dependency of the q IR value changes by the gas surface density of galaxies, which captures the observed trend of q IR values for stellar-mass-selected star-forming galaxies with a minimal impact of radio–infrared selection bias.
Read full abstract