Abstract

The far-infrared - radio correlation connects star formation and magnetic fields in galaxies, and has been confirmed over a large range of far-infrared luminosities. Recent investigations indicate that it may even hold in the regime of local dwarf galaxies, and we explore here the expected behavior in the regime of star formation surface densities below 0.1 M_sun kpc^{-2} yr^{-1}. We derive two conditions that can be particularly relevant for inducing a change in the expected correlation: a critical star formation surface density to maintain the correlation between star formation rate and the magnetic field, and a critical star formation surface density below which cosmic ray diffusion losses dominate over their injection via supernova explosions. For rotation periods shorter than 1.5x10^7 (H/kpc)^2 yrs, with H the scale height of the disk, the first correlation will break down before diffusion losses are relevant, as higher star formation rates are required to maintain the correlation between star formation rate and magnetic field strength. For high star formation surface densities Sigma_SFR, we derive a characteristic scaling of the non-thermal radio to the far-infrared / infrared emission with Sigma_SFR^{1/3}, corresponding to a scaling of the non-thermal radio luminosity L_s with the infrared luminosity L_{th} as L_{th}^{4/3}. The latter is expected to change when the above processes are no longer steadily maintained. In the regime of long rotation periods, we expect a transition towards a steeper scaling with Sigma_SFR^{2/3}, implying L_s~L_th^{5/3}, while the regime of fast rotation is expected to show a considerably enhanced scatter. These scaling relations explain the increasing thermal fraction of the radio emission observed within local dwarfs, and can be tested with future observations by the SKA and its precursor radio telescopes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call