We report the relationship between the luminosities of active galactic nuclei (AGNs) and the rates of star formation (SF) for a sample of 323 far-infrared (FIR)-detected AGNs. This sample has a redshift range of 0.2 $< z <$ 2.5, and spans three orders of magnitude in luminosity, ${\rm L_{X} \sim 10^{42-45}}$erg$s^{-1}$. We find that in AGN hosts, the total IR luminosity (8-1000$\mu$m) has a significant AGN contribution (average$\sim$20%), and we suggest using the FIR luminosity (30-1000 $\mu$m) as a more reliable star formation rate (SFR) estimator. We also conclude that monochromatic luminosities at 60 and 100\,$\mu$\,m are also good SFR indicators with negligible AGN contributions, and are less sensitive than integrated infrared luminosities to the shape of the AGN SED, which is uncertain at $\lambda>$100\micron. Significant bivariate $L_{\rm X}$-$L_{\rm IR}$ correlations are found, which remain significant in the combined sample when using residual partial correlation analysis to account for the inherent redshift dependence. No redshift or mass dependence is found for the ratio between SFR and black hole accretion rate (BHAR), which has a mean and scatter of log (SFR/BHAR) $=3.1 \pm$ 0.5, agreeing with the local mass ratio between supermassive black hole and host galaxies. The large scatter in this ratio and the strong AGN-SF correlation found in these IR-bright AGNs are consistent with the scenario of an AGN-SF dependence on a common gas supply, regardless of the evolutionary model.