The radio and far-infrared luminosities of star-forming galaxies are tightly correlated over several orders of magnitude; this is known as the far-infrared radio correlation (FIRC). Previous studies have shown that a host of factors conspire to maintain a tight and linear FIRC, despite many models predicting deviation. This discrepancy between expectations and observations is concerning since a linear FIRC underpins the use of radio luminosity as a star-formation rate indicator. Using LOFAR 150MHz, FIRST 1.4 GHz, and Herschel infrared luminosities derived from the new LOFAR/H-ATLAS catalogue, we investigate possible variation in the monochromatic (250$\mathrm{\mu m}$) FIRC at low and high radio frequencies. We use statistical techniques to probe the FIRC for an optically-selected sample of 4,082 emission-line classified star-forming galaxies as a function of redshift, effective dust temperature, stellar mass, specific star formation rate, and mid-infrared colour (an empirical proxy for specific star formation rate). Although the average FIRC at high radio frequency is consistent with expectations based on a standard power-law radio spectrum, the average correlation at 150MHz is not. We see evidence for redshift evolution of the FIRC at 150MHz, and find that the FIRC varies with stellar mass, dust temperature and specific star formation rate, whether the latter is probed using MAGPHYS fitting, or using mid-infrared colour as a proxy. We can explain the variation, to within 1$\sigma$, seen in the FIRC over mid-infrared colour by a combination of dust temperature, redshift, and stellar mass using a Bayesian partial correlation technique.