The low temperature thermal conductivity κ 0/ T of d-wave superconductors is generally thought to attain a “universal” value independent of disorder at sufficiently low temperatures, providing an important measure of the magnitude of the gap slope near its nodes. We discuss situations in which this inference can break down because of competing order, and quasiparticle localization. Specifically, we study an inhomogeneous BCS mean field model with electronic correlations included via a Hartree approximation for the Hubbard interaction, and show that the suppression of κ 0/ T by localization effects can be strongly enhanced by magnetic moment formation around potential scatterers.