Nuclear burning plays a key role in a wide range of astrophysical stellar transients, including thermonuclear, pair instability, and core collapse supernovae, as well as kilonovae and collapsars. Turbulence is now understood to also play a key role in these astrophysical transients. Here, we demonstrate that turbulent nuclear burning may lead to large enhancements above the uniform background burning rate, since turbulent dissipation gives rise to temperature fluctuations, and in general the nuclear burning rates are highly sensitive to temperature. We derive results for the turbulent enhancement of the nuclear burning rate under the influence of strong turbulence in the distributed burning regime in homogeneous isotropic turbulence, using probability distribution function methods. We demonstrate that the turbulent enhancement obeys a universal scaling law in the limit of weak turbulence. We further demonstrate that, for a wide range of key nuclear reactions, such as C^{12}(O^{16},α)Mg^{24} and 3-α, even relatively modest temperature fluctuations, of the order of 10%, can lead to enhancements of 1-3 orders of magnitude in the turbulent nuclear burning rate. We verify the predicted turbulent enhancement directly against numerical simulations, and find very good agreement. We also present an estimation for the onset of turbulent detonation initiation, and discuss implications of our results for stellar transients.