To investigate the effect of electronic energy losses on nuclear damage in UO2, we use a simple Rate Theory (RT) model, based on the time evolution of single point defects, governed by their absorption at the surface of TEM lamellae, and by interstitial-type dislocation loops nucleation, solely characterized by their number and average size. We first parametrize the model by fitting six different experimental datasets at various temperatures, ion type and energy (0.39 MeV Xe and 4 MeV Au ion irradiations at 93, 298 and 873 K) where defect evolution in UO2 is dominated by displacement damage caused by nuclear energy losses. The model suggests that dislocation evolution kinetics is driven by monomers diffusion at 873 K. At lower temperature (93 and 298 K) monomer diffusion has little impact and the evolution is governed by the nucleation of loops within the collision cascade. Analysis of four additional experimental datasets at 93 and 298 K where electronic energy losses are strong (single 6 MeV Si and dual simultaneous Xe & Si ion irradiations) required modification of monomer's diffusion coefficients. In the case of single Si ion irradiation, evolution is temperature independent and the enhanced diffusion due to electronic excitations and ionizations are best captured by adding athermal component to the monomer diffusion coefficients. In case of the dual Xe & Si ions irradiation, electronic excitation caused by Si ions impacts the defects pre-generated by Xe ions and enhances defect diffusion by at local heating induced by the thermal spike of Si ions. This effect is best captured by artificially raising the irradiation temperature.