In situ irradiations with 1 MeV Kr ions at 50~613 K up to a fluence of 6.25 × 1014 ions/cm2 (~1.25 displacements per atom, dpa) have been performed on pre-aged dilute Cu-0.9%Co, Cu-0.9%Fe and Cu-0.8%Cr alloys containing uniform matrix dispersions of coherent precipitates in order to study the effects of initial precipitate sink strength, damage dose and irradiation temperature on radiation-induced coherency loss of precipitates. Coherent precipitates with different point defect sink strengths (2πNd, where N and d are the precipitate density and diameter) were used in this work to examine potential differences in atomic relaxation during absorption of point defects. In all cases, irradiation to low doses (<~1 dpa) was very effective at inducing loss of precipitate coherency. At low sink strengths (~1013 m−2), loss of precipitate coherency could be induced for doses ~0.01 dpa. This suggests there might be an efficient preferential medium-range strain-induced bias for absorption of interstitial defects due to tensile strains emanating from the undersized precipitates, which induces relatively rapid loss of coherency at low precipitate sink strengths. High precipitate sink strength (~1014 m−2) conditions were relatively resistant to the radiation-induced loss of coherency (requiring higher doses approaching ~1 dpa) and this might be due to nearly equal numbers of interstitial and vacancy defects arriving at the precipitate interface for such high sink strength conditions. The precipitate coherency loss was observed to have a weak dependence on irradiation temperature. Molecular dynamics simulations confirm a strong effect of precipitate sink strength on the probability of interstitial absorption at precipitates.
Read full abstract