ABSTRACT The growth of oxide films on Fe-Cr-Ni alloys in simulated pressurized water reactor (PWR) primary system environments was investigated, with focus on the influence of alloy composition and temperature. Oxide film thicknesses were measured for 90 specimens prepared from compact tension (CT) test pieces. Cross-sectional observations were also conducted on coupon specimens using scanning transmission electron microscopy (STEM). The behavior of Ni-based alloys, specifically Alloy 600, and Fe-based alloys, specifically Type 316 stainless steel, were compared. Ni-based alloys exhibit superior corrosion resistance due to slower Fe diffusion kinetics compared to Fe-based alloys, leading to the formation of a nickel-enriched protective layer at the metal-oxide interface. The oxidation rates of Ni-based alloys follow an Arrhenius-type temperature dependence. In contrast, Fe-based alloys show a more complex temperature-dependent behavior, with reduced oxidation rates at higher temperatures that are likely due to changes in oxide dissolution. Higher chromium concentrations in both alloy types improve the protectiveness of the oxide film towards corrosion by forming a more compact inner oxide layer. The study identifies three primary factors influencing oxide film growth: reactions at the metal-oxide interface, diffusion within the oxide layer, and processes at the oxide-solution interface. The relative significance of these processes varies depending on the alloy composition and temperature.