To reduce the massive CO2 emission from coal-fired power plants, oxy-fuel combustion has been considered as one of the most effective and promising technologies to fulfil this goal. In this process, coal is burnt in pure O2 rather than air. So, the resultant flue gas is mainly CO2 and H2O, making subsequent CO2 capture and sequestration feasible. Unfortunately, the CO2 gas has been found to be very corrosive to the steels of the critical heat exchanging components in boilers, resulting in severe oxidation and carburisation degradation. In addition, increasing operation temperatures to improve boiler heat efficiency becomes necessary to meet continuously increasing energy demand, e.g. in advanced ultra-supercritical power generation. Under these circumstances, nickel-based alloys should be considered as alternative candidate materials owing to their superior creep strength and corrosion resistance at higher temperatures, compared with iron-based alloys. However, little is known about the corrosion behaviour of Ni-based alloys in CO2-rich gases at high temperatures associated with oxy-fuel technology. This paper investigated the effect of temperature on corrosion of model Ni-Cr binary alloys of 5, 10, 15, 20, 25 and 30 wt% in Ar-20 vol.% CO2 gas at 650, 700 and 800oC. Oxidation kinetics was determined by weight gain measurement. Figure 1a shows that weight gain kinetics at 700oC followed approximately a parabolic rate law. Reaction rates of alloys with Cr contents of 5, 10, 15 and 20 wt% were closely similar, with the Ni-15Cr alloy slightly higher. With an increase of Cr content to 25%, the rate of weight gain decreased. Further increasing Cr to 30 wt% led to very low weight gains. Phase identification by XRD showed the reaction products were predominantly NiO and Cr2O3, with trace amounts of NiCr2O4 detected in some samples. Metallographic cross-section analyses showed duplex or multiple oxide layers on the alloy surfaces. SEM-EDS analysis revealed that the outer layer was NiO, and the inner layer was composed of Cr2O3, NiO and NiCr2O4 spinel if present. An internal oxidation zone (IOZ) was observed in all the alloys except Ni-30Cr, which formed a continuous Cr2O3 layer on the surface. Thus the minimum Cr concentration for protective Cr2O3 scale formation is 30 wt% Cr in Ni at 700oC. Temperature affected the oxidation behaviour of Ni-Cr alloys and the critical Cr concentration required for the transition from non-protective to protective oxidation. Figure 1b compares the weight gains of different alloys at different temperatures after 150 h exposure. In general, the weight gain increased with increasing temperature for the alloys with Cr concentrations ranging from 5 to 20 wt%. In the case of Ni-25Cr and Ni-30Cr alloys, both weight gains were low and reached their lowest value at 800oC. At 650 and 700oC, the weight gains showed no significant change from 5 to 20 wt% Cr. At 700oC, the weight gain gradually decreased as Cr concentration increased from 20 to 25 wt%, and decreased even further at 30 wt%. However, at 650oC the weight gain continued to increase with Cr concentration up to 25 wt%, and then decreased at 30 wt%. At 800oC, the weight change showed a slight increase from 5 to 15 wt% Cr, and then a substantial decrease from 15 to 25 wt% Cr. Further increasing Cr to 30 wt% slowed the rate of weight gain only a little more. Metallographic cross-sections in Fig. 2 reveal that a protective Cr2O3 scale was formed at 800°C on the Ni-25Cr alloy, but not at 700°C, where a higher Cr level was required. However, at 650oC, no exclusive protective Cr2O3 scale was formed even at 30 wt% Cr. In the range examined, the critical Cr concentration required for protection decreases with increasing temperature. Figure 1
Read full abstract