Abstract

The purpose of this work is to study the influence of heat treatment on the corrosion resistance of a newly developed white cast iron, basically suitable for corrosion- and wear-resistant applications, and to attain a microstructure that is most suitable from the corrosion resistance point of view. The composition was selected with an aim to have austenitic matrix both in as-cast and heat-treated conditions. The difference in electrochemical potential between austenite and carbide is less in comparison to that between austenite and graphite. Additionally, graphitic corrosion which is frequently encountered in gray cast irons is absent in white cast irons. These basic facts encouraged us to undertake this work. Optical metallography, hardness testing, X-ray diffractometry, and SEM–EDX techniques were employed to identify the phases present in the as-cast and heat-treated specimens of the investigated alloy and to correlate microstructure with corrosion resistance and hardness. Corrosion testing was carried out in 5 pct NaCl solution (approximate chloride content of sea water) using the weight loss method. In the investigated alloy, austenite was retained the in as-cast and heat-treated conditions. The same was confirmed by X-ray and EDX analysis. The stability and volume fraction of austenite increased with an increase of heat-treated temperature/time with a simultaneous decrease in the volume fraction of massive carbides. The decrease in volume fraction of massive carbides resulted in the availability of alloying elements. These alloying elements, on increasing the heat treatment temperature or increasing the soaking period at certain temperatures, get dissolved in austenite. As a consequence, austenite gets enriched as well as becomes more stable. On cooling from lower soaking period/temperature, enriched austenite decomposes to lesser enriched austenite and to a dispersed phase due to decreasing solid solubility of alloying elements with decreasing temperature. The dispersed second phase precipitated from the austenite adversely influenced corrosion resistance due to unfavorable morphology and enhanced galvanic action. Corrosion rate and hardness were found to decrease with an increase in heat treatment temperatures/soaking periods. It was essentially due to the increase in the volume fraction and stability of the austenitic matrix and favorable morphology of the second phase (carbides). The corrosion resistance of the investigated alloy, heat treated at 1223 K (950 °C) for 8 hours, was comparable to that of Ni-Resist iron. Thus, a microstructure comprising austenite and nearly spherical and finer carbides is the most appropriate from a corrosion point of view. Fortunately, the literature reveals that the same microstructure is also well suited from a wear point of view. It confirms that this investigated alloy will be suitable for corrosive-wear applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call