Abstract

Subcritical heat treatment of austenite in alloyed cast iron, treated at a temperature below pearlite transformation (A1), has been used to improve matrix hardness without conventional hardening heat treatment. Two series of hypoeutectic 16 and 26 wt% Cr cast irons containing 0, 1, 2 and 3 wt% molybdenum (Mo) were used to investigate the behavior of hardness and retained austenite during subcritical heat treatment. An as-cast test piece was held at subcritical temperatures in 50K intervals from 773K (500C, 932F) to 873K (600C, 1112F) for 21.6 ks to 64.8 ks and then cooled to room temperature by fan cooling. Hardness and volume fraction of retained austenite (Vγ) were measured. In the as-cast state, the hardness decreased gradually but the Vγ increased greatly as Mo content increased in both 16 and 26 wt% Cr cast irons. In the state of subcritical heat treatment, the hardness increased first and then decreased with an increase in holding time. This phenomenon is due to a hardening caused by the precipitation of secondary carbides and by martensite transformation from the destabilized austenite during cooling. At the same Mo content, the degree of hardening was greater in the 16 wt% Cr cast iron than in the 26 wt% Cr cast iron. The Vγ decreased with an increase in both holding time and holding temperature. The maximum hardness in the subcritical heat treatment (HSTmax) was obtained when the specimens were treated at temperatures from 823K (550C, 1022F) to 873K (600C, 1112F) for 50.4 ks. The HSTmax increased gradually in the 16 wt% Cr cast iron but it changed little in the 26 wt% Cr cast iron when Mo content was increased. The highest value of HSTmax, 760 HV30, was obtained in the 16 wt% Cr cast iron with 3 wt% Mo where the Vγ was less than 10%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call