A series of heat treatments was employed to vary the microstructure of four commercially important alloy white cast irons, the wear resistance of which was then assessed by the ASTM jaw-crusher gouging abrasion test. Compared with the as-cast condition, standard austenitizing treatments produced a substantial increase in hardness, a marked decrease in the retained aus-tenite content in the matrix, and, in general, a significant improvement in gouging abrasion resistance. The gouging abrasion resistance tended to decline with increasing austenitizing tem-perature, although the changes in hardness and retained austenite content varied, depending on alloy composition. Subcritical heat treatment at 500 ° following hardening reduced the retained austenite content to values less than 10 pct, and in three of the alloys it caused a significant fall in both hardness and gouging abrasion resistance. The net result of the heat treatments was the development of optimal gouging abrasion resistance at intermediate levels of retained aus-tenite. The differing responses of the alloys to both high-temperature austenitizing treatments and to subcritical heat treatments at 500 ° were related to the effects of the differing carbon and alloying-element concentrations on changes in theMs temperature and secondary carbide precipitation.