The use of white wear-resistant cast iron instead of steel 110G13L for the production of rapidly wearing parts of crushing and milling equipment increases their service life [1]. The choice of rational alloying of cast iron can be based on correct understanding of the qualitative proportion of excess phases, their morphology, and the nature of alloying of the metallic matrix, which should provide maximum wear resistance in abrasion. It is known that cast iron of the type of ICh250Kh16 alloyed with molybdenum has the highest wear resistance if the matrix has a structure of martensite with a certain amount of metastable retained austenite [2]. The presence of even a low amount of nonmartensitic products of the decomposition of austenite, i.e., pearlite or troostite, in the structure of the matrix decreases markedly the wear resistance of cast iron. There is an opinion that the contribution of retained austenite to the wear resistance is determined by its stability. The occurrence of the martensitic transformation in metastable austenite promotes an increase in the wear-resistance of cast iron. In the presence of stable retained austenite the initial hardness of cast iron decreases as well as its wear resistance [2]. The aim of the present work2 consisted in determining the role of the phase composition of the matrix and the stability of retained austenite in providing the wear resistance of sparingly alloyed wear-resistant chromium cast irons upon abrasive wear and developing optimum regimes for their heat treatment.