Abstract

Elemental powders of iron (Fe), molybdenum (Mo), and carbon (C) were mixed in a pot mill to obtain the compositions of C45, C45–1% Mo, and C45–2% Mo steels. They were then compacted and sintered. The sintered preforms had a density of 75% of the theoretical density (TD). Then, the sintered preforms were subjected to densification to obtain the two densities of 80% and 85% TD through forging. The sintered and densified preforms of the alloy steel were subsequently machined to obtain the required wear test specimens. The experiments were conducted on a pin-on-disc tribometer, conforming to ASTM G99 standards, on a rotating EN32 disc. Using Minitab 16 software, dry sliding wear experiments were planned using a L27 orthogonal array. The percentage TD of the specimens (%Theoretical density + %Porosity = 1), percentage Mo addition, load, and sliding velocity were taken as input parameters, and mass loss was the output parameter. It was observed that increasing the density of the alloy steel adversely affects the wear resistance of the alloy steel, and thus the mass loss is increased. The addition of Mo to the C45 steel improves the wear resistance irrespective of density, owing to hard-phase carbides present in the microstructure. Empirical correlations for mass loss with respect to input parameters were developed using regression analysis. The hardness of the alloy steel was directly related to the density of the alloy. Mo addition contributed to an increase in hardness of the alloy steel. It was observed from optical images of the wear pattern that the C45 steel is subjected to uniform wear, as an evenly spread wear track appeared in the images. On the other hand, it was observed that the C45–Mo-alloyed steel exhibited non-uniform wear because of hard-phases present in the microstructure.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call