Abstract

The effect of heating rate and grain size on the melting behavior of Nb–47 mass% Ti is measured and modeled. The experimental method uses rapid resistive self-heating of wire specimens at rates between ∼10 2 and ∼10 4 K/s and simultaneous measurement of radiance temperature and normal spectral emissivity as functions of time until specimen collapse, typically between 0.4 and 0.9 fraction melted. During heating, a sharp drop in emissivity is observed at a temperature that is independent of heating rate and grain size. This drop is due to surface and grain boundary melting at the alloy solidus temperature even though there is very little deflection (limited melting) of the temperature–time curve from the imposed heating rate. Above the solidus temperature, the emissivity remains nearly constant with increasing temperature and the temperature vs time curve gradually reaches a sloped plateau over which the major fraction of the specimen melts. As the heating rate and/or grain size is increased, the onset temperature of the sloped plateau approaches the alloy liquidus temperature and the slope of the plateau approaches zero. This interpretation of the shapes of the temperature–time curves is supported by a model that includes diffusion in the solid coupled with a heat balance during the melting process. There is no evidence of loss of local equilibrium at the melt front during melting in these experiments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call