Abstract

Quantitatively monitoring the thickness reduction of a pipe by using magnetic flux density method remains a major challenge due to the small and gradual change in magnetic flux density over a long distance along a pipe. This signal change is hindered in the background signal of a detector that has a high temperature dependence and unstable background signal. In this paper, we verify the effectiveness of magnetic flux density method in quantitatively monitoring thickness reduction through simulation and experimental verification. Experimental results show that the proportion of magnetic signal change is 1.4 mT per 10% wall loss. To support this quantitative approach, we optimise an original ultra-high-sensitivity planar Hall sensor with high thermal stability and stable background signal versus time for measurement of the magnetic field over a wide range from 0 to 50 mT. The quantitative measurements were validated on test pipes with varying thickness steps of 0.5 mm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call