The traditional probability of detection (POD) method, as described in the Department of Defense Handbook MIL-HDBK-1823A for nondestructive evaluation systems, does not take the time dependency of data collection into account. When applied to in situ sensors for the measurement of flaw sizes, such as fatigue-induced crack length and corrosion-induced mass loss, the validity and reliability of the traditional method is unknown. In this paper, the POD for in situ sensors and their associated reliability assessment for detectable flaw sizes are evaluated using a size-of-damage-at-detection (SODAD) method and a random parameter model (RPM). Although applicable to other sensors, this study is focused on long-period fiber gratings (LPFG) corrosion sensors with thin Fe–C coating. The SODAD method uses corrosion-induced mass losses when successfully detected from different sensors for the first time, while the RPM model considers the randomness and difference between mass loss datasets from different sensors. The Fe–C coated LPFG sensors were tested in 3.5 wt.% NaCl solution until the wavelength of transmission spectra did not change. The wavelength shift of 70% of the tested sensors ranged from 6 to 10 nm. Given a detection threshold of 2 nm in wavelength, the mass losses at 90% POD are 31.87%, 37.57%, and 34.00%, which are relatively consistent, and the upper-bound mass losses at 95% confidence level are 33.20%, 47.30%, and 40.83% from the traditional, SODAD, and RPM methods, respectively. In comparison with the SODAD method, the RPM method is more robust to any departure from model assumptions since significantly more data are used. For the 90% POD at 95% confidence level, the traditional method underestimated the mass loss by approximately 19%, which is unconservative in engineering applications.
Read full abstract