Abstract

According to databases such as CONCAWE and PHMSA, corrosion failures of onshore pipelines accounted for about 16% of the overall number of incidents among 2004 to 2011. Therefore, corrosion monitoring has become a major objective within the oil industry. The most popular technique used for this purpose is the In-Line Inspection (ILI), which is used to determine overall pipeline status (e.g., inner and outer condition of the pipe and wall thickness). It is widely used for risk management with standards such as ASMEB31G or API579-1/ASME FFS-1. However, these approaches do not take into account the uncertainty associated with ILI inspection tools (e.g., MFL and UT). Several investigations have been conducted to reduce the noise generated and to accurately measure metal losses. However, there are still important deficiencies in evaluating both the spatial and the time-dependent variability of damage. This work seeks to use data obtained from ILI to better support risk-based decision making. The proposed approach uses corrosion growth models to estimate the remaining life of the pipeline based on mechanistic models (e.g., pressure failure criteria). This information, combined with pattern recognition techniques, clustering and reliability concepts, is then used to obtain base failure probabilities. Actual data from two ILI measurements was used to validate the model. The results show that by using statistical clustering of data, the failure probability may actually increases by 10% in comparison with the corresponding defects evaluated individually. This result have an important impact on any decision regarding life-cycle analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call