Abstract

Phased array ultrasonic testing (PAUT) requires highly trained and qualified personnel to interpret and analyze images. It takes a solid understanding of wave propagation physics to comprehend the generated images. As such, the inspector's judgment and level of experience have a significant impact on the analysis's outcome. In addition, the procedure is prone to error and laborious. AI had shown to be effective in computer vision in a variety of classification and detection tasks. Regarding PAUT, studies have also demonstrated that machine learning may be able to identify defects with a level of accuracy that is on par or even superior to that of trained and qualified inspectors. Nonetheless, the use of computer vision in PAUT remains very limited. The primary cause of this is the challenge accessing large databases of labelled inspections. In fact, a considerable amount of training data is required for machine learning. While it is easy to access sizeable, labelled databases of MRI scans or photographs for instance, that is not the case in PAUT because inspection results are usually confidential. In this project, a large database was generated using mock-ups commonly used to train and evaluate inspectors. The different defects contained in these mock-ups were used to train a machine learning model. The data was acquired with several different probes centered at different frequencies. Each acquisition was performed using Full Matrix Capture (FMC). The post-processing of the data contained in the FMC allows to compute any sectoral scan from its focal laws. As a result, a comprehensive database composed of hundreds of thousands of sectoral scans was generated from these few FMC acquisitions. The completeness of this database facilitated robust training of a defect detection model for PAUT sectoral scans. The evaluation of the model demonstrated its ability to generalize even to defect types it had never been trained on. Furthermore, the detection performance remained consistent even in high noise conditions where the Contrast-to-Noise Ratio (CNR) was very low.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.