Abstract

Ultrasonic Time-of-Flight Diffraction (TOFD) has proved highly effective for the inspection of welds, providing accurate positioning and sizing of defects. Currently, most TOFD data interpretation is performed off-line by a trained operator and using interactive software aids. This processing is highly dependent on operator experience, alertness and consistency and is cumbersome and time-consuming. Results typically suffer from inconsistency and slight inaccuracies, particularly when dealing with large volumes of data. The recent trend in the related disciplines of remote sensing and medical imaging is to automate the data processing and interpretation process as far as possible, relieving the expert to some extent of unnecessary or repetitive tasks. It is anticipated that TOFD interpretation could benefit from such automation, improving the interpretation procedures by adding an element of robustness, accuracy and consistency. This can be achieved by discriminating between subtle variations in visual and spectral properties of the data, resulting in savings in time, effort and cost. This paper presents a scheme for the automatic processing of TOFD data and detection of weld defects as part of a comprehensive TOFD inspection and interpretation aid. A number of signal and image processing tools have been specifically adapted for use with ultrasonic TOFD data and developed to function autonomously without the need for continuous intervention. It is hoped this will form the basis for a new paradigm in ultrasonics for fully-automatic batch processing and interpretation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.