Abstract

Deep learning was applied in sorting nanoindentation-induced acoustic emission events. The acoustic emission events were triggered by a plasticity onset and dislocations phenomena, observed on the electropolished W (100) sample during nanoindentation tests. The acoustic signal was recorded by a specialized sensor integrated into the nanoindenter tip. The signal was conditioned using analog/digital electronics and post-processed by the advanced signal processing routines that include entropy filtering, and Continuous Wavelet Transforms (CWT). Pseudo time-frequency domain plots were constructed by representing/plotting CWT coefficients in those two domains and creating topography maps. This arrangement presented AE event data in a commonly utilized graphic picture format, jpeg. The deep learning technology originally developed for generic image recognition, which operates on 224 × 224 × 3 sized jpeg images, was deployed for sorting out acoustic events. The GoogLeNet deep learning neural network was trained on predefined classifiers and then deployed on the raw acoustic signal data sets. The proposed deep learning acoustic emission event sorting methodology successfully differentiated W (100) plasticity onset from other types of nanoscale contact acoustic interactions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call