For the two-dimensional localization of acoustic emission events (AE), artificial intelligence (AI) was applied. AE events were generated by breaking pencil leads on a granite plate measuring with dimensions of 50 cm × 50 cm × 6 cm. A total of 128 measurements were carried out at each of the 81 positions on the top surface of the granite plate. The recorded signals with 16 channels were processed using continuous wavelet transformation (CWT). The resulting RGB images served as input data for a convolutional neural network (CNN). The transformed images of the measured AE signals were converted into a four-dimensional input by considering them as deeper input, allowing for improved data capture. The result was analysed using a binary classifier according to the one-versus-all method. However, due to limited computational capacity, processing the four-dimensional data posed a challenge, resulting in significant programming effort. The results of this investigation provide insights into the potential of using AI for the localization of AE events.
Read full abstract