Abstract

Correctly determining the onset of fracture is crucial when performing mechanical experiments. Commonly carried out by visual inspection, here an image-based machine learning approach is proposed to classify cracked and un-cracked specimens. It yields the potential to objectify and automate crack detection, thereby removing sources of uncertainty and error from the post-processing of experiments. More than 30’000 speckle-pattern images obtained from 77 experiments on three specimen geometries are evaluated. They comprise uniaxial tension, notched tension as well as axisymmetric V-bending experiments. Statistical texture features are extracted from all images. They include both first-order (variance, skewness, kurtosis) and higher-order statistical texture features, i.e. Haralick features. The discriminatory power of the texture information is evaluated based on the Fisher's Discriminant Ratio and feature correlations are identified and quantified. Image texture feature subsets of high discriminatory power are used to parse neural network architectures of different complexities from simple perceptron to feed-forward and cascade neural networks. It is found that a small subset of the investigated texture features is highly significant for all experiments. Using this feature subset in conjunction with multi-layer, non-linear and low complexity feed-forward network architectures classification accuracies in the order of 99% are obtained. At the same time, it is shown that linear classifiers are not sufficient to robustly distinguish the state of the specimens, even when high discriminatory power features are used.Graphical Abstract: [Display omitted]

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.