In order to accurately determine the three-dimensional (3D) shape of granular particles, one laborious approach is to employ X-ray microtomography and image processing techniques. As an alternative, in this study a 3D convolutional neural network (3D-CNN) methodology is presented to generate realistic 3D models of particles from a number of distinct two-dimensional (2D) projections. Initially, 3D synthetic particles with the targeted statistical distributions of the relevant geometric parameters, such as sphericity, elongation, aspect ratio, and flatness, were randomly generated using Fourier shape descriptors (FSDs). The database based on the statistical distributions of geometric parameters of Avicel® PH 200, obtained through dynamic image analysis. Following that, projections from multiple angles of 2D particle images were extracted for model training. By using a convolutional neural network (CNN) the shape of the particles was predicted, which involves the following sequential steps: a) model training on 2D images; b) a decoder is built to generate the 3D voxel model after an encoder extracts the properties of 2D images; c) a marching cube algorithm that converts the particle 3D voxel model into a mesh model; d) the accuracy of the prediction is evaluated by comparing the geometric parameters of the prediction models that were generated by using FSDs. The methodology was validated using a database of 3D synthetic particles with targeted statistical distributions of geometric parameters, such as sphericity, elongation, aspect ratio, and flatness. The mean absolute percentage error (MAPE) of the AI model for these parameters was 2.24%, 7.01%, 10.93%, and 7.67%, respectively. The 3D-CNN methodology has the potential for broad applications in various fields, such as rock and mineral analysis, battery materials, pharmaceuticals, and space exploration, offering a practical and affordable solution for shape analysis of relevant materials. It can significantly improve the efficiency and accuracy of numerical simulations and analyses, leading to new insights and innovations in a wide range of fields.
Read full abstract