Laser-induced breakdown spectroscopy (LIBS) is a rapid method for detecting total iron (TFe) content in iron ores. However, accuracy and measurement error of univariate regression analysis in LIBS are limited due to factors such as laser energy fluctuations and spectral interference. To address this, multiple regression analysis and feature selection/extraction are needed to reduce redundant information, decrease the correlation between variables, and quantify the TFe content of iron ores accurately. Overall, 339 batches of iron ore samples from five countries were obtained from the ports of China during the discharging, and 2034 representative spectra were collected. A convolutional neural network (CNN) model for total iron content prediction in iron ores is established. The performance of variable importance random forest (VI-RF), variable importance back propagation artificial neural network (VI-BP-ANN), and CNN-assisted LIBS in predicting the TFe content of iron ores was compared. Coefficient of determination (R2), root mean square error (RMSE), mean relative error (MRE), and modeling time were selected for model evaluation. The result shows that variable importance significantly enhances the quantitative accuracy and reduces modeling time compared to traditional BP-ANN and RF models. Moreover, the CNN model outperformed manual feature selection methods (VI-BP-ANN and VI-RF), exhibiting the shortest modeling time, highest R2, lowest RMSE, and MRE. CNN model's unique characteristics, such as weight sharing and local connection, make it well suited for analyzing high-dimensional LIBS data in multivariate regression analysis. Our approach demonstrates the effectiveness of machine learning and deep learning approaches in improving the accuracy of LIBS for TFe content prediction in iron ores. CNN-assisted LIBS method holds great potential for practical applications in the mining industry.
Read full abstract