Core logging is the geological study, recording and classification of petrophysical attributes of drill hole samples, such as lithology, alteration or mineralogical assemblage. The geological logging is qualitative and subject to errors because of its visual nature and other factors inherent to logging, such as low drill hole recoveries, difficulties in estimating the volumetric contents of minerals, or different logging criteria among geologists. To date, different tools for quality control and validation of geological logging have been elaborated, based on geological knowledge, statistics, geostatistics, image analysis, neural network and data mining. This paper presents an alternative approach based on geostatistical modeling for identifying and reclassifying potentially mislogged samples when quantitative covariates from geochemical analyses or metallurgical tests are available. The principle of this approach is to: (i) define geological domains for each quantitative variable by an adequate grouping of the log classes; (ii) transform the quantitative variables into normal scores, accounting for the previously defined domains, (iii) model the spatial correlation structure of the normal scores, (iv) perform leave-one-out cross validation and obtain predictions of the normal variables and the associated variance-covariance matrices of prediction errors; (v) calculate a measure of consistency for each sample and each possible logged class under a multivariate normal assumption; and (vi) compare these measures of consistency with the actual logged classes to detect suspicious logs. The methodology is demonstrated in a case study from an iron ore deposit, with data of rock type logged by geologists and seven quantitative variables (grades of elements of interest, loss on ignition and granulometry).
Read full abstract