Due to limited resource definition and grade control drilling, conventional bench-scale geochemical exploration is frequently plagued by uncertainty, resulting in diminished processing plant efficiencies. The Measure-While-Drilling (MWD) system comprises a set of sensors designed to gather data pertaining to the performance of mining blast hole drill rigs. Previous MWD research primarily relied on penetration rate to identify rock type and estimate grade. This investigation uses Data Fusion and Machine Learning (ML) algorithms to characterize geochemical orebody quality values, such as iron percentage, phosphorous, sulfur, alumina, and silica content, based on MWD data collected at the field-scale, thereby facilitating efficient mine planning and excavation. Feature importance algorithms identified novel significant MWD variables, such as force on bit and ratio of bit air pressure to penetration rate, which provide valuable insights into subsurface geochemical properties. The performance of various ML algorithms was compared, with the Random Forest algorithm demonstrating the highest coefficients of determination (up to 0.96), indicating accurate field or laboratory results prediction. As a result, this work demonstrates that MWD data can be used to obtain high-resolution orebody knowledge prior to mining, improving subsurface geochemistry prediction. This knowledge can optimize the excavation of high-grade material by minimizing dilution from lower-grade or waste rock entering the processing plant.