Outcrop analogues play a pivotal role in resolving meter-scale depositional facies heterogeneity of carbonate strata. Two-dimensional outcrops are insufficient to decipher the 3D heterogeneity of carbonate facies. Near-surface geophysical methods, notably ground-penetrating radar (GPR), can be employed to step into 3D and extend the dimensionality of the outcrops to behind the outcrop. However, interpreting geophysical images requires specific geophysical expertise, often unfamiliar to field geologists who are more familiar with the actual rock than the geophysical data. A novel generative adversarial network (GAN) application is presented that constructs a photorealistic 3D virtual outcrop behind-the-outcrop model. The method combines GPR forward modeling with a conditional generative adversarial network (CGAN) and exploits the apparent similarities between outcrop expressions of lithofacies with their radargram counterparts. We exemplified the methodology and applied it to the open-source GPR data acquired from the Late Oxfordian-Early Kimmeridgian Arabian carbonate outcrop. We interpret a 4 km long outcrop photomosaic from a digital outcrop model (DOM) for its lithofacies, populate the DOM with GPR properties, and forward model the synthetic GPR response of these lithofacies. We pair the synthetic GPR with DOM lithofacies and train them using CGAN. Similarly, we pair the DOM lithofacies with outcrop photos and train them using CGAN. We chain the two trained networks and apply them to construct an approximately 2 km long 2D and an approximately 60 m2 3D volume of photorealistic artificial outcrop model. This model operates in a visual medium familiar to outcrop geologists, providing a complementary instrument to visualize and interpret rock formation instead of geophysical signals. This virtual outcrop replicates the visual character of outcrop-scale lithofacies features, such as the intricate bedding contacts and the outline of reef geobodies.
Read full abstract