Abstract

Generative Adversarial Networks (GAN) have shown great potential in not only producing acceptable realizations of geologically complex models but also successfully reparametrizing them. Training GANs is quite challenging. One such challenge is mode collapse. When generating realizations of spatial property, mode collapse causes reduction in variability, compared to the input training dataset, and thus, the realizations become spatially biased at specific locations. To address this issue, we developed a new GAN architecture where a regularization term is introduced to maintain the variability and reduce mode collapse. This is achieved by using a probability map to evaluate variability and spatial bias of generated realizations and modifying the GAN loss function to minimize this bias. We applied the new architecture to a binary channelized permeability distribution and compared the results with those generated by Deep Convolutional GAN (DCGAN) and Wasserstein GAN with gradient penalty (WGAN-GP). Our results show that the proposed architecture significantly enhances variability and reduces the spatial bias induced by mode collapse, outperforming both DCGAN and WGAN-GP in the application of generating subsurface property distributions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call