Abstract

X-ray micro-computed tomography (micro-CT) has been widely leveraged to characterise the pore-scale geometry of subsurface porous rocks. Recent developments in super-resolution (SR) methods using deep learning allow for the digital enhancement of low-resolution (LR) images over large spatial scales, creating SR images comparable to high-resolution (HR) ground truth images. This circumvents the common trade-off between resolution and field-of-view. An outstanding issue is the use of paired LR and HR data, which is often required in the training step of such methods but is difficult to obtain. In this work, we rigorously compare two state-of-the-art SR deep learning techniques, using both paired and unpaired data, with like-for-like ground truth data. The first approach requires paired images to train a convolutional neural network (CNN), while the second approach uses unpaired images to train a generative adversarial network (GAN). The two approaches are compared using a micro-CT carbonate rock sample with complicated micro-porous textures. We implemented various image-based and numerical verifications and experimental validation to quantitatively evaluate the physical accuracy and sensitivities of the two methods. Our quantitative results show that the unpaired GAN approach can reconstruct super-resolution images as precise as the paired CNN method, with comparable training times and dataset requirements. This unlocks new applications for micro-CT image enhancement using unpaired deep learning methods; image registration is no longer needed during the data processing stage. Decoupled images from data storage platforms can be exploited to train networks for SR digital rock applications. This opens up a new pathway for various applications related to multi-scale flow simulations in heterogeneous porous media.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.