Abstract

Recent developments in deep learning have shown significant improvement in the accuracy of acoustic impedance inversion results. However, the conventional gradient-based optimizers such as the root-mean-square propagation, the momentum, the adaptive moment estimation (ADAM), etc., used in the deep learning framework, inherently tend to converge at the nearest optimum point, thereby compromising the solution by not attaining the global minimum. We apply a hybrid global optimizer, genetic-evolutionary ADAM (GADAM) to address the issue of convergence at a local optimum in a semisupervised deep sequential convolution network-based learning framework to solve the nonconvex seismic impedance inversion problem. GADAM combines the advantages of adaptive learning of ADAM and genetic evolution of genetic algorithm, which facilitates faster convergence, and avoids sinking into the local minima. The efficacy of GADAM is tested on synthetic benchmark data and field examples. The results are compared with that obtained from a widely used ADAM optimizer and conventional least-squares method. In addition, uncertainty analysis is performed to check the implication of the optimizer’s choice in obtaining efficient and accurate seismic impedance values. Results find that the level of uncertainty and minima of loss function attained using the GADAM optimizer are comparatively lower than that for ADAM. Thus, we demonstrate that the hybrid optimizer, i.e., GADAM is more efficient than the extensively used ADAM optimizer in impedance estimation from seismic data in a deep learning framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call