Abstract
Three-dimensional prestack depth migration and depth residual picking in common-image gathers (CIGs) are the most time-consuming parts of 3D migration velocity analysis. Most migration-based velocity analysis algorithms need spatial coordinates of reflection points and CIG depth residuals at different offsets (or angles) to provide updated velocity information. We propose a new algorithm that can analyze 3D velocity quickly and accurately. Spatial coordinates and orientations of reflection points are provided by a 3D prestack parsimonious depth migration; the migration involves only the time samples picked from the salient reflection events on one 3D common-offset volume. Ray tracing from the reflection points to the surface provides a common-reflection-point (CRP) gather for each reflection point. Predicted (nonhyperbolic) moveouts for local velocity perturbations, based on maximizing the stacked amplitude, give the estimated velocity updates for each CRP gather. Then the velocity update for each voxel in the velocity model is obtained by averaging over all predicted velocity updates for that voxel. Prior model constraints may be used to stabilize velocity updating. Compared with other migration velocity analyses, the traveltime picking is limited to only one common-offset volume (and needs to be done only once); there is no need for intensive 3D prestack depth migration. Hence, the computation time is orders of magnitude less than other migration-based velocity analyses. A 3D synthetic data test shows the algorithm works effectively and efficiently.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.