Marine broadband seismic results typically have observed amplitude spectra that are inconsistent with the earth reflectivity spectra measured from spatially coincident well data. The most notable inconsistency is a visual bias towards stronger ultra-low frequency amplitudes in the 0–10 Hz range on seismic data. Whilst useful in principle for more accurate elastic seismic inversion with less dependence upon low frequency model (LFM) building and for more stable full waveform inversion (FWI), the low frequency bias will degrade seismic resolution if not balanced properly during processing. Furthermore, the observed seismic amplitude spectra on pre-stack data vary with increasing offset and the phase can strongly vary in a frequency-dependent manner. These variations are attributed to three first-order processes: 1. Progressive attenuation of higher frequencies with longer travel paths, 2. NMO stretch, and 3. Offset-dependent tuning, and may easily be compounded by inappropriate parameterization of various processing and imaging steps. A key component of elastic seismic inversion is the extraction and scaling of several angle-dependent and depth-dependent wavelets. These wavelets are ideally independent of the aforementioned processes but inevitably they are not, and the angle ranges used in the inversion impact the significance of NMO stretch and offset-dependent tuning. Even if these various considerations can all be accommodated during processing, the resolution of seismic images is limited by the resolution of the velocity model, the scattering assumptions of the imaging algorithms used, and the a priori information used to constrain imaging and inversion. After reviewing the principles of seismic wavefield propagation in the contemporary context of broadband seismic methods, largely pursued with the ambition of removing sea surface-related ghost effects, I discuss the additional uncertainties introduced into seismic signal processing by broader bandwidth data — notably at the low frequency end. I consider two rather different ‘broadband seismic’ perspectives: 1. The industry must progress towards higher fidelity and resource-consuming measurements of every source event and every receiver measurement in order to effectively deconvolve the ‘system response’ from the data, or, 2. High-end imaging solutions can automatically eliminate the ‘source term’ without requiring detailed source information — assuming that wavefield separation has robustly accounted for dynamic variations in receiver geometry. A longer term consideration of imaging suggests that the classical sequential processing paradigm is in fact dead, that the definition of ‘noise’ is changing, and that advances in hardware are enabling solutions to long-standing challenges with cross-talk artifacts and irregular illumination. The addition of appropriate a priori information into joint migration and inversion allows historical assumptions about the unknown velocity model having a smooth background to be dismissed, and step changes in velocity model resolution may be achievable. I conclude by discussing how higher resolution velocity models will translate to less non-physical efforts in processing that can corrupt pre-stack amplitude, pre-stack phase and (elastic) image resolution.
Read full abstract