1. Technical Field
Examples of the subject matter disclosed herein generally relate to methods and systems for seismic exploration and, in particular methods and systems for seismic data acquisition and seismic data processing directed to de-ghosting.
2. Description of the Background
To map subsurface geology during exploration for oil & gas or other minerals and fluids a form of remote sensing is employed to construct three dimensional images of the subsurface. The method relevant to this specification is known as seismic surveying where an energy source transmits pressure pulses into the earth which maybe reflected by geological interfaces in it and subsequently recorded at the surface by arrays of detectors. Such surveys are conducted on land using geophone detectors which vibrate and displace a magnet within a coil to induce an electric potential, or hydrophones in a marine environment which detect pressure changes due to the reflected wave-field and also induce an electric signal which may be recorded. In the marine environment the source's energy is transmitted and received from the subsurface via the water layer, the surface of which at the air-water interface, acts like a mirror by reflecting energy. As both the energy source and the hydrophone detectors lie within the water layer, to transmit and detect the emitted and reflected energy, reflections of the source and arrivals at the receiver are re-reflected from the water surface and contaminate the recorded wave-field.
Thus, there are two main contaminants which cause problems in processing and interpreting the sub-surface data. Complete reflections of the subsurface interfaces, caused by energy bouncing between the water bottom and air-water surface, which are known as multiples; and shorter period bounces known as ghosts. There are source-side ghosts caused by the source reflected back from the surface; and receiver-side ghosts incident at the detectors as down going surface reflections of the up-going energy from the sub-surface. Of the two contaminants, the latter is damaging to the reflection signal characteristics because the ghost periodicity is so close to the wavelengths of the reflections themselves.
These ghost reflections do more than complicate the subsurface image. The mirror effect which produces them changes the phase of the reflection by 180 degrees so that in some circumstances the energy constructively interferes with the desired signal to magnify it and sometimes it destructively interferes and nullifies the signal. Water surface reflections of very long wavelength, low frequency seismic waves, destructively interfere so there is always a null or notch at 0 Hz. The gradation from destructive to constructive interference manifests itself as if a sloped bandpass filter had been applied to the temporal frequency spectrum of the received traces. The width of this apparent bandpass filter is predictable. For example, in the case of the receiver-side ghost the second notch occurs when the seismic wavelength is equal to twice the receiver depth, as this is when the up-coming and down-going reflections are 180 degrees out of phase. The temporal frequency (F) at which this occurs may be determined if the velocity of propagation (V) of the wave in the fluid medium is known. Higher harmonics of this fundamental frequency naturally occur when multiples of their wavelengths are also equal to twice the receiver depth (Z), so all may be predicted by the relation:Fn=Vn/(2Z), where n=1,2,3 . . . etc.  (1)The change between the extremes of this constructive and destructive effect occurs smoothly, imparting a sine squared taper shaped filter to the amplitude of the recorded wave-field.
These notches and the associated filtering limit the extent to which subsurface reflections or events can be resolved. This damaging process results in blurred images at best, and at worst, fictitious reflections when the ghost energy lags significantly behind the primary reflection energy. This loss of resolution degrades the ability with which geophysicists, geologists and engineers can map the subsurface, possibly obscuring tell-tale details and characteristics which might be clues to the geological environment. To counter the effect of the notches themselves conventional seismic surveys are designed to use shallow source and receiver arrays to ensure that the second notch lies at high frequencies. However, the resultant sloped bandpass filter is a pervading problem as it causes seemingly irretrievable damage to low frequency information which is increasingly being sought in the industry to deliver extra value in the interpretation process.
To discern subsurface rock properties and prediction of content, seismic surveying systems routinely use arrays of detectors arranged at different offset distances away from the source so that a dynamic distortion effect known as move-out is observed and recorded. This distortion is routinely compensated during processing and simultaneously delivers information about the velocity of energy propagation through the subsurface. In a marine environment these arrays are towed in the water in a line behind a survey vessel which often, but not always, also tows a deployment of one or more seismic sources. To maximize the surface coverage for any one transit it is also normal to tow multiple arrays of detectors deployed parallel to each other perpendicular to the direction of the vessel's progression through the water. Each of the sources fires in turn to deliver energy into the water-column, thence transmitted and reflected back from the rock interfaces in the earth. The alternating sources illuminate distinct grids of mid-points instantaneously positioned notionally half way between each source and all detectors. The returning signal is recorded at a high fixed data rate for a time period like 10 seconds, arranged to be slightly less than the time it takes for the vessel to tow all the source and receiver arrays to the next source desired firing position. Each recorded trace, as it is called, notionally sounds the vertical position below the so called mid-point of its detector and source. In this manner, a huge quantity of data traces are recorded, all contaminated by ghost energy as described.
There are three data acquisition variations designed to tackle the ghost problem: slanted arrays of detectors; parallel arrays of detectors arranged vertically one above another at the same horizontal position, known as “over-under”, and mixtures of different types of detectors at a coincident position, are representative. The former techniques exploit variations in the recorded ghost effect, which can be processed together to de-ghost the signals. The latter exploits the fact that up-going and down-going energy exhibit different polarity which one type of detector is able to observe, whereas the other type does not. This allows the ghost energy to be removed by careful summation of the two signals because the ghost is of opposite polarity in one of the recorded datasets. The slanted streamer array technique deploys (current) standard streamer equipment, whereas the other two data acquisition techniques use an increased number of streamers or sets of duplicate detectors, often called ‘dual sensors’. These therefore increase, doubling at maximum, the amount of data traces recorded.
Once recorded, the data are routinely processed in a computer. The term de-ghosting is used to describe the computer based step to ameliorate or remove the ghost effect from the data. Fundamentally de-ghosting either involves some adaptive summation to extinguish the ghost by polarity difference, or adaptive summation of differently ghosted waveforms to recombine the primary signal present in both, in essence to infill the spectral notches.
Existing mechanisms to de-ghost seismic data all fundamentally rely on the recording of alternate views of the same data. Data are deliberately acquired with different ghost characteristics so that when combined there is improved signal spectrum coverage and the damaging notches are filled.
As ghosting occurs on both the source and receiver sides, a receiver side de-ghosting solution alone does not entirely compensate. Most of the existing techniques described below relate to receiver side de-ghosting. Source-side solutions tend to rely on re-designed source arrays to minimize their reflected ghost.
Ray et al. (U.S. Pat. No. 4,353,121 Oct. 5, 1982 Fairfield Industries, Inc. High resolution, marine seismic stratigraphic system) pioneered the use of streamers slanted from shallow to deep over the offset range to obtain a large variation in the ghost characteristics. After adjusting for the datum difference and application of NMO the variation or diversity of the ghost characteristics essentially fill spectral notches with primary energy once the data are stacked (summed). Additionally they showed that optimum alignment of the datum shifted ghost energy, after a suitable polarity change, could be used separately or in conjunction with the primary data to produce de-ghosted stack datasets.
Drawbacks of this approach include that the data pre-stack are at mixed datums, which can complicate the analysis of key properties like velocity, which is fundamental for processing.
R. Soubaras et al. extended this concept by deferring de-ghosting until the last step in data processing: migration, pre or post stack. They exploit the so called mirror-migration in a similar manner to Ray et al, aligning and imaging with polarity inverted ghost energy from virtual receivers at height Z above sea level instead of primaries at depth Z below sea level. In optimally focusing the ghost energy, this forms the down-going wave-field which is then used to de-convolve the upcoming wave-field, the product of a conventional migration process. Deferring the de-ghosting step until after migration has the apparent benefit of focusing both the primary and ghost energy to be more coincident in X, Y, Z, than recorded on common mid-point traces in the field.
The drawback to this approach is that the de-ghosting is deferred to this late stage of the processing and that the rest of the processing has been somewhat complicated by the data being acquired at a mixture of datums. Once again, the derivation of velocity field is complicated by this deferment, yet is crucial to the migration which precedes the de-ghosting.
Dual sensor, wave-field propagation separation, is a technique which exploits the polarity difference measured by two coincident but different types of sensor, one a hydrophone detecting pressure variations and the other a geophone sensing particle motion and therefore able to discriminate between up-going and down-going energy. After appropriate compensation for their different amplitude responses, the two signals are summed to remove the ghost from the pressure measurement because the down-going ghost has opposite polarity on the geophone trace. This is often referred to as the PZSUM technique and provides de-ghosting and de-multiple for towed streamer and ocean bottom acquisition systems.
Other methods require a minimum of two traces with different ghost characteristics, and seek to combine their energies to de-ghost the data or fill the notches. There are several techniques to effect this combination; predict the notch frequencies and design frequency mute transitions to merge energy from one trace to the next trace (U.S. Pat. No. 5,148,406 to Brink, et al.) such that the desired frequencies are spliced together; de-phase and sum, this designs an inverse operator based on the receiver depth and ghost reflectivity with that receiver's trace to undo the effect of its ghost, then sums together the results of this operation on the two traces to infill their respective notches; de-phase and frequency weighted summation method (B. J. Posthumus, Geophysical Prospecting 41, p 267-286, 1993; first presented at 52nd EAGE meeting Copenhagen, May-June 1990) went further to compensate for the sum of the amplitudes of the de-phasing filters and to deliver de-ghosted data.
Obtaining the candidate trace pairs is done in a variety of ways. One way is over-under streamer acquisition, where coincident receiver traces sample the wave-field at two different depths and are combined to provide a de-ghosted wave-field. Although elegant, reliably and safely deploying such configurations is full of practical difficulties, e.g. ensuring that over-under streamers lie in the same vertical plane, and increased risk of tangling streamers.
Another way is called “sub-sampled over-under,” where only sparse pairs of over-under streamers are towed to reduce equipment costs and minimize deployment difficulties. Missing data required for input to the frequency splicing method are reconstructed using interpolation (US Pat. App. Pub. 2010/0074049 A1 to Kragh, et al.). The cross-line separation of the deep streamers is much coarser than the shallow streamers such that interpolation of the sub-sampled deep data becomes band-limited to lower temporal and spatial frequencies. This is reasoned to be acceptable because the deeper data contains the desired low frequencies.
Another way is called “quasi over-under,” where vertically staggered source and/or receiver arrays are deployed in a V or W pattern. DeKok, 2002 (U.S. Pat. No. 6,493,636 to DeKok) describes a method to reduce the ghost effect for quasi over-under data by subtracting cross-line spatially filtered estimates of the ghost periodicity from the recorded data. However, the method requires gathering a sufficient number of cross-line ordered traces to exhibit a repetitive pattern, sorting those gathers according to the data's cross-line position, and then filtering using a filter for removing cross-line trace-to-trace variation.
Accordingly, it would be desirable to dispense with the requirements imposed by previous de-ghosting techniques and, instead, to perform de-ghosting in a different way, thereby avoiding the costs and pitfalls associated with these techniques.