Autoscan shift: A new core-rock data integration technique to overcome the shortcomings of conventional regression

Autoscan shift: A new core-rock data integration technique to overcome the shortcomings of conventional regression

Journal of Petroleum Science and Engineering 51 (2006) 275 – 283 www.elsevier.com/locate/petrol Autoscan shift: A new core-rock data integration tech...

717KB Sizes 0 Downloads 38 Views

Journal of Petroleum Science and Engineering 51 (2006) 275 – 283 www.elsevier.com/locate/petrol

Autoscan shift: A new core-rock data integration technique to overcome the shortcomings of conventional regression Mohammad R. Awal ⁎, Mohammed A. Mohiuddin Center for Petroleum and Minerals, The Research Institute, King Fahd University of Petroleum and Minerals. P.O. Box 182, Dhahran, Saudi Arabia Received 19 February 2005; received in revised form 13 October 2005; accepted 14 January 2006

Abstract For integrating rock mechanical data from cores and well logs, a new, two-stage technique is presented. The first stage consists of a less known criterion for filtering spurious data (outliers). It was developed and applied in astrophysics by Chauvenet [Chauvenet, W., 1863. Theory and Use of Astronomical Instruments: Method of Least Squares, pp. 558–566] but not applied in the E&P industry. The second stage is a new calibration method devised by us. The new calibration method has a unique characteristic: it preserves the shape of the depth vs. parameter profile of well logs. The conventional calibration technique uses the well-known Least Squares-based regression technique. This usually results in some distortion of the shape of the log-based data profile. The distortion is more serious whenever the coefficient of correlation is low, and can entirely mask the true variability of the measured parameter as obtained from the well log. In addition, the presence of any spurious data in the core data itself can render the calibration process meaningless. Calibrating the continuous log-based parameters (e.g., rock-mechanical elastic moduli) with the help of a limited number of core-based data is a routine job in designing mud-weight window for a stable borehole, ensuring sand-free hydrocarbon production, and productivity enhancement by hydraulic fracturing. The proposed technique will help make more accurate designs of these important exploration and production operations. The proposed technique can be commercialized as an independent package or by embedding it in an existing petroleum engineering software package. A preliminary version has already been delivered to Saudi Aramco to help design hydraulic fracturing that is currently being performed in the Hawiya/Haradh Gas Initiative. © 2006 Elsevier B.V. All rights reserved. Keywords: Calibration; Regression; Rock mechanism; In situ stress

1. Introduction Traditionally, the measurements of rock properties based on cores have been considered more reliable than their log counterparts. However, the high cost of coring has often increased reliance on logging. With more ⁎ Corresponding author. E-mail address: [email protected] (M.R. Awal). 0920-4105/$ - see front matter © 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.petrol.2006.01.006

demand for cores to determine rock mechanical properties in the recent years, and also availability of full-wave acoustic logs, the reliance on log-based properties has increased. Today rock mechanical (RM) data, such as, rock elastic moduli (e.g., Poisson's ratio, Young's modulus, etc.) are needed for the design and implementation of many drilling and production engineering operations. The errors in estimating these elastic parameters can significantly affect the accuracy

276

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

of well designs, including optimized borehole trajectory, sanding prediction, and hydraulic fracturing for well stimulation. Therefore, the need for correctly integrating core–log data for RM properties estimation can hardly be overemphasized. In petroleum-related rock mechanics literature, the most common method found is the linear regression technique based on the method of Least Squares (Ahmed et al., 1991). However, it is found that the Least Squares process alters the quality of the log-based data, not only in magnitude but also in trend. It is the second aspect of data alteration that is of particular concern. It may smear the true variability of rock mechanical properties as a function of depth. It was also found, from actual well data, that the smearing of data trend may be as severe as producing a flat response whereas the actual rock mechanical property shows significant variability with depth. Maintaining the true rock property trend with depth, for example, the uniaxial compressive strength (UCS), can be crucial in designing a successful hydraulic fracture job. Since it is highly desirable to contain the fracture height within the payzone, the accurate UCS values in the over- and the under-burden rock strata must be known with respect to those in the payzone. For these reasons, two new methods for core–log data calibration have been developed. These methods can be applied independently or in tandem (in a sequence of two stages). The first method consists of filtering core-based parameter data set, while the second method consists of using the core-based data to calibrate the log-based parameter data set. Applying both has the benefit of the least error involved in the resultant, calibrated data set. It is demonstrated that the new approach in calibration may result in up to an order of magnitude of difference. The second stage technique is described first, highlighting that it can be applied without having to apply the first stage technique a priori (e.g., when there exists no outliers in the raw data set). 2. The Autoscan method In order to remove this defect of data processing involved in calibration, a simple yet high fidelity technique, called Autoscan, has been devised. The Autoscan method resulted from working extensively with a computer algorithm developed by the authors that sought to obtain the optimal translational shift (freezeand-shift) of the log data profile vis-à-vis core data points in the depth interval of interest (Awal and Mohiuddin, 2002). The algorithm showed that there exists a unique shift at which the sum of the squares of

errors between the core-based measurement and the corresponding log-based measurements (∑err) was minimum. The computer program generated from the algorithm was run thousands of times with synthetically generated core and log data. Fig. 1A shows a set of such error plots for the two most important rock mechanical elastic moduli: Poisson's ratio (PR) and Young's modulus (YM). The Autoscan search algorithm plots the ∑err values generated as the log-based frozen data profile is shifted from near the lowest core-based data value to the highest value. The concave-upward shape of the plotted ∑err function in each data set confirms the uniqueness of the optimal shift. When the log-based data profile is finally shifted to this optimal position, not only the true variability of rock properties change as a function of depth is preserved, but it is also observed that the ∑err is also lower than that obtained in the traditional Least Squares method. In order to demonstrate the uniqueness of the solution by the Autoscan method, the numerical experiment was repeated with synthetic data many times. In Fig. 1B, the results of 39 cases and 32,767 cases are shown. The concave-upward shape of the error plots for each data set is clearly discernible. Two actual case studies are shown to illustrate these points. 2.1. Case study #1 Fig. 2 shows the (a) YM and (b) PR for a sandstone reservoir. The actual data are (a) E-log (dark blue solid line) and E-core (yellow triangle); and (b) PR-log (dark blue solid line) and PR-core (yellow triangle). The first step is to ensure that the depth shift in the field data (i.e., between the log and the core data) are resolved. It was done by comparing the spectral gamma-ray (spectral GR) on cores versus the GR-log at selected intervals (from where cores were taken). In both cases, the Autoscan method gives superior results. For PR, the regression method gives totally unacceptable, flat result. For YM, notice the excursion by the regression method with respect to the uncalibrated log profile. For comparison, the result from another recent method, called FORMEL (Raaen et al., 1996), is also plot. 2.2. Case study #2 In this case study, the implications of the calibrated data when they are used to compute other data, e.g., the minimum in situ stress (σh), are shown. The min. in situ stress, σh, is a critical parameter in the design of hydraulic fracturing. In Fig. 3a, the selected rock

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

277

Fig. 1. (A) Optimal shift of original log-based data profile, with 100% preservation of data variation with depth, obtained by the Autoscan method. The red line shows the where the minimum sum of squares error takes place. Case (a) Poisson's ratio profile. Case (b) Young's modulus of elasticity. (B) Demonstration that the Autoscan method shifts the log data set to a position, which uniquely corresponds to the minimum error between the core and log data sets. On the left panel, error plots from 39 pairs of Poisson's ratio data sets show unique minimum. On the right panel, error plots for 32,767 pairs of data sets are shown to the same effect of uniqueness. All data were synthetic. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)

mechanical parameter to use is the PR, which is required to compute the minimum in situ stress (σh), using the following equation: rh ¼ m=ð1−mÞðrz −bpo Þ þ bpo

ð1Þ

σz β po

overburden stress, obtained by integrating the density log, Biot's poro-elastic constant (0–1, assumed 1 here), and pore pressure.

where, ν

Poisson's ratio (PR);

The unconfined compressive strength (UCS) data from a sandstone formation were obtained. The UCS

278

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

Fig. 2. Comparison of Autoscan (red) and conventional regression (green) methods for core–log data calibration, using actual field data. For YM, notice the excursion by regression method with respect to the uncalibrated log profile. For PR, notice the flat response of the regression method. In both cases, the Autoscan method gives a better result. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)

plots are shown for all four cases: core, log as well as the least squared, and Autoscan calibrated. The least squared method is found to give a flat UCS profile, which is unrealistic for the highly variable strength profile of the formation. The reason for the flat

response for the conventional regression method is because of the poor correlation between the core and the log data points. In other words, when the correlation between the two data sets is good, regression is found to give good result-preserving the

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

279

Fig. 3. The Autoscan method (red in both (a) and (b)) shows true fidelity (with respect to the shape of log profile – blue in (a) and green in (b)). The conventional Least Squares method (green in (a), and blue in (b)) shows a flat response (severe distortion of the original log profile, green). (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)

280

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

log-based data variability. On the other hand, the Autoscan method always preserves the data variability with depth as seen by the log. One may question the wisdom of going ahead with calibration (by regression or Autoscan method) when the core and log data points have a poor correlation. In such cases, the engineer usually has no choice, either of repeating the core measurements or the log, neither of which is practical. Then the Autoscan method provides an avenue to salvage the data.

2. The new method is computationally more efficient, due to the fact that the only equation to be computed, involves only two arithmetic operations. 3. With actual data, both the Autoscan method and the conventional regression method give good quantitative results if the correlation between the core and log data sets is good. 4. The accuracy is affected by the outliers, i.e., by such core data points as may be spurious, or pertain to a different rock type (lithology) in the depth interval under investigation.

3. Mathematical basis of the Autoscan method 5. Filtering the core data Having established the case for Autoscan method, an attempt is made to establish the mathematical basis of the method. Another objective was to develop the Autoscan method as a direct, fast algorithm. The mathematical equivalent of the computerized experiment was derived, as shown by Eq. (2a) and (2b). Z VðjÞ ¼ ZðjÞ þ d

ð2aÞ

where, X d¼ ðY ðiÞ−ZðjÞÞ=N

ð2bÞ

Z′(j) represents the calibrated (desired) values, obtained by calibrating the log-based values, Z(j) with the corebased values of the variable, Y(i). The derivation is presented in Appendix A. In order to facilitate a comparison, the least square equation is also presented: Z VðjÞ ¼ mZðjÞ þ c

ð3aÞ

where m and c are the regression constants and are given by: X X X . X X  X 2− X Y− XY m¼ X2 ð3bÞ c¼

X

X . Y −m X N

ð3cÞ

4. Comparison of the two methods 1. The Autoscan method has significant qualitative advantages over the conventional method. The shape of the raw log-based variable (PR, or YM) is preserved in the Autoscan method, while the conventional regression method distorts the shape. In other words, the new method honors the variation of the parameter values with depth with 100% fidelity.

In order to obtain accurate calibration, the importance of having accurate core data can hardly be overemphasized. The errors in core data may enter due to measurement errors in the laboratory and/or failure to isolate zonal data. The problem is complicated by the fact that the core data may have a large range in a rather short depth interval owing to a multitude of depositional, lithological, and structural factors. Nevertheless, the extreme values in the reported core data, called outliers, must be detected and accounted for, before using them for calibrating the log-based data. To the authors' knowledge, there is no method to detect the outliers in the core data in the oil and gas industry literature. To address the problem, first an ad hoc method is devised, based on the deviation of a particular data from the sample mean value. Allowing deviations from one to three standard deviations, it is found that the regression could be improved, by both the Least Squares and the Autoscan methods. Encouraged by the initial results, a multi-disciplinary literature survey was conducted in order to find any Table II-1 Chauvenet's criterion for rejecting a data point Number of data points, N

Ratio of max. acceptable deviation to standard deviation (dN = dmax/σ)

3 4 5 6 7 10 15 25 50 100 300 500 1000

1.38 1.54 1.65 1.73 1.80 1.96 2.13 2.33 2.57 2.81 3.14 3.29 3.48

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

formal treatment on outliers, and found the Chauvenet's Criterion falling along our line of analysis. Chauvenet (1863), an astronomer in the 19th century, developed the following criterion to omit a bad data point: a reading may be rejected if the probability of obtaining the particular deviation from the mean is less than (1/2)N, where N is the size of the sample. According to this criterion, the ratio of the deviation (dmax) to standard deviation (σ) for various values of N can be tabulated as shown in Table II-1 (Appendix B). Before applying the Chauvenet's criterion, one must ensure that there is no depth shift between the

281

core and log data sets. Once the outliers are detected, the Autoscan calibration is performed. As stated before, the outliers are not ‘bad data’ as such, but are isolated as irrelevant. This irrelevancy can be due to change in lithology or other intrinsic differences pertaining to the rock at the corresponding depth. Thus, application of Chauvenet's criterion affords a means of extracting relevant core data before calibration. If the number of outliers is significant, and or pertains to a continuous depth interval, then those data (from both core and log) can be isolated and separately calibrated.

Fig. 4. Effect of filtering on the minimum in situ stress profile (σh).

282

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

The effect of filtering the core data set is shown in Fig. 4a and b, for the Autoscan and the Least Squares methods of calibration, respectively. The data used is the minimum in situ horizontal stress (σh), which is required in many petroleum engineering applications, such as hydraulic fracturing, directional/horizontal drilling, designing the optimum mud-weight window, etc. From the figures, the maximum difference between the filtered and un-filtered calibrated σh profiles were found to be as follows: Least Squares method: 2244 psi Autoscan method: 808 psi. This clearly demonstrates the superiority of the proposed Autoscan method for core–log data calibration. It also underscores the importance of filtering the input core data set. Also one important characteristic between the two methods can be found: the least squared (filtered − unfiltered) profile is jerky, while that for the Autoscan method is relatively smooth. This shows the greater artifacts introduced in the calibration process by the Least Squares method. 6. Conclusions This paper contributes to the oil and gas industry by presenting two new, serial techniques to obtain a reliable data profile from core and log-based measurements. The first technique is applying statistical data processing to identify the outliers questionable data points in the reference data set (core data), and the second technique is a high-fidelity core–log data calibration method. Each of the two techniques may improve the calibrated data profile by an order of magnitude. The following specific conclusions are drawn from this study: 1. The proposed Autoscan calibration technique has demonstrated superiority over the conventional Least Squares method both in terms of smaller sum of squares errors and preservation of variation of the data with depth. 2. A statistical outlier identification technique has been imported for quantum improvement in the calibrated data profile. 3. The application of these two techniques in tandem has been demonstrated for accurate determination of rock mechanical data versus depth, which is important in well operations such as thief zone identification, water shut-offs, hydraulic fracturing, optimum mud-weight window design for avoiding costly drilling failures, etc.

Acknowledgement The authors are grateful to Research Institute of King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia, for permission to publish the paper. Appendix A. Derivation of the Autoscan method The derivation is similar to deriving the conventional regression method, which is as follows Z Vð jÞ ¼ mZð jÞ þ c

ðI  1Þ

where, Z′(j) = calibrated (desired) values Z(j) = original log values, j = 1, 2, …, M; and m and c are the regression constants.

The core-based values are represented by Y(i), i = 1, 2, …, N (N < M). First, match the N number of (Y, Z) pairs of core–log data, each pair corresponding to the same depth (approximately, since the log data has a resolution of only 0.5 ft.). Then the parameters m and c are given by: X X X . X X  X 2− X Y− XY m¼ X2 ðI  1aÞ and, c¼ X X X X

X

X ¼

X . X N

X ðZð jÞÞ

X2 ¼ Y¼

Y −m

X ðZð jÞ2 Þ

X ðY ðiÞÞ

XY ¼

X

ðY ðiÞZð jÞÞ

ðI  1bÞ ðI  1cÞ ðI  1dÞ ðI  1eÞ ðI  1f Þ

The summations (∑) are carried over the N pairs of data. A.1. The Autoscan method Here, seek to minimize the sum of the squares of differences between each pair of core and log data. To

M.R. Awal, M.A. Mohiuddin / Journal of Petroleum Science and Engineering 51 (2006) 275–283

achieve this, shift the log data, Z( j), by an amount δ. The shift magnitude, δ, is the same for all log data points, ensuring the shape of the original log data is frozen. Let X( j) represent the shifted log data. Thus, X ð jÞ ¼ Zð jÞ−d

ðI  2Þ

It follows that the ∑err is given by: X  X err ¼ ðY ðiÞ−X ð jÞÞ2

ðI  3Þ

X ðPð jÞ þ dÞ2 ;

¼

X 2 ðPÞ ð jÞ þ 2Pð jÞd þ d2

¼

X

P2 ð jÞ þ 2d

X

where Pð jÞ ¼ Y ð jÞ−Zð jÞ

Pð jÞ þ

when m = 1, i.e., the regression equation has the following form: Z Vð jÞ ¼ Zð jÞ þ c ðI  6Þ Following the same steps as above, obtain the following expression for the parameter, c: X c¼ ðY ðiÞ−zð jÞÞ=N : ðI  7Þ

Appendix B. Chauvenet's criterion for data filtering

X ¼ ðY ðiÞ−ðZð jÞ−dÞÞ2

¼

X

d2

In a series of counts one or two values may differ from the mean by so much that doubt is cast on the validity of these particular values. For example, in counting radioisotopes this situation occasionally arises when electrical interference is picked up by the amplifiers used in electronic counting apparatus, particularly in laboratories where radio frequency or microwave power is in use. Various criteria have been suggested for deciding whether a suspect data point should be discarded. One of these is Chauvenet's criterion, which calls for the rejection of a data value Xi if jXi −X j=rzdN

¼

X

P2 ð jÞ þ 2d

X

Pð jÞ þ N d2

ðI  4Þ

To minimize (∑err), differentiate Eq. (I-4) with respect to δ and equate to zero: X X A=Adð errÞ ¼ 0 þ 2 Pð jÞ þ 2N d ¼ 0 Zd ¼ − ¼

X

283

ðII  1Þ

where dN has a value which depends on the total number of observations, N, having a standard deviation σ. Some of the values for dN are given in Table II-1. Once a suspect data value has been rejected, the mean and variance of the remaining data may be recalculated. References

Pð jÞ=N

X ðY ðiÞ−Zð jÞ=N Þ

ðI  5Þ

Eq. (I-5) gives the amount (d) by which the original log data has to be shifted in order to obtain the minimum (∑err). Interestingly, the same result can be obtained by slightly modifying the Least Squares regression method. The Autoscan method is equivalent to the LS regression

Ahmed, et al., 1991. Enhanced In-Situ Stress Profiling Using Microfrac, Core, and Sonic Logging Data,”. SPE Paper no. 19004. Awal, M.R., Mohiuddin, M.A., 2002. Introducing an Enhanced Calibration Technique for Core–Log Rock Mechanical Parameters: New Shift Calibration and Recursive Filtering Techniques,”. SPE 78593, Poster Presented at ADIPEC, Abu Dhabi. Chauvenet, W., 1863. Theory and Use of Astronomical Instruments: Method of Least Squares, pp. 558–566. Raaen, A.M., Hovem, K.A., Joranson, H., Fjaer, E., 1996. “FORMEL: A step forward in strength logging,”. SPE paper no. 36533 Presented at the 1996 SPE Annual Technical Conference and Exhibition, Denver, Colorado, U. S. A.