Effective integration of core and log data

Effective integration of core and log data

Effective integration of core and log data Paul F. Worthington* Gaffney, C/ine & Associates, Bentley Ha//, B/acknest, Alton, Hants GU34 4PU, UK Receiv...

1MB Sizes 18 Downloads 70 Views

Effective integration of core and log data Paul F. Worthington* Gaffney, C/ine & Associates, Bentley Ha//, B/acknest, Alton, Hants GU34 4PU, UK Received 15 February 1993; revised 6August 1993; accepted 19August 1993 The primary purpose of core to log data integration is to reduce the uncertainty associated with formation evaluation. In so doing, advantage is taken of both the higher precision of core data and the larger scale of investigation of log data. It is particularly important that when tying logs back to core, the calibration algorithm is as well defined as possible. A basic requirement is the definiton of a common reference depth scale. A second requirement is to reconcile the different vertical resolutions by the depth averaging of core data, the signal enhancement of log data, or both. Essential to this process is the adoption of 'key intervals' as control zones for data integration. These procedures can result in a reduced uncertainty, which transmits through to reservoir appraisal through a better petrophysical definiton of constituent lithological units. Important new developments will facilitate the integration of core and log data where core recovery is incomplete. These include the ultrasonic monitoring of core entry into the core barrel, the multi-sensor scanning of whole cores, and the interactive display of both core and log data in interrogative workstation mode. Implementation of the optimum data integration strategy and of the new upcoming developments will require cross-discipline collaboration between those responsible for core acquisition, core analysis and wireline log interpretation. A key factor will be the ability to develop interpretative algorithms for cross-scale application. Keywords: core logging; well logging; data integration

Effective practice is a prerequisite for achievement in the oil industry of today. The word effective can be defined as 'fit for work or service' (Oxford English Dictionary, 1933). Roget's Thesaurus (Dutch, 1962) offers the synonyms profitable and successful. In modern business parlance, the concept of being effective is expanded to doing the right things, developing new and innovative options, optimizing the use of resources and increasing profits. From an oil industry standpoint, these practices translate to the recognition of commercial opportunities and the quantification, reduction and management of risks, uncertainty and costs. More specifically, in the context of reservoir evaluation, greater effectiveness is achievable through the conjunctive use, or integration, of the relevant sources of geoscientific data to develop a descriptive reservoir model in a way that respects the attributes of the input data and yet discourages data redundancy. A fundamental aspect of integrating data that contribute to the reservoir model is the reconciliation of data measured at different scales. This is necessary because smaller scale, higher spatial resolution data constitute the building blocks for larger scale units that are progressively increased to the seismic and flow unit scales. A key component of this reservoir description process is the integration of core and log data, which is seen here in a broader context than the core calibration of log data, as reviewed by Rafdal (1991). The primary purpose of core-log integration is to control log evaluation using data measured under laboratory conditions. This 'tying back to core' usually involves plotting measured core values of a given * Correspondence to Dr P. F. Worthington

parameter against log-derived values of the same parameter, after some form of depth merging. The resulting data distribution is then used to generate a regression algorithm that can be applied to log data. It is reasoned that such an algorithm will compensate for departures in the well logs due to poor tool design, erroneous shop and field calibration, engineering error and imperfect environmental corrections (CookeYarborough, 1987). Thus we seek to take advantage of both the higher precision of core data and the larger scale of investigation of log data. This approach presupposes that the core data are representative of the reservoir, in terms of both sampling frequency and sample integrity. This supposition is most appropriate when the formation is clean, consolidated, homogeneous and vertically and laterally extensive with well defined boundaries, and when the core data are free of environmental, instrumental and operational errors. These conditions are rarely satisfied. Even if the data can be measured accurately and precisely, there have often been doubts about the representativeness of core data, especially where the lithology is markedly variable (Marchant and White, 1968). A key question concerns the disparity in sample size between core and log data (Enderlin et al., 1991). If the effects of this disparity can be reduced, an improved data correlation might be achievable. This paper examines the mechanism of core-log integration and identifies ways in which uncertainty might be reduced. Throughout, the context is one of using the smaller scale core data to control the interpretation of the larger scale log information, and to allow the latter to be scaled up to the seismic and flow-unit scales. Thus the logs become, for all practical

0264-8172/94/04/0457-10 © 1994 Butterworth-Heinemann Ltd

Marine and Petroleum Geology 1994 V o l u m e 11 N u m b e r 4

457

Effective integration of core and log data: P. F. Worthington purposes, continuous extensions of the core data. This clastic rocks but not for vuggy carbonates, for which drive to gross up and average geoscientific information whole core is needed. The logs should effectively is contrary to the more established philosophy of resolve the beds from which core has been taken. In the seeking the sharpest attainable spatial resolution. The case of thinly laminated sand-shale sequences, the aim is to develop a strategy for an improved data sand layers will only be resolved by high resolution reconciliation for better defined reservoir description. logs such as micrologs. The beds are presumed to be The paper therefore focuses on specific issues of horizontal and laterally consistent in terms of physical reservoir characterization at the mesoscopic (bedding) properties, at least to the depth of investigation of the scale as discussed more generally by Worthington logging tools. Where the beds are not horizontal or (1991). laterally homogeneous, core and log data will show The structure is first to identify prerequisites for more pronounced residual departures (Enderlin et al., effective core-log integration, then to list elements of 1991). These departures might be reconcilable through an idealized data integration strategy, to discuss briefly forward modelling of tool response using any available how this strategy might be impeded by a poor core information about formation anistropy. However, understanding of the physics of scaling, to describe in extreme cases, core and log data might always some contemporary developments that would facilitate be poorly correlated (Marchant and White, 1968). implementation of the strategy, and finally to place the Obvious difficulties can be expected in the cases of technical strategy for effective core-log integration highly deviated and horizontal wells. into the broader perspective of reservoir description. Equivalent textures Rock textures can be described as equivalent when they Prerequisites for core-log integration relate to the same grain and pore geometries. Core The following prerequisites are usually presumed to be data should be measured on samples that have been fully satisfied, even though they rarely are. They are undamaged by the acquisition, handling and cleaning discussed here to emphasize that complete core-log processes. Recovered core material is markedly data reconciliation might not be achievable in all variable in how induced changes in hydration, stress instances, even with the introduction of innovative data and temperature affect its reservoir properties. In handling procedures. particular, routine core analysis practice can induce damage when interface-sensitive clay minerals are present (Heaviside et al., 1983; Pallatt et al., 1984; Matched depths Worthington et al., 1988). Log data should describe Core data are identified by drilled depths, logs zones that have not been damaged by drilling and other by wireline depths. The primary vehicles for depth operational processes, e.g. stress-release or thermallymerging are natural gamma ray measurements. Core induced fractures. Formation damage is, of course, plug locations should be identified on core gamma most likely to occur close to the well bore. records. The various logging runs should be depthmatched to a common wireline depth scale by Comparable measurements comparing the natural gamma logs from the relevant The same parameter should be measured both downtool combinations. Thereafter, core and log data hole and in the laboratory if the data are to be should be merged to a common reference depth scale. effectively integrated. For example, electric and The determination of a reference depth scale is one of induction logs measure horizontal resistivity at given the most difficult tasks facing the data integration frequencies; it is important that the corresponding specialist. As the log data are continuous, and can laboratory measurements on horizontal core plugs are benefit from excellent depth control under favourable recorded at similar frequencies to avoid problems conditions (Theys, 1991), a possible approach might be arising from the frequency dependence of electrical to use a wireline depth scale that has been reconciled data. Furthermore, the correlated data should be with the drillers' depth. This merging procedure might reported in the same parametric system. In the case of not be feasible where core recovery is incomplete and porosity, one might use total interconnected porosity or the natural gamma data are featureless. However, the free fluid porosity, but not both. latter qualification implies a degree of homogeneity that might render the procedure less critical. An alternative procedure has been to match core to Elements of an idealized data integration strategy electrical images of the wellbore obtained using the formation microscanner (Ekstrom et al., 1987); this has Key intervals the additional benefit of core orientation. The growing Of the various prerequisites described here, the ones interest in slimhole continuous coring might lead to over which we have no control are the relative bed opportunities for evaluating and comparing these thickness and lateral homogeneity. We have a choice various approaches. of core plug sizes and can select logging tools to sense different volumes around the near-wellbore zone, but Representative scales o f measurement we cannot alter the intrinsic vertical resolution of Core plugs should be sufficiently large to contain a logging tools without departing from the standard logging suite. Even then, it may not be possible meaningful representation of the pore structure of each lithological unit or bed that is sampled. From a to measure the same parameter(s) with a sharper resolution, simply because of tool availability. A practical standpoint, the smallest plug size should be at least an order of magnitude greater than the beneficial option is to restrict the core-log integration to the central parts of relatively thick beds in the first largest pore diameter. Conventional core plug sizes instance. This would allow a 'calibration' of the log data (2.54 cm x 5 cm) usually satisfy this requirement for 458

Marine and Petroleum Geology 1994 Volume 11 Number 4

Effective to be effected to the exclusion of wireline data close to bed boundaries. The calibration should be restricted to selected key intervals. A key interval is one which consists of beds whose thicknesses are significantly greater than the coarsest vertical resolution within the logging suite and which has been fully cored. Algorithms for tying logs back to core should be established only for key intervals. No other data should be considered. The criteria for selecting key intervals can be based on Table 1. There, the tool intrinsic resolution is an engineering design characteristic (e.g. source-detector spacing), log sensitivity is the minimum bed thickness for significant manifestation in the logging curve where there are non-extreme contrasts between beds, and log parametric: resolution is the minimum bed thickness needed for full manifestation in the logging curve, i.e, the log faithfully records the parametric value for the bed in question (Worthington, 1990). Bed thickness should be several times greater than the minimum needed for the logging tool to achieve parametric resolution, i.e. to make a measurement that is as close to the truth as other environmental factors will allow, subject to perfect tool operation. Obviously some compromise will be needed for the induction log in resistive beds, which is justified by, for example, the success of Olson (1986), Dyos et al. (1988) and Dawe and Murdock (1990) in reconciling core-derived water saturations with those determined from deep induction logs. In some instances, e.g. thinly laminated sandshale sequences, it may not be possible to identify key intervals: the alternative, but related, practice is to use high resolution logs with core samples providing calibration. Data characteristics Table 2 compares the characteristics of core and log data and indicates which are superior, from the standpoint of petrophysical input to reservoir description, in terms of continuity of sampling, sample volume, accuracy and precision, measurement conditions and evaluation philosophy. Ideally, we should try to merge the data to derive the maximum benefit from the collated database. Thus we take advantage of the greater accuracy and precision of core data and the fact that these data can be interpreted directly whereas log data cannot. On the other hand, by merging the data, we link these benefits to the continuous nature and larger sampling volume of log data, and to the advantage that log data relate to in situ conditions. We shall now consider the integration of core and log data in terms of each of these data characteristics.

integration of core and log data: P. F. Worthington Table 2 Relative merits of core and log data Characteristic

Core

Log

Evaluation p r o c e d u r e Measurement conditions Accuracy and precision Continuity Sampling v o l u m e

Direct Experimental Higher Discontinuous Smaller

*-

Indirect

~

In situ

~-* --~

Lower Continuous Larger

Arrows indicate the more desirable data from the standpoint of scaling up

Evaluation procedure Logs are records of physicochemical characteristics that can be measured downhole; they are not records of the reservoir properties we need to know. Therefore the approach to reservoir evaluation using well logs is necessarily one of indirectness, i.e. we measure a physical parameter that is known to be related to the target reservoir parameter such as porosity or water saturation. In contrast, core data are presented directly in terms of the reservoir parameters. After environmental corrections the log data may need to be further corrected for shale or light hydrocarbon effects. An algorithm is then used to link the corrected physical measurements to porosity, water saturation or possibly permeability, all in appropriate parametric units. The algorithm can be conceptual [e.g. the time-average equation for relating acoustic velocity to porosity (Wyllie et al., 1956)], empirical [e.g. the Archie equations (Archie, 1942)] or theoretical [e.g. the application of the Hanai-Bruggemann theory to the evaluation of water saturation in shaly sands (Bussian, 1982)]. It can also be universal, reservoir-specific, or even variable within a reservoir. The form of the algorithm must be clearly established and verified through core control. This is the first level of core calibration of log data. As an example, let us consider the case of a quartz sand from a North Sea field with isolated micropores within the grains. The bulk density is 2.3 g cm -3. The mineral density is 2.65 g cm -3 but the grain density is only 2,55 g cm -3. The conceptual relationship linking porosity ~PD to bulk density 13b is, for the particular case of the density log Pma q~r)-

-- P b

p .... -

where P .... respectively. do we use?'. 2.65 g cm -3,

Pr

(1)

pf are the matrix and fluid densities, The question is: 'which matrix density If we follow standard practice and use ~PD is calculated as an absolute porosity

Table 1 Vertical resolutions of logging tools Log

Gamma ray Density Induction Neutron Sonic

Tool intrinsic r e s o l u t i o n *

Log sensitivity t

Log parametric resolution t

(m)

(m)

(m)

0,2-0.3 0.4 2.1-2.4

0.3-0.4 0. 25- 0. 3 0.75-0.9

0.4 0.6

0.5-0.6 0.6-0,8

0. 6- 0. 75 0. 46- 0. 6 1.2-1.5 (conductive beds) 10 (resistive beds) 2.4-2.6 1.3-1,4

* From Allen etaL (1988) t Assumes no environmental effects, no noise Notes: Tool intrinsic resolution is a tool design characteristic; log sensitivity is the minimum bed thickness that can be seen as an event in the logging curve, with moderate parametric contrasts with the shoulder beds; and log parametric resolution is the minimum bed thickness needed for full manifestation of the logged parameter(s) in the logging curve

Marine and Petroleum Geology 1994 Volume 11 Number 4

459

Effective integration of core and log data: P. F. Worthington of 21%. This figure includes the isolated micropores and would not correctly indicate the reservoir porosity for practical purposes. However, if we are enlightened, we will use the grain density of 2.55 g cm -3 for which qbi~ is calculated as a total interconnected porosity of 16%. This figure excludes the isolated micropores that are represented within the correctly measured grain density, the use of which allows good correlation between qbD and porosity determined in the laboratory from helium expansion measurements. Thus algorithm (1) is correctly characterized. This example emphasizes the need to determine precisely those characterizing parameters, such as pf and Pma, which govern the form of the (conceptual) algorithms relating log measurements to reservoir properties. Another example, this time of an empirical core-calibrated algorithm for permeability prediction from nuclear magnetism and density logs in Western Canada, is that of Logan (1989). This example is particularly interesting because it describes a regional adaptation of the universal algorithm of Timur (1969).

2.5

?

Eo 20

="

_3 °

1.5 ~

s,* /

I

s S s s S

I£ . . . . .

I

1.5

(a)

21.0

2!5

2.0 '

25

2.5

E o~ 2.0

Measurement conditions Core-log integration is more effective where the core data themselves relate to simulated reservoir conditions. Ideally, data should be acquired through pressure coring with rapid preservation, with a compatibility of measurement directions where appropriate, and using simulated reservoir fluids, tectonic stresses and reservoir temperatures. Although this requirement strictly encompasses pressure and temperature, it is general industry practice to consider only effective overburden stresses on samples that have been allowed to depressurize during recovery. At the very least, a reservoir should be studied to establish the influence of stress on porosity and permeability. If there is no significant dependence, the problem falls away. However, if there is a stress effect, some form of correction to core data is needed before these data can be used to calibrate log data. In such instances the specification of experimental conditions is a crucial step in ensuring the accuracy and representativeness of core data. It is worth mentioning that some core data (in the public and private domains) seem to be independent of environmental factors. A synthesis of these data and investigation of common characteristics remain priorities of applied research.

== .m co

1.5 I t

sS s JS

i S. . . . .

(b)

L 1.5

i

Core density (g cm 3)

Figure 1 Examples of cross-plots of core versus log density for (a) conventional log data and (b) signal-enhanced log data (courtesy C. J. Dyos, personal communication)

cored hole drilled for coal exploration through thin beds with excellent depth control. There was a negligible mud cake and therefore only the long spaced density data are shown. Figure la contains unprocessed log data that show departures in terms of accuracy (gradient =k 1) and precision (degree of data scatter). The signal-enhanced density log (Figure lb) shows a marked increase in both accuracy and precision. In this example, the excellent depth control allows the benefit of signal enhancement to be strongly evident. The agreement would be improved still further if the vertical resolutions of log and core data could be rendered more compatible.

Accuracy and precision It is the practice to assume correctly functioning systems both in the laboratory and in situ. Uncertainties in log data are caused by the remoteness and the environment of tool measurement. Corrections for borehole effects, invasion effects and bed thickness all have inherent errors. The error associated with the bed thickness correction is greater than that for borehole correction. To overcome this problem, log readings should be chosen on plateaux where appropriate. Other errors are partly accommodated in tying back to core. Log data can be greatly improved by signal enhancement or deconvolution processing (Dyos, 1986). This is an area which will continue to advance rapidly. In contrast, core data are seen as relatively accurate and precise (Hamilton and Stewart, 1983). As an example, Figure 1 illustrates the tying back to core of a density log. This log was run in a shallow 460

Continuity and sampling volume These two data characteristics are considered separately. Logs hold the advantage in both instances, being continuous and sensing a larger volume, i.e. they are closer to the flow unit scale at which we have to characterize a formation for reservoir description purposes. We therefore need to manipulate core data to bring them closer to the logs. In this way, the logs become an extension of the core data. The continuity of core data is limited by the sampling interval chosen for routine core analysis, although special core analysis studies have more flexibility. The sampling interval for plugs must no longer be considered in isolation from the logs. Key intervals should be chosen by considering log data in conjunction with core natural gamma records. Identified key intervals should have a plug sampling interval of

M a r i n e a n d P e t r o l e u m G e o l o g y 1994 V o l u m e 11 N u m b e r 4

Effective integration of core and log data: P. F. Worthington 15 cm to correspond to the conventional log reporting interval. Plugs should be taken away from bed boundaries. This strategy would provide the same continuity in core data as in conventional log data over the most important intervals. It is worth mentioning that the resolution of dual detector nuclear logging tools can be enhanced through the processing of data sampled digitally at the smaller interval of 3 cm. Plug sampling volume is usually much less than 1% of the volume sensed by conventional logs used in the formation evaluation process, although the comparison is influenced by tool type and environmental conditions. For example, a conventional core plug samples about 25 cm -~ of formation. In contrast, a natural gamma ray tool run in a 20 cm hole with a 30 cm depth of investigation and a vertical resolution of 0.75 m samples about 3.7 x 105 cm 3 of rock, but not with uniform weighting. Yet again, a micro-resistivity device with an 8 cm depth of investigation and a vertical resolution of 5 cm samples about 2.3 x 1 0 3 c m 3 of rock. The locations of plugs for special core analysis should be guided by X-ray computed tomography scanning where possible to avoid localized heterogeneities, which a log would suppress. For purposes of scaling up, plugs are intended to be representative of the different layers sensed by a logging tool and localized heterogeneities that are atypical of the rock mass can prevent this objective from being achieved. Plugs should be taken horizontally for resistivity and permeability measurements and vertically for acoustic measurements. This would allow directional compatibility with the corresponding well logs or formation tests. However, even this refinement only goes part of the way towards representing the complexity of the reservoir. These issues become even more poignant in the case of horizontal wells. For scaling-up purposes and to facilitate data comparisons, the core data should be averaged over a distance that corresponds to the vertical resolution of the appropriate logging tool. If the tool has been subjected to signal enhancement, so that the vertical resolution is processed to be sharper, it is the latter distance over which plug averages should be determined. This procedure should take place over selected key intervals for which plugs have been taken every 15 cm. Such an interval is already half that used in conventional core analysis within the petroleum industry. The simplest approach would be a five-point running mean, which would bring the resolution of the core data closer to that of conventional log data. It can be shown that if the core plugs are from the same bed, this procedure carried out with equal weightings will significantly reduce the uncertainty. The averaging of core data can be designed to reflect the response curve of the corresponding logging tool by using appropriate weightings (Barlai, 1979). This refinement might increase the physical significance of the resulting mean. Certainly, the agreement between core and log data can be enhanced through judiciously selected filtering of the core data (Dawe and Murdock, 1990). Agreement can also be enhanced by processing the log data through some form of deconvolution, as we saw earlier. However, as we are primarily concerned with scaling up, rather than scaling down, this aspect is not developed further here.

As an example of the benefits of core data smoothing,

Figure 2 depicts a simulated density log over a model interval that contains six porous beds, with porosities 0.10, 0.15, 0.20, 0.25, 0.30, 0.35 and with shale strata between them. Each bed is 2.4 m thick and each has been sampled at 15 cm intervals starting 7.5 cm into the bed. Thus plugged intervals are 7.5, 15, 22.5 . . . cm, measured from a bed boundary. The core data have been specified to show a cyclicity of bulk density. It is presumed that each core plug samples a horizontal homogeneous layer of thickness 15 cm. The simulated density log averages the core data, a feature which is clearly evident in Figure 2. In a real situation, we would be confronted with the density log and would cross-plot this against the core data, as shown in Figure 3a. The inputs to this figure are the 'measured' core data for the specified levels and the digitized log data sampled at those same levels. The net result is a scatter of data about the line of unit gradient with outliers due to shoulder-bed effects. In contrast, Figure 3b shows the same plot but with the core data averaged by a five-point running mean over a vertical distance of 60 cm, which is approaching the vertical resolution of the density tool (cf. Table 1). Now the scatter has almost disappeared. Thus, although it is not possible to account for lateral nearwellbore heterogeneities that the tool might see but the core would not, it is possible to account for heterogeneities vertically even though log and core data have different vertical resolutions.

20

Core Densities (gm / cc) •

27

20

Simulated Density (gm/cc)

27

1 /

/

10

1111 20

--.

Figure2 Simulated density log for alternating sand-shale model sequence

Marine and Petroleum Geology 1994 Volume 11 Number 4

461

Effective integration of core and log data: P. F. Worthington Negative ~ wrt nominal

o.iti

e

nom,na,

porosity o osity

Bour~qary effects

.

~~~

? 2.5

10

Eo v

Z 15 ~_. 20

-

25 ~. "5 E o3 (a)

30

35 2.0 2.0 Core density (g cm -3)

215

2.0 (b)

Mean core density

(g cm-3)

2.5

Figure 3 Cross-plots of core density versus log density for (a) level by level correlation and (b) five-point depth-averaging of core data for model of Figure 2

This approach will be less effective in markedly heterogeneous and/or thinly bedded formations; for such instances a strip-sampling approach has been proposed (Dahlberg and Fitz, 1988). This involves the strip-sampling of core over a distance that approximates the spatial resolution of the logging tool. This technique involves destructive testing. It allows the sample to be mixed over the length of the strip. However, it is not standard practice.

Core-derived algorithms for log interpretation There have been numerous published examples of core data being compared with the interpretations of log data. Some interesting recent examples are Nieto and Yale (1991), Bouvier and Maquignon (1991), Marion and Jizba (1993) and Trewin and Morrison (1993). These examples draw 011 the strategic elements of the previous section to varying degrees. Invariably, the form of the interpretative algorithm is determined at the core scale and then applied at the log scale. Although this approach has often worked well, there have been occasions where data reconciliation has not been possible, even where the formation seems laterally homogeneous. These mis-matches have been responsible for the view that core and log data might have a residual incompatibility, which cannot be reconciled even under favourable conditions. In the case of core validation of log interpretation, a possible contributive explanation might be the sensitivity to upscaling of the empirical algorithms used in petrophysics. To illustrate this argument, let us consider a unit block of porous material that is divided into halves, each of which is homogeneous and isotropic (Figure 4). The two halves are characterized by porosities and formation resistivity factors ~l, Fl and q~2, F2, respectively. We want to relate FI and F2 to the formation factor F of a unit block that is homogeneous and is electrically equivalent to the divided block. In logging terms, we might imagine the characteristics of the uniform block to be the interpretation of logs that cannot resolve separately the two halves, which would be characterized by core analysis. The problem therefore simulates the relationship of core data to log data. 462

There are two ways in which we can establish the relationship between Fl and F2, on the one hand, and F, on the other. One approach is to use the concept of resistors in parallel to calculate F directly; this approach is permissible because the formation factors are scaled resistivities. We find that

2

F1

F2

F

The second approach is to use the well established averaging algorithm for porosity and to substitute formation factors for porosities using Archie's first law (Archie, 1942). In this instance, the starting point is the expression

) - ( * , + ,2) = *

(3)

2

ROUTE1 ~

] Resistorsin ~ parallel

~

Estimates disagree unless m=l

Formation

factor powerlaw

porosity

algorithm

+

~

equation

Figure4 Example of non-robust algorithm for cross-scaling (Archie equation)

M a r i n e a n d P e t r o l e u m G e o l o g y 1994 V o l u m e 11 N u m b e r 4

Effective integration of core and log data: P. F. Worthington If we adopt the following form of Archie's first law The first approach is based on a summation of masses so that

1 F = 9'"

(4)

where m is an empirical exponent that is related to pore geometry and is assumed to be constant (default value = 2), we can substitute for 91,92 and 9 in Equation (3) as follows

[//?Z

(5) Equations (2) and (5) represent two different approaches to the averaging or scaling up of petrophysical properties, the former based on the law of physics, the latter on an empirical algorithm in routine use. It can be seen that Equations (2) and (5) are identical only if rn = 1. This condition is known to be satisfied only for systems of parallel capillaries, which have not been introduced here. Otherwise, Equations (2) and (5) provide different estimates of F. Thus, in the general case, our system of equations is not robust in terms of cross-scale application, i.e. different estimates of F are obtained by taking routes 1 and 2 (Figure 4). Possible explanations are that the parallel resistor concept cannot be applied to resistors that are in electrical contact or, perhaps more likely, that the empirical origin of Archie's law makes the exponent m a function of the scale of measurement. In any case, although Archie's first law is in standard use in the petroleum industry, it is frequently abused because interpreters lose sight of the limitations associated with its empirical foundation (Worthington, 1993). As a second illustration, let us consider another divided unit block (Figure 5), the two halves being characterized by porosities and densities 9t, P~ and 92, P2, respectively. We want to relate Pl and P2 to the density Pb of a homogeneous unit block that is materially equivalent to the divided block. Again, there are two ways in which this task might be addressed.

ROUTEI

Fc:I.

I Summati of masses °n

Estimates always agree

1.2 p2V ROUTE2

~

1

Meandensity equation

+ I Densityporosity ~l~ equation

Example of robust (density-porosity equation) Figure5

algorithm

(6)

2

The second approach is to use Equation (3) in conjunction with the density mixing law for porous media, i.e. (7)

Pb = 9Pf + (1 -- 9) Pma

where pf, Pma are, again, the densities of the interstitial fluid and the rock matrix, respectively. Equation (7) can be written Pma -- Db 9 -- Pma -- [Of

(8)

Equation (8) has the same form as Equation (1). We can use Equation (8) to substitute for 9 in Equation (3) so that

2

p ..... - Or

2 [ Pma

Of

Pma -- Of

Equation (9) reduces to Equation (6). This means that both approaches lead to the same prediction of Pb- Thus the system of equations is robust in terms of cross-scale application, i.e. the same estimates of Pb are obtained by taking routes 1 and 2 (Figure 5). It is important to note that these equations are conceptual, in contrast with the previous example where the system was partly founded on empiricism. As petrophysics continues to move strongly towards cross-discipline integration in pursuit of improved reservoir description, the question of the scale compatibility of petrophysical algorithms will be subject to further scrutiny. Even if core analysis practised in accordance with a sound technical strategy does not prove to be fully reconcilable with log data, it does provide some ground-truthing which, if properly used, can aid in the quantification of uncertainty and the management of risk. Further progress will no doubt be forthcoming from the technical advances that are in hand.

Contemporary developments

[--~

TI o101

Z(01 + p:) =

for cross-scaling

The strongest technical drive to integrate core and log data at the present time is to be found in the technology and engineering development function of the Ocean Drilling Program (Meyer, 1991). The ODP was initiated in 1983 and it has already been responsible for many scientific expeditions (legs) to explore the structure and history of the earth as revealed beneath the oceans. The exploration strategy has been to drill fully cored boreholes at target sites and then to run a full suite of wireline logs at each site. The database of core and log data is therefore voluminous and, if not managed effectively, it might become difficult to handle, even during the course of a single leg. The present objective is to provide an integrated database of core and log data for each drilled site in as close to real time as possible. The following ongoing technical developments are directed at achieving this goal. The developments are reported here because, as they mature, they

Marine and Petroleum Geology 1994 Volume 11 N u m b e r 4

463

Effective integration of core and log data: P. F. Worthington will contribute strongly to the improved integration of core and log data.

~530-

Sonic core monitor In cases of incomplete core recovery it has been the practice to 'hang' the recovered core from the top of the core barrel as though recovery had been continuous over the uppermost portion of the interval and zero over the rest. The sonic core monitor, which is part of a prototype hard-rock core orientation system (Rhinehart, ! 993), is designed to record the rate of core recovery so that, by comparing this rate with the rate of bit penetration, the correct location of each core piece within the core barrel can be determined. The system comprises an ultrasonic source/sensor at the top of the core barrel and a reflector positioned on the top of the advancing core (Figure 6). The length of the core column is continuously recorded by measuring the travel time of the reflected pulse. An example of data output is shown in Figure 7. The zones of poor core recovery are immediately recognizable. Now that the concept is proved, the prototype design is being upgraded through the development of a second tool for routine field use in consolidated rocks.

Multi-sensor track The oil industry traditionally records core natural gamma counts by running the recovered core on a belt through a gamma ray detector. The O D P has extended this concept to include other continuous measurements such as magnetic susceptibility, gamma ray attenuation

X C B Core barrel assembly - - \

j

/ 4 Battery pack

-'J

/ J

I./

.~

i'"

\ Downhole

electronic unit

- - ~ ,

Sonic transducer

L~bJ r'~E3 (J

TIME

\

\ Drill c o l l a r

Core bit -

-

-

-

Figure8 Sonic core monitor: personal communication)

464

~ 20~ ~s-~ ~ t0-

/

s0

!

I

I

I

"time (arbitrary units}

Figure 7 Sonic core monitor: bit penetration versus corrected core height. (71)Penetration; (A) corrected core height (courtesy

D. Huey, personal communication)

and P-wave velocity. This so-called 'multi-sensor track' is a screening facility; it provides a reference scale to which other core data can be related, e.g. physical properties measurements on core plugs. As such, it provides a reference core depth scale for subsequent matching with the reference log depth scale. Following the success of O D P deployment, multi-sensor tracks of various designs are starting to be used commercially.

Interactive data display The aim is to integrate laboratory data and downhole measurements within a common user-friendly system. The system is required to be interrogatable and it must have the capability to select and display in interactive mode all those data that are relevant to a particular interpretation exercise. There are four essential stages in this process: (i) integrate core data; (ii) integrate log data; (iii) merge core and log data; and (iv) display core and log data. At the present time item (i) is receiving the highest priority. Data are being stored as computerized barrel sheets for which the display can be varied. An example is shown in Figure 8. The intention is to merge the log data so that they too can be displayed on call in the available left-hand tracks. This would mean that any log or core data could be called up and displayed with a common depth scale subject only to track availability. This facility would provide the opportunity for interactive data interpretation using the c o r e - l o g database during the course of an O D P leg.

Petrophysicai definition of lithological units

v v v

Stainless steel target (riding on top of core)

_ ~s-

phase

1 (courtesy D. Huey,

There can be little doubt that a primary research goal of this decade is the establishment of a robust procedure for the integrated description of petroleum reservoirs. Many different avenues are being explored. The techniques can be deterministic, analytical or probalistic, or some combination of these. There is no clear indication at present as to which approach with its facilitating software will emerge as the scientifically preferred option. However, we can be absolutely certain of one thing - - whatever method is used, the approach will require the definition of basic lithological building blocks based on the conjunctive use of core and log data. The procedures outlined here are

M a r i n e and P e t r o l e u m G e o l o g y 1994 V o l u m e 11 N u m b e r 4

Effective intended to provide a sound foundation for the integration of core and log data so that these building blocks can be identified with confidence. However, there are other aspects that are worthy of mention within the framework of reservoir description. First, although the culture has to be one of scaling up to seismic and simulator grid scales, we should not lose sight of the sharp spatial resolution that is afforded by core data and microresistivity logs. These data contribute strongly to the definition of bed boundaries, which governs the scaling-up process. They are also foundations for the description of the reservoir architecture. In the particular case of thinly laminated sands and shales that are far below the limits of resolution of most conventional logging tools, there are no obvious key intervals. Here, high resolution tools will have to be used to characterize, as far as possible, the sand laminations, with core data being used to calibrate tool response over the sand intervals. The role of core data is strengthened further as strata thicknesses approach the core scale. Second, the scale of the lithological building blocks will vary according to the degree of heterogeneity of the reservoir. In some instances the building blocks will be comparable with seismic scales; more generally, they will be much smaller. Third, in clastic reservoirs we might expect that the building blocks will scale up to flow units in

SITE 854

HOLE B

~phic th.

integration of core and log data: P. F. Worthington some (perhaps as yet undefined) way. In (fractured) carbonate reservoirs the link will be much more tenuous, the massive rock acting as a 'hinterland' for the secondary conduits. Finally, the extension of core-log integration into the broader realm of reservoir description will bring together people from traditionally distinct disciplines• If the data integration initiative in its broadest sense is to be successful, we will require a subpopulation of geoscientists with a specialism in one of the recognized earth sciences and the background and ability to think laterally. These comments are provided with the object of placing the technical strategy for effective core-log integration into a broader perspective. To say more would exceed the scope of this paper. Conclusions Core-log integration is an integral component of reservoir characterization practice. It can be enhanced by the judicious selection of key intervals consisting of relatively thick, fully cored beds. Key intervals should be targeted on the basis of seismic and geological zonation and the character and quality of log data. Plug selection should be driven by the need for compatibility between the different scales of measurement, principally the core and log scales. Plug locations

CORE 2H

CORED 8.4- 17.9 mbsf

Structure u3

Description 03 DIATOM FORAMINIFER CLAY WITH OXIDES, CLAYEY NANNOFOSSlL OOZE WITH RADIOLARIANS and 10YR CLAYEY FORAMINIFER 6/4 NANNOFOSSIL OOZE W/OXIDES & 10YR RADIOLARIANS 3/3

a..k. 1

l

~ >77

~l >>7

m ~t >7>

S

m ~ TT T -~'~

3

m ~ m ~ m ~

>)7

T .,,~T

m ~

Minor Lithologies: The top of Section 1 is dark brown to light yellowish brown FORAMINIFER 10YR CLAYEY NANNOFOSSlL OOZE, and 6/4 the lower part of Section 1 is dark brown DIATOM CLAY WITH OXIDES.

m ~ m ~ m ~ m ~

..,-is T

Major Lithologies: The dominant lithology in Section 2 is 10YR dark brown DIATOM FORAMINIFER 4/3 CLAY WITH OXIDES. In Sections 3 through 6 the dominant lithology is light brownish yellow CLAYEY NANNOFOSSIL OOZE WITH RADIOLARIANS. From the middle of Section 6 through the core catcher, and interloedded in Sections :~ through 6 is dark brown CLAYEY FORAMINIFER NANNOFOSSlL OOZE W/OXIDES & RADIOLARIANS.

General Description: TRACE FOSSILS: Bioturbation is generally light to moderate with more intense burrowing within the darker beds in which Zoophycos is often seen. Solid burrows and Planolites are abundant with occasional vertical burrows.

S

m ~

BANDING CONTRAST: Slight to moderate. S

IOYR 4/3

0 10 20 30 40 50 Susceptibility(10-5 SI)

Figure 8 Example of a computerized core barrel sheet (courtesy T. Janecek, personal communication) Marine and Petroleum Geology 1994 Volume

11 N u m b e r 4

465

Effective integration of core and log data: P. F. W o r t h i n g t o n should be chosen on the basis of log responses and Transactions of the SPWLA 11th European Formation Evaluation Symposium, D1-12 X-ray computed tomography scanning, the latter to Ekstrom, M. P., Dahan, C. A., Chen, M. Y., Lloyd, P. M. and Rossi, investigate any heterogeneities in selected core pieces. D. J. (1987) Formation imaging with microelectrical scanning For maximum effectiveness the plugs should be arrays Log Analyst 28, 294-306 selected by the petrophysicists who are ultimately Enderlin, M. B., Hansen, D. K. T. and Hoyt, B. R. (1991) Rock responsible for log evaluation. This means that volumes: considerations for relating well log and core data. In: Reservoir Characterisation II (Eds L. W. Lake, H. B. Carroll geologists and petrophysicists must work as a team, not Jr, and T. C. Wesson), Academic Press, New York, 277-288 in isolation. Hamilton, J. M. and Stewart, J. M. (1983) Thin bed resolution The clear goal is a significant reduction in uncertainty. and other problems in matching log and core data TransThe key to reducing uncertainty is to see the drilling, actions of the SPWLA 24th Annual Logging Symposium, 11-13 core selection and measurement, and wireline logging Heaviside, J., Langley, G. O. and Pallatt, N. {1983) Permeability as components of a coupled system which must be characteristics of Magnus reservoir rock Transactions of the dovetailed if maximum benefits are to be derived. SPWLA 8th European Formation Evaluation Symposium, However, the goal will not be fully achieved unless A1-29 there is also an integration of those people who are Logan, W. D. (1989) Bridging the gap between core permeability and log-derived permeability Transactions of the SPWLA active in these traditionally distinct areas. Acknowledgements This paper is an extension of that presented at the Society of Core Analysts Annual Technical Conference in San Antonio, Texas in August 1991. The author acknowledges the original support of the Ocean Drilling Program and of BP Exploration and BP Research in publishing this formulation of views. In particular, thanks are due to Andy Fisher, Dave Huey and Tom Janecek for providing the illustrations of ODP technology, and to Dick Woodhouse, Nadia Pallatt and Dave Buller, all formerly of BP, for providing helpful philosophical input.

References Allen, D., Anderson, B., Barber, T., Des Ligneris, S., Everett, B., Flaum, C., Hemingway, J. and Moriss, C. (1988) Advances in high resolution logging Schlumberger Tech. Rev. 36(2), 4-14 Archie, G. E. (1942) The electrical resistivity log as an aid in determining some reservoir characteristics Trans. AIME 146, 54-62 Barlai, Z. (1979) Some theoretical aspects of how to correlate well-log-evaluated data to core-determined reservoir properties Transactions of the SPWLA 20th Annual Logging Symposium, E1-20 Bouvier, L. and Maquignon, S. M. (1991) Reconcilation of log and laboratory derived irreducible water saturations in a double porosity reservoir. In: Advances in Core Evaluation II (Eds P. F. Worthington and D. Longeron), Gordon and Breach, Reading, 275-292 Bussian, A. E. (1982) A generalized Archie equation Transactions of the SPWLA 23rd Annual Logging Symposium, E1-12 Cooke-Yarborough, P. (1987) A method for reconcilation of log-derived and core-derived porosity Transactions of the SPWLA 28th Annual Logging Symposium, DD1-24 Dahlberg, K. E. and Fitz, D. E. (1988) Comparing log-derived and core-derived porosity and mineralogy in thinly-bedded reservoirs: an integrated approach Transactions of the SPWLA 29th Annual Logging Symposium, W1-18 Dawe, B. A. and Murdock, D. M. (1990) Laminated sands - - an assessment of log interpretation accuracy by an oil-base mud coring programme SPE Paper 20542, Society of Petroleum Engineers, Richardson, 12 pp Dutch, R. A. (1962) Roget's Thesaurus of English Words and Phrases, Longman, London, 1309 pp Dyos, C. J. (1986) Inversion of well log data by the method of maximum entropy Transactions of the SPWLA lOth European Formation Evaluation Symposium, H1-15 Dyos, C. J., Petler, J. S., Jones, M. R. O., Cuddy, S. and Wilkinson, D. (1988) Reconciliation of Sw from logs and core in the North Sea Magnus Jurassic sandstone reservoir

466

30th Annual Logging Symposium, V1-23 Marchant, L. C. and White, E. J. (1968) Comparison of log and core analysis results for an extremely heterogeneous carbonate reservoir Transactions of the SPWLA 9th Annual Logging Symposium, L1-16 Marion, D. and Jizba, D. (1993) Acoustic properties and their dependence on porosity, mineralogy and saturation: applications to field-scale measurements. In: Advances in Core Evaluation III (Eds P. F. Worthington and C. ChardaireRiviere), Gordon and Breach, Reading, 43-61 Meyer, W. M. (1991) Large-scale integration of core and logging data at the Ocean Drilling Program SPE Paper 22993, Society of Petroleum Engineers, Richardson, 10 pp Nieto, J. A. and Yale, D. P. (1991) Integration of core and downhole acoustic measurements - - shear and compressional. In: Advances in Core Evaluation II (Eds P. F. Worthington and D. Longeron), Gordon and Breach, Reading, 215-237 Olson, D. M. (1986) Calibration of log and core saturation data: case history from the San Ardo Field Transactions of the SPWLA 27th Annual Logging Symposium, Z1-12 Oxford English Dictionary (1933) Oxford University Press, London Pallatt, N., Wilson, J. and McHardy, B. (1984) The relationship between permeability and morphology of diagenetic illite in reservoir rocks J. Petrol. Technol. 36, 2225-2227 Rafdal, J. (1991 ) Core analysis to calibrate well log interpretation. In: Advances in Core Evaluation II (Eds P. F. Worthington and D. Longeron), Gordon and Breach, Reading, 35-51 Rhinehart, W. C. (1993) A prototype hard rock core orientation system for partial core recovery Preprint 4th Canadian Conference on Marine Geotechnica/ Engineering, St John's, Newfoundland, 27-30 June 1993, 7 pp Theys, P. R. (1991) Log Data Acquisition and Quality Control, Editions Technip, Paris, 330 pp Timur, A. (1969) Pulsed nuclear magnetic resonance studies of porosity, movable fluid and permeability of sandstones J. Petrol. Technol. 21,775-786 Trewin, B. and Morrison, S. (1993) Reconciliation of core and log residual oil saturation through application of in situ saturation monitoring. In: Advances in Core Evaluation III (Eds P. F. Worthington and C. Chardaire-Riviere), Gordon and Breach, Reading, 197-213 Worthington, P. F. (1990) Sediment cyclicity from well logs. In: Geological Applications of Wireline Logs (Eds A. Hurst, M. A. Lovell and A. C. Morton), Spec. Pub/. Geol. Soc. London No. 48, 123-132 Worthington, P. F. (1991) Reservoir characterization at the mesoscopic scale. In: Reservoir Characterization II (Eds L. W. Lake, H. B. Carroll Jr and T. C. Wesson), Academic Press, New York, 123-165 Worthington, P. F. (1993) The uses and abuses of the Archie equations, 1: the formation factor-porosity relationship J. App/. Geophys. 30, 215-228 Worthington, P. F., Toussaint-Jackson, J. E. and Pallatt, N. (1988) Effect of sample preparation upon saturation exponent in the Magnus Field, UK North Sea Log Analyst 29, 48-53 Wyllie, M. R. J., Gregory, A. R. and Gardner, L. W. (1956) Elastic wave velocities in heterogeneous and porous media Geophysics 21, 41-70

M a r i n e a n d P e t r o l e u m G e o l o g y 1994 V o l u m e 11 N u m b e r 4