Quality Assurance in Environmental Monitoring

Quality Assurance in Environmental Monitoring

PII: S0025-326X(99)00016-8 Marine Pollution Bulletin Vol. 39, Nos. 1±12, pp. 23±31, 1999 Ó 1999 Elsevier Science Ltd. All rights reserved Printed in ...

185KB Sizes 0 Downloads 130 Views

PII: S0025-326X(99)00016-8

Marine Pollution Bulletin Vol. 39, Nos. 1±12, pp. 23±31, 1999 Ó 1999 Elsevier Science Ltd. All rights reserved Printed in Great Britain 0025-326X/99 $ - see front matter

Quality Assurance in Environmental Monitoring G. E. BATLEY* Centre for Advanced Analytical Chemistry, CSIRO Energy Technology, Private Mail Bag 7, Bangor, NSW 2234, Australia Over the past decade, there has been a growing awareness of the need for accountable quality assurance (QA) and control in chemical analysis. In environmental monitoring programs, especially those studying metal contaminants at trace and ultratrace concentrations in natural waters, the sampling, sample storage and sample preparation stages prior to analysis, represent potentially greater sources of error than the chemical analysis. The improvements in data reliability and detection limits when appropriate QA is employed, allow meaningful relationships between the measured concentrations and other physical and chemical parameters to be more reliably interpreted. Published data generated from both ®eld and laboratory studies, frequently show no evidence of QA, yet such data are being used to formulate quality guidelines and as the basis for expensive management actions. So that only quality data forms the basis of regulatory action, it is essential that monitoring data not be published without demonstrable evidence of quality practices in all aspects of the monitoring exercise. Ó 1999 Elsevier Science Ltd. All rights reserved. Quality assurance (QA), quality control (QC) and associated good laboratory practice should be essential components of all environmental monitoring. The need for good QA/QC in the chemical analysis of environmental samples has been well recognised (Quevauviller et al., 1995) and has been tested in a number of international QA exercises (Wells et al., 1997; Wells and Co®no, 1997). Such diligence needs also to be applied to other components of the monitoring exercise, since in many instances these may represent a greater source of error. Data that are not based on adequate QA/QC can be in error, and their misuse can lead to wrong environmental management decisions. By de®nition, QA refers to all of the actions, procedures, checks and decisions undertaken to ensure the representativeness and integrity of samples, and accuracy and reliability of analytical results. QC comprises those actions that monitor and measure the e€ectiveness of QA procedures with respect to de®ned objectives. *Fax: +61-9710-6837; e-mail: [email protected]

This might include checking of equipment cleanliness, duplicate sampling, measurement of ®eld and laboratory blanks, and the analysis of replicates and reference materials. QC is therefore an essential component of a QA program and both are usually jointly recognised, by referring to the implementation of good QA/QC as an essential part of good laboratory practice. The essential steps in a sampling program for a monitoring exercise are outlined in Fig. 1, and described in greater detail by Maher et al. (1994). The exercise should begin with a de®nition of the issue/question or the hypothesis to be tested. In environmental studies of natural water systems, the issue typically involves assessing the impacts of some activity on water quality, the extent of contamination of sediments, or the levels of contaminants in biota that might a€ect their suitability for human consumption. Following problem de®nition, a sampling strategy is required that will provide data that speci®cally address the problem or hypothesis. The subsequent stages involve ®eld sampling, chemical analysis and data analysis, and it is in all of these ®eld and laboratory operations that appropriate standards of QA/QC are required. Environmental monitoring, as a basis for environmental management, is undertaken by regulatory agencies, by industry and under contract by commercial consultants. Chemical analyses are frequently sub-contracted to analytical laboratories, with such contracts often awarded on the basis of price rather than quality. Competition and cost-cutting by laboratories, has an inevitable impact on the levels of QA/QC that are necessary to have con®dence in their results. The cost of QA represents some 15±20% of the total cost. While most commercial analytical laboratories are fully conversant with, and hopefully apply, appropriate standards of QA/QC to their analyses, not all monitoring is undertaken by accredited commercial laboratories. Indeed it is entirely probable that a signi®cant percentage of the environmental monitoring studies that are reported in the scienti®c literature do not have adequate levels of QA/QC. Monitoring data are assessed against benchmarks with reference to data from reference sites or conditions, as well as by comparison with accepted guidelines for 23

Marine Pollution Bulletin

Fig. 1 Steps in a monitoring exercise.

water, sediment or food quality. The development of regulatory guidelines and standards relies upon published data from laboratory monitoring of human and environmental toxicology, that are also often the subject of less than satisfactory standards of QA/QC. Analytical chemistry is a major component of such toxicity testing, yet the rigorous protocols for both sampling and analysis that are required for ®eld environmental monitoring are often absent. Data in toxicity databases are largely derived from published literature in which there is often little or no evidence of the extent of the reliability of the chemical analyses or indeed other QA aspects of the toxicity testing. As a ®rst cut any data that are based on nominal rather than measured toxicant concentrations should certainly be rejected. Beyond this it is often impossible to ascertain the levels of QA/QC that are applied. This has important consequences if guidelines based on unreliable data can be on the one hand under-protective of ecosystems, or on the other, overprotective with the result that there are added cost penalties on society to achieve these levels of protection. All of these factors highlight the need for greater attention to quality issues in all aspects of environmental measurement, in both laboratory and ®eld. This will need to involve not only researchers, but also journal editors and reviewers, environmental agencies and environmental consultants. The cost penalty to industry, and ultimately the taxpayer for decisions based on poor quality data can be large. This paper addresses those areas where vigilance is required to ensure acceptable standards of environmental monitoring data. It focuses on the QA that needs to be applied in a ®eld monitoring exercise that involves trace metals in aquatic ecosystems, i.e. in waters, sediment and biota. Although it deals with metals and not organic contaminants or nutrients, similar practices will apply to these analytes, as they do to ecotoxicity studies in both ®eld and laboratory. It will be instructive for the reader to consider how satisfactory such practices have been employed in studies with which they are familiar. Hopefully, journals publishing environmental monitoring data will start requiring evidence of acceptable QA/ QC in published papers, so that in future the data can be used with con®dence by environmental scientists. 24

Field Sampling Field sampling is the ®rst area in the monitoring exercise where QA is required (Kramer, 1994). The sampling will follow a design process, which considers the necessary spatial and temporal resolution, the number of samples and number of replicates, to satisfactorily address the study objectives. Cost is often a limiting factor in designing sampling programs. Pooling of samples can be employed to reduce costs, however, a critical determinant is the statistical power of the design to address the problem questions (Green, 1979; Fairweather, 1991). Sampling program design has been thoroughly reviewed elsewhere (e.g. Maher, 1994; Mudroch and Azcue, 1995) and will not be discussed further. It should nevertheless be recognised that inadequate sampling design can result in a monitoring program whose ®ndings do not adequately re¯ect the environmental risk associated with particular contaminants. In that sense, the quality of the program is jeopardised. In terms of obtaining a result that re¯ects the concentration of the contaminant in the natural environment, the sampling process itself can be a major source of errors, especially where trace contaminants are being measured. It is frequently argued that analytical chemists should be involved in the sampling process, especially of waters, because of their awareness of the problems of contamination and adsorptive losses of metals at trace and ultratrace concentrations (Batley, 1996). For sediments and biota, where metal concentrations are appreciably higher, the likely magnitude of errors when QA is absent, will be less (Fig. 2). With water samples, contamination and/or adsorptive losses can occur through inappropriate sample container selection, inadequate container preparation, contamination from the sampling device during the sampling operation, or inappropriate methods of sample preservation and storage. It is well recognised that polyethylene or Te¯on containers are essential for trace metal studies (Batley and Gardner, 1977; Batley, 1989b), however the need for rigour in container preparation and sampling is often underestimated. Ahlers et al. (1990) presented a most elegant demonstration of contamination problems during sample collection in studies of dissolved zinc in New Zealand

Volume 39/Numbers 1±12/January±December 1999

Fig. 2 An example of possible errors in environmental monitoring.

waters. Their container preparation protocol involved soaking Nalgene bottles in hot 50% nitric acid for two days, rinsing with high purity water, then leaching in 1% nitric acid for two weeks. The bottles were then emptied and wrapped in two polyethylene bags, the inner being acid-cleaned. During the sampling operation, the bottles were sequentially unwrapped by two people wearing polyethylene gloves. The ®rst ÔdirtyÕ assistant removed the outer bag and handed the bottle to the ÔcleanÕ assistant who removed the inner bag and undertook the sampling. Bottles were then ®lled using an appropriately cleaned depth sampler, or by immersion by hand or with the bottle immersed from the end of a pole (Batley, 1989a). Using a less rigorous bottle preparation and handling procedure, measured zinc concentrations showed extreme variability (Fig. 3). With the rigorous protocol measured concentrations were almost a factor of three lower, and there was little di€erence between sample sites. Con®dence in the latter zinc concentration values was reinforced by the fact that the smoother pro®le observed paralleled those for dissolved sodium and potassium concentrations. A similar protocol combined with appropriate analytical rigour, applied to the analysis of dissolved copper in estuarine waters, resulted in signi®cant improvement in the concentration vs. salinity plot, to the point where the classic conservative dilution of copper during estuarine mixing could be clearly demonstrated (Fig. 4).

Fig. 3 Concentrations of dissolved zinc in river water with and without a special sampling protocol (adapted from Ahlers et al., 1990).

To determine the extent of contamination or losses during ®eld sampling, QA measures should include ®eld blanks, container blanks and trip blanks. Field blanks are bottles ®lled with distilled water (or clean seawater) that are taken on the ®eld trip and subjected to all of the operations of a sample container. Container blanks measure the contribution of the container to the same water. Trip blanks measure the e€ects that transportation of a blank sample has on its measured concentration. Typically a ®eld blank will encompass the contributions of the other two blank measurements, and only if this is high will the other measurements help identify its source. The spiking of samples in the ®eld with key analytes will allow losses during transportation and storage to be identi®ed. Sample replication will also be important to characterise the magnitude of sample (and analytical) variability. Other sources of error in ®eld sampling include locational certainty so that the same site can be resampled, sample labelling and logging, and the recording of appropriate ®eld observations and measurements. The latter should include a measure of temperature and other parameters such as salinity (rainfall) and turbidity, that might a€ect levels of trace contaminants.

Sample Storage For trace metal analyses, waters are typically stored at 4°C, while sediments and biota are frozen. Freezing of water samples, particularly seawater, is not recommended as this can lead to selective concentration of metals and possible losses during thawing (Batley, 1989a,b). Storage at higher temperatures can enhance bacterial growth in solution and on the container surfaces, resulting in losses of metals. Acidi®cation of water samples will inhibit bacterial growth, but is only recommended if ÔtotalÕ acid-soluble metals are being measured, because of the solubilisation of particulate metals. For ®ltered samples acidi®cation is appropriate, unless speciation measurements are being undertaken. Losses from solution during the storage of ®ltered, unacidi®ed natural water samples, are often checked by tests using ionic metal spikes. It has been shown that 25

Marine Pollution Bulletin

Fig. 4 Dissolved copper as a function of salinity in estuarine waters, (a) without, and (b) with rigid sampling and analysis procedures.

such spikes do not represent the natural species of metals in most water samples and are indeed more prone to losses than natural colloidal and complexed metal species, thereby overestimating potential storage losses (Batley and Gardner, 1977). Ideally, analyses of unacidi®ed samples should be undertaken as soon as possible after collection to minimise such problems. Excessive storage times, longer than several months, even at 4°C, are not recommended for unacidi®ed samples, although acidi®ed samples are likely to be stable for this period and longer. For sediments, frozen storage is preferable because at higher temperatures samples are more readily oxidised, resulting in changes in metal speciation (Kersten and Forstner, 1989). Biota samples are also better preserved by freezing.

Sample Preparation Sample preparation is the necessary sample manipulation before the sample can be presented for instrumental analysis. In the case of waters, this generally involves ®ltration, for sediments and biota, homogenisation and sample digestion. Unless the membrane ®lter and ®ltration apparatus used for water samples is rigorously cleaned by soaking in dilute acid followed by distilled water, contamination can be a major problem. Appropriate QA to determine 26

the extent of contamination can involve the ®ltration of a standard reference water sample. Because of the high cost of such samples, a better approach is to measure the metal contents of successive aliquots of ®ltered samples. For ultratrace analyses, test ®ltrations of pristine seawater stock solutions are recommended to ensure an absence of contamination. In addition there are quality issues related to the volumes of water being ®ltered. The e€ective ®lter pore size can change during the ®ltration of large volumes especially if there are appreciable suspended solids (Batley, 1989a,b). If too high a pressure is used during pressure ®ltration, rupture of cells can also add intracellular metals to the dissolved fraction (Batley, 1989a,b). The whole question of the signi®cance of 0.45 lm ®ltration as an arbitrary de®nition of the dissolved fraction is itself an issue. Following ®ltration, appropriate water sample preservation techniques are required to prevent further losses or changes. Where total dissolved metals are being analysed, the sample is usually preserved by acidi®cation. Metal concentrations in the added acid will need to be determined to assess their signi®cance. The preparation of sediment and biota samples typically involves homogenisation of the wet samples, after thawing. In some instances freeze drying, or oven drying is carried out prior to homogenising. In both cases poor homogenisation is generally the major source of variability. Analysis of replicate samples will reveal the

Volume 39/Numbers 1±12/January±December 1999

extent of this. Errors in the drying stages include contamination or loss of metals by volatilisation. With sediment samples it is usual to undertake a coarse screening of the moist sample to remove large (>2 mm) particles which can signi®cantly bias analyses especially where small sample masses are used. The remaining sample is then homogenised. It is usual to work with a moist sample because drying of the sample is likely to alter metal speciation (Kersten and Forstner, 1989). Additionally drying may lead to losses of more volatile elements, and a further source of errors. Homogenisation of dried samples is relatively easy after grinding with a mortar and pestle or in a mechanical mill. Wet sieving of moist samples to separate sediment on the basis of grain size, has the potential to both contaminate the sample (from the sieve or the added water) or to leach metals from the sample (as a result of the water addition). The latter can be compensated for by separately analysing the water associated with each separated sediment fraction. Verifying the signi®cance of contamination or losses will be part of the QA process. When working with moist samples, homogenisation becomes especially dicult. Separate aliquots of moist sample are taken for analysis and moisture content determination. This too is a source of error that requires the use of replicate samples. The moist sample must be carefully mixed before each aliquot is taken. With biota samples, particularly smaller organisms, there are many opportunities for errors. With oysters, for example, a rigorous sample preparation would involve cleaning the shell to remove excess sediment or other particulates that may contaminate the ¯esh. The ¯esh is then separated from the shell and excess ¯uid removed using a clean ®lter paper or tissue. Because metals are non-uniformly distributed in the body parts of most biota, including oysters, e€ective tissue homogenisation is important, especially if small sample masses are taken for analysis. Failure to remove the excess ¯uids in a wet-weight analysis of an oyster can also seriously bias the analytical result downwards for comparison against a guideline for human consumption. Similar constraints apply to other biota samples. Appropriate QA might include replication of analyses on a percentage of samples (10%). Given the high cost of analyses, duplication of all samples would generally be prohibitively expensive. Incomplete sample dissolution is a particular source of error in sediment and biota sample preparation. This is best identi®ed by the analysis of certi®ed reference materials (CRMs). It should be noted that the reference value may di€er from the measured value unless ÔtotalÕ dissolution procedures, for complete release of metals from particles, are used. For example, the US EPA Method 200.8 for the acid digestion of sediment samples, which involves re¯uxing the sediment in a mixture of dilute nitric and hydrochloric acids, gives incomplete

dissolution of a number of metals present in the mineral lattice of sediment CRMs. It is also notable that CRMs are usually dried powders and therefore can behave di€erently during acid digestion to moist sediment or biological tissue. For biota, the degree of complete release or mineralisation of metals that is required will depend on the method of analysis. For example with spectrometric techniques involving the decomposition of the sample using ¯ames, plasmas or furnaces, complete mineralisation is not required. Hydride generation followed by spectrometric detection, on the other hand, will require complete mineralisation to ensure complete reaction with added borohydride. Microwave digestion procedures are increasingly being used, however their eciency is a function of the time and power of the applied microwave signal. It is important to establish that the conditions chosen are sucient to fully release all metals into solution. During routine analysis, a CRM should be included in each batch of samples. Reagent blanks are a further source of errors. Trace metal concentrations in any additions to the sample must be known. These can be determined by analysis of the stock solutions, or by measuring the e€ects of multiple additions of reagents to the sample. The latter procedure is only acceptable provided the measured response is not a€ected by any matrix changes (pH, ionic strength) that multiple additions might induce.

Quality Assurance in Sample Analysis The sources of error in the chemical analysis of environmental samples are well understood (Quevauviller et al., 1995). These include: (i) the use of methods with inadequate detection limits; (ii) matrix interferences; (iii) measurements that are species speci®c; (iv) errors due to signal resolution; (v) measurement including a non-analyte response; (vi) high background signals. Analytical detection limits for trace metals in natural waters have been dramatically lowered over the past 50 yr. This has resulted from a combination of more sensitive analytical instrumentation, cleaner laboratory environments, and the use of added reagents in lesser quantities and higher purity. This is re¯ected in historical data for lead in seawater (Batley, 1996) (Table 1). That there has not been what could be interpreted as a lowering of lead concentrations over the past 60 yr, has been con®rmed by the analysis of ice cores from Greenland (Wol€, 1990). Pre-industrial ice contained 1 ng Pb/kg, compared to 200 ng/kg in 1960±1970 and 28 ng/kg in 1983±1984. The above lead data beg the question, are the lowest measured concentrations always correct? In fact this is usually the case, although not 27

Marine Pollution Bulletin TABLE 1 Historical data for lead in seawater. Year

Lead, lg/l

Separation

Analytical method

1938 1940 1957 1958 1960 1963 1981 1995 1995

3.5±8 5 4±9 <0.2 0.4±1.8 0.08±0.4 0.01 0.009 0.009

Evaporation, CuS coprecipitation Evaporation, HgS coprecipitation Evaporation, PbS precipitation Dithizone extraction Dithiocarbamate extraction Dithizone extraction ÿ APDC extraction ÿ

Spectrophotometry with dithizone Spectrographic Spectrophotometry Isotope dilution Spectrophotometry with dithizone Isotope dilution Di€erential pulse anodic stripping voltammetry Graphite furnace atomic absorption spectrometry Cathodic stripping voltammetry

necessarily so. It is important to demonstrate that adsorptive losses to surfaces have not occurred, and conversely for higher results that there has been no contamination. One of the biggest issues in ultratrace analysis is memory e€ects from samples containing high metal concentrations. This is particularly so with techniques such as anodic stripping voltammetry (ASV), and to a lesser extent with spectroscopic techniques such as inductively coupled plasma atomic emission spectrometry (ICPAES), ICP mass spectrometry (ICPMS) and graphite furnace atomic absorption spectrometry (GFAAS), that involve a high surface area in contact with the analyte solution. A sample containing analyte concentrations as little as an order of magnitude higher than those in a test sample, if analysed immediately prior to the test sample, will most likely give rise to an increase in the true measured signal of the test sample, even though appropriate washing of the cell or nebuliser apparatus is conducted between measurements. It is important to avoid such high concentration intrusions. For ultratrace concentrations, appropriate QA is to undertake the analysis of repeat aliquots of the sample until constant low signals are obtained. While it is typical to analyse samples in a random sequence, in ultratrace analysis it is desirable that the lowest samples be analysed ®rst. Separating samples into expected concentration ranges is therefore an important part of QA/ QC for ultratrace analysis. The avoidance of matrix interferences is critical in the trace analysis of waters. Small volume standard additions to the analyte solution will avoid these interferences. The running of separate calibration standards that are not matrix-matched can be a source of signi®cant errors. In all cases the use of single point standard additions is to be avoided. The linearity of the concentration response in the vicinity of the measured concentration must be demonstrated. The measurement of metal concentrations, particularly with spectroscopic techniques, is typically a total metal measurement. Some methods, including ASV, are species-speci®c, and unless appropriate pretreatment and analysis conditions are used, less than a total response will result. In ASV, only labile metals, those that 28

are dissociated and deposited at a mercury electrode under the chosen operating conditions, are measured. Total metals can only be determined after converting bound metals to labile forms, usually by acidi®cation and UV irradiation (Batley and Farrar, 1978). In the determination of arsenic by hydride generation followed by quartz furnace atomic absorption spectrometry, the eciency of conversion of arsenic species to hydrides di€ers for As(III), As(V), and organoarsenic acids. This is sensitive to pH and other solution conditions, while arsenobetaine, trimethylarsine oxide, tertramethylarsonium ion and related compounds do not form hydrides. The measured total hydride forming arsenic concentrations may therefore di€er unless solution conditions are chosen where all of the arsenic species yield a similar response (Gunn, 1983). A critical aspect of the QA should be to check the response of all species, not just arsenic (III). In addition, it should be ascertained whether complete mineralisation or merely dissolution is required. The former usually converts all species to a common form. The resolution of target analyte peaks from those of non-target metals or matrix components can be a signi®cant source of errors in ASV, as well as other techniques. Where possible, this must be overcome by changing solution conditions, e.g. pH and dissolved oxygen for ASV. In spectroscopic measurements, nonanalyte contributions to the signal at the measurement wavelength must be overcome by choosing an alternative wavelength, or using appropriate background correction procedures. High background signals are a further source of error. It is important to instrumentally minimise the size of the background signals, and to demonstrate that limits of determination can be obtained in excess of a small background, that are appropriate to meet the objectives of the monitoring survey. The limit of determination (LOD), or limit of quantitation is considered a practical limit of measurement, generally de®ned as 10 times the standard deviation of the blank. The limit of detection is de®ned as 3 times the standard deviation of the blank. Internal standards, standard additions and CRMs form the major controls on QA in chemical analyses.

Volume 39/Numbers 1±12/January±December 1999

Internal standards are metals that are not expected to be present in the sample and which do not interfere with the quanti®cation of the target analyte. They are typically used in chromatographic techniques and in mass spectrometric techniques, including ICPMS to check the consistency of the analytical response. Analyte concentrations may be determined with increased precision by measuring the ratio of the analyte response to that of the internal standard. Internal standards can only be used for calibration purposes, however, if the concentration responses of both the standard and the target analyte are known. Standard additions are the most usual form of calibration, as discussed earlier and are more reliable than separate standard solutions. The latter should only be used in an identical matrix, or where it has been validated that there are no matrix e€ects. To avoid errors such additions should be made over concentrations below an order of magnitude of the analyte concentration. It is of course assumed that the added standard is in the form of the species that is being measured. A major opportunity for errors arises where the standard addition interacts with a component of the sample matrix, distorting the response. Such an instance arises in stripping voltammetry where because of the complexing ability of natural organics in a water sample, the expected labile metal signal is diminished as a result of complexation (Fig. 5). Only when the complexing capacity of the solution is exceeded is a linear concentration response for added metal seen. Using the initial slope, a gross overestimation of labile metal in the water sample is obtained. CRMs are the most reliable check on QA. The range of environmental CRMs is now extensive and it is generally possible to select ones that match both the matrix and concentration range of interest for waters, sediments, soils and biota. They are available from a number of reputable sources including the National Institute of Standards and Technology (NIST) in the US, the National Research Council of Canada (NRC), and the International Atomic Energy Agency (IAEA).

The preparation of CRMs is a specialised art. Attempting to prepare homogeneous spiked ÔreferenceÕ samples as internal reference standards, or for interlaboratory analysis, is fraught with diculties and should only be undertaken by experienced laboratories. Otherwise de®ciencies in CRM preparation can be unfairly attributed to poor analytical performance. It is generally accepted that a minimum of 10% of samples should be calibration or control checks. These checks should be within an agreed percentage of the certi®ed value, which will depend on the concentrations being measured. Lower precision would be acceptable for ultratrace concentrations than trace concentrations or above. Control charts are an important internal QA measure, and have been referred to as the heart of a QA/QC program (Nadkarni, 1991). The graphical representation of performance data as a function of time, make it easy to identify deviation from a state of control. Deviations outside of statistically-derived limits are used to trigger remedial action. It is typical to show a bias line at 1r, a warning line at 2r and an action line at 3r. Data near the warning line indicate that the system should be watched for possible loss of control. Data near the action are out of control and require corrective action. A typical control chart for ASV measurements of the peak potential of a copper control, highlights the potential drift and impending failure of a reference electrode (Fig. 6). Other checks on a laboratoryÕs competence can be obtained through participation in interlaboratory trials. The results from the Quality Assurance of Information in Marine Environmental Monitoring in Europe (QUASIMEME) interlaboratory exercise featured in several issues of this journal (Volume 29, Nos. 4±5 and Volume 35, Nos. 1±6) re¯ect a general improvement in laboratory performance. Such exercises should not be viewed as a reliable indicator of a laboratoryÕs capabilities for the analysis environmental monitoring samples. It is highly likely that because of the prestige associated with performing well in interlaboratory trials, the care taken in analysing such samples is considerably greater than would be applied to routine analyses.

Data Handling and Analysis

Fig. 5 Distortion of an ASV calibration graph for labile copper in a river water by copper-complexing dissolved organic matter.

Most analytical laboratories are aware of the errors that can be introduced during data management and analysis. Preliminary data management activities include ensuring appropriate precision, preliminary screening for outliers, and checking samples of concern. The generation of data involves the recording of a measurement response that can be related to concentration. A major error can arise during the measurement of peak heights or areas and is associated with baseline and peak resolution. Most instruments have software that draws the baselines on signal peaks. It is worth checking that this is being done correctly; the option for 29

Marine Pollution Bulletin

Fig. 6 Control chart for peak potential in ASV measurement of copper in river waters.

manual setting of baseline points is available on most instruments. Peak heights are measured in some instances, but their use assumes that peak shapes (halfwidths) remain constant. While this is usually the case, it should be veri®ed where heights are measured (e.g. in ASV) as a quality check, otherwise peak areas are a more reliable measure. It goes without saying that there should be appropriate maintenance of laboratory records. These should enable the traceability of results from the ®nal reports back through work books to the original samples or sampling records. Errors in calculation and data transcription are less common with LIMS systems in many laboratories, taking data directly from instrument to result sheets. Appropriate quality checks are required to ensure such errors do not occur. Data interpretation is an equally important component of a monitoring exercise in which measured contaminant concentrations are assessed in terms of their ecological or human health risk. In many studies, as indicated by Maher and Bean (1992), the necessary supporting information describing the sample is not recorded or has not been collected. In the case of biota, this might include the size, weight, age and sex of the species. In the interpretation of temporal trends in metal bioaccumulation, for example, incorrect conclusions could be reached unless species of equal size were being compared, or information was known about the e€ect of size on metal concentrations.

Conclusion Although the need for appropriate QA/QC in environmental analysis is well understood, other components of the environmental monitoring program represent potentially greater sources of errors, especially where trace and ultratrace contaminant concentrations are being studied. Such programs are costly, and even more so are the management actions that may be required to remediate a contaminated environment. It is the responsibility of the environmental scientist to ensure that such actions will be based on quality data, 30

reliably interpreted. It will be equally important that journal editors, reviewers and researchers ensure that published results include evidence of appropriate QA/ QC, so that users of the data can have con®dence in their reliability. The author is grateful to Associate Professor Bill Maher and Dr Simon Apte for helpful comments. Ahlers, W. W., Reid, M. R., Kim, J. P. and Hunter, K. A. (1990) Contamination-free sample collection and handling protocols for trace elements in natural fresh waters. Australian Journal of Marine and Freshwater Research 41, 713±720. Batley, G. E. and Gardner, D. (1977) Sampling and storage of natural waters. Water Research 11, 745±756. Batley, G. E. (1989a) Collection, preparation and storage of samples for speciation analysis. In Trace Element Speciation, Analytical Methods and Problems, ed. G. E. Batley, pp. 1±24. CRC Press, Florida. Batley, G. E. (1989b) Physicochemical separation methods for trace element speciation in aquatic samples. In Trace Element Speciation, Analytical Methods and Problems, ed. G. E. Batley, pp. 43±76. CRC Press, Florida. Batley, G. E. (1996) Analysis and environmental solutions. Chemistry in Australia 63, 164±166. Batley, G. E. and Farrar, Y. J. (1978) Irradiation techniques for the release of bound heavy metals in natural waters and blood. Analytica Chimica Acta 99, 283±292. Fairweather, P. G. (1991) Statistical power and design requirements for environmental monitoring. Australian Journal of Marine and Freshwater Research 42, 555±567. Green, R. H. (1979) Sample Design and Statistical Methods for Environmental Biologists. Wiley, New York. Gunn, A. M. (1983) An automated hydride generation atomic absorption spectrometric method for the determination of total arsenic in raw and potable waters. Water Research Centre Technical Report TR 191, p. 31. Kramer, K. J. M. (1994) What about quality assurance before laboratory analysis? Marine Pollution Bulletin 29, 222±227. Kersten, M. and Forstner, U. (1989) Speciation of trace elements in sediments. In Trace Element Speciation, Analytical Methods and Problems, ed. G. E. Batley, pp. 245±318. CRC Press, Florida. Maher, W. A. and Bean, W. (1992) A database for published measurements of trace metal concentrations in Australian marine organisms. In Proceedings of a Bioaccumulation Workshop, ed. A. G. Miskiewicz, pp. 273±288. Australian Marine Sciences Association, Sydney. Maher, W. A., Cullen, P. W. and Norris, R. H (1994) Framework for designing sampling programs. Environmental Monitoring and Assessment 30, 139±162. Mudroch, A. and Azcue, J. M. (1995) Manual of Aquatic Sediment Sampling. Lewis Publishers, Florida.

Volume 39/Numbers 1±12/January±December 1999 Nadkarni, R. A. (1991) The quest for quality in the laboratory. Analytical Chemistry 63, 675A±682A. Quevauviller, P., Maier, E. A. and Griepink, B. (1995) Quality assurance for environmental analysis. In Quality Assurance for Environmental Analysis, eds P. Quevauviller, E. I. Maier and B. Griepink, pp. 1±25. Elsevier, Amsterdam. Wells, D. E., Aminot, A., Boer, J., Co®no, W., Kirkwood, D. and Pedersen, B. (1997) A review of the achievements of the EU project ÔQUASIMEMEÕ 1993±1996. Marine Pollution Bulletin 35, 3±17.

Wells, D. E. and Co®no, W. P. (1997) Developments in quality assurance in marine environmental monitoring. The QUASIMEME II laboratory performance studies and EU quality assurance of sample handling QUASH project. Marine Pollution Bulletin 35, 146±155. Wol€, E. (1990) Evidence of historical changes in global metal pollution: Snow and ice samples. In Heavy Metals in the Marine Environment, eds R. W. Furness and P. S. Rainbow, pp. 205±217. CRC Press, Florida.

31