ATSR infrared radiometric calibration and in-orbit performance

ATSR infrared radiometric calibration and in-orbit performance

Remote Sensing of Environment 116 (2012) 4–16 Contents lists available at ScienceDirect Remote Sensing of Environment j o u r n a l h o m e p a g e ...

3MB Sizes 0 Downloads 42 Views

Remote Sensing of Environment 116 (2012) 4–16

Contents lists available at ScienceDirect

Remote Sensing of Environment j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / l o c a t e / r s e

ATSR infrared radiometric calibration and in-orbit performance Dave Smith a,⁎, Chris Mutlow a, John Delderfield a, Bob Watkins b, Graeme Mason c a b c

STFC, Rutherford Appleton Laboratory, Chilton, Didcot, OX11 0QX, United Kingdom Department of Atmospheric Oceanic and Planetary Physics, University of Oxford, Parks Road, Oxford, OX1 3PU, United Kingdom ESA, ESRIN, Via Galileo Galilei, Casella Postale 64, 00044 Frascati, Roma, Italy

a r t i c l e

i n f o

Article history: Received 26 July 2010 Received in revised form 25 January 2011 Accepted 29 January 2011 Available online 30 June 2011 Keywords: ATSR ATSR-2 AATSR Along-Track Scanning Radiometer Calibration Radiometer Sea Surface Temperature Blackbody Thermal Infrared

a b s t r a c t Three Along Track Scanning Radiometers (ATSRs) form a series of space-borne instruments specifically optimised to provide accurate remotely sensed measurements of Sea Surface Temperature (SST), which is a key geophysical parameter required to inform the debate on climate change and global warming. These sensors' well-calibrated, high quality data have wide applicability and are being used in a much wider range of earth observation studies and applications, in addition to the planned SST mission. Each successive instrument has been an incremental improvement over its predecessor. Since early 1991 the ATSR sensors have provided global observations from the European Space Agency's Earth Observation satellites; namely ATSR-1 on ERS-1 (European Remote-sensing Satellite), then ATSR-2 on ERS-2 and then AATSR (Advanced ATSR) on ENVISAT. The missions have been operated with good overlaps between successive sensors; AATSR is currently the operational instrument. The fundamental requirement for each ATSR instrument is a design that is capable of delivering absolutely calibrated infrared data; therefore this paper concentrates on how the sensors provide the calibrated radiometric observations required for the SST retrieval algorithms to work. It does not discuss the validation of the algorithms to produce SST derived from these basic observations. Described are the rigorous pre-launch measurements over a range of simulated flight environments which verify that this aim has been achieved, a calibration which exercises the same brightness temperature algorithm that is then used when calibrating in-flight measurements. Crucially, the ATSRs measure calibration source radiances in-flight without interrupting Earth-viewing which permits continuous gain and offset monitoring and calibration. For clarity where it is thought necessary for a data user's better understanding of the calibration for flight data, some detail is provided concerning the differences between the three different ATSRs. Crown Copyright © 2011 Published by Elsevier Inc. All rights reserved.

1. Introduction Each ATSR senses a ± 250 km ground swath symmetrically arranged across the sub-satellite track by scanning four, single pixel, spatially co-registered channels (centred at 1.6, 3.7, 10.8 and 12 μm) that are recorded by cryogenic detectors within its Focal Plane Assembly (FPA). For ATSR-2 and AATSR an additional three shorter wavelength channels (at 0.55, 0.65, 0.85 μm) are included, these are recorded with ambient temperature silicon detectors. They are housed in their own compartment, as part of an enhanced FPA, and optically configured to use the full aperture of the other four channels and share the same common, single pixel field-stop. This paper concentrates on the thermal infrared channels, which are selected to match atmosphere transmission “windows” at centre wavelengths of 3.7, 10.8, and 12 μm. All three spacecraft are in sun-synchronous orbits with a near 3day repeat cycle and approximately 14 orbits per day. Each ATSR, with

⁎ Corresponding author. Tel.: + 44 1235 445996. E-mail address: [email protected] (D. Smith).

its combined day and night coverage and 500 km swath width, provides virtually complete global coverage every three days; except for the two polar regions excluded by the orbit's inclination and the impact of cloud coverage. Note that coverage refers here to data collection rather than retrieved SST, since clouds are opaque in the infrared and SST can only be sensed through cloud-free atmospheric columns. At the sub-satellite point, each sensor's Instantaneous Field of View (IFOV) is 1 km square and data are sampled on nominal 1 km square grid. ATSRs produce observations of the surface at twoangles over their ground swath, one viewing nearly vertically through the atmosphere and the other viewed obliquely. (Because ATSRs are conical scanning radiometers, the term “swath” is also used to describe telescope viewing vectors that are swept out by the scan mirror as opposed to a temporally combined ground coverage strip.) To meet scientific goals over the dynamic range used for SST, radiances reaching the satellite must be measured to an absolute accuracy equivalent to better than 0.1 K in brightness temperature for a stated spectral response. Also to avoid noise multiplication in the retrieval algorithm calibrated, single pixel, random radiometric noise

0034-4257/$ – see front matter. Crown Copyright © 2011 Published by Elsevier Inc. All rights reserved. doi:10.1016/j.rse.2011.01.027

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

must be less than 0.08 K at 3.7 μm and 0.05 K at 11 μm and 12 μm, for a scene temperature of 265 K (relaxed to 270 K for AATSR). 2. Instrument outline and scanning geometry The reader can find detailed descriptions of the ATSR-1 and -2 instruments in Edwards et al. (1990) and Stricker et al. (1995). The instruments have a small IFOV so they can achieve good image quality with a single f5 off-axis parabolic mirror feeding to a single on-axis fieldstop at prime focus. An inclined scan mirror covers the telescope's slightly diverging beam, and it is rotated at 400 rpm with a constant angular velocity vector parallel to this beam's primary ray to generate a scanned cone. The angle of incidence at the scan mirror is ~11.7° and four times this, the full scan cone angle, is ~47°. With its minimised optical surfaces this configuration is well suited to the IR, and the scan plus paraboloid mirrors only need to be cooled to ~−10 °C for their emitted photon noise to fit within allocation in the noise budget. Low angles of incidence and high mirror reflectivity mean that these Earth imaging fore-optics are essentially non-polarising. A conical scanner offers a number of advantageous features: • Automatic generation of two Earth-viewing swaths. • Space between these two swaths to accommodate full aperture inflight calibration sources, that are viewed completely each scan without discarding any useful Earth views or requiring any additional optical components (Fig. 1). • Chopped operation compatible with maintaining IR offset calibration. • Both the Earth views are calibrated with reference to the same pair of blackbodies using the same electro-optical system; removing the possibility of relative calibration errors when combined in the SST retrieval. It is of the essence in ATSRs' design that all the optical elements are used with constant areas and at the same angles for collecting both IR calibration data and each of the dual swath Earth signals. Clearly as no additional optical elements are required for calibration, no artefacts are introduced by extra optic, by area/angle variations, or operating cycle timing changes — the same is not true of all radiometer configurations (e.g. MODIS Barnes et al., 1998). Scan Direction

Nadir View

3. Calibration principles The IR radiometric calibration concept inherent in the ATSR design is that two on-board blackbody sources, at differing temperatures optimised to cover the important range of Earth scene temperatures, is used to establish two reference radiances. These reference signals are combining with other characterisation data determined by prelaunch calibration, to derive the IR radiometric response of the sensor continuously in time for scene pixels around the scan. Built into the instrument design are several features that make it possible to simplify the calibration schema and reduce the number of assumptions that need to be made in the process. Of these the key features are: • Cooled fore-optics to minimise any background signals and ensure any small residual signals remain stable over each “scan cycle” (i.e. during sampling of the two Earth view swaths and two calibration sources). • Same optical path used to view the Earth and the on-board sources. • Stray-light baffles and other stray-light control features. • Use of a blackbody calibration system employing two blackbodies at temperatures covering the cold and hot extremes of SST. (This significantly reduces the effects of system non-linearity in the range of expected SST temperatures compared to that those obtained by instruments which calibrate using a “zero” point (offset) derived from viewing cold space.) Hence the measured signal Cscene for a scene of radiance Lscene can be expressed as: Cscene = gainLscene + offset:

Cold Blackbody

Along-Track View Fig. 1. The ATSR scan pattern showing the positions of the on-board calibration targets and earth views.

ð2Þ

where for a perfect blackbody with emissivity ε = 1.0, the spectral radiance B(λ,T) at wavelength λ emitted by a blackbody at temperature T, per unit area, per steradian, per second, per unit wavelength is given by:   2 5 Bðλ; T Þ = 2hc = λ ð expðhc = λkb T Þ−1Þ ;

Hot Blackbody

ð1Þ

This is known as the “ATSR Radiometric Model”. Using this model (excluding for the moment any detector non-linearity) we can then determine the gain and offset for each spectral channel by using two sources of known radiance carried into orbit. The ATSRs have a cold and a hot blackbody, at temperatures Tcold and Thot respectively, that provide the known calibration radiances Lcold,λ and Lhot,λ. These temperatures have been chosen to span the range of expected SST, and minimise the need for extrapolation and therefore the effects of non-linearity over the range from 260 K to 300 K. To convert from temperature to radiance, and vice versa, we integrate the Planck function with a spectral response as a function of wavelength, Rλ(λ) to give the integrated in-band radiance as: Lλ ðT Þ = ∫Rλ ðλÞBðλ; T Þdλ;

VISCAL

5

ð3Þ

where h is Planck's constant, c is the velocity of light and kb is Boltzmann's constant. The radiometric calibration is dependent only on the radiances determined from the on-board calibration sources and does not require an absolute knowledge of the AΩ. The ATSRs work because the calibration sources are located ahead of all the optical components and are sampled as part of the normal measurement cycle so that the complete electro-optical chain is calibrated. However, note that it is essential that there be no observable contribution to the measured radiance from other sources, and so stray light control and entrance baffle clearances on optical beams are vital factors in the instrument design and performance verification.

6

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

Non-linearity is an expected characteristic of the conductive HgCdTe detectors for the 10.8 μm and 12 μm channels, Baker et al. (1978). Essentially, the electron–hole recombination rate increases as the number of carriers (electrons and holes); the result is a fall-off in the detector's response as the photon-flux increases. The approach for determining and correcting for the effects of nonlinearity was established for ATSR (Mason, 1991) and has been used for the other two sensors. Here the nonlinearity is expressed as the fractional fall-off in response, fλ(T), compared to the linear response function, such that fλ ðT Þ = z0;λ + z1;λ

 2 Lλ ðT Þ Lλ ðT Þ + z2;λ : Lλ ð320K Þ Lλ ð320K Þ

ð4Þ

This fall-off shape is applied within the calibration algorithm by considering a corrected radiance, L λ′ (T), where: Lλ′ ðT Þ = fλ ðT ÞLλ ðT Þ:

ð5Þ

The dates of the calibration campaigns for the three instruments were as follows: ATSR ATSR-2 AATSR

Oxford University Oxford University RAL

April-1989 to July-1989 November-1992 to March-1993 October-1997 to December-1997 October-1998 to December-19981

1 The AATSR calibration campaign was repeated after some rework on the instrument to rectify a technical problem that had been identified during the first test campaign.

4. On-board blackbody sources The on-board blackbody sources are described in detail in Mason et al. (1996); two of these highly accurate blackbody sources are built into each ATSR, and provide the basis for in-flight absolute calibration of the thermal infrared channels. During every scan, each of the two blackbodies is viewed in turn by all of the spectral channels, to provide the cold and hot reference radiances. One blackbody temperature ‘floats’ cold at the optimised temperature of the instrument foreoptics enclosure, whilst the other is heated at a constant power to provide a hot reference. The blackbodies and optical system are designed such that the instrument's full optical beam has an unobstructed view of the base-plate; is not clipped by any aperture or the blackbody cavity walls. Each base-plate has been designed to maintain a uniform temperature. The cavities provide a very high effective emissivity (ε N 0.999) by a combination of Martin Marietta black coating and a re-entrant cone base geometry. The temperatures of the blackbody bases are measured with high accuracy precision platinum resistance thermometers (PRTs). At unit level, these are calibrated with their flight electronics against a transfer standard PRT traceable to ITS-90. Even the high emissivity of ε = 0.999 is not unity, so the blackbody radiances will have a small reflected component originating from the instrument optical enclosure that itself behaves as a blackbody source. The calibrated blackbody radiance from a blackbody targets is: Lλ ðTbb Þ = ελ ∫Rλ ðλÞBðλ; Tbb Þdλ + ð1−ελ Þ∫Rλ ðλÞBðλ; Tinst Þdλ:

• Measurement of system non-linearity and formulation of a parameterisation to correct it. • Assessment of the measurement noise and derivation of the equivalent brightness temperature difference. • Confirmation of radiometry around the scanned swaths to establish that it is free from around-scan effects. • Inversion of the derived calibration algorithms to show they fit back to the raw results. • Investigation of radiometric performance under different general thermal conditions. • Verification that the algorithms return accurate radiometry during simulated orbital transient thermal conditions.

ð6Þ

Eq. (6) can be used in conjunction with Eqs. (1), (2) and (3) to provide all the information needed to convert a measured signal counts to a linearly calibrated radiance. 5. Pre-launch radiometric calibration The principle steps in the pre-launch calibration applied to each of the thermal infra-red channels instruments in the three ATSR instruments are: • Measurement of the spectral response for each channel at FPA subsystem level. • Operation of the instrument in simulated flight environment with its hot blackbody power set to give the required temperature differential with respect to the ambient (cold) one. • Verification of the “on-board” radiometric calibration over the full range of scene temperatures to be measured in flight (i.e. between 210 K and 315 K).

5.1. Test facilities' description For infrared radiometric calibration it is essential that the thermal environment be well controlled, and accurately monitored. This was achieved by performing tests in a vacuum chamber with the instrument under test surrounded by temperature controlled panels to allow it to operate at temperatures close to those expected from the thermal model at flight conditions. To accomplish this goal, four main thermal zones in the test facility were controlled. These were an “Earth-shine” plate (ESP) which was used to simulate radiation from the Earth, a “Payload Electronics Module (PEM) simulator” which mimicked the interfacing panel of the spacecraft, and a “cold box” to provide a uniform space temperature environment around the instrument and a “drum baffle” to shield the instrument to strays from the chamber walls. The ESP was used to support the external calibration targets which could be rotated about the instrument's scan cone, nominally set to be the tank's axis. ATSR and ATSR-2 were calibrated in a purpose built facility at the Department of Atmospheric Oceanic and Planetary Physics (DAOPP) at Oxford University (Mason, 1991). Although of the same focal length and aperture, AATSR was a larger instrument than ATSR and ATSR-2, and could not be fitted within the Oxford test facility. The AATSR calibration activities were therefore performed in the larger Space Test Chamber (STC) at the Rutherford Appleton Laboratory retaining the original design concepts and philosophy developed for the Oxford testing, and a subset of the test equipment. This is described in Smith (1999), and Smith et al. (2001). 5.2. External blackbody sources The external blackbody sources used for the pre-launch calibration of all three ATSR instruments were designed and built by the UK Meteorological Office to provide a scene radiance corresponding to an accurately known temperature. Using the same sources and procedures in ground testing has provided continuity of calibration between the different sensors. The requirement of these blackbodies is to provide a reference source for the scene radiance known to an uncertainty equivalent better than 0.1 K, an error that is apportioned as follows: • The errors caused by the emissivity not being unity must be less than 0.035 K. • The temperature of the target must be known to better than 0.02 K.

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16 Table 1 Calculated and measured emissivities of external blackbodies.

3.7 μm 11 μm 12 μm

Calculated

Measured

Difference

0.99899 ± 0.00035 0.99847 ± 0.00036 0.99871 ± 0.00037

0.99911 ± 0.00055 0.99870 ± 0.00040 0.99871 ± 0.00032

0.00012 0.00023 0.00000

Table 2 Percentage fall-off at 310 K for all ATSR instruments.

10.8 μm 12.0 μm

ATSR

ATSR-2

AATSR

3.0% 4.2%

4.2% 3.9%

3.5% 7.2%

7

probes that ensured good thermal contact with the blackbody, the other two sensors were mounted one halfway up the baffle and the other positioned near the aperture plate. However, as there were questions about the self-heating in these PRTs, for ATSR-2 they were replaced by 27 Ω rhodium–iron resistance thermometers (RIRTs) supplied by Oxford Instruments; these were demonstrated to be accurate to within ±0.01 K with self-heating of less than 0.001 K when measured by an AC resistance bridge. As is the case for the instrument's internal blackbodies, the radiance emanating from the external calibration units is derived using the formulation in Eq. (4). The emissivity of each target was calculated using a geometric model and the spectral emissivity of the paint, Table 1, Mason (1991). These values were validated by comparison against the ATSR-2 and AATSR on-board blackbodies (see Section 5.4).

• The temperature differences across the target are less than 0.02 K. • The temperature fluctuations over 5 min must be less than 0.01 K.

5.3. Test procedure

For ATSR testing, the operating temperatures of each target were measured using six Rosemount E109 100 Ω platinum resistance thermometers (PRTs). Four sensors were mounted in the baseplate in

Radiometric calibration and instrument performance verification was performed by measuring the signal channel responses over a range of preset stable external blackbody target temperature from

Fig. 2. Results from the radiometric calibration at the centre of nadir-view for BOL thermal balance conditions for ATSR (top), ATSR-2 (middle) and AATSR (bottom). Each plot shows, as a function of target temperature, the differences between the brightness temperature as measured by the instrument and the actual target brightness temperature as measured by the resistance thermometers.

8

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

210 K to 310 K. Tests were performed under different thermal conditions for the instrument, to establish the stability and robustness of its outputs. In practice, only one of the targets was changed during a calibration run (usually nadir) whilst the other (along-track target) was maintained at a fixed temperature for reference (Table 2). The initial detailed calibration runs were undertaken at the centre of nadir view, with the nadir target temperature being moved at 5 K intervals from 210 K to 315 K. Once the initial capability of the instrument was established the remaining calibration tests covering a full range of viewed scan swath angles were performed using a reduced set of target temperatures at 10 K intervals between 240 K and 310 K. The purpose of these tests was to measure any scandependent variations; the calibration was verified at different positions around the scan to cover all nadir and along-track pixels. A subset of these tests was carried out holding both external targets at a constant temperature of 240 K and cycling the Earth Shine Plate temperature to provide a more stringent test that the baffles were contributing negligible stray radiation. Measurements were also taken with an external target at approximately the same temperature as

each of the on-board blackbodies, to verify calibration at points where no correction for non-linearity was needed. For these tests the fixed target was usually set to 280 K, being approximately mid-way between the temperatures of the two on-board blackbodies. To ensure that the overall radiometric uncertainty from the sources was below 0.04 K, measurements were collected only when both target base-plates were drifting at rates below 0.02 K over 5 min, and the temperature differences across the targets were below 0.02 K. To verify the stability of the sensor over its lifetime, as well as in normal daily operation, each radiometric test was executed with ATSR and the test environment under thermal equilibrium at differing thermal conditions: these included the conditions expected at beginning of life (BOL) and end-of-life (EOL), as well as simulated orbital cycles. The orbital simulations were performed with the nadir target at 240 K, 275 K and 310 K for two orbits each, and with alongtrack target maintained at 280 K. Low radiance measurements were performed by cooling the along-track target with liquid nitrogen. When the target temperature had reached 96 K, the LN2 supply was disconnected and the blackbody

Fig. 3. Brightness temperature errrors with the external blackbodies set at the same temperatures as the on-board blackbodies for ATSR (left), ATSR-2 (centre) and AATSR (right). Data for the + XBB are shown as red diamonds and the –XBB are shown as blue triangles.

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

9

Fig. 4. NEΔT as a function of target temperature for ATSR (top), ATSR-2 (middle) and AATSR (bottom) at BOL thermal conditions.

was allowed to warm up slowly, and measurements taken at 1 K intervals. These measurements were used to determine non-linearity, Table 2, and provide the coefficients for the corrections. 5.4. Summary of results The main pre-launch calibration results and analyses are described in Mason (1991), Smith et al. (1993) and Smith (1999). A summary of the results at BOL conditions for each ATSR instrument is shown in Fig. 2. The calibration error is defined as the difference in the brightness temperature of the external targets as measured by ATSR and the actual target brightness temperature as determined from the temperatures as measured by the thermometers. Where the external blackbody temperature is close to that of an on-board blackbody, errors due to non-linearity become insignificant and only residual errors remain. So for each of the calibration runs, measurements were taken with the external blackbody temperature matched to the on-board target temperatures to investigate these residuals, without the need to allow for detector non-linearity. The differences between the brightness temperatures and actual target temperatures, for all tests at BOL thermal environment, are shown in Fig. 3. As well as establishing the radiometric bias errors, the radiometric noise was measured as a function of scene temperature. The

instrument noise is expressed as the noise equivalent brightness temperature difference, NEΔT, and is given by: NEΔTλ = A1;λ ΔCλ

  ∂Lλ −1 ∂T T

ð7Þ

where ΔCλ is the signal channel noise taken to be the standard deviation of the blackbody pixel counts. Fig. 4 shows the NEΔTs for each instrument plotted as a function of target temperature for the BOL calibrations at the centre-of-nadir view. Formally, the requirement for NEΔT is set for a scene temperature of 270 K and the measurements compared against the requirement at this temperature are given in Table 3. In all cases, the measured noise was within the requirement and there is good agreement across instruments. It should be noted that the apparent Table 3 NEΔT of IR channels at target temperature of 270 K.

3.7 μm 10.8 μm 12.0 μm

Requirement

ATSR

ATSR-2

AATSR

0.080 0.050 0.050

0.046 0.023 0.031

0.051 0.021 0.024

0.037 0.025 0.023

10

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

rise in NEΔT as the target temperature decreases is purely related to ∂ L/∂ T as a function of scene temperature. For the 11 μm and 12 μm channels the dominant noise source is detector noise, which remains almost constant for all scene temperatures. At 3.7 μm, pre-amplifier noise dominates for low photon fluxes (Tscene b 250 K) whilst at higher photon fluxes the noise becomes dominated by statistical photon signal noise. 6. In orbit performance and monitoring To state the obvious, once in orbit there is no longer access to an independent absolute calibration facility external to the instruments. Nevertheless, careful monitoring of the in-orbit trends described in this section contributes towards confidence in the radiometric measurements. 6.1. Cooler performance During in-flight commissioning, the ATSR-1 Stirling cycle cooler was seen not to cool the FPA as had been achieved during ground calibration,

i.e. to run at a stable FPA temperature of ~80 K it needed to be run at much higher amplitudes. For ATSR on ERS-1 the Experiment Interface Document with the satellite specified that ATSR had a good view-factor to space from its anti-earth side. In addition, spacecraft panels had been built with extensions so as to prevent direct sunlight reaching this side of the instrument on which the cooler and FPA are situated. The UK calibration facility was implemented on such a basis and the Stirling Cycle cooler easily cooled the FPA down to 80 K with mechanism amplitude to spare; ATSR was then calibrated. However neither of the spacecraft interface provisions was met. At spacecraft level testing the cooler body was getting too hot and a heatpipe solution was successfully retro-fitted. However, in flight it was soon apparent that the cooler was running at much higher amplitudes than during instrument calibration and struggling to reach 80 K. To prevent premature ageing of the mechanism and maximise the lifetime of the mission, the decision was taken to modify the operations, and the cooler was operated at fixed amplitude levels rather than with active control. Thus the orbital temperature cycling of the cryogenic elements was not constrained, although the average temperature was maintained by occasional manual updates to the commanded cooler amplitudes and frequency.

Fig. 5. Mission trends of the IR detector temperature (top), commanded gains (middle) and offsets (bottom) for ATSR (left), ATSR-2 (centre) and AATSR (right). The gain and offsets are shown for 12 μm (RED), 11 μm (green) and 3.7 μm (blue).

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

Initially, at the start of the mission, the aim was to keep orbital temperature peaks below 90 K, but after a couple of years this was relaxed to 100 K. Care was taken to minimise excursions above this temperature in order to keep the 12 μm detector in the operating region where its HgCdTe long-wavelength responsivity cut-off did not significantly alter SST retrievals. Later in the mission, even 100 K was not always achievable on a continuous basis and retrievals had to take spectral response changes into account. Optimising cooler operation was the most challenging part of running ATSR in flight, and unavoidably the dataset has variable noise performance, and later in the mission some variation in spectral response. One factor that helped late in the ATSR mission was that the instrument was de-hibernated at roughly 3 month intervals for just over a day, giving data overlap periods with ATSR-2, with the result that the FPA was always clean and needed minimum cooling. It is known that the response non-linearity is dependent on the detector temperatures. This was characterised pre-launch at the nominal operating temperature of 80 K and at an elevated temperature of 90 K then fitting the results to theory. Although there was a small change in non-linearity, with the calibration algorithm applied the differences in the measured brightness temperatures between 260 K and 300 K were within the ±0.1 K requirement and it was concluded that the same temperature to radiance look-up-tables were valid for all anticipated detector temperatures.

11

The detectors for the 12 μm channel were specified to have a long wavelength cut-off so that, over the normal operating range up to 90 K, their change in temperature would not affect spectral response. The pre-launch measurements of noise performance showed the expected worsening between 80 K and 90 K over which range the spectral response was taken as fixed. However, eventually an increase in detector temperature will affect the 12 μm spectral response, and in orbit it became apparent that the detectors would have operate at up to 100 K. So additional post launch measurements were performed on detectors from the same batch to characterise the spectral response changes that came into play at these elevated temperatures. These measurements were used in the adjustment of instrument products when running at higher temperatures. To counter these problems ATSR-2 on ERS-2 had many details altered in its thermal blanket and finish. Crucially, RAL designed a low emissivity radiation shield around the inside cryogenic part of the FPA assembly such that even with the now correctly understood spacecraft interfaces (essentially unchanged from ERS-1) the load on its cooler was below the specified 400 mWatts. Then the cooler, unchanged from ERS-1, successfully maintained the IR FPA at 80 K for the whole mission, conferring stable noise and spectral performance. Other design changes were made for ATSR-2 to minimise cooler noise pick-up that was a problem on ATSR-1; the detector signal

Fig. 6. Mission trends of the NEΔTs for ATSR (left), ATSR-2 (centre) and AATSR (right). Hot BB signals are shown in red diamonds and cold BB signals are shown in blue triangles.

12

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

preamplifier was relocated away from the cooler and plugged directly into the FPA, which also removed the need for a heavily screened lead. In-flight the ATSR-2 cooler behaved extremely well with no discernable degradation in its behaviour or performance throughout the instrument's 13-year operating lifetime — the ATSR-2 cooler remained perfectly functional even though instrument's mission was terminated because the scan mirror mechanism failed! The AATSR cooler, a much bigger mechanism provided by British Aerospace Bristol (now EADS ATSRIUM-UK) and integrated into a sub-system with its drive electronics by RAL. It has performed in an exemplary fashion throughout the mission lifetime to date and is maintaining the infrared focal plane assembly at its target temperature 80 ± 0.5 K during nominal operations. A cool-down from ambient to 80 K typically takes approximately 6 h and there are no signs of any degradation in performance. A significant influence on the AATSR cooler performance has been the build up of water ice contamination on the outer surfaces of the FPA cryogenic components. Some level of water ice contamination was expected because of the FPA is vented and carbon fibre structures are used for the instrument and spacecraft. These are known to absorb water vapour whilst on ground which then desorbs on-orbit and can condense on cold surfaces below 150 K. The effect of the water ice condensation is to increase emissivity and hence increase the thermal loading on the FPA. To maintain the FPA at 80 K, the cooler control software compensates by increasing the cooler drive levels. The cooling performance is recovered after the FPA has been decontaminated by allowing the FPA to warm to ambient for 48 h. As the water vapour surrounding the spacecraft slowly disperses, the rate at which the cooler drive levels and temperatures rise has gradually decreased after each successive cool-down. For ATSR-2 the contamination rate was estimated to be 0.1 μm per day during commissioning which decreased to 0.01 μm per day after the first year of operation, which meant that it was

only necessary to decontaminate ~6 month intervals. However, the deposition rate for AATSR was an order of magnitude higher than observed at the start of the ATSR-2 mission, which resulted in a more rapid build up of water ice contamination. This means that regular outgassing at ~4 month intervals are necessary to prevent the layer thickness from reaching excessive levels and having a detrimental effect on the optical and thermal performance. Although contamination does affect AATSR optics, the overall instrument radiometric performance was not affected since the onboard sources provide continuous calibration. Also, despite the increased water ice contamination, the AATSR Stirling cycle cooler has been operating well within its operational limits with no sign of degradation and has not needed any adjustment to its software configuration since launch. 6.2. Throughput gain In ATSRs, with their d.c. coupled signal channels, an IR Auto-GainOffset (AGO) loop optimally maintains each detector signal within its 12 bit digitisation ranges. The loop compares the measured signal channel counts for the hot and cold blackbodies against a nominal characteristic of counts versus temperature. From this it steps towards improved commanded pairs of gain and offset values and maintains the required overall brightness temperature range. Therefore the trends of the gain provide an inverse indication of the variation of total ATSR sensitivity over time, Fig. 5. Some months into operations, the ATSR 3.7 μm channel failed. This has been a one-off failure in the series for which no proven cause has been established. The effect was temporarily to remove a valuable source for night-time SST information (daytime SSTs were unaffected because the 3.7um channel is not used due to the presence of reflected solar energy in the measured signals). For ATSR, the main factor

Fig. 7. Fourier Transform spectrum of ATSR blackbody noise signals. This analysis was performed over 512 scans and after removal of background drifts due to the temperature variations of the blackbodies. It should be noted that because the cooler frequency is 43 Hz and the scan period is 6.7 Hz, the noise signal is aliased strongly.

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

13

Table 4 Typical blackbody thermometer readings for AATSR taken on 3rd of June 2002. The top row shows the average of the 5 baseplate sensors (PRT1-PRT5). Note that PRT6 is the baffle temperature and is not used in the average. The difference column shows the differences between the individual sensor readings and the average temperature. The final column is a typical reading from the pre launch calibration in December 1998. − XBB temperatures (K)

+ XBB temperatures (K)

Baseplate average PRT1 PRT2 PRT3 PRT4 PRT5 PRT6 (Baffle Sensor)

Reading

Difference

Pre-launch

301.522 301.513 301.518 301.526 301.525 301.530 301.905

– − 0.009 − 0.004 0.004 0.003 0.008 0.383

293.527 − 0.009 − 0.002 0.002 0.001 0.006 0.391

Baseplate Average PRT1 PRT2 PRT3 PRT4 PRT5 PRT6 (Baffle Sensor)

Reading

Difference

Pre-launch

262.897 262.898 262.899 262.897 262.892 262.897 262.882

– 0.001 0.002 0.000 − 0.005 0.000 − 0.015

252.773 0.001 0.000 0.000 − 0.001 − 0.002 − 0.017

Fig. 8. Mission trends for ATSR (left), ATSR-2 (centre) and AATSR (right) of the daily averages of the + XBB and –XBB baseplate mean temperatures, and the differences from the mean of the individual sensor readings.

14

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

affecting the long term gain stability was the FPA temperature. As this tended to increase over time, the commanded gains were gradually increased by the AGO loop to compensate for the decrease in the radiometric response. With the FPA/IR detector temperatures having remained stable throughout the ATSR-2 and AATSR missions, we note the AGO commanded gains after each outgassing have remained stable, which would also indicate that there has been negligible degradation of any instrument component including the optics.

6.3. Radiometric noise In-flight, the radiometric noise is monitored using the two onboard blackbodies signals. The mission trends of the NEΔTs for each instrument are presented in Fig. 6. For ATSR, the NEΔT trend of the 11 and 12 μm channels increases over the mission lifetime. This is due to the noise signal being amplified as the electronic gain was increased to compensate for the loss of radiometric response. When considering the raw detector signals in Volts the noise signals actually remained stable suggesting that the detectors and preamplifiers have not degraded in performance. Again for ATSR, there is an additional significant noise contribution from magnetic cooler drive pick-up. This signal can be observed by careful inspection of ATSR images over uniform scenes and is clear in Fourier analysis of the blackbody signals (see Fig. 7). Due to cooler operating frequency the signal is almost an odd number of half cycles per scan and drifts slowly in relation to the scan cycle. This signal was not present (or at least insignificant) on ATSR-2 or AATSR.

For ATSR-2, we note that the noise performance derived from the on-board blackbodies appears variable over the mission, as it can be subtly affected by scan mirror rotation instability. All ATSRs use two integrators for alternate pixels so as to maximise noise performance. Because the integrators have marginally different characteristics for the same digitally commanded settings, the data from odd and even pixel integrators are calibrated separately. Normally there are exactly 2000 pixels during the 150 ms scan period and so odd/even pixels occur keep fixed locations in a scan. However, when a scan jitter occurs it is possible to get 2001 or 1999 pixels per scan, so the locations of odd/even are swapped. As the software used to produce this figure does not track the details of this effect, such a scan mirror jitter gives an apparent increase in the radiometric noise. However, as there is no long term drift it can be inferred that the detector and optics performance has remained stable throughout the mission. The AATSR noise performance has remained very stable throughout the mission. Occasional outliers can be observed in the trends, Fig. 6, but these can be traced to the start and finish of outgassing cycles when the detectors are not at their nominal operating temperature. 6.4. Blackbody radiances As described, each blackbody has multiple temperature sensors, each of which has its own precision amplifier before their signals are multiplexed. Therefore consistency within a blackbody's temperatures builds confidence. Table 4 shows such a set of in-orbit readings compared against measurements taken during the pre-launch calibration. Although the in-flight readings are warmer by about 10 K, the differences between the individual sensors readings and the

Fig. 9. Blackbody temperatures and IR channel blackbody signals for AATSR blackbody crossover test performed on 21st of April 2009.

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16 Table 5 Results from ATSR-2 blackbody cross-over test performed on 21st and of 22nd April 2004. 1995 results

3.7 μm 11 μm 12 μm

2004 results

+ XBB

− XBB

+ XBB

− XBB

− 0.0047 − 0.0027 − 0.0015

0.0024 0.0010 0.0009

0.0115 0.0049 0.0058

− 0.0138 − 0.0060 − 0.0069

base plate averages are well maintained. Fig. 8 suggests both that the temperature differences across the blackbodies and also that the relative calibrations of the PRTs have not changed, linking well to prelaunch temperature baselines. It is also possible in flight to compare signals as measured radiometrically from both blackbodies in what has come to be known as a “blackbody cross-over test” by switching the heaters over between the + XBB and –XBB blackbodies (and allowing the temperatures to “cross-over” and stabilise, and then to reverse the settings back to normal). The basic idea is to compare the radiometric signals in the thermal channels when the two blackbodies are at identical indicated temperatures. Any significant difference would imply a drift in the blackbody thermometer calibration, an around scan stray, or a change in one emissivity caused by a deterioration of the black surface finish. To draw conclusions from these data one has to take account of some esoteric blackbody data collection timings and rates of change of temperature. The blackbody temperatures and radiometric signals during a typical blackbody cross-over test for AATSR are shown in Fig. 9. For ATSR-2, the test was performed during the commissioning phase and repeated on 14 and 15 July 2004. For the second test the onboard tape drives of ERS-2 had not been functioning since July 2003, and hence science data were only available when the spacecraft was in range of the Kiruna ground station. To ensure that the cross-over occurred during the 10 min of Kiruna visibility it was therefore necessary to schedule the commands using timings calculated from the 1995 test results, a challenge that was successfully achieved. Results for ATSR-2 commissioning showed that there were no significant differences between the two on-board blackbodies. The 2004 results show that although there has been some drift, the relative radiometric differences between the ATSR-2 blackbodies were less than 0.05 K, Table 5. For AATSR, the test was performed during commissioning and is now performed roughly annually. The results for AATSR indicate that, relative to each other, the brightness temperature errors from the blackbodies are less than 10 mK at 11 and 12 μm, and below 20 mK at 3.7 μm. Comparing with earlier measurements, Fig. 10, it can be seen that the 11 and 12 μm channels are stable over time, whilst there appears to be a very slow increase in the 3.7 μm channel of

Temperature (mK)

16 14 12

approximately 6 mK over 7 years. Even for the trend in this channel, the one with the lowest blackbody emissivity, the apparent brightness temperature difference is still much smaller than the radiometric noise. It should be noted that the test is a comparison of one-blackbody against the other on the assumption that the reference is stable, and therefore does not provide an absolute calibration of the blackbodies. Although this not a direct verification of the overall radiometric error, the results combined with the verified consistency of the blackbody thermometry and signal channel gains and offsets, Fig. 5, suggest that the radiometric performance is being maintained. The blackbodies were developed for ATSR (Mason et al., 1996). As part of this process extremely demanding experiments were performed directly measuring emissivity. Also thermometry was checked for stability over temperature cycling and for extended timelines. With this bedrock it is hardly surprising that ATSR radiometry is good. However, although in-flight the cross-over tests exercise much of the overall radiometry function of the instruments; even they are clearly not an independent absolute radiance re-calibration. There are at least two further techniques that have been employed on ATSRs' IR channels. First, radiances as received at the satellite can be correlated between satellites, especially when they orbit in tandem. Alternatively the SST/surface radiance product itself can be validated in campaigns involving satellite overpasses. Many campaigns have been performed to support the validation of satellite SSTs that demonstrate that the ATSRs radiometric performance is achieved. Examples of ATSR validation results can be found in Corlett et al. (2006) and O'Carroll et al. (2006). 7. Conclusions By viewing external blackbody sources and operating the instruments in vacuo under flight conditions, included simulated orbits, the pre-launch calibration activities of the ATSR series have clearly demonstrated that the infrared radiometric accuracies to be better than 0.05 K traceable to ITS-90. The accuracy is maintained around the full instrument swath and under different thermal conditions. Also such calibration tests demonstrate that the radiometric noise performance for each instrument was below 0.05 K in all channels at a reference temperature of 270 K. In-orbit monitoring has verified that the on-board radiometry including calibration sources have remained stable throughout mission lifetimes. This is confirmed by blackbody cross-over tests performed for ATSR-2 and AATSR, which imply that any drift of the thermometry and/or emissivity of the two on-board sources was below 0.02 K. For the early ATSR, the main factor affecting the scientific performance has been the gradual increase in detector temperature

+XBB Temperature Uncertainty

20 18

15

3.7µm 11µm 12µm Trend (3.7µm) Trend (11µm) Trend (12µm)

10 8 6 4 2 0 Jan-02

Jan-03

Jan-04

Jan-05

Jan-06

Jan-07

Jan-08

Jan-09

Fig. 10. Blackbody temperature uncertainties from the AATSR cross-over tests.

Jan-10

16

D. Smith et al. / Remote Sensing of Environment 116 (2012) 4–16

during the mission due to factors discussed. This aside, all analyses of the data are consistent with the ATSR series having provided state-ofthe-art stable precision radiometry throughout their missions.

Acknowledgements The authors would particularly like to thank Jack Abolins, John Wright and Brian Maddison of RAL, Gary Corlettt and David LlewellynJones from Leicester University, Francois Bousquillon de Freschville, Serge Paturaud, Hugues Dufort and Jean Noel Berger of ESOC, Phillipe Goryl, Wolfgang Lengert and Pascal Lecomte of ESRIN and Miguel Canella from ESTEC and many others for their inputs over the years without whom this work would not have been possible. ATSR and ATSR-2 were originally funded by the Science and Engineering Research Council (SERC) with responsibility being transferred to the Natural Environment Research Council (NERC). AATSR was funded by the UK Department of Environment Food and Rural Affairs (DEFRA), and now the UK Department for Energy and Climate Change (DECC). ATSR and ATSR-2 data are provided courtesy of the European Space Agency (ESA), NERC, the British National Space Centre (BNSC) and RAL. AATSR data are provided courtesy of the NERC Earth Observation Data Centre (NEODC), the UK Department for Energy and Climate Change (DECC) and ESA.

References Baker, I. M., Capocci, F. A., Charlton, D. E., & Wotherspoon, J. T. M. (1978). Recombination in cadmium mercury telluride photodetectors. Solid-State Electronics, 21, 1475–1480. Barnes, W. L., Pagano, T. S., & Salomonson, V. V. (1998). Prelaunch characteristics of the Moderate Resolution Imaging Spectroradiometer (MODIS) on EOS-AM1. IEEE Transactions on Geoscience and Remote Sensing, 36(4), 1088–1100. Corlett, G. K., Barton, I. J., Donlon, C. J., Edwards, M. C., Good, S. A., Horrocks, L. A., et al. (2006). The accuracy of SST retrievals from AATSR: an initial assessment through geophysical validation against in situ radiometers, buoys and other SST data sets. Advances in Space Research, 37(4), 764–769. Edwards, T., Browning, R., Delderfield, J., Lee, D. J., Lidiard, K. A., Milborrow, R. S., et al. (1990). The Along Track Scanning Radiometer measurement of sea-surface temperature from ERS-1. Journal of the British Interplanetary Society, 43, 160–180. Mason, G. (1991). Test and calibration of the Along Track Scanning Radiometer, a satellite-borne infrared radiometer designed to measure sea surface temperature. D. Phil Thesis, University of Oxford. Mason, I. M., Sheather, P. H., Bowles, J. A., & Davies, G. (1996). Blackbody calibration sources of high-accuracy for spaceborne infrared instrument - the Along-Track Scanning Radiometer. Applied Optics, 35, 629–639. O'Carroll, A. G., Saunders, R. W., & Watts, J. G. (2006). The measurement of the sea surface temperature by satellites from 1991 to 2005. Journal of Atmospheric and Oceanic Technology, 23(11), 1573–1582. Smith, D. L. (1999). AATSR Infra-Red Radiometric Calibration Report. AATSR Project Document PO-RL-RAL-AT-0024. Smith, D. L., Delderfield, J., Drummond, D., Edwards, T., Mutlow, C. T., Read, P. D., et al. (2001). Calibration of the AATSR instrument. Advances in Space Research, 28, 31–39. Smith, D. L., Watkins, R. E. J., & Corney, D. C. (1993). Test and Calibration of the AlongTrack Scanning Radiometer-2 — Final Report. ATSR-2 Project Document ER-RP-OXFAT-2001. Stricker, N. C. M., Hahne, A., Smith, D. L., Delderfield, J., Oliver, M. B., & Edwards, T. (1995). ATSR-2 — the evolution in its design from ERS-1 to ERS-2. ESA BulletinEuropean Space Agency, 83, 32–37.