A Systematic View of Remote Sensing

A Systematic View of Remote Sensing

C H A P T E R 1 A Systematic View of Remote Sensing O U T L I N E 1.1. Introduction 2 1.2. Platform and Sensor System 1.2.1. Geostationary Satellit...

5MB Sizes 241 Downloads 462 Views

C H A P T E R

1 A Systematic View of Remote Sensing O U T L I N E 1.1. Introduction

2

1.2. Platform and Sensor System 1.2.1. Geostationary Satellites 1.2.2. Polar-orbiting Satellites 1.2.3. U.S. Operational Missions 1.2.4. Sensor Types 1.2.5. Data Characteristics 1.2.5.1. Spatial Resolution 1.2.5.2. Spectral Resolution 1.2.5.3. Temporal Resolution 1.2.5.4. Radiometric Resolution

2 3 4 5 7 8 8 8 8 8

1.6.1.1. Scene Generation 1.6.1.2. Surface Radiation Modeling 1.6.1.3. Atmospheric Radiative Transfer 1.6.1.4. Sensor Modeling

1.6.2. Inversion Methods 1.6.2.1. Statistical Analysis and Machine Learning Techniques 1.6.2.2. Optimization Algorithms 1.6.2.3. Look-up Table Algorithms 1.6.2.4. Data Assimilation Methods

1.3. Data Transmission and Ground Receiving System 11 1.4. Data Processing System 1.4.1. Radiometric Calibration 1.4.2. Geometric Processing 1.4.3. Image Quality Enhancement 1.4.4. Atmospheric Correction 1.4.5. Image Fusion and Product Integration

14 14 15 16 17

22 22 24 25 25

25 26 26 27

1.7. Production, Archiving, and Distribution of High-level Products 27 1.8. Product Validation

28

18

1.9. Remote Sensing Applications

28

1.5. Mapping Category Variables

18

1.10. Concluding Remarks

30

1.6. Estimating Quantitative Variables 1.6.1. Forward Radiation Modeling

22 22

Advanced Remote Sensing DOI: 10.1016/B978-0-12-385954-9.00001-0

1

Copyright Ó 2012 Elsevier Inc. All rights reserved.

2

1. A SYSTEMATIC VIEW OF REMOTE SENSING

Abstract This chapter provides an overview of the remote-sensing system, including the platform and sensor system, data transmission and ground receiving system, processing system of radiometric and geometric properties, analysis system for mapping category variables and generating high-level products of quantitative variables, product production and distribution system, product validation system, and remote- sensing applications. It aims to present a complete picture of the state-of-the-art development of remote sensing techniques by linking different chapters in the rest of the book and filling in any possible gaps.

1.1. INTRODUCTION We are living in a world whose population is rapidly increasing, depleting natural resources, and experiencing the possible consequences of human-induced climate change. Our ability to meet these challenges partially depends on how well we understand the Earth system and use that information to guide our actions. Remote sensing is a tremendous source of information needed by policy makers, resource managers, forecasters, and other users, and it has become increasingly vital for the effective and sustainable future management of the Earth. A remote sensing system consists of instrumentation, processing, and analysis designed to measure, monitor, and predict the physical, chemical, and biological aspects of the Earth system. Sophisticated new technologies have been developed to gather vast quantities of data, and the mathematical and physical sophistication of the techniques used to process and analyze the observed data has increased considerably. The first chapter of the book aims to link diverse components to paint a full picture of a remote sensing system as illustrated in Figure 1.1. It starts with a brief introduction to the platform and sensor system for acquiring data, and then moves on to the data transmission and ground receiving system, the processing system for handling the geometric and radiometric properties of data, the analysis system for extracting information on both category and

FIGURE 1.1 Key components of the remote sensing system.

numerical variables of the Earth surface environment, the product generation and distribution system, the product validation system, and end-user applications. Applications largely define the data acquisition system, and endusers often need to validate the products to quantify their errors and uncertainties.

1.2. PLATFORM AND SENSOR SYSTEM The data acquisition system mainly consists of the sensor and the platform on which the sensor resides. The platform may be on the surface, in the air, or in space. A surface platform may be a ladder, tower, cherry-picker, crane, tall

1.2. PLATFORM AND SENSOR SYSTEM

building, or scaffolding that provides data used primarily for validation. Aerial platforms include aircraft and balloons. Unmanned aerial vehicles (UAVs) are being increasingly used for remote sensing purposes these days. Spaceborne platforms are mainly satellites and space shuttles. Only satellite sensor systems (both geostationary and polar-orbiting) will be considered in the rest of this section.

1.2.1. Geostationary Satellites A geostationary satellite is in a geostationary orbit, which can only be achieved at an altitude

3

very close to 35,786 km (22,236 m) and keeps the satellite fixed over one longitude at the equator. The satellite appears motionless at a fixed position in the sky to ground observers. There are several hundred communication satellites and several meteorological satellites in such an orbit. Figure 1.2 illustrates a few typical meteorological satellites in the geostationary orbit relative to the polar-orbiting satellites. U.S. operational weather satellites include the geostationary operational environmental satellites (GOES) for short-range warning and “nowcasting” primarily to support the National Weather Service (NWS) requirements. The

FIGURE 1.2 Illustration of the distribution of a few common geostationary satellites compared to the polar-orbiting satellites.

4

1. A SYSTEMATIC VIEW OF REMOTE SENSING

procurement, design, and manufacturing of GOES are overseen by the National Aeronautics and Space Administration (NASA), while all operations of the satellites once in orbit are effected by the National Oceanic and Atmospheric Administration (NOAA). Before being launched, GOES satellites are designated by letters (-A, -B, -C.). Once a GOES satellite is launched successfully, it is re-designated with a number (-1, -2, -3.). Normally two GOES satellites are operational. Information on the GOES series is shown in Table 1.1. The third generation of GOES, the new GOESR satellite series planned for launch in 2015, represents a significant improvement in spatial, temporal, and spectral observations (several orders of magnitude) over the capabilities of the currently operational GOES series. European operational missions are currently operated by the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). EUMETSAT’s geostationary satellite programs include the earlier Meteosat system (up to Meteosat-7) and three Meteosat Second Generation (MSG) satellites (MSG-1,2,3 or Meteosat-8,9,10). A fourth MSG satellite is being considered. The MSG satellites carry an impressive pair of instrumentsdthe Spinning Enhanced Visible and InfraRed Imager (SEVIRI), which has the capacity to observe the Earth in 12 spectral channels and provide image data every half hour, and the Geostationary Earth Radiation Budget (GERB) instrument supporting climate studies. The Meteosat Third Generation is also in the planning stage and is expected to achieve operation in the 2020s. The Japanese Geostationary Meteorological Satellite (GMS) series had five satellites from 1977. The Multifunctional Transport Satellites (MTSAT) are the successors to the GMS 1-5 satellite series. The launch of MTSAT-1 failed in 1999, but MTSAT-1R was successfully launched in 2005. After a five-year service in space, it was replaced by MTSAT-2 from 2010. China has launched six of the first generation of geostationary meteorological satellites

named Fengyun (FY-2) from FY-2A to FY-2F since 1997. The second generation of geostationary meteorological satellites FY-4 is also under development.

1.2.2. Polar-orbiting Satellites Polar-orbiting satellites can provide an observational platform for the entire planet surface, while their geostationary counterparts are limited to approximately 60 degrees of latitude TABLE 1.1

Information on GOES Satellite Series

Satellites

Launch day

Status

1

Oct. 16, 1975

decommissioned

2

June 16, 1977

decommissioned

3

June 16, 1978

decommissioned

4

Sept. 9, 1978

decommissioned

5

May 22, 1981

deactivated on July 18, 1990

6

April 28, 1983

decommissioned

G

May 3, 1986

Failed to orbit

7

April 28, 1987

used as a communications satellite

8

April 13, 1994

decommissioned

9

May 23, 1995

decommissioned

10

April 25, 1997

decommissioned

11

May 3, 2000

In operation as GOESWest

12

July 23, 2001

standby, providing coverage for South America

13

May 24, 2006

In operation as GOES East

14

June 27, 2009

On-orbit storage; Will replace GOES-11

15

March 4, 2010

stand-by

GOES-R

Scheduled in 2015

1.2. PLATFORM AND SENSOR SYSTEM

at a fixed point over the earth. Polar-orbiting satellites are able to circle the globe approximately once every 100 minutes. Most polarorbiting Earth observation satellites, such as Terra, ENVISAT, and Landsat, have an altitude of about 800 km. They are in sun-synchronous orbits passing directly over a given spot on the ground at the same local time. A relatively low orbit allows detection and collection of data by instruments aboard a polar-orbiting satellite at a higher spatial resolution than from a geostationary satellite. NASA has launched a series of polar-orbiting satellite missions with the ability to characterize the current state of the Earth system. All the missions fall into three types: exploratory, operational precursor and technology demonstration, and systematic. Exploratory missions are to yield new scientific breakthroughs. Each exploratory satellite project is expected to be a one-time mission that can deliver conclusive scientific results addressing a focused set of scientific questions. In some cases, an exploratory mission may focus on a single pioneering measurement that opens a new window on the behavior of the Earth system. These missions are managed in the NASA Earth System Science program (ESSP). Examples include the Gravity Recovery and Climate Experiment (GRACE), and CloudSAT. GRACE data can be used for estimating soil moisture and surface/underground water (Section 21.3). Operational precursor and technology demonstration missions enable major upgrades of existing operational observing systems. NASA is investing in innovative sensor technologies and developing more cost-effective versions of its pioneer scientific instruments that can be used effectively by operational agencies. An example is the NMP EO-1 (New Millennium Program Earth Observing-1) mission launched on November

5

21, 2000, which includes three advanced land imaging sensors and five revolutionary cross cutting spacecraft technologies. The three sensors led to a new generation of lighter weight, higher performance, and lower cost Landsat-type Earth surface imaging instruments. The hyperspectral sensor Hyperion is the first of its kind to provide images of land surface in more than 220 spectral bands. Systematic missions provide systematic measurements of key environmental variables that are essential to specify changes in forcings caused by factors outside the Earth system (e.g., changes in incident solar radiation) and to document the behavior of the major components of the Earth system. An example is the Earth Observing System (EOS) program. EOS is the centerpiece of NASA’s recent Earth observation program. It was conceived in the 1980s and began to take shape in the early 1990s. It is composed of a series of satellites and sensors, a science component, and a data system supporting a coordinated series of polar-orbiting and low inclination satellites for long-term global observations of the land surface, biosphere, solid Earth, atmosphere, and oceans. Complete and still active EOS satellites are shown in Tables 1.2 and 1.3.

1.2.3. U.S. Operational Missions NOAA’s operational environmental satellite system is composed of two types of satellites: GOES for national, regional, short-range warning and “now-casting,” and polar-orbiting environmental satellites (POES) for global, long-term forecasting, and environmental monitoring. Both types of satellite are necessary for providing a complete global weather monitoring system. In addition, NOAA operates satellites in the Defense Meteorological Satellite Program (DMSP) that are also polar-orbiting satellites.

6 TABLE 1.2

1. A SYSTEMATIC VIEW OF REMOTE SENSING

Active EOS Satellites as of November 2011

Satellites

Launch day

Landsat-7

April 15, 1999

Quik Scatterometer (QuikSCAT)

June 19, 1999

Terra

December 18, 1999

Active Cavity Radiometer Irradiance Monitor Satellite (ACRIMSAT)

December 20, 1999

Jason-1

December 7, 2001

Aqua

May 4, 2002

Solar Radiation and Climate Experiment (SORCE)

January 25, 2003

Aura

July 15, 2004

Ocean Surface Topography Mission (OSTM)

June 20, 2008

Landsat Data Continuity Mission (LDCM)

December, 2012

The GOES satellites have been introduced in Section 1.2.1. The POES system includes the Advanced Very High Resolution Radiometer (AVHRR) and the Television Infrared Observation Satellite (TIROS) Operational Vertical Sounder (TOVS). The world’s first meteorological satellite, TIROS, was launched on April 1, 1960, and demonstrated the advantage of mapping Earth’s cloud cover from satellite altitudes. On January 23, 1970, the first of the improved TIROS Operational Satellite (ITOS) was launched. Between December 11, 1970, and July 29, 1976, five ITOS satellites designated NOAA-1 through 5 were launched. From October 13, 1978, to July 23, 1981, satellites in the TIROS-N series were launched, where N represents the next generation of operational satellites. NOAA-6 and NOAA-7 were also launched during this time frame. On March 28, 1983, the first of the Advanced TIROS-N (or

TABLE 1.3 Completed EOS Satellites as of November 2011 Satellites

Lunch day

Upper Atmosphere Research Satellite (UARS)

September 12, 1991

ATLAS

March 24, 1992

TOPEX/Poseidon

August 10, 1992

Spaceborne Imaging Radar-C (SIR-C)

April 19, 1994

Radar Satellite (RADARSAT)

November 4, 1995

Total Ozone Mapping SpectrometerEarth Probe (TOMS-EP)

July 2, 1996

Advanced Earth Observing Satellite (ADEOS)

August 17, 1996

Orbview-2/SeaWiFS

August 1, 1997

Tomographic Experiment using Radiative Recombinative Ionospheric EUV and Radio Sources (TERRIERS)

May 18, 1999

Shuttle Radar Topography Mission (SRTM)

February 11, 2000

Challenging Mini-Satellite Payload (CHAMP)

July 15, 2000

Stratospheric Aerosol and Gas Experiment (SAGE III)

December 10, 2001

SeaWinds (ADEOS II)

December 14, 2002

Ice, Cloud, and Land Elevation Satellite (ICESat)

January 12, 2003

ATN) satellites, designated NOAA-8, was launched. NOAA continues to operate the ATN series of satellites today with improved instruments. Complementing the geostationary satellites are two NOAA polar-orbiting satellites, one crossing the equator at 7:30 AM local time, the other at 1:40 PM local time. The latest is NOAA-19, launched on February 6, 2009. NOAA-18 (PM secondary), NOAA-17 (AM backup), NOAA-16 (PM secondary), and NOAA-15 (AM secondary) all continue transmitting data as stand-by satellites. NOAA-19 is the “operational” PM primary satellite, and

1.2. PLATFORM AND SENSOR SYSTEM

METOP-A, owned and operated by EUMETSAT, is the AM Primary satellite. The first AVHRR sensor was a 4-channel radiometer, first carried on TIROS-N (launched October 1978). This was subsequently improved to a 5-channel instrument (AVHRR/2) that was initially carried on NOAA-7 (launched June 1981). The latest instrument version is AVHRR/3, with 6 channels, first carried on NOAA-15, launched in May 1998. Multiple global vegetation index datasets have been developed from NOAA-7 to now. From 2011, NOAA has started the new “Joint Polar Satellite System” (JPSS) program that will consist of platforms based on the NPOESS Preparatory Project (NPP) satellite, launched in October 2011. NOAA and EUMETSAT have also established the Initial Joint Polar-Orbiting Operational Satellite System (IJPS), which comprises two polar-orbiting satellite systems and their respective ground segments from each side. It provides meteorological data for “Morning” and “Afternoon” orbits by complementing each other’s polar satellite global coverage.

1.2.4. Sensor Types There are two types of sensorsdpassive and active. Passive sensors detect natural radiation that is emitted by the object being viewed or reflected by the object from a source other than the instrument. Reflected sunlight is the most common external source of radiation sensed by passive sensors. Typical passive sensors include: • Radiometer: An instrument that quantitatively measures the radiance of electromagnetic radiation in the visible, infrared, or microwave spectral region. • Imaging Radiometer: A radiometer that includes a scanning capability to provide a two-dimensional array of pixels from which an image may be produced. It is often called a scanner. Scanning can be performed mechanically or electronically by using an array of detectors. Across-track scanners,

7

scanning from one side of the sensor to the other across the platform flight direction using a rotating mirror, are called Whiskbroom Scanners, such as AVHRR. Alone-track scanners, scanning a swath with a linear array of Charge Coupled Devices (CCD) arranged perpendicular to the flight direction of the platform without using a mechanical rotation device, are called Pushbroom scanners, such as High Resolution Visible of SPOT and Advanced Land Imager of EO-1. • Spectroradiometer: A radiometer that can measure the radiance in multiple spectral bands, such as the Moderate Resolution Imaging Spectroradiometer (MODIS) and the Multi-angle Imaging SpectroRadiometer (MISR). Active sensors provide their own electromagnetic radiation to illuminate the scene they observe. They send a pulse of energy from the sensor to the scene and then receive the radiation that is reflected or backscattered from that scene. Typical active sensors include: • Radar (Radio Detection and Ranging): A microwave radar that uses a transmitter operating at microwave frequencies to emit electromagnetic radiation and a directional antenna or receiver to measure the time of arrival of reflected or backscattered pulses of radiation from distant objects for determining the distance to the object. • Synthetic-aperture radar (SAR): A side-looking radar imaging system that uses relative motion between an antenna and the Earth surface to synthesize a very long antenna by combining signals (echoes) received by the radar as it moves along its flight track for obtaining high spatial resolution imagery. There are multiple SAR systems in operation, and some examples can be seen in Section 19.3.2 for mapping soil moisture. • Interferometric synthetic aperture radar (InSAR): A technique that compares two or more amplitude and phase images over the same

8

1. A SYSTEMATIC VIEW OF REMOTE SENSING

geographic region received during different passes of the SAR platform at different times. InSAR can survey height information of the illuminated scene with cm-scale vertical resolution and 30-m pixel resolution, and covering areas 100 km x 100 km (in standard beam modes). Examples include ERS-1 (1991), JERS-1 (1992), RADARSAT-1 and ERS-2 (1995), and ASAR (2002). While the majority of InSAR missions to date have utilized C-band sensors, recent missions such as ALOS PALSAR, TerraSAR-X, and COSMO SKYMED are expanding the available data in the L- and X-bands. • Scatterometer: A high frequency microwave radar designed specifically to determine the normalized radar cross section of the surface. Over ocean surfaces, measurements of backscattered radiation in the microwave spectral region can be used to derive maps of surface wind speed and direction. It has also been used for mapping surface soil moisture and freeze/thaw states. Examples include the Advanced Microwave Instrument (AMI) of ERS-1 and ERS-2. • Lidar (Light Detection and Ranging): An active optical sensor that uses a laser in the ultraviolet, visible, or near infrared spectrum to transmit a light pulse and a receiver with sensitive detectors to measure the backscattered or reflected light. Distance to the object is determined by recording the time between the transmitted and backscattered pulses and using the speed of light to calculate the distance traveled. The details are given in Chapters 14 and 15. • Laser Altimeter: A laser altimeter that uses a lidar to measure the height of the instrument platform above the surface. By independently knowing the height of the platform with respect to the mean Earth’s surface, the topography of the underlying surface can be determined. The Geoscience Laser Altimeter System (GLAS) of ICESat is a typical example of a space-based Laser Altimeter.

1.2.5. Data Characteristics The specifications of the platform and the sensor determine the resolutions of the remotely sensed data: spatial, spectral, temporal, and radiometric. 1.2.5.1. Spatial Resolution Spatial resolution is a measure of the smallest object that can be resolved by the sensor, or the ground area imaged for the instantaneous field of view (IFOV) of the sensor, or the linear dimension on the ground represented by each pixel. Figure 1.3 shows the campus of the University of Maryland at College Park at four different spatial resolutions. Table 1.4 shows the spatial resolution of some common sensors. 1.2.5.2. Spectral Resolution The spectral resolution describes the number and width of spectral bands in a sensor system. Many sensor systems have a panchromatic band, which is one single wide band in the visible spectrum, and multispectral bands in the visible-near-IR or thermal-IR spectrum (see Table 1.4). Hyperspectral systems usually have hundreds of spectral narrow bands; for example, Hyperion on EO-1 satellite has 220 bands at 30-m spatial resolution. 1.2.5.3. Temporal Resolution Temporal resolution is a measure of the repeat cycle or frequency with which a sensor revisits the same part of the Earth’s surface. The frequency characteristics are determined by the design of the satellite sensor and its orbit pattern. The temporal resolutions of common sensors are also shown in Table 1.4. 1.2.5.4. Radiometric Resolution Radiometric resolution refers to the dynamic range, or the number of different output numbers in each band of data, and is determined by the number of bits into which the recorded radiation is divided. In 8-bit data, the digital numbers (DN)

1.2. PLATFORM AND SENSOR SYSTEM

(a)

1m

(b)

10m

FIGURE 1.3 Campus of the University of Maryland at College Park at four spatial resolutions.

9

10

1. A SYSTEMATIC VIEW OF REMOTE SENSING

(c)

30m

(d)

250m

FIGURE 1.3 (Continued).

11

1.3. DATA TRANSMISSION AND GROUND RECEIVING SYSTEM

TABLE 1.4

The basic characteristics of some commonly used sensors

Satellite sensors Spectral bands

Spatial resolution (m)

Radiometric Temporal Temporal resolution (bits) resolution (day) coverage

Coarse resolution (> 1000 m)

POLDER

B1-B9

6*7

12

4

POLDER 1: October 1996 to June 1997 POLDER2: April to October 2003

Medium resolution (100e1000 m)

MODIS

B1-b2

250

12

daily

1999

B3-b7

500

B8-b36

1000

B1-B5

1100 at nadir

10

daily

B1

15

8

B2-B9

30

B11-B14

90

12

Pan

15

8

16

1999-

B1-B5,B7

30

B6

60

Pan

2.5 or 5

8

26/2.4

2002-

B1-B3

10

SW-IR

20

Panchromatic band

0.82 at nadir

11

3 days at 40 latitude

1999-

B1-b4

3.2 at nadir

Pan

0.61

11

1-3.5

2001-

B1-B4

2.44

World view

pan

0.5 at nadir

11

1.7-5.9

2007-

Geoeye-1

pan

1.41 at nadir

11

2008-

B1-B4

1.65 at nadir

2.1-8.3 days at 40 latitude

AVHRR Fine resolution (5-100 m)

ALI/EO1 ASTER/Terra

ETMþ/ Landsat7

HRV/SPOT5

Very high resolution (< 5 m)

Ikonos

Quickbird

can range from 0 to 255 for each pixel (28 ¼ 256 total possible numbers). Obviously more bits results in higher radiometric accuracy of the sensor, as shown in Figure 1.4. The radiometric resolutions of common sensors are shown in Table 1.4.

1.3. DATA TRANSMISSION AND GROUND RECEIVING SYSTEM There are three main options for transmitting data acquired by satellite sensors to the

12

1. A SYSTEMATIC VIEW OF REMOTE SENSING

(a)

8 bits (256 levels)

(b)

4 bits (16 levels)

FIGURE 1.4 Four radiometric resolutions.

1.3. DATA TRANSMISSION AND GROUND RECEIVING SYSTEM

(c)

2 bits (4 levels)

(d) 1 bit (2 levels)

FIGURE 1.4 (Continued).

13

14

1. A SYSTEMATIC VIEW OF REMOTE SENSING

surface: (1) the data can be directly transmitted to Earth if a Ground Receiving Station (GRS) is in the line of sight of the satellite; (2) the data can be recorded on board the satellite for transmission to a GRS at a later time; and (3) the data can also be relayed to the GRS through the Tracking and Data Relay Satellite System (TDRSS), which consists of a series of communications satellites in geosynchronous orbit. The data are transmitted from one satellite to another until they reach the appropriate GRS. There are two types of GRSs: fixed and mobile. Most GRSs are fixed, and Figure 1.5 shows the locations of all ground stations operated by the U.S. and International Cooperator ground station network for the direct downlink and distribution of Landsat 7 (L7) and/or Landsat 5 (L5) image data. Since coverage of the globe by ground receiving stations is not complete, as seen from Figure 1.5, the mobile station is an attractive solution to fill the holes and also an efficient

means to perform acquisition in a remote location for a long period of time when a lot of images are needed for particular work (cartography of a region for example). The ground receiving stations acquire, preprocess, archive, and process data. Their typical components and functions may include the data acquisition facility, the data processing facility, the value added facility, and user support services.

1.4. DATA PROCESSING SYSTEM Two types of preprocesses are conducted: radiometric processing and geometric processing.

1.4.1. Radiometric Calibration Radiometric calibration is a process that converts recorded sensor voltages or digital numbers (DN) to an absolute scale of radiance

FIGURE 1.5 Landsat ground stations and their coverage. The circles show the approximate area over which each station has the capability for direct reception of Landsat data. The green circles show the components of the L7 ground station network, the red circles show components of the L5 station network, and the dashed circles show stations with dual (L5 and L7). status. The yellow circles show L5 short-term (“campaign”) stations that may contribute to the USGS Landsat archive. (http:// landsat.usgs.gov/about_ground_stations.php)

1.4. DATA PROCESSING SYSTEM

or reflectance. Because outer space is such a harsh environment, the performance of all satellite sensors degrades over time. To achieve consistent and accurate measurements that can be used to detect climatic and environmental change, the DNs need to be transformed into physical quantities. Calibration measurements can be conducted in three stages. Pre-flight calibration measures a sensor’s radiometric properties before that sensor is sent into space. Pre-flight instrument calibration is performed at the instrument builder’s facilities. The controllable and stable environment in the laboratory guarantees high calibration accuracy and precision. In-flight calibration is usually performed on a routine basis with on-board calibration systems. More and more optical sensors have on-board calibration devices. For example, the AVHRR optical sensor does not have an onboard calibration capability, but the ETMþ has three on-board calibration devices: the Internal Calibrator, the Partial Aperture Solar Calibrator, and the Full Aperture Solar Calibrator. MODIS also has three dedicated calibration devices for the reflective bands: Solar Diffuser, Solar Diffuser Stability Monitor, and the Spectroradiometric Calibration Assembly. In addition, MODIS has two additional calibration techniques: looking at the Moon and at deep space. Post-launch calibration data have to be obtained from vicarious calibration techniques that typically make use of selected natural or artificial sites on the surface of the Earth. Pre-launch and onboard methods are better established, and the use of invariant sites in vicarious calibration is becoming more popular with the changing design and demands of new instruments. Vicarious calibration using pseudo-invariant sites has become increasingly accepted as a fundamental post-launch calibration method to monitor longterm performance of satellite reflective solar sensors. There are several common desired characteristics of an invariant sitedfor example, temporal stability, spatial uniformity, little or no

15

vegetation, and relatively high surface reflectivity with approximately Lambertian reflectance. Commonly used sites include stable desert areas of the Sahara and Saudi Arabia, the Sonoran Desert, the White Sand, and regions in Bolivia.

1.4.2. Geometric Processing No image acquired by sensors can perfectly represent the true spatial properties of the landscape. Many factors can also distort the geometric properties of remote sensing data, such as variations in the platform altitude, attitude and velocity, Earth rotation and curvature, surface relief displacement, and perspective projection. Some of these resulting distortions are systematic and can be corrected through analysis of sensor characteristics and platform ephemeris data, but others are random and have to be corrected by using ground control points (DCP). In the sensor ground instantaneous field of view (IFOV), surface elements do not contribute to the pixel value equally, but rather, the central part contributes most to the pixel value. This kind of spatial effect is usually specified by the sensor point spread function (PSF) in the spatial domain, and the Fourier transform of the PSF is called the modulation transfer function (MTF)d a precise measurement of details and contrast made in the frequency domain. The sensor PSF is often modeled as a Gaussian. Figure 1.6 illustrates the PSF in two- and three-dimensions, where TGSD is the threshold ground sample distance, which is the centroid-to-centroid distance between adjacent pixels. The actual response function of the ground IFOV is often not square; for example, for MODIS, it is twice as wide cross-track as in-track because of time integration during scanning. For most whiskbroom scanners, such as AVHRR and MODIS, the actual size of the ground IFOV is a function of the scanning angle (see Figures 1.7 and 1.8). This topic is significant because level-1 radiance data or level-2 reflectance data should be

16

1. A SYSTEMATIC VIEW OF REMOTE SENSING

FIGURE 1.6 Illustrations of the point spread functions (PSF) in two- and three-dimensional spaces (Zhang, et al., 2006). Reproduced with permission of IEEE

FIGURE 1.7 Sketch of pixel geometry for the AVHRR to illustrate the variations in pixel size in the alongscan direction (Breaker 1990).

corrected for geometric distortions before calculating geophysical parameters in order to obtain a truly absolute geophysical parameter. The details are discussed in Chapter 2.

1.4.3. Image Quality Enhancement Imperfections or image artifacts are continuously caused by the instrument’s electronics, dead or dying detectors, and downlink errors. Known artifacts include the scan-correlated shift, memory effect, modulation transfer function, and coherent noise. Dropped lines and inoperable detectors also exist as a result of

decommutating errors and detector failure. Potential remnant artifacts include banding and striping. In the past, these effects were ignored or artificially removed using cosmetic algorithms during radiometric preprocessing. For example, dropped lines are usually filled with the values of the previous lines or the averages of the neighboring lines. Strips can be removed by using simple along-line convolution, highpass filtering, and forward and reverse principal component transformations. To assist human visual interpretation, various image enhancement techniques have been incorporated in many remote sensing digital image

1.4. DATA PROCESSING SYSTEM

17 FIGURE 1.8 MODIS IFOV dependant on the view zenith angle

processing systems. These enhancement methods can broadly be divided into spatial domain and frequency domain categories. In spatial domain techniques, we directly deal with the image pixels. Figure 1.9 illustrates the effects of a linear enhancement technique. The pixel values are manipulated to achieve the desired enhancement. In frequency domain methods, the image is first transferred to the frequency domain. That is, the Fourier transform of the image is computed first, all the enhancement operations are performed on the Fourier transform of the image, and then the inverse Fourier transform is performed to obtain the resultant image. Note that the radiometric properties characterizing environmental conditions are artificially altered by image enhancement methods. Most image enhancement techniques for assisting visual interpretation should not be performed before quantitatively estimating biophysical variables.

1.4.4. Atmospheric Correction Since the observed radiance recorded by a spaceborne or airborne sensor contains both

atmospheric and surface information, atmospheric effects must be removed to estimate land surface biogeophysical variables, particularly from the reflective and thermal IR data, since microwave signals are not very sensitive to changes in atmospheric conditions. Clouds in the atmosphere largely block Earth surface information and make most optical and thermal-IR imagery useless for terrestrial applications. Various cloud and shadow detection algorithms have been developed, and cloud mask is one of the high-level atmospheric products. However, this is still an active research area, and more effective and reliable algorithms are needed. For km coarse-resolution imagery (e.g., AVHRR and MODIS), there usually remain many cloudy or mixed cloudy pixels after applying the cloud mask. Various solutions have been used to address this issue. One solution relies on temporal compositing techniques, converting daily observations to weekly or monthly data based on maximum vegetation index or other criteria; other solutions include replacement of these contaminated pixels using smoothing

18

1. A SYSTEMATIC VIEW OF REMOTE SENSING

algorithms. This topic will be discussed in Chapter 3. For optical imagery, both aerosol and water vapor scatter and absorb the radiation reflected from the surface. There are two approaches for atmospheric correction. The first one assumes known atmospheric properties, usually the total amounts of aerosol and water vapor in the atmospheric column, which may be estimated from other sensors and/or other sources. Many atmospheric radiative transfer codes (e.g., MODTRAN, 6S) can be used to calculate the quantities required for atmospheric correction. The second one relies only on the imagery itself without any external information. If the atmospheric information can be accurately estimated from other sources, the first approach is preferable, but quite often we are not that lucky in reality. This topic will be discussed in Chapter 5. For thermal-IR imagery, if we can acquire atmospheric profile information (mainly temperature and water vapor) from sounding data, atmospheric correction is straightforward. The split-window approach based on two thermal-IR bands, when no such atmospheric profile information is available, is often used to estimate land surface temperature without atmospheric correction. The details are available in Section 8.2.2.

1.4.5. Image Fusion and Product Integration There are many cases where we need to integrate image data through image fusion techniques. Definitions of image fusion in the literature are very diverse. Image fusion can be viewed as a process that produces a single image from a set of input images. The fused image should have more complete information and is more useful for estimating land surface variables. It can improve both reliability by using redundant information and capability by using complementary information, as illustrated in Figure 1.10.

Image fusion is not distinguished from image merging or image integration, which at the pixel level may be in many different forms, for example: • Multi-temporal images from the same or multiple sensors for change detection (e.g., merge TM images acquired at different times); • Multi-spatial images from the same or multiple sensors (e.g., merge ETMþ panchromatic and multispectral images); • Multiple images of different spectral regions from the same or multiple sensors (e.g., merge SAR with optical imagery, or visible bands with thermal bands); • Remote sensing images with ancillary data (e.g., topographic map). Chapter 3 will present some of these techniques in detail. When we examine high-level satellite products, it is surprising to see that most products are mainly generated from a single sensor. For example, the MODIS team produces albedo products mainly from MODIS data, which is also true for MISR, MERIS, etc. The same product from different satellite sensors may have different characteristics (e.g., spatial and temporal resolutions, accuracy). Instead of asking the user to pick the “best” product, we can generate a blended/integrated product from multiple-sensor products. Chapter 22 is devoted to addressing this topic in the example of leaf area index (LAI).

1.5. MAPPING CATEGORY VARIABLES We are in general interested in two types of land surface variables: category and quantitative. The category variables represent the types of objects on the land surfaces and are usually mapped out through image classification. The purpose of image classification is to group together pixels that have similar properties into a finite set of

1.5. MAPPING CATEGORY VARIABLES

(a)

Original image

(b)

histogram

19

FIGURE 1.9 An example of linear enhancement: original image and its histogram (a) and (b); linearly enhanced image and its histogram (c) and (d).

20

1. A SYSTEMATIC VIEW OF REMOTE SENSING

(c)

Enhanced image

(d)

New histogram

FIGURE 1.9 (Continued).

1.5. MAPPING CATEGORY VARIABLES

FIGURE 1.10 Illustration of image fusion.

classes. An example of a classified image is a land cover map. Figure 1.11 is a global land cover map mapped from MODIS data. The key steps in the classification process are as follows: 1. Definition of classification system (scheme): This depends on the objective and the characteristics of the remote sensing data. The purpose of such a scheme is to provide a framework for organizing and categorizing the information that can be extracted from the data. A number of classification schemes have been developed for mapping regional and global land cover and land use maps. The IGBP land cover classification system for global mapping using MODIS data is shown in Figure 1.11.

21

2. Selection of features: Classification is executed based on a series of features in the feature space. It divides the feature space into several classes based on a decision rule. Instead of using the original bands, they are often transformed into feature space to discriminate between the classes. Examples of features include various vegetation indexes, principal components and those from the Tasseled Cap transformation, and other spatial, temporal, and angular features. The subset of features is selected to maximally distinguish different classes. 3. Sampling of training data: Training is the process of defining the criteria by which these classes are recognized and is performed with either a supervised or an unsupervised method. Supervised training is closely controlled by the analyst, who selects pixels from each class based on highresolution imagery, ground truth data, or maps, while unsupervised training is more computer-automated and enables the user to specify some parameters that the computer uses to uncover statistical patterns that are inherent in the data but do not necessarily correspond to classes in the classification scheme.

FIGURE 1.11 Global land cover classification map from MODIS.

22

1. A SYSTEMATIC VIEW OF REMOTE SENSING

4. Classification: A parametric or nonparametric decision rule, which is often called a classifier, is used to perform the actual sorting of pixels into distinct class values. There are various classifiers, such as the parallelepiped classifier, minimum distance classifier, maximum likelihood classifier, regression tree classifier, and support vector machine classifier. They are compared with the training data so that an appropriate decision rule is selected for classification. 5. Accuracy assessment: The classified results should be checked and verified for their accuracy and reliability. The training data are usually divided into two parts, one for training and the other for validation. In the evaluation of classification errors, a classification error matrix is typically formed, which is sometimes called a confusion matrix or contingency table. The details of image classification techniques are not covered by this book, but the basic principles and progress can be found elsewhere (Jensen, 2004; Lu and Weng, 2007; Mather and Magaly, 2010; Richards and Jia, 2005). Some typical techniques for mapping land-use types will be discussed in Chapter 24. At most spatial resolutions, the majority of pixels are mixed. If a pixel is required not just to be labeled as one of the cover types but to estimate the percentages of the cover types, it would be more challenging. How to estimate the fractional vegetation coverage within one pixel will be discussed in Chapter 13.

1.6. ESTIMATING QUANTITATIVE VARIABLES To drive, calibrate, and validate the Earth process models and support various applications, high-level products of quantitative variables are much more desirable. How to generate these products is the main focus of this book. In the early stages of remote sensing technique development,

visual interpretation was the technique commonly used for extracting land surface information. Statistical analysis later became a more common method for quantitatively estimating land surface information. As seen from the following chapters, various inversion techniques based on physically based surface radiation models have become the subject of mainstream research (Liang 2007, 2008). It is necessary to provide an overview of these techniques. Since many inversion algorithms are based on forward radiation modeling, let us start with that first.

1.6.1. Forward Radiation Modeling This is the process that links the pixel values of an image with surface characteristics through mathematical models (Liang, 2004). We will mainly present landscape generation, surface and atmosphere radiative transfer modeling, and sensor models. 1.6.1.1. Scene Generation Scene generation is a quantitative description of our understanding of the landscape. Strahler et al. (1986) identify two different scene models in remote sensing: H- and L-resolution models. H-resolution models are applicable where the elements of the scene are larger than the pixel size, and L-resolution models are applicable when the converse is true. H-resolution scenes can be generated using computer graphics techniques. One of the most well-known techniques is called the L-system. L-resolution scenes can be generated using mathematical models or GIS (geographic information system) techniques. 1.6.1.2. Surface Radiation Modeling Given landscape composition and its optical properties, we can predict the radiation field. Three types of models characterize the radiation field of the scene, and they are commonly used in optical remote sensing: geometric optical models, turbid-medium radiative transfer models, and computer simulation models.

1.6. ESTIMATING QUANTITATIVE VARIABLES

In geometric optical models, canopy or soil is assumed to consist of geometric protrusions with prescribed shapes (e.g., cylinder, sphere, cones, ellipsoid, spheroid), dimensions, and optical properties that are distributed on a background surface in a defined manner (regularly or randomly distributed). The total pixel value is the weighted average of sunlit crown, sunlit ground, shadowed crown, and shadowed ground. Figure 1.12 illustrates a simulated canopy field with an ellipsoid shape and the calculated sunlit and shadowed components. Turbid-medium radiative transfer models treat surface elements (leaf or soil particle) as small absorbing and scattering particles with

23

given optical properties, distributed randomly in the scene and oriented in given directions. In one-dimensional canopy models, canopy elements are assumed to be randomly distributed, but three-dimensional RT models can take into account the structural information of the landscape, as shown in Figure 1.13. The further development of geometric optical models has incorporated radiative transfer theory in calculating the individual sunlit/shadow components; the resulting models are often called hybrid models. In computer simulation models, the arrangement and orientation of scene elements are simulated on a computer, and the radiation properties are determined based on the radiosity equations FIGURE 1.12 Projection principles of the geometric-optical model for the canopy with an ellipsoid crown and the simulated canopy field.

24

1. A SYSTEMATIC VIEW OF REMOTE SENSING

FIGURE 1.13 DART (Discrete Anisotropic Radiative Transfer)(Gastellu-Etchegorry, 2008)

and/or Monte Carlo ray tracing methods. Figure 1.14 compares a photo of grass field and the simulated field using the Botanical Plant Modeling System (Lewis, 1999) based on the ray-tracing technique. 1.6.1.3. Atmospheric Radiative Transfer The radiation at the Earth’s surface is disturbed by the atmosphere before being captured by the sensor in the atmosphere (airborne sensors) or above the atmosphere (spaceborne sensors).

Atmospheric gases, aerosols, and clouds scatter and absorb the incoming solar radiation and the reflected and/or emitted radiation from the surface. As a result, the atmosphere greatly modulates the spectral dependence and spatial distribution of the surface radiation. The atmospheric radiative transfer theory is quite mature, and many computer software packages (e.g., MODTRAN, 6S) have been developed to enable us to calculate all necessary quantities, such as path radiance and transmittance.

1.6. ESTIMATING QUANTITATIVE VARIABLES

25

FIGURE 1.14 A photo of an actual canopy (left) and the simulated canopy field (right). Courtesy of Dr. Mathias Disney at University College London.

1.6.1.4. Sensor Modeling Since common detector materials do not respond across the entire optical spectrum, most sensors have separate focal planes and noise mechanisms for each spectral region. The sensor model can describe the effects of an imaging spectrometer on the spectral radiance mean and covariance statistics of a land surface. The input radiance statistics of every spectral channel are modified by electronic gain, radiometric noise sources, and relative calibration error to produce radiance signal statistics that represent the scene as imaged by the sensor. The sensor model includes approximations for the spectral response functions and radiometric noise sources. The spectral response functions of each instrument can be measured and provided by the sensor manufacturers. Radiometric noise processes are modeled by adding variance to the diagonal entries of the spectral covariance matrices. Radiometric noise sources come from the detector and electronics. Detector noise includes photon (shot) noise, thermal noise, and multiplexer/readout noise. Since detector parameters are often specified in terms of electrons, the noise terms are summed in

a root sum squared sense in that domain before being converted to noise equivalent spectral radiance. The noise processes originating in the electronics include quantization noise, bit errors (in recording or transmitting the data), and noise arising within the electrical components. Besides the sensor spectral response function and radiometric noise, the sensor model can also include spatial effects using PSF and MTF.

1.6.2. Inversion Methods All inversion algorithms for estimating quantitative variables can be classified into three groups: statistical methods, physical methods, and hybrid methods. Statistical methods are mainly based on a variety of vegetation indices. Physical algorithms rely on inverting surface radiation models. A new trend is to combine statistical and physical methods, which is referred to as a hybrid algorithm in this book. 1.6.2.1. Statistical Analysis and Machine Learning Techniques Statistical models have proven to be very useful in various remote sensing applications. They are usually created using ground

26

1. A SYSTEMATIC VIEW OF REMOTE SENSING

measurements. Because it is very expensive to collect extensive ground measurements under various conditions, the major weakness of models based on ground measurements is limited representation. An alternative solution is to simulate remotely sensed data using a physically based radiation model that may have been calibrated and validated by field measurements. The key modeling components have been discussed in Section 1.2.5.1. Different statistical methods can be used to relate inputs and outputs of the model simulations. Besides the conventional multivariate regression analysis, different machine learning methods and other advanced statistical analysis techniques have been used, such as artificial neural network (ANN) methods (Sections 13.3.3 and 15.3), genetic algorithms (Section 13.3.5), and Bayesian networks (Section 13.3.4). An important drawback of most machine learning techniques (e.g., ANN) has been their lack of explanation capability. It is increasingly apparent that, without some form of explanation capability, the full potential of trained ANNs may not be realized. Many studies have focused on mechanisms, procedures, and algorithms designed to insert knowledge into ANNs (knowledge initialization), extract rules from trained ANNs (rule extraction), and utilize ANNs to refine existing rule bases (rule refinement). 1.6.2.2. Optimization Algorithms Optimization algorithms estimate the parameters (x) of the surface radiation model by minimizing the cost function J defined as follows: 1 JðxÞ ¼ ðx  xb ÞT B1 ðx  xb Þ 2 1 þ ðHðxÞ  yÞT R1 ðHðxÞ  yÞ ¼ Jb þ Jo 2 (1.1) where y is the observation vector, xb is the background field (or first guess), H is the surface radiation model operator, and R is the

observation-error covariance matrix and B is background-error covariance matrix. The first term Jb is to force the optimal parameters as closely as possible to background fields, and the second term Jo is to adjust parameters so that model outputs will be as close to the observations as possible. Specifying R and B depends on the relative accuracy of background information and remote sensing data products. In extreme cases, if the errors of the “first-guess” values are extremely large, the final estimates will be decided from the fitting of the observations and will be close to the “first-guess” values. The high computational demands of optimization approaches have led to the use of more simplified surface reflectance models, rather than forcing optimization algorithm efficiencies. One of the general trends in optical remote sensing is to use simpler empirical or semiempirical models (Section 7.1.3). Optimization algorithms are used to estimate the parameters in these simple models. These parameters are then related to surface properties. 1.6.2.3. Look-up Table Algorithms Optimization algorithms are computationally expensive and very slow, performing the inversion process with a huge amount of remotely sensed data. The look-up table (LUT) approach has been used extensively to speed up the inversion process. It precomputes the model reflectance for a large range of combinations of parameter values. In this manner, the most computationally expensive aspect can be completed before the inversion is attempted, and the problem is reduced to searching a LUT for the modeled reflectance set that most resembles the measured set. In an ordinary LUT approach, the dimensions of the table must be large enough to achieve high accuracy, which leads to much slower online searching. Moreover, many parameters must be fixed in the LUT method. To reduce the dimensions of the LUTs for rapid table searching, empirical functions are used to fit the LUT

1.7. PRODUCTION, ARCHIVING, AND DISTRIBUTION OF HIGH-LEVEL PRODUCTS

values so that a table searching procedure becomes a simple calculation of the local functions, or a simple linear regression is executed instead of table searching for each angular bin in the solar illumination and sensor viewing geometry. Section 11.3.6 presents the LUT method for estimating LAI, and Section 6.3.3 presents its application to estimation of incident solar radiation. 1.6.2.4. Data Assimilation Methods The values of land surface variables, estimated using the methods from the previous sections from different sources, may not be physically consistent. Most techniques do not take advantage of observations acquired at different times and cannot handle observations with different spatial resolutions together. In particular, these techniques estimate only variables that significantly affect radiance received by the sensors. In many cases, the estimation of some variables not directly related to radiance is desirable. Given the ill-posed nature of remote sensing inversion (the number of unknowns is far greater than the number of observations) and vast expansion of the amount of observation data, data fusion techniques, which simply register and combine datasets together from multiple sources, may be one solution. These techniques will be presented in Chapter 4. The data assimilation (DA) method allows use of all available information within a time window to estimate various unknowns of land surface models (Liang, 2007). The information that can be incorporated includes observational data, existing pertinent a priori information, and, importantly, a dynamic model that describes the system of interest and encapsulates theoretical understanding. A DA scheme commonly includes the following components: (1) a forward dynamic model that describes the time evolution of state variables such as surface temperature, soil moisture, and carbon stocks; (2) an observation model that relates the model estimates of state variables

27

to satellite observations and vice versa; (3) an objective function that combines model estimates and observations along with any associated prior information and error structure; (4) an optimization scheme that adjusts forward model parameters or state variables to minimize the discrepancy between model estimates and satellite observations; and (5) error matrices that specify the uncertainty of the observations, model, and any background information (these are usually included in the objective function). The details are given in Section 11.4.

1.7. PRODUCTION, ARCHIVING, AND DISTRIBUTION OF HIGHLEVEL PRODUCTS Satellite observations must be converted into high-level biogeophysical products using the inversion methods outlined above through a production system. Given satellite observations and the inversion algorithm, production of high-level products is not straightforward because of the huge amount of data. Creation of data information systems is absolutely necessary. For example, NASA’s principal Earth Science information system is the Earth Observing System Data and Information System (EOSDIS), which has been operational since August 1994. EOSDIS acquires, processes, archives, and distributes Earth Science data and information products created from satellite data that arrive at a rate of more than four trillion bytes (4 terabytes) per day. More and more information systems are supported by high-performance computing capabilities. The various levels of data used by the EOSDIS are defined below. For some instruments, there will be no Level 1B product that is distinct from the Level 1A product. In these cases, the reference to Level 1B data can be assumed to refer to Level 1A data. Brief definitions follow: Level 0: Reconstructed, unprocessed instrument/payload data at full resolution; any and

28

1. A SYSTEMATIC VIEW OF REMOTE SENSING

all communications artifacts (e.g., synchronization frames, communications headers, duplicate data removed). In most cases these data are provided by EDOS to a Distributed Active Archive Center (DAAC) as Production Data Sets to produce higher level products. Level 1A: Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients, and georeferencing parameters (e.g., platform ephemeris, computed and appended but not applied to the Level 0 data). Level 1B: Level 1A data that have been processed to sensor units (not all instruments will have a Level 1B equivalent). Level 2: Derived geophysical variables at the same resolution and location as the Level 1 source data. Level 3: Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency. Level 4: Model output or results from analyses of lower level data (e.g., variables derived from multiple measurements). Archiving and distribution of the huge amount of data and products are also challenging. NASA Earth Science information is archived at eight DAACs located across the United States. The DAACs specialize by topic area and make their data available to researchers around the world. More details on data, production, and distribution systems are presented in Chapter 23.

1.8. PRODUCT VALIDATION One of the key components of information extraction is validation. Without a known accuracy, the product cannot be used reliably and, therefore, has limited applicability. With various land products available, users need quantitative information on product uncertainties in order to

select the most suitable product, or combination of products, for their specific needs. As remote sensing observations are generally merged with other sources of information or assimilated within process models, evaluation of product accuracy is required. Making quantified accuracy information available to the user can ultimately provide developers the necessary feedback for improving the products and can possibly provide methods for their fusion to construct a consistent long-term series of surface status. Land product validation has to rely on ground measurements, which may be time consuming and very expensive. Because of its importance, such product validation must involve the efforts of the entire community. Sharing the validation methodologies, instruments, measured data, and results pilots the way to success and progress. One of the critical issues is the mismatch between ground “point” measurements and kilometer-scale pixel values over the heterogeneous landscape. Up-scaling “point” measurements using high-resolution remotely sensed data is the key to addressing this issue; more details are discussed in many relevant chapters.

1.9. REMOTE SENSING APPLICATIONS Remote sensing has generated comprehensive, near-real-time environmental data, information, and analyses. It serves a wide range of users and empowers decision makers to respond more effectively to the many environmental challenges facing modern civilization. Figure 1.15 depicts the linkage and flow of information from remote sensing observations and other in situ data to societal benefits. Data can be used for driving, calibrating, and validating models and decision support tools. The last chapter of this book illustrates how different remote sensing

29

1.9. REMOTE SENSING APPLICATIONS

FIGURE 1.15 Linking Earth observations with societal benefits (CENR/IWGEO, 2005)

data and products can be used for monitoring land cover and land use changes and assessing their environmental impacts. GEOSS identified nine societal benefit areas in which there was recognition that clear societal benefits could be derived from a coordinated global observation system. As illustrated in Figure 1.16, the nine societal benefit areas are: • Reducing loss of life and property from natural and human-induced disasters.

• Understanding environmental factors affecting human health and well-being. • Improving management of energy resources. • Understanding, assessing, predicting, mitigating, and adapting to climate variability and change. • Improving water resource management through better understanding of the water cycle. • Improving weather information, forecasting, and warning. FIGURE 1.16 GEOSS and applications (GEO, 2005)

30

1. A SYSTEMATIC VIEW OF REMOTE SENSING

• proving the management and protection of terrestrial, coastal, and marine ecosystems. • Supporting sustainable agriculture and combating desertification. • Understanding, monitoring, and conserving biodiversity. Some of these societal benefit areas are themselves complex clusters of issues, with many and varied stakeholders. In each area there are observational needs for many variables, with requirements for their accuracy, spatial and temporal resolution, and speed of delivery to the user. The societal benefit areas are at widely varying levels of maturity with respect to establishing user needs, defining the observation requirements, and implementing coordinated systems. For example, the Weather area is very mature, while the Health area is relatively immature in the context of Earth Observation.

generally a nonlinear ill-posed problem, and use of more spatial and temporal constraints by incorporating a priori knowledge and integrating multiple-source data deserves further research. Although remote sensing data products have been widely used, significant disconnects between remote sensing development and applications continue to exist. Some products developed by remote sensing scientists have not been widely used, and many variables required by land process models and decision support systems have not been generated. The product accuracy and application requirements may not always be consistent. Successful applications are not static but evolve as new sensor, data processing, and network technologies emerge. Improved coupling of remote sensing science and applications would be most advantageous.

References 1.10. CONCLUDING REMARKS Significant progress in engineering issues has enabled remarkable improvements in platform and sensor systems that can be seen in many indicators, such as signal-to-noise ratios, resolutions, pointing accuracies, geometric and spectroradiometric integrity, and calibration. The immense amounts of data available from satellite observations present great challenges. Considerable investments have been put into developing physical models to understand surface radiation regimes, and some of these models have been incorporated into useful algorithms for estimating land surface variables from satellite observations; however, development of realistic and computationally simplified surface radiation models mostly suitable for inversion of land surface variables from satellite data is still urgently required. Inversion of land surface parameters is

Breaker, L. C. (1990). Estimating and removing sensorinduced correlation from Advanced Very High Resolution Radiometer satellite data. Journal of Geophysical Research, 95, 9701e9711. CENR/IWGEO (2005). Strategic plan for the U.S. integrated Earth observation system In council Committee on Environment and natural Resources. Gastellu-Etchegorry, J. P. (2008). 3D modeling of satellite spectral images, radiation budget and energy budget of urban landscapes. Meteorology and Atmospheric Physics, 102, 187e207. GEO (2005). GEOSS 10-Year Implementation Plan Reference Document. pp. 209. Jensen, J. R. (2004). Introductory Digital Image Processing. Prentice Hall. Lewis, P. (1999). Three-dimensional plant modelling for remote sensing simulation studies using the Botanical Plant Modelling System. Agronomie, 19, 185e210. Liang, S. (2004). Quantitative Remote Sensing of Land Surfaces. New York: John Wiley & Sons. Liang, S. (Ed.). (2008). Advances in Land Remote Sensing: System, Modeling, Inversion and Application. New York: Springer. Liang, S. L. (2007). Recent developments in estimating land surface biogeophysical variables from optical

REFERENCES

remote sensing. Progress in Physical Geography, 31, 501e516. Lu, D., & Weng, Q. (2007). A survey of image classification methods and techniques for improving classification performance. International Journal of Remote Sensing, 28, 823e870. Mather, P. M., & Magaly, K. (2010). Computer Processing of Remotely Sensed Images: An Introduction. Hoboken, NJ: John Wiley & Sons.

31

Richards, J. A., & Jia, X. (2005). Remote sensing digital image analysis: An introduction. Springer. Strahler, A. H., Woodcock, C. E., & Smith, J. A. (1986). On the nature of models in remote sensing. Remote Sensing of Environment, 20, 121e139. Zhang, P., Li, J., Olson, E., Schmit, T. J., Li, J. L., & Menzel, W. P. (2006). Impact of point spread function on infrared radiances from geostationary satellites. IEEE Transactions on Geoscience and Remote Sensing, 44, 2176e2183.