Ecological Informatics 53 (2019) 100984
Contents lists available at ScienceDirect
Ecological Informatics journal homepage: www.elsevier.com/locate/ecolinf
An effective approach to selecting the appropriate pan-sharpening method in digital change detection of natural ecosystems
T
Behzad Rayegania, , Susan Baratib, Hamid Goshtasba, Hamid Sarkheilc, Javad Ramezanid ⁎
a
Department of Natural Environment & Biodiversity, College of Environment, Karaj, Iran Department of Natural Resources, Isfahan University of Technology, Isfahan, Iran c Department of Human Environment, College of Environment, Karaj, Iran d Department of Marine Environment, College of Environment, Karaj, Iran b
ARTICLE INFO
ABSTRACT
Keywords: Fusion techniques Fused image quality assessment Spatial quality Spectral quality
In an image fusion process, the spatial resolution of a multispectral image is improved by a panchromatic band. However, due to the spatial and spectral resolution differences between these two data sets, the enhanced image may have two distortions, spatial and spectral. Therefore, to evaluate the efficiency of the pan-sharpening method, the status of these two types of distortions is examined. Unfortunately, there is still no developed acceptance index that can thoroughly investigate the quality of the pansharpened image; moreover, most of the proposed methods for reviewing the quality of output images have been developed with an emphasis on the residential area. Accordingly, to assess the quality of the pansharpened image in this study, we evaluated highly effective conventional methods, such as visual examinations, quantitative evaluation and impact analysis regarding the change detection process of mangrove forests. Finally, we suggested a simple yet efficient approach for such research in natural ecosystems. In the proposed method, based on the nature of the ecosystem, a spectral vegetation index is applied to the pansharpened images; the spectral quality of the images is further evaluated based on two parameters, 1) the areas under the curves of the histogram of the spectral vegetation index in the natural ecosystem region and 2) its centroid. The spatial quality of the pansharpened images is evaluated through implementing of two transects perpendicular to each other in the images of the spectral vegetation index, and creating a spatial deviation on them. With expert reviews and visual evaluation of the pansharpened images, the proposed method, especially in natural ecosystems, has more advantages as regards assessing the quality of the fused images. Based on the evaluations, among 11 methods of pansharpening, including Ehlers Fusion, FuzeGO, Gram-Schmidt, HPF, HCS, PCA, Modified IHS, Brovey Transform, Projective Resolution Merge, Wavelet IHS, and Wavelet PCA; the HPF method the Brovey Transform and Modified IHS methods respectively showed the best performance in the digital change detection of Mangrove forests.
1. Introduction Having a satellite data with a high spectral and spatial resolution is a requirement of some of the most crucial remote sensing analyses, such as mapping, extraction of a road network, and managing natural resources (Wu et al., 2015). Unfortunately, physical laws do not allow for high spectral and spatial resolution at the same time (Pohl and van Genderen, 2016). Hence, satellite data usually store spectral and spatial requirements in a multispectral (MS) image and panchromatic (PAN) image, respectively. Pansharpening is a process where the spatial resolution of a multispectral image is enhanced by a panchromatic higher-resolution counterpart (Alparone et al., 2015; Pohl and van Genderen, 2016; Vivone et al., 2014). Improvement of spatial
⁎
resolution using the PAN band is a downscaling method improving spatial, temporal, and spectral resolution. In principle, the enhancement of resolution by the PAN image is a subset of fusion methods used as a mean for increasing spatial resolution (Pohl and van Genderen, 2016; Vivone et al., 2014). All fusion techniques can be categorized into six classes (Pohl and van Genderen, 2016): 1) component substitution; 2) numerical methods; 3) statistical image fusion; 4) multiresolution approaches, 5) hybrid techniques, and 6) other methods. In the component substitution method (CS), some original bands of the MS are converted into a new data space (for example, a new color coordinate system), and a PAN band is replaced by one of the bands of the converted MS. In the next step, the fused image is obtained by the reverse transform (Alparone et al., 2015; Du et al., 2007; Pohl and van
Corresponding author. E-mail addresses:
[email protected],
[email protected] (B. Rayegani).
https://doi.org/10.1016/j.ecoinf.2019.100984 Received 22 February 2019; Received in revised form 18 July 2019; Accepted 19 July 2019 Available online 26 July 2019 1574-9541/ © 2019 Elsevier B.V. All rights reserved.
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Genderen, 2016). The most well-known method of this category is the IHS transform where the bands turn into a sphere-like color coordinate system with three components of intensity, hue, and saturation; the PAN band is replaced by the intensity band (Alparone et al., 2015; Du et al., 2007; Pohl and van Genderen, 2016; Strait et al., 2008). Due to the broad application of this method (Al-Wassai et al., 2011; Pohl and van Genderen, 2016; Vijayaraj et al., 2004; Vijayaraj et al., 2006), several models have been proposed for the IHS color space (shapes like a cylinder, cone, bicone, hexcone, etc.) that convert the bands into the three components via different methods, known as modified IHS methods (Al-Wassai et al., 2011). The hyperspherical color space (HCS) is an alternative component substitution method in which the color space is defined to be suitable for human vision (Pohl and van Genderen, 2016). Principal component substitution (PCS) is another popular CS technique (Pohl and van Genderen, 2016) where the MS is transformed into uncorrelated components by principal component analysis. The first component is then replaced by the PAN image, which histogram is similar to that of the first component (using the Histogram Matching method). Finally, the fused image is obtained by the inverse PCA (Pohl and van Genderen, 2016; Strait et al., 2008; Vijayaraj et al., 2004). Gram-Schmidt transform, also known as the extended form of the PCS method (Pohl and van Genderen, 2016), is considered as another well-known alternative component substitution method. In this method, the PAN image is primarily simulated with an analysis performed on the MS image, and joins the MS bands. This new dataset is subsequently transformed into a new space by an orthogonal transform method called GS (Du et al., 2013; Laben and Brower, 2000). After that, the PAN band is converted by histogram matching method to have a histogram similar to the that of the first channel of the new dataset; it is then replaced by the first channel in the dataset. Finally, via the inverse transform of the GS algorithm, the fused image is obtained (Du et al., 2013; Klonus and Ehlers, 2009; Pohl and van Genderen, 2016). The Brovey transform, which is in the second category, is considered to be one of the most successful arithmetic-based fusion techniques normalizing the input bands using add and subtract operations (Pohl and van Genderen, 2016). In this method, the average of the total bands (usually three bands) of the MS image is obtained as a normalization index (I); for the fused image, each band is individually multiplied in the PAN image, with results divided into index (I) (Du et al., 2007; Jensen, 2016; Pohl and van Genderen, 2016; Vijayaraj et al., 2004). The high-pass filter method (HPF) is one of the first arithmetic resolution merge methods, which, after 30 years since its first use, is still employed in spatial resolution enhancement techniques (Alimuddin et al., 2012; Pohl and van Genderen, 2016). This method has three steps; first, a high-pass filter is applied to the PAN image. Then, the output image is weighted according to the standard deviation and is added to each MS image band. Ultimately, new band histograms are matched with histograms of the original bands (by histogram matching method) (Pohl and van Genderen, 2016; Zhang and Mishra, 2012). The Projective Resolution Merge is a geometric method unique to the ERDAS IMAGINE software, in which models of PAN and MS images are created by sensor-dependent geometric techniques; ultimately, these models combine in a manner that the geometry of the PAN image is maintained. The function of this method is based on filtering the PAN image similar to the HPF method, but with different filters (Geospatial Hexagon, 2015). FuzeGo, which uses the UNS Pansharp algorithm, is one of the most interesting and successful methods for statistical image fusion (third category) (Pohl and van Genderen, 2016). It is based on the creation of a regression equation via the least-squares method (Snehmani et al., 2016). For each band of the MS image using the PAN image, in parts where their frequency ranges are similar, creates a linear relationship in the form of regression; by applying the same regression, the PAN image is converted to the fused MS image. With this, the spectral and spatial quality of the fused image will be optimized. Due to the commerciality of the method, the details of this method and its algorithm are unclear
(Pohl and van Genderen, 2016; Zhang and Mishra, 2014). Each fusion method has its own limitations which researchers have tried to eliminate through combining these methods. The Ehlers Fusion method is a hybrid technique (fifth category) in which the MS bands first transform into an IHS space (Pohl and van Genderen, 2016; Snehmani et al., 2016). Then the intensity (in the space of IHS) and the PAN band combine, with the Fourier Transformation converting into a Fast Fourier Transform file (FFT). In the Fourier space, a low-pass filter is applied to the FFT, a high-pass filter is applied to the PAN image, the modified FFT file is stored, and the inverse Fourier operation is performed in order to obtain the enhanced PAN image (the intensity image is not used here anymore). The enhanced PAN image replaces the intensity image, and the inverse IHS provides a fused MS image (Ehlers, 2005; Ehlers et al., 2010; Klonus and Ehlers, 2009; Pohl and van Genderen, 2016). The Wavelet-IHS is a hybrid method combining the IHS with Wavelet Transform or WT (Category 4). In this method, the MS image is primarily converted into an IHS space. Subsequently, the intensity image is decomposed by the Wavelet Transform to have the same pixel size as the PAN image. The PAN image is then matched to the intensity image, and WT is decomposed. In the next step, using a weighted combination, the intensity decompositions are combined into PAN decompositions. In the final stage, the IHS components are achieved via the Inverse Wavelet and the fused image is eventually obtained through the Inverse IHS Transformation (Pohl and van Genderen, 2016; Zhang and Mishra, 2014). The Wavelet PCA method is similar to the Wavelet IHS method, except that in the former, PCA is used instead of HIS (Pohl and van Genderen, 2016). Due to the multiplicity of fusion methods, evaluating the pansharpened image quality (the fused MS image quality) is one of the most controversial aspects of remote sensing (Alimuddin et al., 2012; Amro et al., 2011; Du et al., 2007; Jawak and Luis, 2013; Klonus and Ehlers, 2009; Pohl and van Genderen, 2016; Strait et al., 2008; Vijayaraj et al., 2004; Vijayaraj et al., 2006; Vivone et al., 2014; Zhang et al., 2016Zhang and Mishra, 2012). According to the signal-to-noise ratio (SNR) law to have a higher signal quality in images, the inverse relationship between spectral and spatial resolution is required. A PAN image with higher spatial resolution does not have a spectral diversity (Pohl and van Genderen, 2016; Vivone et al., 2014). Due to the spatial and spectral resolution differences between these two datasets, the enhanced image may have two spatial and spectral distortions in pansharpening process (Johnson, 2014; Pohl and van Genderen, 2016; Snehmani et al., 2016; Strait et al., 2008). Therefore, in evaluation pansharpening approaches, the status of these two types of distortions is analyzed regarding image quality (Amro et al., 2011; Pohl and van Genderen, 2016; Strait et al., 2008; Zhang et al., 2016). Unfortunately, there is no developed acceptance index that can thoroughly examine the quality of the pansharpened image (Pohl and van Genderen, 2016; Zhang and Mishra, 2012), and most methods proposed for reviewing the quality of fused images have been developed for residential areas. There are currently two types of procedures to assessing the image quality of the pansharpened image: the qualitative methods based on the visual examination of the pansharpened MS image and its comparison with the original MS image (Amro et al., 2011; Zhang and Mishra, 2012), and quantitative approaches which use quantitative and statistical assessments to evaluate the quality of the fused MS image (Alparone et al., 2015; Amro et al., 2011;Pohl and van Genderen, 2016; Strait et al., 2008). There exist other logical approaches, such as evaluating the effect on classification and land cover and land use maps (Bovolo et al., 2010; Pohl and van Genderen, 2016). In this research, changes in mangrove forests as natural ecosystems were considered. By examining commonly used methods for quantitative and qualitative assessment of the pansharpened images, in most cases, it will become apparent that the whole image mainly man-made areas are considered (Alimuddin et al., 2012; Du et al., 2007; Jawak and Luis, 2013; Pohl and van Genderen, 2016; Strait et al., 2008; Vijayaraj et al., 2004; Vivone et al., 2014). However, little effort has 2
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
been made to achieve a fused image quality assessment framework for natural ecosystems. On the other hand, in conventional statistical methods, the criterion of the superiority of a pansharpening method is due to the similarity of each output to the MS image whose pixels have been resampled to have an exact pixel size of the PAN image (Pohl and van Genderen, 2016; Strait et al., 2008). Therefore, this research seeks to determine whether, in areas where the changes in a homogeneous ecosystem such as mangrove forests is investigated, the conventional indices for spectral and spatial distortion can be used as the proper measure for the quality and quantity of the fused images. Moreover, is it possible to provide an efficient method for selecting the appropriate pansharpening method in these areas?
Murayama, 2015; Giri, 2016; Long et al., 2016; Zhu and Woodcock, 2014), and the successful experiences associated with the use of Landsat panchromatic image (Johnson, 2014). Considering the effect of tidal volume on the estimated areas of the mangrove forests (Carney et al., 2014; Giri et al., 2008; Jahari et al., 2011; Lee and Yeh, 2009), in this study, satellite images of Landsat used in 2015 and 2001 were selected with similar dates, and the same tidal state. Considering the temperature, tidal state, and visual examination of Landsat images, the most optimal time for comparison was found to be March. Accordingly, we selected the OLI sensor data on 02/24/2015 and the ETM+ sensor data on 03/03/2001. 2.2.2. Image preparation In this study, due to the use of the image algebra for change detection and its need for absolute radiometer correction (Jensen, 2016), and given the multitemporal analysis, the absolute atmospheric correction and extraction of reflectance values using the ATCOR module (Aosier et al., 2005; Campbell and Wynne, 2011; Jensen, 2005; Liang et al., 2012) in ERDAS IMAGINE 2014 was performed. This correction was done for the MS and PAN images of the region via ATCOR2 module due to the flatness of the study area. For geometric correction, firstly, the data of the OLI sensor were geometrically corrected using ground control points (Jensen, 2005; Koch and Mather, 2013) and then the ETM+ data were registered to it (Jensen, 2005, 2016). By examining the PAN image of the ETM+ sensor, this data revealed a Salt and Pepper Noise (Acton, 2012; Schowengerdt, 2006; Unal et al., 2016). Fourier Transform (Schowengerdt, 2006), Principal Component Analysis (Schowengerdt, 2006), and the Median Filter (Unal et al., 2016) were used to eliminate the noise. At the pansharpening stage, the filtered PAN image was used for the ETM+ sensor due to the suitability of the median filter output.
2. Materials and methods 2.1. Study area The study area is mangrove forests in Hara International Wetland, which is one of UNESCO Biosphere Reserves in Iran. The mangrove forests are located on the southern coast of Iran, particularly on and near the island of Qeshm in the Persian Gulf. These forests are situated in subtropical climates with an annual rainfall of 100–300 mm and an altitude of the sea level, or at most 6 m above. It is considered as one of the best mangrove forests in Iran. Fig. 1 shows the position of the Hara International Wetland against the province of Hormozgan and Qeshm island. 2.2. Methodology 2.2.1. Data In this study, OLI (Landsat 8) and ETM+ (Landsat 7) data were used for analysis due to the full range of applications (Estoque and
Fig. 1. The study area (the hara international wetland). 3
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
2.2.3. Spatial enhancement using fusion methods In this study, 11 different pansharpening methods, including Ehlers Fusion (Ehlers, 2005; Ehlers et al., 2010; Klonus and Ehlers, 2007, 2009); FuzeGo (Snehmani et al., 2016; Zhang et al., 2016; Zhang and Mishra, 2014); Gram-Schmidt (Aiazzi et al., 2009; Klonus and Ehlers, 2009; Laben and Brower, 2000; Wu et al., 2015); High Pass Filter (HPF) (Alimuddin et al., 2012; Amro et al., 2011; Jawak and Luis, 2013; Zhang and Mishra, 2012; Zhang and Mishra, 2014); Hyperspherical Color Space (Padwick and Deskevich, 2014; Wu et al., 2015); Modified HIS (Al-Wassai et al., 2011; Strait et al., 2008; Vijayaraj et al., 2004; Vijayaraj et al., 2006; Wu et al., 2015); Principal Component (Loncan et al., 2015; Strait et al., 2008; Vijayaraj et al., 2004; Vijayaraj et al., 2006; Wu et al., 2015); Brovey Transform (Du et al., 2007; Ehlers et al., 2010; Vijayaraj et al., 2004; Vijayaraj et al., 2006; Vivone et al., 2014); Projective Resolution Merge (Geospatial Hexagon, 2015; Lindgren and Kilston, 1996); Wavelet HIS; and Wavelet PCA (Strait et al., 2008; Vijayaraj et al., 2006) were used via FuzeGo, ENVI and ERDAS IMAGINE softwares. It is to be noted that the resampling method was selected by its proposed manual of the fusion method; however, according to certain studies, the resampling type does not show a noticeable visual effect on the final pansharpened image (Jawak and Luis, 2013). Moreover, in the case of ETM+ sensor, the modified PAN image based on the median-filter (Unal et al., 2016) is used.
and van Genderen, 2016; Strait et al., 2008). To assess the spatial quality of the fused MS image (Table 1), the correlation coefficient of the Laplacian filter outputs (Table 1) was further used (Pohl and van Genderen, 2016). To use these indices, first, the original MS image was resampled to obtain the same spatial resolution as the PAN image (Pohl and van Genderen, 2016). It should be noted that when an MS image is resampled at the scale of the PAN image its spatial frequency decreases (Stathaki, 2011); however, most statistical image distortion indices need to have a pixel in an MS image for each pixel in the PAN image (Pohl and van Genderen, 2016), hence the inevitability of resampling. According to Table 1, the equations were applied between the resampled and the fused MS image. It is noteworthy that because of the salt and pepper noise in the PAN image of the ETM+ sensor and its need for correction, these indices were only applied to the OLI sensor. 2.2.5. Digital change detection and examination of the performance of the fusion techniques 2.2.5.1. Change detection. As mentioned, the effect of the fusion techniques on change detection was evaluated as one of the methods for assessing the performance of the pansharpening method. For this purpose, the pixel value was compared with algebra operations so as to determine the changes (Eastman, 2012, 2015a; Jensen, 2005, 2016). In this method, the images were enhanced using the conventional vegetation spectral indexes (Bahrami Nejad et al., 2018; Barati et al., 2011; Giri et al., 2007; Lee and Yeh, 2009; Liu et al., 2008; Lu et al., 2004; Rayegani et al., 2017), where NDVI index showed better performance through the vegetation spectral indices. Accordingly, the NDVI index, which was also proved to be useful in similar studies (Giri et al., 2007; Lee and Yeh, 2009; Liu et al., 2008; Pettorelli 2013), was selected. Mangrove forests were separated by a combination of unsupervised classification and applying thresholds to the NDVI index in each of the two epochs. Next, based on the “OR” Boolean operator, these two mangrove forest maps were combined to obtain a mask of the mangrove forests. In all of the following steps, this mask was used to prevent the systematic error. To determine the changes, firstly, the two sensor data were atmospherically corrected via ATCOR2 module to
2.2.4. Quality assessment of fusion To evaluate the quality of the pansharpened image in the present research, visual inspection, quantitative evaluation, and impact analysis were employed in the process of change detection in mangrove forests. In the visual inspection, personal experience was used (Pohl and van Genderen, 2016), and the existence of any anomalies in the output images was examined (Pohl and van Genderen, 2016; Zhang and Mishra, 2012). In the quantitative evaluation, statistical distortion indices, including correlation coefficient, RMSE, RASE, ERGAS, and Zhou's protocol were used (Alimuddin et al., 2012; Jawak and Luis, 2013; Pohl Table 1 The indices used to assess the spectral and spatial quality of the fused MS images. Quantitative assessment method Correlation Coefficient (CC)
RMSE
RASE
ERGAS
Zhou Correlation Coefficient of Zhou's protocol
a
Equation rfr =
(f1∗r1∗ + f2∗r2∗ +
m x =1
RMSE =
RASE =
100 µ
ERGAS =
Di =
1 m×n
n (f (x . y ) y=1 m×n
1 b
r (x . y ))2
b i = 1 RMSE (fi . ri )
d 100 h dl
R
…+
fn rn∗)/n ∗
1 b
C
b RMSE (fi . ri ) i=1 µ(f ) i
|fiRC
MCiRC|
rfrr = (f1fr1f + f2fr2f + … + fnfrnf)/n
Category
Description
Spectral Quality
rfr Correlation Coefficient fi∗ Normalizeda pixel value of the fused MS image ri∗ Normalizeda pixel value of the resampled MS image n Total number of pixels f(x. y) the pixel value of the fused MS image r(x. y) the pixel value of the resampled MS image m Number of image lines n Number of image columns μ Average value of all MS bands b Number of bands
Spectral Quality Spectral Quality
b i = 1 RMSE (fi . ri )
Spectral Quality
Total RMSE of all bands
dhthe spatial resolution of the PAN image dl the spatial resolution of the MS image b RMSE (fi . ri ) i=1 µ(f ) i
Spectral Quality Spatial Quality
The total RMSE ratio of each band to its average value of the
band The average absolute value of the difference between the pixel value of the resampled and the fused MS image rfrr Correlation Coefficient of the filtered MS images fif Normalizeda pixel value of the fused MS image after applying the Laplacian filter rif Normalizeda pixel value of the resampled MS image after applying the Laplacian filter the Laplacian filter 1 1 1 1 8 1 1 1 1
Normalization was done via the global mean ( X ) and the global standard deviation (SDX) of the image using: Xt = 4
Xt X SDX
where Xt is the original pixel value.
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
obtain the surface reflection values; 11 fusion methods were then applied to these images. In the next step, the NDVI spectral indexes were separately obtained for each fusion method; after that, the NDVI difference of the two sensors was calculated in the mangrove forests (using the mask). The average value and standard deviation of the difference image in the mangrove forests were further estimated. Finally, the thresholds of the average plus and minus twice the standard deviation were used to detect the change (Eastman, 2012, 2015a; Eastman, 2015b; Jensen, 2005, 2016): unchanged area (μdifNDVI − 2SDdifNDVI ≤ NDVIdif ≤ μdifNDVI + 2SDdifNDVI); increased area (NDVIdif > μdifNDVI + 2SDdifNDVI); and decreased area (NDVIdif < μdifNDVI − 2SDdifNDVI). 2.2.5.2. Accuracy assessment. Each digital change detection method usually has three outputs: unchanged, increased, and decreased region. We considered three repetitions for each output. So, for each fusion method, 9 points were randomly selected; accordingly, for the 11 fusion methods employed, 99 points had been randomly selected. However, certain places were located at the edge of the mangrove forest, where the water was mixed with vegetation, hence removed from the process. Finally, eight places were set aside, and 91 points were randomly selected (Congalton and Green, 2008) to obtain the reference data in an error matrix. The error matrix was then formed, and the total accuracy and kappa coefficient in each fusion method was separately obtained (Campbell and Wynne, 2011; Congalton and Green, 2008; Eastman, 2015a; Jensen, 2005, 2016).
Fig. 2. Main parameters of the spectral quality index.
Dr =
2.2.6. Proposing a new image quality index appropriate to digital change detection of natural ecosystems Due to the different function of the various fusion methods, to compare the fused image, it is necessary to normalize the output data of each process. However, normalization based on the mean and standard deviation cannot provide a general assessment of the technique. On the other hand, plants are the most critical part of a natural ecosystem (Chapin et al., 2011), and their assessment in many cases determines the efficiency and health of an ecosystem. Therefore, a spectral vegetation index is used in this proposal (Jawak and Luis, 2013). Spectral vegetation indices such as NDVI normalize the sensor image bands (Eastman, 2012, 2015a) and simultaneously determine vegetation conditions in the landscape (Chapin et al., 2011; Pettorelli 2013). When the histogram of a spectral vegetation index is plotted, the area under the curve of the histogram can be a good measure of productivity (Ivits et al., 2013), and its average is a standard for ecosystem function (Stoms and Hargrove, 2000). Therefore, in our proposal, two criteria were employed to evaluate the spectral quality of the fused image: the area under the histogram curve of the NDVI and the NDVI value in the centroid of the histogram curve in the mangrove forest (eq. 1 and 2).
A=
f (x ) dx
x =
x dA dA
|C fNDVI CrNDVI | 1 × + 2 CrNDVI
f (NDVIf ) dx
f (NDVIr ) dx
f (NDVIr ) dx
(3)
where f represents the fused MS image and r represents the original MS image. Concerning the study of a natural ecosystem, the use of the Zhou's protocol and the Laplacian filter does not seem to provide a suitable benchmark for evaluating the spatial quality of the fused MS images. Therefore, to evaluate the spatial quality, two perpendicular transects were implemented in all NDVI images. After that, the correlation between the original NDVI and the NDVIs of the fused images were obtained along with these transects. In the next step, in order to detect the spatial movements resulting from the fusion method, primarily the horizontal transect (along the x-axis) only on the NDVIs of the fused images, one pixel to the right, then two pixels to the right were shifted and the coefficient of correlations were obtained in these conditions. Next, the horizontal transect (along with the x-axis) only on the NDVIs of the fused images, one pixel to the left, then two pixels to the left were shifted, and the coefficients of correlation were obtained in these conditions. Assuming that the displacement resulting from the fusion method reduces the coefficient of correlations, we specified in which fusion method, the coefficient of correlations obtained from the shift of the pixels was higher than the initial coefficient of correlations; the difference with the initial correlation was considered as a criterion for assessing the spatial quality. In the case of the vertical transect only on the NDVIs of the fused images, these shifts were applied as a pixel up, two pixels up, one pixel down, and two pixels down. The same assumption was made for the spatial quality assessment. Finally, the sum of these two values (the difference between the two correlation coefficients) was considered as the spatial quality index of the fused images. Moreover, to have a benchmark for comparison, Zhou's protocol (Pohl and van Genderen, 2016; Vivone et al., 2014) was further employed to examine the spatial quality of the fused images.
(1) (2)
where the main parameters are shown in Fig. 2. As it is shown in Fig. 2, the possibility of estimating the area under the histogram curve of the NDVI is easily provided. For this purpose, to determine the area under the curve, the histogram is divided into trapezoidal shapes based on the intervals. To calculate the NDVI value in the centroid of the curve, it is also sufficient to multiply the centroid of each trapezoid in its area, according to Eq. (2), and divide their sum into the total area. Due to the existence of the Salt and Pepper Noise in the PAN band of the ETM+ sensor and its need for correction, the two proposed criteria based on relationships 1 and 2 only applied to the OLI data. Given the importance of the productivity and the ecosystem functioning, and for a general assessment, these two criteria were converted to the relative change, where their mean was obtained (Eq. (3)).
3. Results & discussion 3.1. Visual inspection Fig. 3 shows the performance of the fusion methods used in this research the study area for the OLI sensor. According to the results, there appears to be a distortion in contrast to the Gram-Schmidt 5
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Fig. 3. Visual inspection of the fused MS images in the mangrove forests (OLI Sensor).
method, but the overall performance quality of this method is not bad. Anomalies were observed in the Hyperspherical Color Space (HCS) method in the water body adjacent to the mangroves; in some places, the effect of pepper or dark pixels is evident in the mangrove forests (Fig. 3). In both Wavelet methods, visual quality was inappropriate,
especially on the forest edges (As shown in Fig. 3). Given the difficulty of judging the fused MS image quality, instead of emphasizing on the components of image interpretation such as shape, texture, color, etc., A comparative visualization of NDVI histograms was used in the mangrove forest area (using the mangrove 6
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Fig. 4. NDVI histograms related to different fusion methods in the mangrove forest and their comparison with NDVI histogram associated with the original MS image (OLI sensor).
mask). In Fig. 4, NDVI histograms of the fusion methods are compared with the NDVI histogram of the original MS image. As shown in this figure, Ehlers, FuzeGo, and Projective methods are the least similar to the original MS image. In the case of Gram-Schmidt and PCA, the difference with the original histogram is also evident. However, the Brovey, HCS, and Modified IHS methods most closely resemble the histogram of the original image.
similar to the differential histogram of the original image. This similarity shows why the accuracy of these methods is identical to each other. Even despite the inappropriate histogram of the single-image date of the Projective Resolution Merge method (Fig. 4), its differential histogram (Fig. 5) has found a reasonably normal distribution, and is close to the differential histogram pattern of the original image, hence the justifiability of the relatively high accuracy of the change detection in this method. Another point is the differential histogram of the Fuze Go method, which shows a symmetric and no-jagged distribution, indicating shows why the function of this method is appropriate.
3.2. Comparison of the performance of the fusion methods regarding change detection in mangrove forest Table 2 shows the accuracy of the change detection in the various fusion methods. As observed, the lowest level of accuracy is in the Ehlers and Gram-Schmidt method, respectively. According to Fig. 4, the NDVI histogram, these two methods are significantly different from the NDVI histogram of the original MS image, justifying this weakness in performance. Also, Brovey Transform, HCS, and Modified IHS, with the closest histograms to the original NDVI, as shown in Fig. 4, have the same function and accuracy. In algebra change detection used in this study, the difference between the two NDVIs is calculated, hence the fact that the histogram shape of this image is different from the initial forms and the accuracy of changes depends on the structure of this new histogram form (Eastman, 2015a; Jensen, 2016). The more this shape becomes similar to the normal distribution, the more likely the increase in the accuracy of the change detection will be (Eastman, 2015a). Fig. 5 shows the difference histograms of the NDVI index in the mangrove forests. As shown in Fig. 5, the differential histograms of the HCS, HPF, Brovey Transform and Modified IHS methods are quite
3.3. Conventional spatial and spectral fused image quality indices Table 3 shows the results of quantitative and qualitative evaluation of the fused MS images (OLI images). Given that a natural ecosystem was evaluated in the present study, the indices were only calculated for bands common in spectral vegetation indices, and the overall values were also based on the same bands (NIR, RED, and GREEN bands). The color of each column in Table 3 was chosen to indicate better values in green and worse values in red. According to the color scheme, the results of the indices were so varied (Table 3) that cannot practically be discussed in detail regarding the performance of the fusion methods. However, in general, it can be pointed out that more repetitions of green color indicate the better performance of a process, and vice vera. Accordingly, the most optimal spectral quality by Table 3 is related to the two methods of the wavelet. Also, based on the Zhou's spatial distortion, the two ways of the wavelet, followed by Modified IHS method, had better spatial quality compared with other methods. In general, the results of this section show significant differences with previous 7
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Table 2 Accuracy assessment of different methods of fusion.
findings, particularly in the case of methods with similar NDVI histograms (Fig. 4), and, accordingly, these methods show a similar performance.
spectral quality of the fused image. According to Table 4, HCS, Brovey Transform, Modified IHS, and HPF showed the most optimal performance, a result expected from Fig. 4. According to Table 4, FuzeGo and Ehlers methods with NDVI histograms different from the original image (Fig. 4), showed the worst performance evaluation. In order to carry out a more comprehensive evaluation of the suggested spectral quality index, and to recognize the functional similarity
3.4. Proposed spatial and spectral fused image quality indices Table 4 shows the outputs of the proposed index for evaluating the
Fig. 5. NDVI difference histograms related to the different ways of fusion in mangrove forest and comparison with NDVI difference histogram of original MS image. 8
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Table 3 Results of quality indices.
Spectral & Spatial Quality Indices
0.71 0.75 0.68 208.3 5810 8180 4.81 156.5 0.69 0.63 0.6 6333.6 6532.2 6479.4 5.62 181.5 0.82 0.79 0.58 201 276.2 356.9 1.16 100.6
5805.9 8177.1 0.46 6528.2 6474.7 0.5 235.9 304.6 0.5
0.5 0.43 0.49
0.18 0.26 0.18
Schmidt HPF HCS Modified PCA IHS Brovey Projective Wavelet IHS Wavelet
0.68 0.53 0.79 0.78 0.75 0.64 0.9 0.85
165.1 236.3 107 172.4 858.7 977.3 60.4 116.5
0.48 0.49 0.63 0.47 0.5 0.47 0.79 0.72
0.22 0.11 0.45 0.18 0.17 0.12 0.42 0.29
NIR
RED
171.5 6329. 170.3 3 230.4 280.3 1.06 85 127.5 325.3 712.2 1.49 121.2 234.6 149.5 417.9 1.08 78.6 106.2 232.4 226.2 1.02 84.8 132.2 884.1 2007.7 2.48 513.3 859.9 1005.3 2285.5 2.64 1669.4 977.3 99.8 241.3 0.85 64.5 60.4 173.6 209.2 0.91 72.5 83.9
of fusion methods, we tested the index in separate populations of the mangrove forest. For this purpose, around 42 separate populations were visually identified based on water movement in the mangrove forests. After that, 21 populations (50%) were randomly selected, on which the proposed index was tested. Fig. 6 shows the position of these sampled populations. Fig. 6 shows the position of the 21 selected mangrove populations to test the suggested spectral quality index. Table 5 shows the statistical summary of the proposed spectral quality index in the 21 populations, where the best performance belongs to both Wavelet and HPF fusion methods, and the worst is associated with Ehlers and FuzeGo fusion methods. To compare the means of the fusion methods based on the proposed index, it was necessary to primarily examine the normality of the data (Pena-Rodriguez, 2013). Due to the combination of water and vegetation on the edges of the mangrove forest, as shown in Fig. 4, the histogram of the population does not have a normal distribution in different pansharpening methods. However, one sample KolmogorovSmirnov was used to test the normality of the distribution (Hinton et al., 2004), and, as guessed, the null hypothesis was rejected. Therefore, with the help of IBM SPSS Statistics 25, nonparametric tests (independent samples) were performed to compare the methods (George and Mallery, 2016). The null hypothesis (the distribution of the
Zhou’s distortion
Zhou’s Protocol GREEN
Total
Total
177.7 321.6 148.2 181.8 878.9 998.1 100.4 124.9
NIR
RED
0.25 0.45 0.57 0.47 0.21 0.65 0.71 0.71
GREEN
NIR
RED
GREEN
0.64 0.58 0.83 0.76 0.78 0.65 0.92 0.8
RASE ERGAS
198.9 0.48 524.2 0.46 303.9 0.63 173.8 0.48 1997.6 0.47 2275.2 0.46 151.5 0.76 140.4 0.74
proposed index is the same across different methods) was rejected here as well. Even following the removal of the four methods that showed different performances from other methods (Ehlers, Gram-Schmidt, FuzeGo, Projective), the null hypothesis was also rejected. Since the histogram of the mangrove forest is not entirely different from a normal distribution (the skewness is about −0.5 and kurtosis is about 0.15), one-way ANOVA was used to compare fusion methods, since it is more powerful in data analysis (George and Mallery, 2016). The null hypothesis, where method means are equal, was rejected as well, even after removing the four different methods (Ehlers, GramSchmidt, FuzeGo, Projective). Table 6 shows the results of one-way ANOVA based on the Duncan method (Bethea, 2018). Based on this table, Ehlers, FuzeGo, and Projective are significantly different from other methods. HPF, Wavelet HIS, and Wavelet PCA show similar performance, and Brovey Transformation, Modified HIS, HCS, PCA, and Gram-Schmidt have the same means. Table 7 shows the outputs of the proposed index for evaluating the spatial quality of the fused images. Interestingly, there is a significant difference between the main CC and the CC of one pixel shifted to the right in both Wavelet methods, indicating the highest error based on the proposed index. This difference, as noted in the visual inspection results, is also evident with eye evaluations. Based on this assessment, most fusion methods, except for the Wavelets, have similar spatial
Table 4 Results of the proposed spectral quality index.
Ehlers
fuzeGo Gram-
HPF
Schmidt 0.708
0.82
0.124
0.095
HCS
spatial NIR
Ehlers FuzeGo Gram-
RMSE
RED
CC
GREEN
Fusion Method
Modified Projecti Brovey PCA
Wavele Wavele
HIS
tIHS
t PCA
0.15
0.156
ve
0.051 0.047
9
0.156
0.047
0.136
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Fig. 6. The position of the 21 selected mangrove populations to test the suggested spectral quality index.
performance. Nevertheless, the best spatial quality is related to methods whose histograms are similar to that of the original image (Fig. 4). Therefore, the extent to which the histogram shape of each method of fusion matches with the original histogram can also show the spatial quality of the images, which seems logical. Table 8 shows assessment results of the spatial quality of NDVI images based on Zhou's protocol. In contrast to the proposed method in this study, this assessment showed that HCS has the worst performance, and the two ways of Wavelet have a function similar to most other methods. In the case of HCS, due to the effect of pepper and the mechanism of Zhou's protocol (Pohl and van Genderen, 2016), the output is justifiable. However, in the case of Wavelets, according to the visual
assessment, this analysis does not seem to be correct, and our proposed method for evaluating spatial quality is closer to reality. 4. Conclusions A correct and reliable assessment of the quality of the fused MS images is a major challenge in remote sensing analysis (Alimuddin et al., 2012; Amro et al., 2011; Jawak and Luis, 2013; Klonus and Ehlers, 2009; Pohl and van Genderen, 2016; Strait et al., 2008; Vijayaraj et al., 2004; Vijayaraj et al., 2006; Vivone et al., 2014). In certain cases, the results of such quantitative evaluations are so misleading and diverse that discourage researchers from using the quality
Table 5 Summary of the proposed index in 21 populations.
10
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Because in a natural ecosystem, plants are the most crucial living component, in our proposed method, the performance of the fusion method was evaluated based on this viable component. By applying a spectral vegetation index on the output images of the pansharpening methods, and observing their histogram, an indicator was obtained to compare the function of each method. For this indicator, two statistical criteria were proposed, and used to assess the fused image quality. Image quality evaluation was straightforward in the proposed method and its results, unlike conventional quantitative methods, made it less skeptical for the researcher. Although the assessment of the spectral quality of the fused images in the proposed plan can be quickly done by calculating the area under the curve of the vegetation spectral index via splitting it into geometric shapes and considering the nature of the centroid, the mean value can be used instead of the centroid. However, as a general suggestion, based on the results of this study, it would even be possible to simplify the process of spectral quality evaluation of the fused images, and perform the assessment only based on the similarity of the histogram shape of the vegetation spectral index within the natural environment. This comparison can be made visually by observing the fused VI histogram in the ecosystem and comparing it with the original VI histogram. Interestingly, according to the results, this similarity can even partially determine the spatial quality of the fused images (Table 8 and Fig. 4). Further proved was the suitability of the proposed method for evaluating the spatial quality of the pansharpened images by the visual inspection of the outputs and its comparison with the PAN image (Pohl and van Genderen, 2016). After implementing a transect and extracting the pixel values below, this method can also be quickly done in statistical software. Also, if researchers are interested in using a general criterion for assessing the function of the fusion method, the Euclidean distance of the centroids of the area under the curve of the vegetation index is proposed. In other words, the proposed general criterion for assessing the function of the
Table 6 Summary of one-way ANOVA of the proposed index in 21 populations. Proposed spectral quality index Duncana Methods Wavelet IHS HPF Wavelet PCA Brovey Modified IHS HCS PCA Gram-Schmidt Projective Ehlers FuzeGo Sig.
N 21 21 21 21 21 21 21 21 21 21 21
Subset for alpha = 0.05 1 2 3 0.0573 0.0730 0.0761 0.106 0.106 0.112 0.127 0.136 0.176 0.213
0.060
1.000
4
0.699 1.000
5
0.827 1.000
Means for groups in homogeneous subsets are displayed. a Uses Harmonic Mean Sample Size = 21.000.
indices (Zhang and Mishra, 2012; Zhang and Mishra, 2014). Because most fusion methods have been developed for commercial satellites with urban applications, many of the methods suggested for assessing the quality of fused images have also been proposed for urban environments (Ehlers, 2005; Jawak and Luis, 2013; Pohl and van Genderen, 2016; Strait et al., 2008). Hence, in the field of natural environmental studies, there are still significant deficiencies. The present proposal focuses on natural ecosystems, hence the fact that its nature can change based on the type of environment. In other words, because the proposed method is based on the use of spectral vegetation indices (VIs), it is possible to select a suitable VI based on the type of ecosystem and implement the method accordingly.
Table 7 The results of the proposed index for evaluating the spatial quality of the fused image.
11
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al.
Table 8 The results of the Zhou's spatial distortion.
fusion method is the Euclidean distance. In this case, due to the difference in the scale of the two axes X and Y, and given that in the non-desert ecosystems the value of the NDVI index is between 0 and 1, it is suggested that frequencies be fuzzified between 0 and 1 (on based on the minimum and maximum) in order to eliminate anomaly in calculating the Euclidean distance (if an index other than NDVI is used, the frequency should be based on the range of the index). In these cases, it is better to avoid stretching the value of the vegetation spectral index, because by performing a stretch and expanding the values, it is no longer possible to accurately assess the quality of the fused image. Based on the results, the accuracy assessment of the change detection is not a suitable method for comparing the function of the fusion methods due to its strong dependence on the form of the differential histogram, particularly when the goal is to assess the current conditions of an ecosystem; however, it can be considered as a general assessment. In general, if we consider spectral and spatial criteria and visual inspection simultaneously (Pohl and van Genderen, 2016), HPF is introduced as the best fusion method in this study, because apart from the excellent performance in each proposed index (Tables 4, 5, 6, and 8), the visual image quality of its output is far better than several similar fusion methods. After that, the best performance is related to Brovey Transform and Modified IHS methods. It is to be noted that HCS method should be used with caution due to the pepper noise effect. Moreover, the Projective Resolution Merge, as recommended (Geospatial Hexagon, 2015), should only be used for DigitalGlobe and GeoEye images.
African mangrove ecosystem: 1986–2010. Geoforum 53, 126–135. Chapin, F.S., Chapin, M.C., Matson, P.A., Vitousek, P., 2011. Principles of Terrestrial Ecosystem Ecology. Springer, New York. Congalton, R.G., Green, K., 2008. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. CRC press. Du, Q., Younan, N.H., King, R., Shah, V.P., 2007. On the performance evaluation of pansharpening techniques. IEEE Geosci. Remote Sens. Lett. 4, 518–522. Du, P., Liu, S., Xia, J., Zhao, Y., 2013. Information fusion techniques for change detection from multi-temporal remote sensing images. Inform. Fusion 14, 19–27. Eastman, J., 2012. IDRISI Selva Tutorial. Eastman, J., 2015a. TerrSet Tutorial. Clark Labs, Clark University, Worcester, MA, United States. Eastman, J.R., 2015b. TerrSet Manual. Accessed in TerrSet version. vol. 18. pp. 1–390. Ehlers, M., 2005. Beyond pansharpening: advances in data fusion for very high resolution remote sensing data. In: Proceedings, ISPRS Workshop ‘High-Resolution Earth Imaging for Geospatial Information, pp. 17–20. Ehlers, M., Klonus, S., Johan Åstrand, P., Rosso, P., 2010. Multi-sensor image fusion for pansharpening in remote sensing. Int. J. Image Data Fusion 1, 25–45. Estoque, R.C., Murayama, Y., 2015. Classification and change detection of built-up lands from Landsat-7 ETM+ and Landsat-8 OLI/TIRS imageries: a comparative assessment of various spectral indices. Ecol. Indic. 56, 205–217. George, D., Mallery, P., 2016. IBM SPSS Statistics 23 Step by Step: A Simple Guide and Reference. Taylor & Francis. Geospatial Hexagon, 2015. ERDAS Imagine Help Guide. Giri, C.P., 2016. Remote Sensing of Land Use and Land Cover: Principles and Applications. CRC Press. Giri, C., Pengra, B., Zhu, Z., Singh, A., Tieszen, L.L., 2007. Monitoring mangrove forest dynamics of the Sundarbans in Bangladesh and India using multi-temporal satellite data from 1973 to 2000. Estuar. Coast. Shelf Sci. 73, 91–100. Giri, C., Zhu, Z., Tieszen, L., Singh, A., Gillette, S., Kelmelis, J., 2008. Mangrove forest distributions and dynamics (1975–2005) of the tsunami-affected region of Asia. J. Biogeogr. 35, 519–528. Hinton, P.R., Brownlow, C., Cozens, B., McMurray, I., 2004. SPSS Explained. Routledge. Ivits, E., Cherlet, M., Mehl, W., Sommer, S., 2013. Ecosystem functional units characterized by satellite observed phenology and productivity gradients: a case study for Europe. Ecol. Indic. 27, 17–28. Jahari, M., Khairunniza-Bejo, S., Shariff, A.R.M., Shafri, H.Z.M., 2011. Change detection studies in Matang mangrove forest area, Perak. Pertanika J. Sci. Technol. 19, 307–327. Jawak, S.D., Luis, A.J., 2013. A comprehensive evaluation of PAN-sharpening algorithms coupled with resampling methods for image synthesis of very high resolution remotely sensed satellite data. Adv. Remote Sensing 2, 332. Jensen, J.R., 2005. Introductory Digital Image Processing: A Remote Sensing Perspective, 3rd ed. Prentice Hall, Upper Saddle River, N.J. Jensen, J.R., 2016. Introductory Digital Image Processing: A Remote Sensing Perspective. Pearson Education, Inc, Glenview, IL. Johnson, B., 2014. Effects of Pansharpening on vegetation indices. ISPRS Int. J. Geo Inf. 3, 507. Klonus, S., Ehlers, M., 2007. Image fusion using the Ehlers spectral characteristics preservation algorithm. GIScience Remote Sensing 44, 93–116. Klonus, S., Ehlers, M., 2009. Performance of evaluation methods in image fusion. In: Information Fusion, 2009. FUSION'09. 12th International Conference on. IEEE, pp. 1409–1416. Koch, M., Mather, P., 2013. Computer Processing of Remotely-Sensed Images: An Introduction. Wiley, Hoboken, N.J. Laben, C.A., Brower, B.V., 2000. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. Google Patents. Lee, T.-M., Yeh, H.-C., 2009. Applying remote sensing techniques to monitor shifting wetland vegetation: a case study of Danshui River estuary mangrove communities, Taiwan. Ecol. Eng. 35, 487–496. Liang, S., Li, X., Wang, J., 2012. Advanced Remote Sensing:: Terrestrial Information Extraction and Applications, 1st ed. Academic Press, Amsterdam; Boston. Lindgren, J.E., Kilston, S., 1996. Projective pan sharpening algorithm. In: Multispectral Imaging for Terrestrial Applications. International Society for Optics and Photonics, pp. 128–138. Liu, K., Li, X., Shi, X., Wang, S., 2008. Monitoring mangrove forest changes using remote sensing and GIS data with decision-tree learning. Wetlands 28, 336–346. Loncan, L., de Almeida, L.B., Bioucas-Dias, J.M., Briottet, X., Chanussot, J., Dobigeon, N., Fabre, S., Liao, W., Licciardi, G.A., Simoes, M., 2015. Hyperspectral pansharpening: a
References Acton, Q.A., 2012. Issues in Analysis, Measurement, Monitoring, Imaging, and Remote Sensing Technology: 2011 Edition. (ScholarlyEditions). Aiazzi, B., Baronti, S., Lotti, F., Selva, M., 2009. A comparison between global and context-adaptive pansharpening of multispectral images. IEEE Geosci. Remote Sens. Lett. 6, 302–306. Alimuddin, I., Sumantyo, J.T.S., Kuze, H., 2012. Assessment of pan-sharpening methods applied to image fusion of remotely sensed multi-band data. Int. J. Appl. Earth Obs. Geoinf. 18, 165–175. Alparone, L., Aiazzi, B., Baronti, S., Garzelli, A., 2015. Remote Sensing Image Fusion. CRC Press. Al-Wassai, F.A., Kalyankar, N., Al-Zuky, A.A., 2011. The IHS Transformations Based Image Fusion. arXiv preprint arXiv:1107.4396. Amro, I., Mateos, J., Vega, M., Molina, R., Katsaggelos, A.K., 2011. A survey of classical methods and new trends in pansharpening of multispectral images. EURASIP J. Adv. Signal Process. 2011, 79. Aosier, B., Kaneko, M., Takada, M., Saitoh, K., Katoh, K., 2005. Evaluate the accuracy of the atmosphere correction (ATCOR software method) of the ASTER data using ground radiometric measurement data. In: ISPRS, pp. 358–362. Bahrami Nejad, M., Rayegani, B., Jahani, A., Nezami, B., 2018. Proposing an EarlyWarning System for Optimal Management of Protected Areas (Case Study: Darmiyan Protected Area, Eastern Iran). (Journal for Nature Conservation). Barati, S., Rayegani, B., Saati, M., Sharifi, A., Nasri, M., 2011. Comparison the accuracies of different spectral indices for estimation of vegetation cover fraction in sparse vegetated areas. Egypt. J. Remote Sens. Space Sci. 14, 49–56. Bethea, R.M., 2018. Statistical Methods for Engineers and Scientists. CRC Press. Bovolo, F., Bruzzone, L., Capobianco, L., Garzelli, A., Marchesi, S., Nencini, F., 2010. Analysis of the effects of pansharpening in change detection on VHR images. IEEE Geosci. Remote Sens. Lett. 7, 53–57. Campbell, J.B., Wynne, R.H., 2011. Introduction to Remote Sensing. Guilford Press, New York. Carney, J., Gillespie, T.W., Rosomoff, R., 2014. Assessing forest change in a priority West
12
Ecological Informatics 53 (2019) 100984
B. Rayegani, et al. review. IEEE Geosci. Remote Sensing Magazine 3, 27–46. Long, X., Li, N., Tie, X., Cao, J., Zhao, S., Huang, R., Zhao, M., Li, G., Feng, T., 2016. Urban dust in the Guanzhong Basin of China, part I: a regional distribution of dust sources retrieved using satellite data. Sci. Total Environ. 541, 1603–1613. Lu, D., Mausel, P., Brondizio, E., Moran, E., 2004. Change detection techniques. Int. J. Remote Sens. 25, 2365–2401. Padwick, C., Deskevich, M.P., 2014. Hyperspherical Pan Sharpening. Google Patents. Pena-Rodriguez, M.E., 2013. Statistical Process Control for the FDA-Regulated Industry. ASQ Quality Press. Pettorelli, N., 2013. The Normalized Difference Vegetation Index. OUP Oxford. Pohl, C., van Genderen, J., 2016. Remote Sensing Image Fusion: A Practical Guide. CRC Press. Rayegani, B., kheirandish, z., Kermani, F., Mohammdi Miyab, M., Torabinia, A., 2017. Identification of active dust sources using remote sensing data and air flow simulation (Case Study: Alborz Province). Desert Manage. 4, 15–26. Schowengerdt, R.A., 2006. Remote Sensing: Models and Methods for Image Processing. Elsevier Science. Snehmani, A.G., Ganju, A., Kumar, S., Srivastava, P., 2016. A Comparative Analysis of Pansharpening Techniques on QuickBird and WorldView-3 Images. Stathaki, T., 2011. Image Fusion: Algorithms and Applications. Elsevier Science. Stoms, D.M., Hargrove, W.W., 2000. Potential NDVI as a baseline for monitoring ecosystem functioning. Int. J. Remote Sens. 21, 401–407. Strait, M., Rahmani, S., Merkurev, D., 2008. Evaluation of Pan-Sharpening Methods. (UCLA Department of Mathematics). Unal, A., Nayak, M., Mishra, D.K., Singh, D., Joshi, A., 2016. Smart Trends in Information
Technology and Computer Communications: First International Conference, SmartCom 2016, Jaipur, India, August 6–7, 2016, Revised Selected Papers. Springer, Singapore. Vijayaraj, V., O'Hara, C.G., Younan, N.H., 2004. Quality analysis of pansharpened images. In: Geoscience and Remote Sensing Symposium, 2004. IGARSS'04. Proceedings. IEEE International, pp. 2004. Vijayaraj, V., Younan, N.H., O'Hara, C.G., 2006. Quantitative analysis of pansharpened images. Opt. Eng. 45, 046202. Vivone, G., Alparone, L., Chanussot, J., Dalla Mura, M., Garzelli, A., Licciardi, G., Restaino, R., Wald, L., 2014. A critical comparison of pansharpening algorithms. In: Geoscience and Remote Sensing Symposium (IGARSS), 2014 IEEE International. IEEE, pp. 191–194. Wu, B., Fu, Q., Sun, L., Wang, X., 2015. Enhanced hyperspherical color space fusion technique preserving spectral and spatial content. J. Appl. Remote. Sens. 9, 097291. Zhang, Y., Mishra, R.K., 2012. A review and comparison of commercially available pansharpening techniques for high resolution satellite image fusion. In: Geoscience and Remote Sensing Symposium (IGARSS), 2012 IEEE International. IEEE, pp. 182–185. Zhang, Y., Mishra, R.K., 2014. From UNB PanSharp to Fuze Go–the success behind the pan-sharpening algorithm. Int. J. Image Data Fusion 5, 39–53. Zhang, Y., Roshan, A., Jabari, S., Khiabani, S.A., Fathollahi, F., Mishra, R.K., 2016. Understanding the quality of Pansharpening–a lab study. Photogramm. Eng. Remote Sens. 82, 747–755. Zhu, Z., Woodcock, C.E., 2014. Continuous change detection and classification of land cover using all available Landsat data. Remote Sens. Environ. 144, 152–171.
13