Pergamon
Vistas in AstronomyVol. 41, No. 3, pp. 351-357,1997 @ 1997 Elsevier Science Ltd Printed in Great Britain. All rights reserved 0083~6656/97 $15.00 + 0.00 PI I: SOOS3-6656(97)00038-X
SIMULTANEOUS IMAGE FUSION AND RECONSTRUCTION USING WAVELETS Applications to SPOT + LANDSAT images JORGE NfJT;jEZa*b,l,XAVIER OTAZU a, OCTAVI FORS a,b, ALBERT PRADES c,2 a Departament d’Astronomia i Meteorologia, Universitat de Barcelona, Av. Diagonal 647, E-08028 Barcelona, Spain b Observatorio Fabra, Barcelona, Spain c Escola Universitaria Politecnica de Barcelona, Universitat Politecnica de Catalunya, Av. Dr. Gregorio Marail6n 44-50, E-08028 Barcelona, Spain
Abstract- In this paper we present a technique, based on multiresolution wavelet decomposition, for the merging and data fusion of two images. The images to merge are a high-resolution panchromatic image and a low-resolution multispectral image. The method consists of adding the wavelet coefficients of the high-resolution image to the multispectral (low-resolution) data. We applied the method to merge SPOT and LANDSAT (TM) images. The current data fusion methods may not be satisfying because they can distort the spectral characteristics of the multispectral data. The technique presented here is clearly better than the standard IHS and LHS mergers in preserving both spectral and spatial information. The fusion of reconstructed images and the application of the technique to astronomical images are commented on. @ 1997 Elsevier Science Ltd. All rights reserved.
1. INTRODUCTION There are several situations that simultaneously require high spatial and spectral resolution in a single image. This is particularly important in remote sensing. In other cases, like in astronomy, it is very important to have simultaneously high spatial resolution and signal-to-noise ratio. However, in most cases instruments are not capable of providing such data either by design or by observational constraints. For example, in remote sensing SPOT PAN satellite data provides high resolution (10-m pixels) panchromatic data while LANDSAT TM satellite data provides ’ E-mail:
[email protected]. 2 E-mail:
[email protected].
J. Ndtiez et al.
352
lower resolution (30-m pixels) multispectral data. In astronomy, the space-borne telescopes give high-resolution images but the photons are expensive making long-exposure multispectral observations uncommon. From the ground, on the other hand, the resolution is poor but the photons are cheap and the signal-to-noise ratio can be increased. Also from the ground it is possible to obtain long-exposure (but low-resolution) multispectral data easily. One possible answer to this problem could come from the field of data fusion. A number of methods for merging panchromatic and multispectral data have been proposed [2,9]. The most common procedures are the Intensity-Hue-Saturation (IHS) and the Lightness-Hue-Saturation (LHS) methods [1,6]. However, the IHS and LHS methods suffer from spectral degradation. This problem is particularly important in remote sensing if the images to merge are not taken at the same time. Recently, Yocky [ 10,l l] and independently Garguet-Duport et al. [3] proposed a new approach to image merging that uses a multiresolution analysis procedure based upon the discrete two-dimensional wavelet transform. This method preserves the spectral characteristics of the multispectral image better than the IHS or LHS methods. The wavelet-based image merging can be done in two ways: (a) by substituting some wavelet coefficients of the multispectral image by the corresponding coefficients of the high-resolution image and (b) by adding high-resolution coefficients to the multispectral data. Here, we further explore the wavelet transform image merging technique with special attention to the “additive” merger. To decompose the data into wavelet coefficients, we use the discrete wavelet transform algorithm known as “a trous” (“with holes”). The method is applied to merge SPOT and LANDSAT (TM) images. We also comment on the combination of fusion and reconstruction techniques and some astronomical applications.
2. WAVELET DECOMPOSITION The wavelet decomposition is increasingly being used for the processing of images (see, e.g. Refs. [8,5]). The method is based on the decomposition of the image into multiple channels based on their local frequency content. The wavelet transform provides a framework to decompose images into a number of new images, each one of them with a different resolution degree. While the Fourier transform gives an idea of the frequency content in our image, the wavelet representation is an intermediate representation between the Fourier and spatial representation, and it can provide us with good localization in both frequency and space domains. The wavelet transform of a distribution f(t) can be expressed as W(f)(a, b) = 1a1-1/2
/-f,,(~)
dt.
(1)
In this representation, a and b are scaling and translational parameters respectively. Each base function @((t - b)/a) is a scaled and translated version of a function 1(/called Mother Wavelet. These base functions are such that $ +((t - b)/a) = 0. Following Starck and Murtagh [7], to decompose the image into wavelet planes, we use the discrete wavelet transform algorithm known as “a trous” (“with holes”). An image p is successively transformed as follows: W(P) =
Pl,
W(Pl)
= P2,
W(P2)
= p3,. , I
We use a scaling function which has a B3 cubic spline profile. Letting wl = pl - p~_~ (1 = 1, ‘..1 n), in which po = p, we can write
Simultaneous imagefusion and reconstruction using wavelets
353
(2) In this representation, the images pl (I = 0, . . . , n) are versions of the original image at increasing scales (decreasing resolution levels), wl (1 = 1, . . . , n) are the multiresolution wavelet planes and pr is a residual image.
3. WAVELET IMAGE FUSION METHOD The wavelet merger method is based on the fact that, in the wavelet decomposition, the images pl (I = 0, . . . ) n) are the successive versions of the original image at increasing scales. Thus, the first wavelet planes of the high-resolution panchromatic image have spatial information that is not present in the multispectral image. As stated above, the wavelet image merging can be done in two ways. 3.1. Substitution method In the standard substitution method, some of the wavelet planes of the multispectral image are substituted by the planes corresponding to the panchromatic image. The steps to follow are the following: (1) Register the low resolution multispectral image to the same size of the high resolution image. (2) Decompose the R, G and B bands of the multispectral image to 3-4 wavelet planes (resolution levels). (3) Decompose the Panchromatic high resolution image accordingly. (4) Substitute the first wavelet planes of the R, G and B decompositions by the equivalent planes of the Panchromatic decomposition. (5) Perform the inverse wavelet transform. 3.2. Additive method Another possibility is to add the wavelet planes of the high-resolution image directly to the multispectral image. The steps of this “additive” method are the following: (1) Register the low resolution multispectral image as before. (2) Decompose only the Panchromatic high resolution image to 3-4 wavelet planes (resolution levels). (3) Add the wavelet planes of the Panchromatic decomposition to the R, G and B bands of the multispectral image. (4) Perform the inverse wavelet transform. In the substitution method, the wavelet planes corresponding to the multispectral image are discarded and substituted for by the corresponding planes of the panchromatic image. However, in the additive method all the spatial information of the multispectral image is preserved. Thus, the main advantage of the additive method is that the detailed information from both sensors is used. The advantages of using the wavelet image merging technique over the standard IHS or LHS methods are (1) The spectral quality of the color image is preserved to a high degree.
J. NhTez et al.
Fig. 1. Detail of the LANDSAT
image in R (left), G (center) and B (right) bands.
Fig. 2. Detail of the SPOT image,
(2) The resolution of the panchromatic image is introduced in the solution. (3) The total energy content of a wavelet plane is 0. Thus, the total energy of the multispectral image is preserved.
4. RESULTS We applied the above methodology to merge a SPOT and a LANDSAT image. The original SPOT image has 10-m pixels while the original LANDSAT image has 30-m pixels. The original images were registered and the SPOT image was photometrically corrected to present a histogram similar to the lightness (L) component of the LANDSAT image. Then, we applied the additive wavelet image fusion technique and compared the results with the standard methods. Fig. 1 shows a detail of the LANDSAT image. Due to technical constraints, in this paper we cannot reproduce colors. Thus we present hereafter the three bands of the image separately with the red band at the left, the green band at the center and the blue at the right. We chose this detail of the image because it contains a green field (that appears very clear in G band) at the middle right. This field can be used to test the spectral characteristics of the methods. Fig. 2 shows the same area of the SPOT image. The resolution of the image is clearly better in comparison with the LANDSAT image. It is easy to see that the SPOT and LANDSAT images were taken at different epochs. Note, for example, the aspect of the green field at the middle right which in the SPOT image is clearly different in aspect in the LANDSAT image displayed in Fig. 1. When the SPOT image was taken, the bottom part of the field was of a different color. Also, there are several structures that were not present at the time at which the LANDSAT image was taken.
Simultaneous image fusion and reconstruction using wavelets
355
Fig. 3. Detail of the fusion by the LHS method in R (left), G (center) and B (right) bands.
Fig. 4. Detail of the fusion by the additive wavelets method in R (left), G (center) and B (right) bands. Fig. 3 shows the result of the fusion of the LANDSAT and SPOT images by the standard LHS method. The increase in resolution with respect to the original LANDSAT image is evident. Most of the resolution of the SPOT image has been incorporated into the result. However, as stated above, there is spectral degradation and intensity dependence of the color and a strong correlation between the merged image and the panchromatic intensity. This fact can be seen qualitatively in the green field at the middle right in comparison with the same area in Fig. 1. Note specially the different aspect in the G image. Fig. 4 shows the result of the fusion by the additive wavelets method. In this example three wavelet planes were added. As in the LHS solution, most of the resolution of the SPOT image was incorporated into the merged image. However, in this case, the spectral characteristics of the LANDSAT image are preserved better than in the LHS merger. Note the similar tonalities of Fig. 4 with respect to Fig. 1. In particular the green field at middle right is now represented by colors that are much closer to the original LANDSAT colors than the colors in the LHS solution. We can quantify the behavior of the LHS and additive wavelet fusion methods. Table 1 shows the correlations between the LHS, and additive wavelets solutions and the original LANDSAT and SPOT images. The first line of the table shows the correlation between the R, G, and B bands of the LANDSAT image and the SPOT image. The second line shows the correlation between the R, G, and B bands of the LHS solution and the SPOT while the third line shows the same but for the additive wavelets solution. Note that the correlations of the LHS solution are higher than the correlations of the wavelets solution. This means that the LHS solution is closer to the SPOT image than the wavelets one. As stated above, there is strong correlation between the LHS merged image and the panchromatic intensity. The correlation is, however, lower with regard to the additive wavelets solution. On the other hand, the fourth and fifth line of Table 1 indicate the correlation of the LHS and wavelet solutions with respect to the origina LANDSAT image.
J. Nu’ffezet al.
356
Table 1 Correlation between the LHS, and additive wavelets solutions and the original LANDSAT and SPOT images Correlation Images
Red
Green
Blue
LANLXAT/SP(TT LHSM’OT AWlSPOT LHYLANDSAT AWiLANDSAT
0.757 0.814 0.770 0.857 0.918
0.804 0.850 0.839 0.848 0.889
0.674 0.719 0.695 0.847 0.871
Note that now the correlation of the additive wavelet solution is higher than that of the LHS one. This means that, as stated above in qualitative form, the additive wavelet solution preserves the spectral characteristics of the panchromatic image to a higher degree than the LHS solution. Thus, the additive wavelet solution behaves better than the standard methods.
5. IMAGE
RECONSTRUCTION
AND ASTRONOMICAL
APPLICATIONS
We studied the behavior of the method with reconstructed images. We can reconstruct both the multispectral and the panchromatic image and then perform the fusion. Another possibility is to perform the reconstruction using the wavelet decomposition in the way of Ntifiez and Otazu 141, which allows a simultaneous fusion-reconstruction process by controlling the level of reconstruction of the different wavelet planes that are added. We used the additive wavelets method to merge reconstructed LANDSAT and SPOT images (we used 10 iterations with the RichardsonLucy method). The results are still better than the results displayed in Fig. 4. The correlations are similar to the ones of Table 1, indicating that the reconstruction does not change the spectral characteristics of the images. With regard to the astronomical applications, two possible applications are (a) the fusion of multispectral images taken from the ground with images taken from space, and (b) the fusion of long exposure images (ground) with short exposure images from space. We performed a simulation study of the second case with encouraging results.
6. CONCLUSIONS The additive wavelets image fusion method combines a high-resolution panchromatic image and a low-resolution multispectral image by adding some wavelet planes of the high-resolution to the low-resolution image. The method has the capability of enhancing the spatial quality of the multispectral image while preserving its spectral characteristics much better than the standard methods. The method can be applied successfully to astronomy.
ACKNOWLEDGEMENTS This work was supported in part by the DGICYT Ministerio de Educaci6n y Ciencia (Spain) under grant no. BP951031-A. Partial support was also obtained from the Institut Cartogrtic de Catalunya, the D.G.U. Generalitat de Catalunya and from the Gaspar de Portola Catalan Studies Program of the University of California and Generalitat de Catalunya.
Simultaneous
image fusion and reconstruction
using wavelets
357
References
[l] J.W. Carper, T.M. Lillesand, R.W. Kiefer, The use of intensity-hue-saturation transformations for merging SPOT panchromatic and multispectral image data, Photogrammetric Engineering and Remote Sensing 56 (1990) 459. [2] P.S. Chavez, S.C. Sides, J.A. Anderson, Comparison of Three Different Methods to Merge Multi-Resolution and Multispectral Data: Landsat TM and SPOT panchromatic, Photogrammetric Engineering and remote sensing 57 (1991) 295. [3] B. Garguet-Duport, J. Girel, J.M. Chassery, G. Pautou, The use of multiresolution analysis and wavelets transform for merging SPOT panchromatic and multispectral image data, Photogrammetric Engineering and Remote Sensing 62 (1996) 1057. [4] J. Ntifiez, X. Otazu, Multiresolution image reconstruction using wavelets, Vistas in Astronomy 40 (1996) 555. [5] F. Rue, A. Bijaoui, A multiscale vision model applied to astronomical images, Vistas in Astronomy 40 ( 1996) 495. [6] V.K. Shettigara, A generalized component substitition technique for spatial enhacement of the use of multispectra images using a higher resolution data set, Photogrammetric Engineering and Remote Sensing 58 ( 1992) 56 1. [7] J.L. Starck, F. Murtagh, Image restoration with noise suppression using the wavelet transform, Astron. Astrophys. 288 (1994) 342. [8] J.L. Starck, E. Pantin, Multiscale maximum entropy image restoration, Vistas in Astronomy 40 (1996) 563. [9] J.C. Tilton, ed., Multisource data integration in remote sensing, NASA Conf. Publ. 3099 (1991). [lo] D.A. Yocky, Image merging and data fusion by means of the discrete two-dimensional wavelet transform, J. Opt. Sac. Amel: se,: A 12 (1995) 1834. [ 1l] D.A. Yocky, Multiresolution wavelet decomposition image merger of LANDSAT thematic mapper and SPOT panchromatic data, Photogrammetric Engineering and Remote Sensing 62 (1996) 1067.