Optik 125 (2014) 4751–4757
Contents lists available at ScienceDirect
Optik journal homepage: www.elsevier.de/ijleo
Iterative block level principal component averaging medical image fusion R. Vijayarajan a,∗ , S. Muttan b a b
Anna University, Chennai, India Department of ECE, Anna University, Chennai, India
a r t i c l e
i n f o
Article history: Received 13 September 2013 Accepted 15 April 2014 Keywords: Image fusion Principal components Mutual information Median filter Mean structural similarity index
a b s t r a c t Image fusion is a method of integrating all relevant and complementary information from images of same source or various sources into a single composite image without any degradation. In this paper, a novel pixel level fusion called Iterative block level principal component averaging fusion is proposed by dividing source images into smaller blocks, thus principal components are calculated for relevant block of source images. Average of principal components of all the blocks provide weights for fusion rule, thus importance is given to blocks of source images. In this scenario, Iterations are incorporated in the form of size of blocks of source images which gives fusion results with maximum average mutual information. This algorithm is experimented for the fusion of noise free medical images and noise filtered of the same. The experimental results for both the cases show that the proposed algorithm performs well in terms of average mutual information and mean structural similarity index. © 2014 Published by Elsevier GmbH.
1. Introduction The main objective of medical image fusion is faithful integration of visual information observed from various input images into single image without any degradation and loss of information [1]. Three major fusion methods have been dealt in the literature of image fusion – pixel level, feature level and decision level. Each method has its performance variability for various inputs. Pixel level fusion is carried out either in spatial domain or in transform domain. In spatial domain, relevant pixel values of source images contribute to the pixel value in the fused image. But spatial domain methods often lead to the absence of spectral information and introduce spatial distortions. Degradation in the form of poor perceptual quality is introduced by pixel level fusion methods such as averaging and weighted averaging. In transform domain, multiresolution approaches have been proposed in the literature to overcome this problem. Algorithms based on pyramid approach and wavelets are quiet successful in this scenario. Various pyramid fusion methods [2–6] result in blocking effect [7] in significantly different regions of the image. Contrast pyramid method results in loss of much source information, ratio pyramid leads to inaccurate fusion and morphological pyramid ends with more artifacts
∗ Corresponding author. E-mail addresses:
[email protected] (R. Vijayarajan), muthan
[email protected] (S. Muttan). http://dx.doi.org/10.1016/j.ijleo.2014.04.068 0030-4026/© 2014 Published by Elsevier GmbH.
[1]. Wavelet transforms based fusion is the alternative for this setback, but often suffer from shift sensitivity, poor directionality and absence of phase information [8–10]. Dual tree complex wavelet transform (DTCWT) offers better directionality and shift invariance than wavelet transforms [11], hence suitable for wavelet based image fusion. But DTCWT is computationally costly and requires large memory [12]. Principal component analysis (PCA) is an efficient method for feature extraction, dimensionality reduction and data representation. Turk and Pentland [13] proposed PCA based on KL transform. Lot of algorithms has been proposed for fusion based on PCA because of its computation and realization. PCA is well known decorrelation method in statistical sense and preserves only the most significant principal components, thus lead to reasonably good image fusion. Principal components are derived from the covariance matrix and its diagonalization by finding its eigen vectors and eigen values. Largest principal components are the linear representation of most of the image details present in source images [14]. These largest principal components corresponding to the largest eigen values of covariance matrix are the weights for the input images in the fusion rule [15]. In this paper, a novel pixel level fusion algorithm called, Iterative block level principal component averaging (IBLPCA) has been proposed for fusion of noise free and noise filtered MR brain images. This algorithm evaluates principal components by splitting images into image blocks. Size of the image blocks are decided based on average mutual information (AMI) between fused image and source
4752
R. Vijayarajan, S. Muttan / Optik 125 (2014) 4751–4757
images. Average of principal components of all blocks is evaluated to have weights for fusion rule. Digital images particularly medical images play a vital role in modern diagnosis and treatment planning, thus subjected to various degradations during image acquisition, storage, processing, transmission and reproduction which results in distracted visual quality [16]. Various noises contribute different levels of degradation to medical images; one such noise is impulse noise that happens at any stage of image handling. This impulse noise may be a fixed or random one. Lot of impulse removal algorithms are available in literature, but always there is a tradeoff between quality of restored image and the amount of degradation introduced by the filter itself. Adaptive center weighted median filter (ACWMF) is the one which adaptively adjusts its threshold values to detect noisy pixels [17] and retain uncorrupted values. In this paper, MR brain images are subjected to various impulse noise densities and filtered by ACWMF. Performance of proposed fusion algorithm is analyzed by average mutual information and mean structural similarity index (MSSIM). Comparative analysis is carried out with PCA fusion, Multiresolution algorithms such as discrete wavelet fusion (DWT) and dual tree complex wavelet transform (DTCWT). The remaining sections of this paper are organized as follows. Section 2 elaborates proposed method with its block diagram. Section 3 is about impulse noise and ACWMF. Section 4 gives experimental results and performance analysis. This is followed by conclusion. 2. Proposed algorithm 2.1. Principal component analysis fusion
m1 and m2 are the weights of input images in the fusion rule and the rule for PCA fusion is given by y = m1 × xi + m2 × xj
(6)
Weights for fusion rule, m1 and m2 , decide the amount of information fused from each source image. 2.2. Block processing for the proposed algorithm [18] Dividing image into small blocks is subjective based on gray level profile of images to be fused. In this work, size of the blocks is not generalized for all images and suitable value for size of the block is experimented to get maximum AMI. Size of blocks =
M M × n; 2n 2
where M = Number of rows and columns in source images Number of blocks = K = 2n × 2n = 22n ;
⎡
x1
⎤
⎡
x1
⎤
2.3. Iterative block level principal component averaging fusion Source images are split into K number of blocks as shown in Fig. 1. PCA is implemented for the relevant blocks of source images. Principal components are evaluated for the all the blocks as specified in the Section 2.1. Principal components for all K number of blocks are given as mK1 and mK2 . Average of all the principal components are given by 1 K = 2n m1 ; 2
(1)
xN
The covariance matrix of between two source images is given by Cov(xi , xj ) = E[(xi − i )(xj − j )] Mean of all pixels is i =
1
N
xi
and j =
(2)
1
N
xij
(3)
Diagonal matrix D of eigen values and full matrix V whose columns are the corresponding eigenvectors are computed. Eigen values and eigen vectors are arranged in descending order and first 2 × 2 values from V and D matrices are taken for fusion.The normalized components m1 and m2 are computed from V based on following conditions and should be less than oneIf D(1, 1) > D(2, 2) m1 =
V (1, 1) ; V (1, 1) + V (2, 1)
m2 =
V (2, 1) V (1, 1) + V (2, 1)
V (1, 1) ; V (1, 1) + V (2, 1)
m2 =
K=1
22n
m2(av)
n = 1, 2, 3, 4
(9)
K=1
n is subjective for different fusion input images. m1(av) and m2(av) constitute weights for fusion rule and the fused image is given by y(IBLPCA) = m1(av) × xi + m2(av) × xj
(10)
V K (2, 1) V (1, 1) + V (2, 1)
(1) Divide the input images into K blocks, where K is given by K = 22n ; For n = 1; K = 4; (2) Evaluate principal components for K blocks and given by mK1 and mK2 (3) Find out m1(av) and m2(av) as given in Eq. (9) (4) Fused image is obtained as given in Eq. (10) (5) Evaluate AMI between fused image and the source images as given in Section 4.2.1 (6) Repeat steps (1)–(4) for n = 2, 3, 4; For particular value of n, maximum AMI is obtained which gives better fusion result. 3. Noise model 3.1. Fixed value impulse noise (FVIN) FVIN [19] is commonly modeled as
(4) yi,j =
else m1 =
m1(av)
1 K = 2n m2 ; 2
Algorithm.
⎢x ⎥ ⎢x ⎥ ⎢ 2⎥ ⎢ 2⎥ ⎢ ⎥ ⎢ ⎥ ; N is number of pixels xi = ⎢ and x = j ⎥ ⎢ . ⎥ ⎣ ... ⎦ ⎣ .. ⎦ xN
(8)
where n = 1, 2, 3, 4. . .
22n
PCA is an effective de-correlation and dimensionality reduction method which has found wide applications in pattern recognition, compression, fusion and noise reduction. PCA [15] evaluates eigenvectors and eigen values globally and concentrates energy of pixels on a small subset of PCA dataset. Let xi and xj are the two images to be fused and expressed as column vectors as given by
(7)
(5)
(0, 255)
with probability
p
x(i,j)
with probability
1−p
(11)
where p is the noise density; xi,j and yi,j represent intensity value of the original and corrupted images at (i, j)
R. Vijayarajan, S. Muttan / Optik 125 (2014) 4751–4757
4753
Fig. 1. Block diagram of Iterative block level principal component averaging fusion.
3.2. Non-linear statistical filters Among the non-linear statistical filters, the standard median filter and its modifications provide balancing performance in suppression of impulse noise [20,21]. These filtering methods adapt to the local properties and structures in the image. Despite its effectiveness in smoothing noise, it removes fine details of the image [22–24]. To preserve the fine details and to restore the image, other median filtering techniques such as adaptive median, hybrid median, relaxed median filtering methods have been used. The standard median filter (SMF) is an order statistics filter which provides a reasonable noise removal performance but removes thin lines and blurs fine details even at low noise densities [17]. At higher noise densities, SMF often blurs the image for larger window sizes and ineffective noise suppression for smaller window sizes [20,25]. The impulse noise filters are designed to yield effective noise reduction without compromising high frequency content of images [20,26]. However, most of the filters process both noise and noise free pixels. Adaptive median filter provides better performance at lower noise densities, due to the fact that there are few corrupted pixels replaced by median values. At higher noise densities, this replacement increases considerably by adaptive window size. However, the corrupted pixel values and replaced median values are less correlated [27]. Hybrid median filter (HMF) uses diagonal neighborhood evaluation for de-noising, but not performs well at low noise densities. Adaptive center weighted median filter (ACWMF) adaptively adjusts its threshold values to detect noisy pixels [17] and retain uncorrupted values. State of the art methods such as classic kernel regression and steering kernel regression provide better results but time consuming. In this paper ACWMF is used for impulse noise removal because of its fast and competitive performance.
where a weight adjustment is applied to the origin pixel and XijW = {Xi−s,j−t w ∈ Xij |(s, t) ∈ W }, here the window size is 2L + 1 with L > 0. For current pixel Xij , the differences are defined as dk = |YijW − Xij | = |Yij2k+1 − Xij |
(13)
where k = 0, 1, . . ., L − 1. and dk ≤ dk−1 where k≥1 [17]. Information about the likely presence of impulse noise for the current pixel can be derived form the differences (dk ). If the absolute difference is large, the current pixel may be smallest or the largest among the samples within the window. Else the current pixel may be free of noise and left unaltered. Impulse detection procedure in the filter is implemented using predefined thresholds Tk where k = 0, 1, . . ., L − 1 and Tk−1 > Tk . Impulse detector can be realized as follows:
xij =
Yij ,
if k, dk > Tk
Xij ,
otherwise
(14)
where xij denotes the final estimate of current pixel Xij . Threshold values Tk are evaluated based on median of absolute deviation from the median and given as MD = median{|Xi−u,j−v − Yij |(u, v) ∈ W |}
(15)
This gives a robust estimate of dispersion. The thresholds are described as Tk = s · MD + ık . Parameter s(≥0) varies for different images degraded with different noise densities and performs well using 0 ≤ s ≤ 0.6. For simulation, s value is taken as 0.1. Where ık (0, 1, . . ., L − 1) are the threshold values taken between [0 & 255]. 4. Experiments and results 4.1. Configuration for experiments
3.3. Adaptive center weighted median filter (ACWMF) [17,27,28] Let Xij & Yij be the input and output of ACWMF at current pixel location (i, j). Consider a window symmetrically surrounding the current pixel W = {(s, t)l − m ≤ s ≤ m, −m ≤ t ≤ m}. The output of ACWMF can be described as YijW = median(XijW )
(12)
For performance analysis of proposed algorithm, two evaluation methods are adopted and the experiment results are presented in Tables 1 and 2. Fusion can be implemented for two types of medical images. One is fusion of medical images from different modalities and another method is fusion of images from same modality. In this work, experiments are carried out for later case. Five sets of perfectly registered MR images of size 256 × 256 are taken for experiments and given in Fig. 2. To prove the effectiveness of proposed method, three existing methods such as PCA, DWT and
4754
R. Vijayarajan, S. Muttan / Optik 125 (2014) 4751–4757
Table 1 Performance of IBLPCA fusion of noise free images. Methods
MR1 & MR2
MR 3 & MR 4
MR 5 & MR 6
MR 7 & MR 8
Average mutual information 2.533853 PCA IBLPCA 6.087183 DWT 1.570518 1.615786 DTCWT
MR & MR0
2.22172 2.368309 1.585696 1.616904
1.952471 2.166826 1.456032 1.591223
2.182939 2.190437 1.453635 1.602065
2.144691 2.254261 2.092653 1.656314
Mean structural similarity index 0.89534 PCA 0.997192 IBLPCA 0.920122 DWT DTCWT 0.918625
0.950355 0.989364 0.958164 0.940512
0.854088 0.98074 0.903017 0.888479
0.930046 0.930185 0.735265 0.833426
0.923117 0.984083 0.846212 0.921052
DTCWT are experimented. This experiment is done for two cases of source images. Firstly, experiments are carried out for noise free source images. Secondly, same procedure is repeated for the noise filtered source images.
For comparative analysis, DWT fusion is carried out with a wavelet DB2 and four decomposition levels [29]. DTCWT [30] fusion is experimented by seven decomposition levels.
Table 2 Performance of fusion of noise filtered images for various impulse noise densities (0.01–0.09). Metric
Methods
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
AMI
PCA IBLPCA DWT DTCWT
2.527079 2.670696 1.571042 1.666794
2.636614 2.737139 1.573668 1.674946
2.620884 2.58437 2.304879 1.610984
2.63022 2.670959 2.017287 1.662242
2.608814 2.654389 2.336468 1.632343
2.611591 2.724395 2.040566 1.663628
2.60364 2.651903 2.214488 1.63348
2.628578 2.602439 1.560006 1.608584
2.591361 2.583222 2.252512 1.652729
MSSIM
PCA IBLPCA DWT DTCWT
0.895386 0.997863 0.921112 0.918779
0.895315 0.998407 0.920709 0.918312
0.895365 0.998668 0.919593 0.918391
0.895326 0.997441 0.920505 0.91855
0.895201 0.99748 0.919894 0.919311
0.8956 0.997999 0.921203 0.918746
0.895618 0.997888 0.920852 0.918941
0.894832 0.99591 0.91935 0.917638
0.895071 0.995093 0.919887 0.918082
MR1 and MR2 PCA IBLPCA AMI DWT DTCWT
2.217596 2.287639 1.585493 1.531681
2.217534 2.278942 1.58825 1.549464
2.215205 2.350791 2.282865 1.601545
2.208261 2.346816 1.575719 1.544603
2.207318 2.452887 1.579137 1.568169
2.204783 2.383509 1.575521 1.602183
2.199573 2.267673 1.57139 1.593258
2.199134 2.343336 1.571063 1.559849
2.191259 2.322452 1.562262 1.591846
PCA IBLPCA DWT DTCWT
0.950462 0.989278 0.957708 0.940297
0.950511 0.989318 0.958087 0.940409
0.950289 0.993279 0.957318 0.939865
0.950549 0.992946 0.957496 0.940192
0.95046 0.996611 0.957671 0.940032
0.950231 0.994782 0.957201 0.939643
0.950483 0.989271 0.957766 0.939971
0.950331 0.992953 0.957643 0.940061
0.950234 0.992343 0.956099 0.938939
MR3 and MR4 PCA IBLPCA AMI DWT DTCWT
1.947869 2.193346 2.041172 1.599541
1.947049 2.134041 1.451983 1.583082
1.946734 2.157239 2.016015 1.581976
1.941554 2.159903 2.034719 1.569355
1.940385 2.164409 1.995466 1.616286
1.93995 2.137016 1.473411 1.606411
1.936102 2.140583 1.903795 1.584222
1.936544 2.157284 1.446044 1.576246
1.930235 2.546314 1.434472 1.587955
PCA IBLPCA DWT DTCWT
0.854088 0.986631 0.902601 0.888078
0.854029 0.9812 0.902193 0.888229
0.853272 0.983503 0.901912 0.887109
0.853795 0.984377 0.901208 0.887275
0.853588 0.985131 0.901364 0.887337
0.853428 0.982235 0.901162 0.888366
0.852806 0.983207 0.899631 0.886613
0.85359 0.9849 0.901189 0.887336
0.851515 0.999127 0.898769 0.884895
MR5 and MR6 PCA IBLPCA AMI DWT DTCWT
2.189028 2.18754 1.452032 1.612595
2.189697 2.176061 1.446411 1.596498
2.186861 2.173884 1.451698 1.595892
2.178777 2.171414 2.025901 1.592311
2.181374 2.169903 1.451582 1.612191
2.180908 2.169329 1.452902 1.58539
2.181126 2.16671 2.025326 1.609754
2.179552 2.172381 1.443466 1.588062
2.17547 2.164443 1.882116 1.609198
PCA IBLPCA DWT DTCWT
0.870968 0.973014 0.898771 0.891055
0.871011 0.971911 0.89839 0.891168
0.870604 0.971817 0.898514 0.890829
0.870714 0.97184 0.898831 0.890788
0.870649 0.971829 0.897845 0.890795
0.870045 0.97171 0.89807 0.889521
0.870182 0.971701 0.897289 0.889823
0.869979 0.969894 0.896619 0.889244
0.869884 0.971545 0.897185 0.890358
MR7 and MR8 PCA IBLPCA AMI DWT DTCWT
2.145824 2.100035 1.518926 1.647933
2.14537 2.32422 1.520168 1.780058
2.144724 2.128603 1.519591 1.756121
2.144183 2.14829 2.177707 1.71036
2.140246 2.023715 2.18257 1.649333
2.144764 2.126365 1.536871 1.706192
2.139384 2.268745 1.516681 1.727869
2.138589 2.10018 1.512642 1.748186
2.136047 2.129159 1.515002 1.624268
PCA IBLPCA DWT DTCWT
0.930389 0.923184 0.905026 0.899389
0.930634 0.824242 0.905225 0.899536
0.930748 0.941672 0.908812 0.899776
0.930609 0.892278 0.905568 0.899486
0.930706 0.927503 0.90513 0.899638
0.930822 0.957452 0.90804 0.899811
0.930767 0.974016 0.903877 0.899795
0.930402 0.921835 0.903571 0.899203
0.930409 0.946303 0.904376 0.899935
MR & MR0
MSSIM
MSSIM
MSSIM
MSSIM
R. Vijayarajan, S. Muttan / Optik 125 (2014) 4751–4757
4755
where Pxi y – joint normalized histogram of xi and y; Pxj y – joint normalized histogram of xj and y; Pxi – normalized histogram of xi ; Pxj – normalized histogram of xj ; Py – normalized histogram of y. 4.2.2. Mean structural similarity index (MSSIM) Mean SSIM index [16] is evaluated between one of the source images and the fused image which shows better performance in image quality assessment [32]. If one of the images being compared is of perfect quality, then MSSIM can be considered as the quality measure of the other image. If img1 = img2, then MSSIM = 1. The MSSIM values exhibit much better consistency with the qualitative visual presentation. 1
[SSIM(xs , ys )] L L
MSSIM(xi , y) =
(19)
S=1
Fig. 2. Fusion inputs (a) MR and MR0 (b) MR1 and MR2 (c) MR3 and MR4 (d) MR5 and MR6 (e) MR7 and MR8.
To get noise filtered images, Impulse noise removal is carried by competitively performing ACWMF. All images are subjected to uniform impulse noise with densities ranging from 0.01 to 0.09. ACWMF uses a window size of 3 × 3. For the threshold values of [ı0 , ı1 , ı2 , ı3 ] = [55, 40, 25, 15], ACWMF consistently performs well in removing uniform valued impulses for all noise densities [17]. Values of [ı0 , ı1 , ı2 , ı3 ] = [40, 25, 10, 5] give better noise removal at very low noise densities. Parameter s(≥0) varies for different images degraded with different noise densities, and performs well using 0 ≤ s ≤ 0.6. For experiments, s value is taken as 0.1. 4.2. Performance analysis
xi and y are the one of the input images and fused image respectively. xs and ys are image contents at the sth local window. L is the number of local windows in the image. 4.2.2.1. Experiment 1. IBLPCA is experimented on all the noise free source images. By iteratively simulating the proposed algorithm, block size that gives maximum AMI is obtained. Fused result of the proposed algorithm is compared with other methods. To analyze qualitative visual appearance, MSSIM is also evaluated. 4.2.2.2. Experiment 2. Source images are subjected to uniform impulse noise by MATLAB noise implementation. A statistical filter known as ACWMF is applied to all the noise corrupted images. Kernel regression methods are not taken for de-noising because of complexity and time consumption. ACWMF also gives competitive results as compared to kernel regression methods. Proposed algorithm is experimented on noise filtered source images. Again performance of proposed method is analyzed based on AMI and MSSIM and given Table 2.
To analyze the performance of fusion on noise free and noise filtered images, two metrics, AMI and MSSIM are evaluated. Higher AMI represents high information transfer from source images to the fused image. SSIM estimates the correlation of source images by comparing luminance, contrast and structural characteristics. MSSIM value of close to one denotes better similarity between source images and fused image. There are two experiments conducted to prove the effectiveness of proposed algorithm. 4.2.1. Average mutual information (AMI) Mutual information quantifies the amount of information transferred from one source image to the fused image. For AMI, mutual information [31] is evaluated for each source image with fused image and average is calculated. Mutual information between the source image (xi ) and fused image (y) is given by I(xi ,y) =
Px ,y (xi , y) log(Px ,y (xi , y)) i
i
Pxi (xi )Py (y)
(xi ,y)
(16)
Mutual information between the source image (xj ) and fused image (y) is given by I(xj ,y) =
Pxj ,y (xj , y) log(Pxj ,y (xj , y)) Pxj (xj )Py (y)
(xj ,y)
AMI =
I(xi ,y) + I(xj ,y) 2
(17)
(18)
Fig. 3. Fusion results of noise free images (a) PCA (b) IBLPCA (c) DWT and (d) DTCWT.
4756
R. Vijayarajan, S. Muttan / Optik 125 (2014) 4751–4757
quantitative and qualitative metrics such as average mutual information and mean structural similarity index clearly demonstrate that the proposed algorithm proves superior than other algorithms for the fusion of noise free and noise filtered MR images. On the whole, IBLPCA provides better fusion of image details of noiseless and noise filtered MR images.
Acknowledgement I would like to acknowledge Keith A. Johnson and J. Alex Becker of ‘The Whole Brain Atlas’ for having given access to MRI images for experiments and analysis.
References
Fig. 4. Fusion results of noise filtered images (a) PCA (b) IBLPCA (c) DWT and (d) DTCWT.
4.2.3. Analysis of fusion of noise free images AMI and MSSIM for fusion of noise free images are given in Table 1. This clearly demonstrates that the proposed algorithm outperforms other algorithms for MRI images. Because of gray level profile of MR5-MR6, IBLPCA gives slightly better results than that of PCA method. DWT and DTCWT give poor performance on these images because of contrast enhancement. DWT results in background contrast enhancement which can be clearly observed in fusion results. Subjective analysis of all the images given in Fig. 3 also reveals that the proposed method is better than other methods. 4.2.4. Analysis of fusion of noise filtered images Experimental results of fusion of noise filtered images are given in Table 2. Analysis of AMI shows that the proposed method performs better for all noise densities except very few. For the fusion of MR5-6 and MR7-8, proposed method provides competitive results as compared to PCA for some of the noise densities Analysis of MSSIM proves that the proposed method has better similarity than other methods for all noise densities. Subjective analysis of fused results also exposes the same and given in Fig. 4. On the whole, proposed method provide superior results in terms of AMI for most of the noise filtered source images and proves good in terms of MSSIM over other methods. DWT results in background contrast enhancement like degradation which is evident in subjective analysis of fusion results. 5. Conclusion In this work, a novel algorithm called IBLPCA that evaluates principal components of blocks of source images is presented. Principal components exposing local variance are derived to decide weights for fusion rule. By varying the number of blocks in the source images, this algorithm is iteratively applied to get fusion results with maximum average mutual information. Analyzing
[1] Zhiping Xu, Medical image fusion using multi-level extrema, Inf. Fusion, 19, September (2014) 38–48. [2] P. Burt, E. Adelson, The laplacian pyramid as a compact image code, IEEE Trans. Commun. 31 (4) (1983) 532–540. [3] P.J. Burt, R.J. Kolczynski, Enhanced image capture through fusion, in: Proceedings of the 4th IEEE International Conference on Computer Vision (ICCV’93), 1993, pp. 173–182. [4] A. Toet, J.J. Van Ruyven, J.M. Valeton, Merging thermal and visual images by a contrast pyramid, Opt. Eng. 28 (7) (1989) 789–792. [5] A. Toet, Image fusion by a ratio of low-pass pyramid, Pattern Recognit. Lett. 9 (4) (1989) 245–253. [6] G.K. Matsopoulos, S. Marshall, Application of morphological pyramids: fusion of MR and CT phantoms, J. Vis. Commun. Image Represent. 6 (2) (1995) 196–207. [7] H. Li, B.S. Manjunath, S.K. Mitra, Multisensor image fusion using the wavelet transform, Graph. Models Image Process. 57 (3) (1995) 235–245. [8] A. Khare, U.S. Tiwary, W. Pedrycz, M. Jeon, Multilevel adaptive thresholding and shrinkage technique for denoising using Daubechies complex wavelet transform, Imaging Sci. J. 58 (6) (2010) 340–358. [9] A. Khare, M. Khare, Y.Y. Jeong, H. Kim, M. Jeon, Despeckling of medical ultrasound images using complex wavelet transform, Signal Process. 90 (2) (2010) 428–439. [10] A. Khare, U.S. Tiwary, M. Jeon, Daubechies complex wavelet transform based multilevel shrinkage for deblurring of medical images in presence of noise, Int. J. Wavelets Multi. Inf. Process. 7 (5) (2009) 587–604. [11] I.W. Selesnick, R.G. Baraniuk, N.G. Kingsbury, The dual-tree complex wavelet transform, IEEE Signal Process. Mag. 22 (6) (2005) 123–151. [12] R. Singh, A. Khare, Fusion of multimodal medical images using Daubechies complex wavelet transform – a multiresolution approach, Inf. Fusion 19 (2014) 49–60. [13] M.A. Turk, A.P. Pentland, Eigenfaces for recognition, J. Cogn. Neurosci. 3 (1) (1994) 71–86. [14] S. Senthil Kumar, K. Mahesh Bharath, S. Muttan, Implementation of max principle with PCA in image fusion for surveillance and navigation application, Electron. Lett. Comput. Vis. Image Anal. 10 (1) (2011) 1–10. [15] V.P.S. Naidu, J.R. Raol, Pixel-level image fusion using wavelets and principal component analysis, Def. Sci. J. 58 (3) (2008) 338–352. [16] Z. Wang, A.C. Bovik, H.R. Sheikh, E.P. Simoncelli, Image quality assessment: from error visibility to structural similarity, IEEE Trans. Image Process. 13 (April (4)) (2004) 600–612. [17] T.-C. Lin, P.-T. Yu, A new adaptive center weighted median filter for suppressing impulsive noise in images, Inf. Sci. 177 (January (1)) (2007) 1073–1087. [18] R. Vijayarajan, S. Muttan, Local principal component averaging image fusion, Int. J. Imaging Robotics 13 (2) (2014) 94–103. [19] H. Hosseini, F. Marvasti, Fast restoration of natural images corrupted by highdensity impulse noise, Eur. J. Image Video Process. (2013), 1186/1687/-52812013-15. [20] R.H. Chan, C.-W. Ho, M. Nikolova, Salt-and pepper noise removal by mediantype noise detectors and detail preserving regularization, IEEE Trans. Image Process. 14 (October (10)) (2005) 1479–1485. [21] Y. Dong, R.H. Chan, S. Xu, A detection statistic for random-valued impulse noise, IEEE Trans. Image Process. 16 (April (4)) (2007) 1112–1120. [22] H. Ibrahim, N.S.P. Kong, T.F. Ng, Simple adaptive median filter for the removal of impulse noise from highly corrupted images, IEEE Trans. Consum. Electron. 54 (November (4)) (2008) 1920–1927. [23] C.A. Pomalaza-Racz, C.D. Macgillem, An adaptive non linear edge preserving filter, IEEE Trans. Acoust. Speech Signal Process. ASSP-32 (1984) 571–576. [24] T. Sun, Y. Neuvo, Detail-preserving median based filters in image processing, Pattern Recognit. Lett. 15 (1994) 341–347. [25] M.-S. Pan, J.T. Tang, X.-L. Yang, An adaptive median filter algorithm based on B-Spline function, Int. J. Autom. Comput. 8 (February (1)) (2011) 92–99. [26] A. Toprak, I. Guler, Suppression of impulse noise in medical images with the use of fuzzy adaptive median filter, J. Med. Syst. 30 (6) (2006) 465–471.
R. Vijayarajan, S. Muttan / Optik 125 (2014) 4751–4757 [27] T. Chen, H.R. Wu, Adaptive impulse detection using center-weighted median filters, IEEE Signal Process. Lett. 8 (January (1)) (2001) 1–3. [28] S. Yazdi, F. Homayouni, Modified adaptive center eighted median filter for suppressing impulsive noise in images, Int. J. Res. Rev. Appl. Sci. 1 (December (3)) (2009) 218–227. [29] Oliver Rockinger, Fusion Tool Box, 1999.
4757
[30] http://taco.poly.edu/WaveletSoftware [31] G.H. Qu, D.L. Zhang, P.F. Yan, Information measure for performance of image fusion, Electron. Lett. 38 (7) (2002) 313–315. [32] Y. Han, Y. Cai, Y. Cao, X. Xu, A new image fusion performance metric based on visual information fidelity, Inf. Fusion 14 (2013) 127–135.