Journal Pre-proof Natural-based underwater image color enhancement through fusion of swarm-intelligence algorithm Kamil Zakwan Mohd Azmi, Ahmad Shahrizan Abdul Ghani, Zulkifli Md Yusof, Zuwairie Ibrahim
PII: DOI: Reference:
S1568-4946(19)30591-5 https://doi.org/10.1016/j.asoc.2019.105810 ASOC 105810
To appear in:
Applied Soft Computing Journal
Received date : 10 May 2019 Revised date : 4 September 2019 Accepted date : 23 September 2019 Please cite this article as: K.Z.M. Azmi, A.S. Abdul Ghani, Z.M. Yusof et al., Natural-based underwater image color enhancement through fusion of swarm-intelligence algorithm, Applied Soft Computing Journal (2019), doi: https://doi.org/10.1016/j.asoc.2019.105810. This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
© 2019 Elsevier B.V. All rights reserved.
*Highlights (for review)
Journal Pre-proof Highlights
Jo
urn a
lP
re-
pro of
1. The proposed natural-based underwater image color enhancement enhances contrast and color 2. Proposed NUCE method superimposes color cast neutralization, dual image fusion, and swarm equalization steps 3. The output images produce significant contrast and color while outstandingly addressing blue-green color cast 4. Both qualitative and quantitative evaluations show better improvement of the addressed problems
Journal Pre-proof Graphical abstract (for review)
RESULTANT IMAGE
HISTOGRAM CHANNEL
Superior based underwater color cast neutralization
Superior color channel
Inferior color channels
Original histogram Average point
Max value
lP
Min value
Mean value
re-
Median value
Dual-intensity images fusion based on average of mean and median values
pro of
START
urn a
Lower stretched-region
Upper stretched-region
PSO
Swarm-intelligence based mean equalization
Jo
Unsharp masking
END
Natural-based Underwater Image Color Enhancement through fusion of Swarm-Intelligence Algorithm
Journal Pre-proof *Manuscript Click here to view linked References
Natural-based Underwater Image Color Enhancement through fusion of Swarm-Intelligence Algorithm Kamil Zakwan Mohd Azmi, Ahmad Shahrizan Abdul Ghani*, Zulkifli Md Yusof, Zuwairie Ibrahim.
pro of
Faculty of Mechanical & Manufacturing Engineering, Universiti Malaysia Pahang, 26600 Pekan, Pahang, Malaysia (*corresponding author:
[email protected])
Abstract
Underwater imagery suffers from severe effects due to selective attenuation and scattering effects when light travels through water medium. Such damages limit the ability of vision tasks and reduce image quality. There are a lot of enhancement methods to improve the quality of underwater image.
re-
However, most of the methods produce distortion effects in the output images. The proposed natural-based underwater image color enhancement (NUCE) method consists of four steps. The first step introduces a new approach to neutralize underwater color cast. The inferior color channels are enhanced based on gain factors, which are calculated by considering the differences between
lP
the superior and inferior color channels. In the second step, the dual-intensity images fusion based on average of mean and median values is proposed to produce lower-stretched and upper-stretched histograms. The composition between these histograms improves the image contrast significantly. Next, the swarm-intelligence based mean equalization is proposed to improve the naturalness of the
urn a
output image. Through the fusion of swarm intelligence algorithm, the mean values of inferior color channels are adjusted to be closed to the mean value of superior color channel. Lastly, the unsharp masking technique is applied to sharpen the overall image. Experiments on underwater images that are captured under various conditions indicate that the proposed NUCE method produces better output image quality, while significantly overcoming other state-of-the-art methods.
Jo
Keywords: color correction; underwater color cast; contrast stretching; swarm-intelligence algorithm;
1. Introduction
The development of marine research is increasingly dependent on underwater images that are mostly captured by autonomous underwater vehicles and remotely operated underwater
1
Journal Pre-proof
vehicles [1]. However, underwater images are difficult to analyze due to the light absorption and scattering effects which cause underwater images to degenerate, such as low contrast and misleading colors.
pro of
A few major factors which control the attenuation rate have been identified such as water temperature and salinity, as well as the type and quantity of floating particles present in the water [2]. In addition, object colors are very important for underwater tasks and investigations. Serious deterioration leads to difficulty of information in an image to be recovered. Therefore, discovering a solution to restore the true color of underwater image is very challenging task. It is reported [3] that underwater image enhancement techniques have been proposed to
re-
address these problems. However, in some cases, the existing methods not succeed to deal with those problems effectively especially when the images are captured deep on the seafloor
the aforementioned problems.
lP
[4][5][6]. Therefore, it is imperative for researchers to introduce an effective method to address
In this paper, a new method namely natural-based underwater image color enhancement
urn a
(NUCE) for improving underwater images is presented. Following are the main contributions to be emphasized in this work. i.
The proposed NUCE method reduces the underwater color cast by enhancing the inferior color channels based on gain factors. These gain factors are calculated by considering the differences between superior and inferior color channels. The proposed method applies dual-intensity images fusion based on average of
Jo
ii.
mean and median values to improve the image contrast. The average point is determined and selected as the point of separation to produce lower-stretched and upper-stretched histograms.
2
Journal Pre-proof
iii.
In the third step, the swarm-intelligence based mean histogram equalization is proposed to improve the naturalness of the output image. Through the fusion of the swarm intelligence algorithm, the mean values of inferior color channels are
pro of
adjusted to be closed to the mean value of superior color channel. In explaining the proposed NUCE method, this paper is organized as follows: related work is presented in Section 2. In Section 3, the motivation behind this research is described. The detail of the proposed underwater image enhancement method is presented in Section 4. Before conclusion, qualitative and quantitative assessments of the proposed technique in comparison with other state-of-the-art techniques are presented. In addition, the capability of the
re-
proposed method in addressing common computer vision problem, namely keypoint matching is
2. Related works
lP
demonstrated as well to support the finding of this study.
Underwater image enhancement technique is a simple method but has a great effect on improving the quality of underwater imagery. It improves image quality by manipulating red-
urn a
green-blue (RGB) color channels intensity values according to certain rules. Chambah et al. [7] proposed a color correction method based on automatic color equalization (ACE) model. Their method is designed to solve strong and non-uniform color cast of underwater images. However, this method does not significantly improve the contrast of underwater image as some output
Jo
images produce low image contrast and color. Iqbal et al. [4] [8] used slide stretching method to improve underwater image quality. This approach is easy to apply and successfully improves underwater image quality. Due to its effectiveness, Abdul Ghani et al. [9] introduced a new method to improve the quality of underwater images based on Iqbal et al.’s method. They applied contrast stretching technique with respect to Rayleigh distribution to further improve the image contrast. This technique 3
Journal Pre-proof
succeeded in exhibiting better results compared to Iqbal et al. However, the problem regarding blue-green color cast is not well-addressed. In the other report [10], they have proposed a new approach that integrates three major steps; homomorphic filtering, recursive-overlapped CLAHS,
pro of
and dual-image wavelet fusion. This method improves the clarity of underwater images, but insignificantly eliminates bluish or greenish tone in underwater images as the blue-green color cast remains in the output images.
In 2012, Naim and Isa [11] proposed a method called pixel distribution shifting color correction (PDSCC). This method corrects the white reference point of the image and ensures that the white reference point is achromatic. However, this method is less effective in reducing
re-
the blue-green color cast, while the image contrast is not significantly improved. Abu Hassan et al. [12] improved under-exposed underwater images via enhanced homomorphic filtering and
lP
mean histogram matching which is used for object tracking. Contrast improvement can be seen in the output image when using this technique. Furthermore, this technique successfully enhanced dark areas in the image. However, in some cases, the dark areas remain in the output
urn a
images and the objects are not clearly differentiated with the background. On the other hand, underwater image restoration method focuses on reducing the effects caused by the underwater environment [13]. Most of these techniques refer to the Jaffe McGlamery model [14][15] which is a sophisticated underwater optical and required many physical parameters. Dark channel prior (DCP) [16], is one of the methods categorized under this
Jo
technique. DCP is a very effective method in eliminating haze. This method makes an assumption that information in the darkest channel of a hazy image can be attributed to the haze. Since the DCP has been effectively proven, many researchers present new underwater restoration techniques based on DCP model. For example, a red channel method [17] that can be interpreted as a DCP variant that has been proposed to restore the lost contrast.
4
Journal Pre-proof
Besides that, some existing methods which are used to offset the color of the underwater image for contrast improvement have been investigated for further analysis. For instance, unsupervised color correction (UCM) technique [4] involved Von Kries hypothesis to correct the
pro of
two lowest color channels. However, deeper investigations dealing with extremely deteriorated underwater scenes reveal that this method inadequately produces satisfactory results as the bluegreen color cast retains in the output images.
Figure 1 shows an example of output images by state-of-the-art methods. Figure 1(b) and Figure 2(b) show the resultant images produced by UCM method [4]. For turbid image, this method causes the blue channel to be excessively enhanced and leads to unnatural look as the
re-
output image become more deteriorate with bluish color cast. For deep underwater image, the red
Method
lP
channel has been stretched inappropriately, and thus generates a reddish output image.
Images
Color Histograms
Mean of RGB intensity Mean of R intensity = 34.11 Mean of G intensity = 180.09 Mean of B intensity = 12.91
Original image (turbid scene)
b
Von Kries applied in UCM
Mean of R intensity = 177.49 Mean of G intensity = 180.09 Mean of B intensity = 150.50
c
Modified Von Kries applied in DIRS
Mean of R intensity = 34.15 Mean of G intensity = 34.12 Mean of B intensity = 34.22
d
Modified Von Kries applied in IISR
Mean of R intensity = 177.55 Mean of G intensity = 253.53 Mean of B intensity = 67.74
Jo
urn a
a
5
Journal Pre-proof
Figure 1: Resultant images and their respective histograms based on existing of Von Kries methods for turbid scene.
Method
Color Histograms
Original image (deep underwater scene)
Mean of R intensity = 7.89 Mean of G intensity = 180.16 Mean of B intensity = 202.20
Mean of R intensity = 131.24 Mean of G intensity = 201.91 Mean of B intensity = 201.96
Von Kries
Modified Von Kries applied in IISR
lP
d
urn a
Modified Von Kries applied in DIRS
re-
b applied in UCM
c
Mean of RGB intensity
pro of
a
Images
Mean of R intensity = 123.27 Mean of G intensity = 180.15 Mean of B intensity = 180.26
Mean of R intensity = 8.83 Mean of G intensity = 201.92 Mean of B intensity = 226.40
Figure 2: Resultant images and their respective histograms based on existing of Von Kries methods for deep underwater scene.
Jo
Besides that, Abdul Ghani and Mat Isa [5] modified Von Kries hypothesis in their proposed method called composition of dual-intensity images and Rayleigh-stretching (DIRS) using a mean intensity color channel as a reference channel. The purpose of this modification is to ensure the gain factor is not too large or too small. However, for turbid image, this method deteriorates the green channel, which produces darker image as compared to the original image
6
Journal Pre-proof
as shown in Figure 1(c). For deep underwater image, the red channel has been stretched inappropriately as shown in Figure 2(c),which generates a reddish output image. In another report, Abdul Ghani et al. [6] proposed another variation of modified Von
pro of
Kries hypothesis in their proposed method called integrated-intensity stretched-Rayleigh histograms (IISR). In this method, the mean value is chosen to ensure an appropriate value of gain factor for color correction. However, for turbid water, the green channel has been exceedingly enhanced, which produces a greenish output image as shown in Figure 1(d). For deep underwater image, there are no or less significant improvements that can be seen as the bluish color cast retains in the output image as shown in Figure 2(d). The image histograms also
re-
remain similar to the input histograms. Based on these findings, it is crucial to introduce a technique that is capable to reduce the underwater color cast effectively.
lP
3. Motivation
Underwater image has very different features compared to a normal image captured above the ground where the blue-green color cast severely affects the image quality. Underwater
urn a
color cast occurs due to the absorption of the light spectrum in the water medium where the red color is first largely absorbed followed by the other color spectrums [13]. Therefore, the colors must be corrected prior to contrast enhancement to overcome this color disparity. For this reason, this newly proposed NUCE method is motivated based on observation of
Jo
natural landscape images as these images are consistent with human perception. Two natural landscape images with their respective histograms are shown in Figure 3. According to histogram perspective, it can be noticed that three RGB color channels are closely related, and the color distribution is more uniform in these images. Meanwhile, the mean values of all color channels are not significantly differ.
7
Journal Pre-proof
Images
Color Histograms
Mean of RGB intensity Mean of R intensity = 82.15 Mean of G intensity = 92.57 Mean of B intensity = 82.30
pro of
Natural image 1
Mean of R intensity = 89.42 Mean of G intensity = 97.12 Mean of B intensity = 86.29
Natural image 2
re-
Figure 3: Natural landscape images with their respective histograms.
Based on the above analysis, the following assumptions can be made to solve the problem regarding underwater color cast and poor contrast of underwater image. In turbid scene or in places with high concentration of plankton, the green channel
lP
i.
is relatively well-preserved, compared to red and blue channels [18]. The blue channel may be significantly attenuated due to absorption by organic matter,
urn a
while the red channel which has the longest wavelength is definitely lost first when traveling in water medium. Therefore, it is suggested that the green channel is selected as a reference channel to compensate for the attenuation of red and blue channels. This can be done by adding a fraction of the green channel to the red and blue channels.
In deep underwater scene, the blue channel is relatively well-preserved, compared
Jo
ii.
to red and green channels [18]. This is because the red and green colors produce longer wavelength than blue color, which cause them to be absorbed first. Therefore, in this case, it is suggested that the blue channel is selected as a reference channel to compensate for the attenuation of red and green channels. 8
Journal Pre-proof
This can be done by adding a fraction of the blue channel to the red and green channels. iii.
Despite white balancing (the first step) is vital to reduce the underwater color cast,
pro of
employing this step alone is not adequate to address the poor contrast problem since the details and edges of the scene have been deteriorated by the scattering effect. Therefore, this proposed method also integrates dual-intensity images fusion to improve overall contrast. iv.
In addition, the compensation should consider the mean values of all color histograms. According to natural landscape image, the mean values of all color
re-
channels are not significantly differ. This is also supported by the well-known GW [19] assumption which stated that all color channels have the same mean
lP
values before attenuation. Therefore, through the incorporation of swarm intelligence algorithm, the mean values of inferior color channels are adjusted to be closed to the mean value of superior color channel. This is expected to increase
urn a
the naturalness of the deteriorated underwater image while improving the image contrast.
4. Methodology: Natural-based underwater image color enhancement (NUCE) As illustrated in Figure 4, the proposed natural-based underwater image color
Jo
enhancement (NUCE) technique adopts four steps strategy; i) superior based underwater color cast neutralization, ii) dual-intensity images fusion based on average of mean and median values, iii) swarm-intelligence based mean equalization, and iv) unsharp masking. The first step intends to compensate for the color cast affected by the selective absorption of colors. Meanwhile, the second step is introduced to mitigate the loss of contrast resulting from back-scattering. The third step is implemented to improve the naturalness of the output image by equalizing the mean 9
Journal Pre-proof
values of all color channels. Finally, the unsharp masking technique is applied at the last stage to sharpen the output image.
Channels decomposition
pro of
Input image
Classification of superior and inferior color channels based on total pixel intensity values
Superior based underwater color cast neutralization
Determination of minimum, maximum, mean, and median values of image histogram
Computation of average point between mean and median values of image histogram
re-
Computation of gain factors J and K
Stretching both regions over the entire dynamic range
Separation into two regions at average point
Composition of lower- and upper-stretched histograms
Classification of superior and inferior color channels based on mean values
Fitness evaluation of each particle
Initialization of population and variables
urn a
lP
Dual-intensity images fusion based on average of mean and median values
Enhancement of inferior color channels
Swarm-intelligence based mean equalization
Jo
Update personal best and global best
Output image
Update velocity and position
No
Stopping condition met? Yes
Sharpen image using unsharp masking
Mean equalization
Figure 4: Flowchart of the proposed NUCE method.
10
Journal Pre-proof
4.1 Superior based underwater color cast neutralization This step aims to reduce undesirable color cast by recovering the inferior color channels which have been reduced due to selective attenuation. The image in RGB color model is first
pro of
decomposed into its respective channels (red, green, and blue). As discussed in Section 3, in turbid scene or in places with high concentration of plankton, the green channel is relatively well-preserved, compared to red and blue channels. Therefore, in this case, the green channel is regarded as superior color channel and selected as a reference channel to compensate for the attenuation of inferior color channels (red and blue channels). In contrast, for deep underwater scene, blue channel is relatively well-preserved, compared to the red and green channels.
re-
Therefore, in this case, the blue channel is regarded as superior color channel and selected as a reference channel to compensate for the attenuation of inferior color channels (red and green
lP
channels).
Mathematically, to take into account the above considerations, the superior and inferior color channels are classified based on the total pixel intensity value. Firstly, the total pixel
urn a
intensity values of red, green, and blue channels are calculated as follows:
(1)
Jo
(2)
(3)
where M and N represent the number of rows and columns of the image, respectively. , and
,
are the pixels values of red, green and blue channels, respectively at 11
Journal Pre-proof
position
. The color channel which has highest value is regarded as the superior color
channel,
. On the other hand, the other color channels which have intermediate and
minimum values are considered as inferior color channels,
and
, respectively.
pro of
Next, to compensate for the attenuation of inferior color channels, the gain factors of J and K are computed. The gain factor of J carries information of the difference between the maximum color channel and intermediate color channel in terms of total pixel intensity value, while the gain factor of K holds information regarding the difference between the maximum color channel and minimum color channel. This information is crucial to determine the appropriate amount of pixel intensity value that should be added to improve the inferior color
re-
channels. The more significant the differences between the superior color channel and inferior color channels, the higher pixel intensity values will be added to improve the inferior color
where and
urn a
lP
channels and vice versa. Mathematically, the gain factors of J and K are calculated as follows:
(4)
(5)
is the total pixel intensity value of superior color channel, while
are total pixel intensity values of other inferior color channels.
Jo
After the computation of gain factors of J and K, the inferior color channels are compensated at every pixel location (x) as follows:
(6)
12
Journal Pre-proof
(7) Figure 5 shows the illustration of underwater color cast neutralization steps. The
pro of
illustration shows an example of underwater image before and after neutralization step along with the respective histograms. For turbid scene, the green channel is superior to red and blue channels as shown in the image histograms. In this case, the proposed step will compensate for the attenuation of inferior color channels (red and blue channels). Through this step, the unwanted greenish color cast is significantly reduced.
Despite this step demonstrates excellent performance in reducing the underwater color
re-
cast, however, using this step alone is not sufficient to solve the poor contrast problem since the edges and details of the scene have been affected by the scattering problem. Therefore, in the
lP
next section, the proposed NUCE method also integrates dual-intensity images fusion to improve overall image contrast. The next subsection explains the integration in details.
After neutralization
urn a
Before neutralization
Superior color channel
Jo
Inferior color channels
Figure 5: Resultant images and their respective histograms based on neutralization step.
13
Journal Pre-proof
4.2 Dual-intensity images fusion based on average of mean and median values This step aims to enhance the edges and details of underwater image, in order to compensate the loss of contrast resulting from back-scattering effect. The step begins with the
minimum intensity value,
pro of
determination of minimum, maximum, mean, and median values of each image histogram. The refers to the lowest intensity value in the image histogram,
whereas the maximum intensity value,
refers to the highest intensity value in the image
histogram. On the other hand, the mean intensity value,
refers to the mean value of
image histogram. This is found by adding all the numbers in the data set and then dividing by the refers to the median value of
re-
number of pixels in the set. The median intensity value
image histogram. This value is the middle value when a data set is ordered from least to the greatest.
lP
It is reported that Abdul Ghani et al. [5] have used the mean value as the point of separation to divide the histogram into two regions. When the sample size is large and does not include outliers, the mean value usually provides a better measure of central tendency. However,
urn a
the median value is considered better than mean value if the data set is known to have some extreme values (high or low). In such cases, median value gives a more realistic estimate of central value. Considering this fact, the proposed method takes into account both situations by computing the average point between these values. The average point of mean value and median
Jo
value of each image histogram is computed through Equation 8.
(8)
14
Journal Pre-proof
After determine the average point between mean and median values, each image histogram is divided into two regions, namely the lower stretched-region and upper stretched-
where
and
pro of
region, as shown in Figure 6. Then, each region is stretched according to Equation 9 [4].
are the input and output of pixel intensity values, respectively.
(9)
and
indicate the minimum and maximum of intensity values of the input image, respectively, while and
are the minimum and maximum intensity values of the output image,
re-
respectively.
Each region is implemented with full stretching over the entire dynamic range. For each color channel, the division at the average point and stretching processes will generate two
lP
histograms, namely lower-stretched histograms and upper-stretched histograms. All lowerstretched histograms are composed to produce a new image. The same procedure is executed for all upper-stretched histograms. After that, these dual-intensity images are then integrated by
urn a
means of average points. Figure 7 illustrates the integration process of over-enhanced and underenhanced images to produce an enhanced-contrast output image. This step demonstrates excellent performance in improving image contrast. In the next section, an additional step is introduced to improve the naturalness of underwater image by
Jo
equalizing the mean values of all color channels.
15
Journal Pre-proof
Original histogram Average point Mean value
Median value
Max value
pro of
Min value
Upper stretched-region
re-
Lower stretched-region
lP
Figure 6: Illustration of histogram division at average point and stretching of the original histogram to produce lower-stretched and upper-stretched regions.
Jo
urn a
Over-enhanced image
Enhanced-contrast output image
Input image
Under-enhanced image
Figure 7: Integration process of under-enhanced and over-enhanced images to produce enhanced-contrast output image. 16
Journal Pre-proof
4.3 Swarm-intelligence based mean equalization As discussed in section 3, the mean values of all color channels are not significantly differ for natural landscape images. This is supported by the well-known GW [19] assumption
pro of
which stated that all color channels have the same mean values before attenuation. Thus, for the proposed NUCE method, the superior and inferior color channels are first classified based on mean value. The color channel which has the highest value of mean is regarded as the superior color channel,
. On the other hand, the other color channels which have intermediate and
minimum values of mean are considered as inferior color channels,
and
, respectively.
In order to improve the mean values of inferior color channels, the gamma correction
where
and
lP
re-
method as defined by the following power-law expression [20] is used.
(10)
are the input and output of pixel intensity values, respectively. The
application dependent constant. The appropriate value of
plays a significant role in enhancing
will make the image darker, while
urn a
the image’s naturalness.
brighter. In this proposed step, the value of
is an
will make the image
will be determined automatically through
employment of swarm-intelligence algorithm namely particle swarm optimization (PSO) [21]. PSO is employed to shift the mean values of inferior color channels to be closed to the
Jo
mean value of superior color channel. In this work, the PSO is selected because it has excellent robustness and can converge to the optimization value quickly [22]. Secondly, this optimization algorithm has been proven to be effective in many applications [23]. Moreover, it is uncomplicated to hybridize PSO with other algorithms and can be used in various application environments with a little modification [22]. To our knowledge, this is the first attempt that the optimization algorithm is being employed to perform mean equalization to the three RGB color 17
Journal Pre-proof
channels for the improvement of underwater images. Therefore, it is highly hoped that this proposed method can be easily used by other researchers for further improvement or real application in underwater imaging.
pro of
Figure 8 shows the original flowchart of the PSO algorithm. In the earlier stage of PSO algorithm, some parameters are initialized. The PSO parameter values include the number of particles, maximum inertia weight, wmax, minimum inertia weight, wmin, cognitive component, c1, and social component, c2. In addition, the initial position of particles is randomly located in a
re-
search space.
START
lP
Initialization of population and variables
Fitness evaluation of each particle
urn a
Update personal best and global best
Update velocity and position
No
Jo
Stopping criteria met? Yes END
Figure 8: Flowchart of original PSO algorithm.
18
Journal Pre-proof
After the initialization stage is done, the objective function is formulated in order to shift the mean value of inferior color channels to be closed to the mean value of superior color
where
,
pro of
channel as shown in Equation (11).
, and
(11)
indicate the minimum, intermediate and
maximum color channels, respectively, which are based on mean values.
The personal best (pbest) is the best solution found by each particle while global best (gbest) is defined as the best pbest. Both pbest and gbest are updated at every iteration. The
re-
velocities of each particle are updated based on the summation of parameters as shown in
where
and
lP
Equation 12.
(12)
are the position and velocity of particle i at iteration t, respectively. rand is and
denote the cognitive and social coefficients,
urn a
random numbers between 0 and 1. respectively, while
refers to the inertia weight at iteration t. In this proposed algorithm, the
linear dynamic inertia weight is used and calculated based on Equation (13).
and
and
denote the maximum and minimum values of inertia weight, respectively,
Jo
where
(13)
is the maximum iteration. Then, the particle’s new velocity,
is used to update the
particle’s position using Equation (14)
(14)
19
Journal Pre-proof
Finally, the process of searching for optimum solution continues until the maximum number of iteration is obtained. In addition, another stopping condition applied in this method is whenever the difference between the mean value of inferior color channels and the mean value
pro of
of superior color channel is found to be less than 0.5 in order to avoid a large computational time when applying this method.
Table 1 Parameter values of PSO algorithm. Parameter
Value
Number of particles
50
Maximum inertia weight, wmax
0.9
Minimum inertia weight, wmin
0.4
Social component, c2 Maximum iteration, tmax
2
re-
Cognitive component, c1
2
100
lP
Table 1 shows the parameter values of PSO used in this work. The chosen values of these parameters are explained as follows:
Number of particles (50): In the analysis done by Bratton and Kennedy [24], it is
urn a
reported that under testing, no swarm size between 20-100 particles produced results that were clearly superior or inferior to any other value for a majority of the tested problems. To confirm this argument, the swarm sizes between 20-100 particles are tested in this work. 300 underwater images were employed for testing. The results are similar to what they found. The findings show that the swarm size between 20-100 particles manages to equalize the mean values of all color channels for all tested underwater images. In spite of this, Bratton and Kennedy [24] suggested that 50
Jo
numbers of particles are the best to be used as swarms of this size performed better by a very slight margin when averaged across the entire range of test problems. Because of their analysis involves a two dimensional benchmark function [24], which is similar to this work, therefore, it is suggested 50 number of particles are selected to perform mean equalization.
20
Journal Pre-proof
Maximum and minimum inertia weight (0.9~0.4): This inertia weight is utilized to obtain a better balance between global exploration and local exploitation of PSO algorithm. A larger value of inertia weight promotes a global exploration (searching new areas) while a smaller value encourages a local exploitation [25]. Shi and Eberhart [25] suggested that the inertia weight is linearly decreased by the iterative
pro of
generations, where it decreases from 0.9 to 0.4. To validate this suggestion, these linearly decreased values are tested in this work. The result shows that PSO succeeds to equalize the mean values of all color channels for all tested underwater images.
Cognitive component, c1 and social component, c2 (2): Kennedy and Eberhart [21] offered a fixed value of 2, and this configuration has been implemented by many researchers [26]. In other report, Eberhart and Shi [27] set both value to 2.05. These values (2 and 2.05) have been tested in this work. However, there is no significant
re-
difference in performance when PSO is set to these values. The PSO succeeds to equalize the mean values of all color channels for all tested underwater images. Nevertheless, a fixed value of 2 is suggested as this value has been used by many
lP
other researchers [26] and proven to be effective in this work. Maximum iteration (100): This is highly problem-dependent. 300 underwater images have been tested to determine the appropriate maximum number of iteration for PSO to perform mean equalization. It is found that for each image, PSO requires
urn a
different number of executions in order to achieve this objective (mean equalization). For some images, PSO requires a higher number of executions (up to 49 executions) to achieve the objective. To account for this, it is suggested the maximum number of iteration is set to 100. However, for some images, the PSO requires only a single execution to achieve this objective. Therefore, another stopping condition applied in this method is whenever the difference between the mean value of inferior color
Jo
channels and the mean value of superior color channel is found to be less than 0.5 in order to avoid a large computational time when applying this method.
Figure 9 shows the resultant image after applying this step. Prior to enhancement, the mean value of red channel, green channel, and blue channel are 110.59, 117.70, and 123.41, respectively. After the enhancement, the proposed step successfully shifts the mean values of
21
Journal Pre-proof
inferior color channels (red and green channels) nearly closed to the mean value of superior color channel (blue channel). As the result, the mean values of red and green channels are enhanced to 123.15 and 123.72, respectively. This is expected to escalate the naturalness of underwater
pro of
image. In addition, this is consistent with the previous assumption as discussed in section 3. According to natural landscape image, the mean values of all color channels are not significantly differ. This is also supported by the well-known GW assumption which stated that all color channels have the same mean values before attenuation. Therefore, through the incorporation of swarm intelligence algorithm, the mean values of inferior color channels are adjusted to be
re-
closed to the mean value of superior color channel. In the next section, an unsharp masking
lP
technique is applied to sharpen the overall output image.
urn a
Before mean equalization
Mean of R intensity = 123.15 Mean of G intensity = 123.72 Mean of B intensity = 123.41
Jo
Mean of R intensity = 110.59 Mean of G intensity = 117.70 Mean of B intensity = 123.41
After mean equalization
Figure 9: Resultant image and its respective histograms after applying the proposed step.
22
Journal Pre-proof
4.3 Unsharp masking Unsharp masking is ideal to sharpen an image and applicable for underwater images [28]. The basic concept of this technique is to blur the original image first, then subtract the blurry
pro of
image from the original image [29]. Then, the difference is added to the original image. This basic concept is described in Equation 15.
(15)
There are three important settings that control the effectiveness of this unsharp masking: 1. Amount: It controls the strength of the desired sharpening effect. High value leads to a
re-
high contrast of output image. Normally, the value of this parameter is within the range of 0 and 2. A large value of this parameter may deteriorate image quality.
lP
2. Radius: Standard deviation of the Gaussian lowpass filter. This value determines the size of the area around the edge pixels that are affected by sharpening. A small value sharpens narrower regions around edges, while a large value sharpens broader regions around the
urn a
edges.
3. Threshold: Minimum contrast required for a pixel to be considered as an edge pixel. A higher value allows sharpening in high contrast regions only, while lower value allows sharpening in relatively smoother regions of the image. The parameter values used for unsharp masking implementation are shown in Table 2.
Jo
This is based on the default values used in [30]. Table 2 Parameter values of unsharp masking. Parameter
Value
Amount
0.8
Radius
1
Threshold
0
23
Journal Pre-proof
5. Results and discussion The effectiveness of the proposed NUCE method is compared with several state-of-theart methods namely gray world (GW) [19], integrated color model (ICM) [8], unsupervised color
pro of
correction method (UCM) [4], dual-intensity images and Rayleigh-stretching (DIRS) [5], integrated-intensity stretched-Rayleigh (IISR) [6], recursive adaptive histogram modification (RAHIM) [31], relative global histogram stretching (RGHS) [32] and underwater light attenuation prior (ULAP) [33]. GW and ICM methods are selected for comparison because these methods are famous and normally used for comparing enhanced images. On the other hand, UCM, DIRS and IISR are chosen for the comparison because these methods apply white-
re-
balancing technique to reduce underwater color cast which is also applied in the proposed NUCE method. Meanwhile, RAHIM, RGHS and ULAP are the latest color correction techniques which
lP
are also proposed to enhance underwater image.
The image quality produced by each method is evaluated qualitatively and quantitatively. In qualitative evaluation, the main criteria in determining the quality of resultant image are color
urn a
cast, contrast, under-enhanced and over-enhanced effects. On the contrary, quantitative measurement is used to support qualitative perceptions. In quantitative evaluation, the performance of each method is assessed based on several evaluation metrics such as entropy [34], average gradient [35], Sobel edge detection [4] [36], patch-based contrast quality index (PCQI) [37], underwater image quality metric (UIQM) [38], and natural image quality evaluator
Jo
(NIQE) [39]. Entropy is used to evaluate the image details. High entropy value is preferred as it indicates that the image contains more information. Meanwhile, the average gradient shows the contrast quality of an image. A high average gradient value indicates that the image has a higher intensity level and thus better contrast. Besides, Sobel edge detection is also employed in this experiment to evaluate the quality of the resultant image by detecting the object’s boundary
24
Journal Pre-proof
within the image. Sobel edge detector which is provided in MATLAB is used in this experiment. The high Sobel edge detection value indicates that the resultant image is better as the boundaries of the object are significantly detected. On the other hand, PCQI is a general-purpose image
pro of
contrast metric, which is employed to assess the contrast quality. UIQM is specifically designed to evaluate underwater imagery, where it takes into account three important criteria; colorful, sharpness and contrast. Larger PCQI and UIQM values show better image quality. The last metric used for evaluation is NIQE, where it compares the resultant image to a default model computed from images of natural scenes. Small NIQE value shows better image quality. In this experiment, 300 underwater images are used for testing and comparison. These
re-
underwater images are highly affected by underwater color cast and suffered from poor contrast. Most of the objects in the images are hardly differentiated from the background and thus
lP
possessed extremely low visibility level. Four images are selected for deeper discussion, namely diver, turtle, fishes, and coral branch. The resultant images produced by all methods are shown in Figures 10 to 13. In addition, the histograms of all color channels as well as the mean values
urn a
of respective histograms are included for extensive discussion. To support qualitative evaluations, Table 3 shows the quantitative results of the sample images shown in those figures. Additional sample images are attached in Appendix A and the quantitative evaluation of these
Jo
images is featured in Appendix B.
25
Journal Pre-proof
The effect of greenish color cast dominates the original image of diver as shown in Figure 10(a). The objects are hardly seen due to poor contrast. Through comparison, GW tends to produce a reddish image. Despite this method succeeds to equalize the mean values of all
pro of
color channels, however, the enhancement of these color channels are not well-balanced. It can be seen that the histograms of green and blue channels are insufficiently stretched as compared to red channel. ICM inadequately improves the overall image quality as the bluish color cast retains at the background as shown in the circle of Figure 10(c). UCM produces a yellowish output image as the histogram of blue channel is insufficiently stretched. Besides that, the mean value of this channel is also very low as compared to red and green channels. DIRS is able to
re-
reduce the greenish color cast and obtains the highest score in UIQM (1.525) and NIQE (3.519). However, the over-enhanced effect can be detected at the background. IISR deteriorates the
lP
original image as the overall image become too bright. This phenomenon results in reducing the image detail. According to histograms, the mean value of green channel is extremely increased compared to red and blue channels. Moreover, the high value of NIQE (4.026) verifies that the
urn a
resultant image produced by this method is not positively enhanced. RAHIM and RGHS insufficiently improve the image quality as the greenish color cast retains at the background. The entire output image produced by ULAP is overshadowed by the greenish color cast as the red and blue channels are poorly improved. On the other hand, the proposed NUCE method significantly enhances the image as the greenish color cast is adequately reduced and the objects
Jo
can be seen clearly. The image contrast is also well-improved. According to color histograms, all mean values are nearly equivalent, which is consistent with GW assumption and natural landscape image. This superior performance is also supported by the quantitative evaluation reported in Table 3(a) as the proposed method achieves the best rank for entropy, average gradient, Sobel count and PCQI with the values of 7.920, 6.813, 9037, and 1.267, respectively.
26
Journal Pre-proof
The original image of turtle in Figure 11(a) has been covered by strong greenish color cast. According to histograms, the green channel dominates the other two color channels by having the largest mean value (180.07). GW worsens the original image, as the image becomes
pro of
too dark. The output images produced by ICM and UCM indicate contrast improvement as the objects are distinguishable from the background. However, the greenish color cast is not significantly reduced. In addition, the UCM oversaturates the original image as some regions become too bright. DIRS is able to reduce the greenish color cast as the color of turtle looks real and natural. However, the image contrast is not well-improved as the histograms of all color channels are insufficiently stretched. Similar to previous tested image, IISR deteriorates further
re-
the original image, as the greenish color cast extremely overshadows the resultant image. It can be noticed that the green channel has been excessively enhanced as the mean value of this
lP
channel is too large (251.98). Meanwhile, RAHIM obtains the highest rank in UIQM metric with the value of 1.542. However, visual observation indicates that the objects in the image are dimmed as the greenish color cast retains in the output image. The output image also exhibits
urn a
limited contrast enhancement as the histograms of red and blue channels are not properly stretched. RGHS is able to reduce the greenish color cast to some extent. However, the histogram of blue channel is not appropriately stretched that causes the mean intensity value of this channel to be very low (58.41) compared to the other channels. There is no significant improvement can be seen in the output image produced by ULAP as the output image retains the greenish color
Jo
cast. In addition, this inadequacy is supported by quantitative evaluation as this method yields a very low UIQM value (1.051). On the other hand, the proposed NUCE method produces the best image quality, as the image exhibits significantly improved contrast and color. The greenish color cast is significantly reduced as the objects can be easily distinguished from the background. In addition, there is no under-enhanced and over-enhanced effects produced in the output image.
27
Journal Pre-proof
The color histograms produced by the proposed NUCE method also display the best distribution compared to the other methods. This finding is supported by the quantitative evaluation reported in Table 3(b) as the proposed NUCE method achieves the best rank for entropy, average
pro of
gradient, Sobel count and PCQI, with the values of 7.719, 9.473, 13130, and 1.441, respectively. The original image of fishes in Figure 12(a) has poor contrast and the real color of fishes is overshadowed by the bluish color cast. The comparisons show that the output image produced by GW appears reddish especially at the foreground as the colors of fishes and stones turns red. ICM over-enhances the original image as the background color becomes too bright. Similar to GW, the output images produced by UCM and DIRS turn reddish especially the color of stones
re-
located at the foreground. There is no significant improvement can be seen in the output image produced by IISR and RAHIM as the output images retain the bluish color cast. Similar to ICM,
lP
RGHS over-enhances the original image as the background color becomes too bright. ULAP has successfully reduced the impact of bluish color cast in the foreground as the fishes are clearly visible. However, the bluish color cast in the background is insufficiently reduced. Furthermore,
urn a
this method obtains a very high value of NIQE (4.024) which indicates poor naturalness quality. On the other hand, the proposed NUCE method shows the best output image as the visibility of the image is significantly improved at the background and foreground areas. The image contrast is also well-improved as the fishes and stones can be easily distinguished from the background. This excellent performance is also supported by the quantitative analysis reported in Table 3(c).
Jo
The proposed NUCE method achieves the best ranks for average gradient, Sobel count, and NIQE metrics, with the value of 6.398, 8194, 3.355, respectively.
28
Journal Pre-proof
As shown in Figure 13(a), the original image of coral branch is severely deteriorated by bluish color cast. Besides, the image contrast is also very poor as the objects in the image are hardly seen. The output image produced by GW is deteriorated further as the whole image turns
pro of
reddish. The histograms of green and blue channels show that these color channels are poorly enhanced. This visual observation is supported by the quantitative evaluation as the entropy value produced by GW (5.622) is lower than in original image (6.524). ICM is capable to reduce the bluish color cast to some extent, however the image contrast is not significantly improved as the histogram of red channel is not properly stretched. Similar to GW, UCM and DIRS tend to produce reddish output images especially the colors of objects as highlighted in circle. There is
re-
no significant improvement can be seen in the output images processed by IISR, RAHIM and RGHS methods as the output images retain the bluish color cast. In addition, the histograms of
lP
red channels for these images are inadequately stretched. The performance of ULAP in improving this image is poor as the output image and the original image do not significantly differ. On the other hand, the proposed NUCE method produces the best image quality as the
urn a
bluish color cast is significantly reduced. Moreover, the resultant image exhibits significantly improved contrast as the objects can be easily distinguished from the background. This notable performance is also supported by the quantitative analysis reported in Table 3(d) as the proposed NUCE method produces the highest entropy and PCQI, with the values of 7.811 and 1.346,
Jo
respectively.
29
Journal Pre-proof
Method
Resultant images
Histograms
Mean of R intensity = 49.28 Mean of G intensity = 145.30 Mean of B intensity = 88.13
pro of
(a) Original image
Mean of R intensity = 94.24 Mean of G intensity = 94.24 Mean of B intensity = 94.24
(b) GW
Mean of R intensity = 85.57 Mean of G intensity = 101.29 Mean of B intensity = 99.04
(c) ICM
re-
(d) UCM
lP
(e) DIRS
(g) ULAP
Mean of R intensity = 107.41 Mean of G intensity = 107.23 Mean of B intensity = 68.83 Mean of R intensity = 90.93 Mean of G intensity = 99.08 Mean of B intensity = 98.07 Mean of R intensity = 79.23 Mean of G intensity = 162.85 Mean of B intensity = 93.00 Mean of R intensity = 76.40 Mean of G intensity = 108.82 Mean of B intensity = 102.69 Mean of R intensity = 84.89 Mean of G intensity = 106.46 Mean of B intensity = 97.88
Jo
(g) RAHIM
urn a
(f) IISR
(h) RGHS
Mean values
Mean of R intensity = 69.51 Mean of G intensity = 136.15 Mean of B intensity = 73.99 Mean of R intensity = 120.61 Mean of G intensity = 121.08 Mean of B intensity = 121.66
(h) NUCE
30
Journal Pre-proof
Figure 10: Images of diver tested with different method. Method
Resultant images
Histograms
Mean values Mean of R intensity = 34.11 Mean of G intensity = 180.09 Mean of B intensity = 12.91
(b) GW
Mean of R intensity = 75.71 Mean of G intensity = 75.70 Mean of B intensity = 75.00
pro of
(a) Original image
Mean of R intensity = 102.27 Mean of G intensity = 108.42 Mean of B intensity = 68.62
(c) ICM
re-
(d) UCM
lP
(e) DIRS
(h) RGHS
(g) ULAP
Mean of R intensity = 104.46 Mean of G intensity = 106.51 Mean of B intensity = 92.24 Mean of R intensity = 136.69 Mean of G intensity = 251.98 Mean of B intensity = 65.20 Mean of R intensity = 76.07 Mean of G intensity = 115.24 Mean of B intensity = 62.04 Mean of R intensity = 111.17 Mean of G intensity = 114.43 Mean of B intensity = 58.41
Jo
(g) RAHIM
urn a
(f) IISR
Mean of R intensity = 128.97 Mean of G intensity = 121.58 Mean of B intensity = 56.02
Mean of R intensity = 35.65 Mean of G intensity = 175.32 Mean of B intensity = 11.93 Mean of R intensity = 122.53 Mean of G intensity = 123.17 Mean of B intensity = 123.53
(h) NUCE
31
Journal Pre-proof
Figure 11: Images of turtle tested with different methods. Method
Resultant images
Histograms
Mean values Mean of R intensity = 55.89 Mean of G intensity = 148.07 Mean of B intensity = 183.96
(b) GW
Mean of R intensity = 114.83 Mean of G intensity = 129.31 Mean of B intensity = 129.31
pro of
(a) Original image
Mean of R intensity = 76.63 Mean of G intensity = 192.50 Mean of B intensity = 195.86
(c) ICM
re-
(d) UCM
lP
(e) DIRS
(h) RGHS
(g) ULAP
Mean of R intensity = 101.17 Mean of G intensity = 116.47 Mean of B intensity = 120.59 Mean of R intensity = 65.52 Mean of G intensity = 162.19 Mean of B intensity = 196.69 Mean of R intensity = 78.32 Mean of G intensity = 107.27 Mean of B intensity = 118.69 Mean of R intensity = 66.89 Mean of G intensity = 167.29 Mean of B intensity = 204.14
Jo
(g) RAHIM
urn a
(f) IISR
Mean of R intensity = 131.49 Mean of G intensity = 158.78 Mean of B intensity = 157.98
Mean of R intensity = 94.96 Mean of G intensity = 137.22 Mean of B intensity = 173.88 Mean of R intensity = 144.77 Mean of G intensity = 144.61 Mean of B intensity = 144.85
(h) NUCE
32
Journal Pre-proof
Figure 12: Images of fishes tested with different methods. Method
Resultant images
Histograms
Mean values Mean of R intensity = 7.89 Mean of G intensity = 180.16 Mean of B intensity = 202.20
(b) GW
Mean of R intensity = 101.86 Mean of G intensity = 130.08 Mean of B intensity = 130.04
pro of
(a) Original image
Mean of R intensity = 43.06 Mean of G intensity = 123.79 Mean of B intensity = 120.35
(c) ICM
re-
(d) UCM
lP
(e) DIRS
(h) RGHS
(g) ULAP
Mean of R intensity = 116.14 Mean of G intensity = 107.51 Mean of B intensity = 106.45 Mean of R intensity = 41.11 Mean of G intensity = 101.84 Mean of B intensity = 111.05 Mean of R intensity = 38.26 Mean of G intensity = 96.90 Mean of B intensity = 101.83 Mean of R intensity = 22.24 Mean of G intensity = 117.05 Mean of B intensity = 114.57
Jo
(g) RAHIM
urn a
(f) IISR
Mean of R intensity = 133.81 Mean of G intensity = 115.81 Mean of B intensity = 104.61
Mean of R intensity = 23.68 Mean of G intensity = 163.65 Mean of B intensity = 193.51 Mean of R intensity = 126.06 Mean of G intensity = 126.07 Mean of B intensity = 125.97
(h) NUCE
33
Journal Pre-proof
Jo
urn a
lP
re-
pro of
Figure 13: Images of coral branch tested with different methods.
34
Journal Pre-proof
Table 3 Underwater image quality comparisons based on quantitative evaluation.
c)
d)
Turtle
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
6.713 6.075 7.555 7.301 6.842 4.856 7.445 7.681 6.930 7.719
Fishes
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
Coral branch
pro of
Diver
7.556 7.030 7.804 7.665 7.584 7.113 7.681 7.797 7.711 7.920
UIQM
NIQE
1.027 1.136 1.505 1.469 1.525 1.503 1.481 1.470 1.272 1.491
3.822 3.769 3.788 3.849 3.519 4.026 3.797 3.807 3.826 3.840
1.575 4.260 7.769 7.886 7.968 5.804 6.188 6.764 2.345 9.473
470 1759 9969 11132 11012 4639 8449 8930 1609 13130
1.000 0.992 1.429 1.209 1.347 0.973 1.393 1.410 1.145 1.441
0.807 1.189 1.537 1.387 1.529 1.207 1.542 1.520 1.051 1.519
4.996 4.344 4.812 6.947 4.911 4.615 4.505 4.861 4.608 4.727
7.813 7.103 7.273 7.801 7.474 7.001 7.653 7.582 7.820 7.620
2.946 3.288 4.617 5.309 5.908 4.062 5.576 4.009 4.316 6.398
3122 3776 6115 7398 7652 5102 7304 5293 5583 8194
1.000 0.987 1.048 1.130 1.161 1.035 1.174 1.048 1.115 1.164
1.075 1.172 1.290 1.387 1.560 1.238 1.491 1.252 1.281 1.392
4.016 3.782 3.802 3.719 3.616 3.956 3.571 3.814 4.024 3.355
6.524 5.622 7.411 7.310 7.229 7.219 7.494 7.415 7.295 7.811
0.955 4.149 4.661 7.717 7.269 4.171 3.789 3.434 1.775 6.202
197 4868 5007 11102 10087 4377 3633 3603 794 8006
1.000 1.075 1.278 1.226 1.187 1.197 1.167 1.174 1.125 1.346
0.599 1.311 1.498 1.611 1.641 1.460 1.486 1.427 0.841 1.474
7.601 5.635 5.690 5.740 5.921 5.644 5.572 5.838 7.214 5.919
lP
urn a
b)
Entropy
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
Jo
a)
Method
re-
Image
Quantitative Analysis Average Sobel PCQI Gradient Count 2.459 1390 1.000 2.944 1531 0.943 5.441 6159 1.209 5.147 6514 1.196 5.320 6076 1.165 5.491 6541 1.107 5.091 5654 1.232 4.679 5046 1.185 3.659 3178 1.134 6.813 9037 1.267
Note: The values in bold typeface represent the best result obtained in the comparison.
35
Journal Pre-proof
Table 4 Average quantitative result of 300 tested underwater images.
Entropy
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
7.064 6.607 7.567 7.571 7.448 7.258 7.568 7.615 7.248 7.782
Average Gradient 3.283 4.418 7.506 8.227 8.401 7.112 7.200 6.381 5.023 9.795
UIQM
NIQE
1.116 1.231 1.548 1.539 1.622 1.600 1.578 1.531 1.377 1.577
4.244 4.801 3.977 4.615 4.925 3.959 3.709 3.751 3.811 3.825
pro of
Method
Quantitative Analysis Sobel PCQI Count 11012 1.000 14788 0.976 27905 1.204 31286 1.194 30441 1.179 25763 1.148 28158 1.222 24550 1.146 19401 1.073 35673 1.276
re-
Note: The values in bold typeface represent the best result obtained in the comparison.
Dataset in Table 4 shows the average quantitative results of 300 tested underwater images. Based on these results, the proposed NUCE method produces the highest average score
lP
for entropy, average gradient, Sobel count, and PCQI with the values of 7.782, 9.795, 35673, and 1.276, respectively. For UIQM and NIQE performance metrics, the proposed NUCE method is in fourth place, with the values of 1.577 and 3.825, respectively. However, this does not
urn a
necessarily indicate that the proposed NUCE is inferior compared to the other methods. The quantitative evaluation metrics that are being used are subjective and thus have difficulties in quantifying accurately the improvements done by an image enhancement method [40]. In some cases, some performance metrics ineffectively accomplish a result that is in agreement with the
Jo
human perception of image quality.
For example, based on Figure 11 (image of turtle), RAHIM obtains a better score for UIQM metric (1.542) compared to the proposed NUCE (1.519). However, visual observation indicates that the objects in the image are dimmed as the greenish color cast retains in the output image. The output image also exhibits limited contrast enhancement as the histograms of red and blue channels are not properly stretched. On the other hand, the proposed NUCE method 36
Journal Pre-proof
produces the best image quality, as the image exhibits significantly improved contrast and color. The greenish color cast is significantly reduced as the objects can be easily distinguished from the background.
pro of
Another example, based on the same image, GW obtains a better score for NIQE metric (4.344) compared to the proposed NUCE (4.727). However, according to visual inspection, GW worsens the original image, as the image becomes too dark. On the other hand, the proposed NUCE method produces the best image quality, as the greenish color cast is significantly reduced. The color histograms produced by the proposed NUCE also display a better distribution compared to the GW.
re-
Hence, in terms of image quality comparison, visual qualitative evaluation by human visual system which is made through observation is taken as the first priority for overall image
lP
quality evaluation [31]. The quantitative performance metrics are used to support the visual evaluation. Nevertheless, when a disagreement occurred between these two types of evaluations, the visual observation is taken as a more convincing result.
urn a
Furthermore, a statistical test is performed on the results reported in Table 4 to validate further whether the performance of the proposed NUCE method exhibits a significant improvement as compared to the other methods. To analyze the performance of these methods statistically, Wilcoxon signed rank test is chosen [41]. The test begins with finding the differences between the performances of the two methods in each quantitative metrics. Then, the
Jo
differences are ranked based on their absolute value. Next, the sum of ranks where the first method outperforms the second method, R+, and the sum of ranks where the second method outperforms the first method, R-, are computed. The smallest value between R+ and R- is selected as the test statistic, T. The null hypothesis is rejected if the test statistic value is less than critical value, T0. The rejection of the null hypothesis indicates that the performance of these two
37
Journal Pre-proof
methods significantly differs. Table 5 shows the R+, R-, T, and T0 for all the pairwise comparisons concerning the proposed NUCE. The significance level, α, is set to 0.1 [41]. The statistic value, T, of Wilcoxon signed rank test obtained is compared with T0 from table of
pro of
critical values of T0 in Wilcoxon signed rank test [42]. If T is found to be less than T0, therefore, significance difference exists. As the table states, NUCE shows a significant improvement over GW, ICM, UCM, DIRS, IISR and ULAP.
Table 5 Wilcoxon signed ranks test results. R0 0 0 0 1 1 4 2 1
T 0 0 0 0 1 1 4 2 1
re-
R+ 21 21 21 21 20 20 17 19 20
lP
Comparison NUCE versus Original image NUCE versus GW NUCE versus ICM NUCE versus UCM NUCE versus DIRS NUCE versus IISR NUCE versus RAHIM NUCE versus RGHS NUCE versus ULAP
T0 (α = 0.1) 2 2 2 2 2 2 2 2 2
Significant Yes Yes Yes Yes Yes Yes No No Yes
urn a
Underwater color cast and poor contrast in underwater images increase complexity to various computer vision algorithms such as detection and localization. For example one of the main limitations of the existing underwater SLAM (Simultaneous Localization and Mapping) methods is due to the lack of robust, stable and local feature points that can be matched reliably
Jo
in numerous and challenging underwater scenes [43]. Therefore, to verify the effectiveness of the proposed method for the task of matching images, a keypoint matching test [44] is used. Two challenging underwater images pairs (turbid and deep underwater scenes) are selected for evaluation. Figure 14(a) shows the keypoint matching applied on original images pairs. Due to poor quality of these images, there is insignificant number of keypoint matching than can be detected. 38
Journal Pre-proof
Meanwhile, after the original images have been improved by the proposed NUCE method, the resultant images exhibit significantly improved contrast as the objects can be easily distinguished from the background. The bluish color cast is also significantly reduced. Therefore,
pro of
by applying the same matching procedures to these resultant images, the proposed NUCE method is able to obtain significant additional number of valid matches compared to original images pairs. This test proves that the output images processed by the proposed method have more details. This application test provides additional evidence on the effectiveness of the proposed NUCE method.
b
Proposed NUCE method
lP
Original image
Deep underwater images
urn a
a
Turbid images
re-
Method
Figure 14: Keypoint matching test.
6. Conclusion
Jo
In this paper, the proposed NUCE method is successfully implemented to enhance the quality of underwater images. Motivated from natural landscape image, the proposed method integrates four major steps which involves swarm-intelligence algorithm to equalize mean values of all color channel histograms. As shown in the results, the proposed method is able to enhance various underwater scenes with high visibility, and being able to reduce underwater color cast
39
Journal Pre-proof
significantly. In addition, the proposed method outperforms other state-of-the-art techniques qualitatively and quantitatively.
Acknowledgments
pro of
We would like to thank all reviewers for the comments and suggestions to improve this paper. This study is supported by Universiti Malaysia Pahang (UMP) through Research Grant (RDU1803131) entitled “Development of multi-vision guided obstacle avoidance system for ground vehicle”.
References
M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “True color correction
re-
[1]
of autonomous underwater vehicle imagery,” J. F. Robot., vol. 33, no. 6, pp. 853–874, 2015. [2]
C. Li, J. Guo, and C. Guo, “Emerging from water: Underwater image color correction
323–327, 2018. [3]
lP
based on weakly supervised color transfer,” IEEE Signal Process. Lett., vol. 25, no. 3, pp.
P. Sahu, “A survey on underwater image enhancement techniques,” Int. J. Comput. Appl.,
[4]
urn a
vol. 87, no. 13, pp. 19–23, 2014.
K. Iqbal, M. Odetayo, A. James, R. A. Salam, and A. Z. H. Talib, “Enhancing the low quality images using unsupervised colour correction method,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, 2010, pp. 1703–1709.
[5]
A. S. Abdul Ghani and N. A. Mat Isa, “Underwater image quality enhancement through composition of dual-intensity images and Rayleigh-stretching.,” Springerplus, vol. 3, no.
[6]
Jo
1, pp. 1–14, 2014.
A. S. Abdul Ghani, R. S. N. A. Raja Aris, and M. L. Muhd Zain, “Unsupervised contrast correction for underwater image quality enhancement through integrated-intensity stretched-Rayleigh histograms,” J. Telecommun. Electron. Comput. Eng., vol. 8, no. 3, pp. 1–7, 2016.
[7]
M. Chambah, D. Semani, A. Renouf, P. Coutellemont, and A. Rizzi, “Underwater color 40
Journal Pre-proof
constancy : enhancement of automatic live fish recognition,” in Proceeding of the 16th Annual symposium on electronic imaging, 2004, pp. 157–168.
[8]
K. Iqbal, R. A. Salam, A. Osman, and A. Z. Talib, “Underwater image enhancement using
[9]
pro of
an integrated colour model,” Int. J. Comput. Sci., vol. 34, no. 2, pp. 239–244, 2007. A. S. Abdul Ghani and N. A. Mat Isa, “Underwater image quality enhancement through integrated color model with Rayleigh distribution,” Appl. Soft Comput., vol. 27, pp. 219– 230, 2015.
[10] A. S. Abdul Ghani, “Image contrast enhancement using an integration of recursiveoverlapped contrast limited adaptive histogram specification and dual-image wavelet fusion for the high visibility of deep underwater image,” Ocean Eng., vol. 162, pp. 224–
re-
238, Aug. 2018.
[11] M. J. N. Mohd Naim and N. A. Mat Isa, “Pixel distribution shifting color correction for digital color images,” Appl. Soft Comput., vol. 12, no. 9, pp. 2948–2962, 2012.
lP
[12] M. F. Abu Hassan, A. S. Abdul Ghani, D. Ramachandram, A. Radman, and S. A. Suandi, “Enhancement of under-exposed image for object tracking algorithm through homomorphic filtering and mean histogram matching,” Adv. Sci. Lett., vol. 23, no. 11, pp.
urn a
11257–11261, 2017.
[13] R. Schettini and S. Corchs, “Underwater image processing: State of the art of restoration and image enhancement methods,” EURASIP J. Adv. Signal Process., vol. 2010, pp. 1–14, 2010.
[14] J. S. Jaffe, “Computer modeling and the design of optimal underwater imaging systems,” IEEE J. Ocean. Eng., vol. 15, no. 2, pp. 101–111, 1990.
Jo
[15] R. Garcia, T. Nicosevici, and X. Cufi, “On the way to solve lighting problems in underwater imaging,” in Proceeding of the Oceans ’02 MTS/IEEE, 2002, pp. 1018–1024. [16] K. He, J. Sun, and X. Tang, “Single image haze removal using dark channel prior,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 33, no. 12, pp. 2341–2353, 2011. [17] A. Galdran, D. Pardo, A. Picon, and A. Alvarez-Gila, “Automatic Red-Channel underwater image restoration,” J. Vis. Commun. Image Represent., vol. 26, pp. 132–145, 41
Journal Pre-proof
2015.
[18] C. O. Ancuti, C. Ancuti, C. De Vleeschouwer, and P. Bekaert, “Color balance and fusion
pro of
for underwater image enhancement,” IEEE Trans. Image Process., vol. 27, no. 1, pp. 379– 393, Jan. 2018.
[19] G. Buchsbaum, “A spatial processor model for object colour perception,” J. Franklin Inst., vol. 310, no. 1, pp. 1–26, 1980.
[20] R. C. Gonzalez and R. E. Woods, Digital image processing, Second Edi. New Jersey: Prentice Hall, 2002.
re-
[21] J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, 1995, vol. 4, pp. 1942–1948. [22] D. Wang, D. Tan, and L. Liu, “Particle swarm optimization algorithm: an overview,” Soft
lP
Comput., vol. 22, no. 2, pp. 387–408, 2018.
[23] R. Poli, “Analysis of the publications on the applications of particle swarm optimisation,” J. Artif. Evol. Appl., no. 2, pp. 1–10, 2008.
[24] D. Bratton and J. Kennedy, “Defining a standard for particle swarm optimization,” in
urn a
Proceedings of the 2007 IEEE Swarm Intelligence Symposium, 2007, pp. 120–127. [25] Y. Shi and R. C. Eberhart, “Empirical study of particle swarm optimization,” in Proceedings of the 1999 Congress on Evolutionary Computation, 1999, vol. 3, pp. 1945– 1950.
[26] Z. Beheshti and S. M. H. Shamsuddin, “A review of population-based meta-heuristic
Jo
algorithm,” Int. J. Adv. Soft Comput. its Appl., vol. 5, no. 1, pp. 1–35, 2013. [27] R. C. Eberhart and Y. Shi, “Comparing inertia weights and constriction factors in particle swarm optimization,” in Proceedings of the 2000 Congress on Evolutionary Computation, 2000, vol. 1, pp. 84–88. [28] A. Zaafouri, M. Sayadi, and F. Fnaiech, “A developed unsharp masking method for images contrast enhancement,” in International Multi-Conference on Systems, Signals and Devices, 2011, pp. 1–6. 42
Journal Pre-proof
[29] J. N. Archana and P. Aishwarya, “A review on the image sharpening algorithms using unsharp masking,” Int. J. Eng. Sci. Comput., vol. 6, no. 7, pp. 8729–8733, 2016. [30] “MATLAB Image Processing Toolbox 2016b.” . [31] A. S. Abdul Ghani and N. A. Mat Isa, “Automatic system for improving underwater
pro of
image contrast and color through recursive adaptive histogram modification,” Comput. Electron. Agric., vol. 141, pp. 181–195, 2017.
[32] D. Huang, Y. Wang, W. Song, J. Sequeira, and S. Mavromatis, “Shallow-water image enhancement using relative global histogram stretching based on adaptive parameter acquisition,” in Schoeffmann K. et al. (eds) MultiMedia Modeling. MMM 2018. Lecture Notes in Computer Science, vol. 10704, Springer, Cham, 2018, pp. 453–465. [33] W. Song, Y. Wang, D. Huang, and D. Tjondronegoro, “A rapid scene depth estimation
re-
model based on underwater light attenuation prior for underwater image restoration,” in Advances in Multimedia Information Processing, 2018, pp. 678–688. [34] Z. Ye, “Objective assessment of nonlinear segmentation approaches to gray level
lP
underwater images,” ICGST J. Graph. Vis. Image Process., vol. 9, no. II, pp. 39–46, 2009. [35] J. Wu, H. Huang, Y. Qiu, H. Wu, J. Tian, and J. Liu, “Remote sensing image fusion based on average gradient of wavelet transform,” in IEEE International Conference on
urn a
Mechatronics and Automation, 2005, vol. 4, pp. 1817–1822. [36] C. Munteanu and A. Rosa, “Gray-scale image enhancement as an automatic process driven by evolution,” IEEE Trans. Syst. Man, Cybern. Part B Cybern., vol. 34, no. 2, pp. 1292–1298, 2004.
[37] S. Wang, K. Ma, H. Yeganeh, Z. Wang, and W. Lin, “A patch-structure representation method for quality assessment of contrast changed images,” IEEE Signal Process. Lett.,
Jo
vol. 22, no. 12, pp. 2387–2390, 2015. [38] K. Panetta, C. Gao, and S. Agaian, “Human-visual-system-inspired underwater image quality measures,” IEEE J. Ocean. Eng., vol. 41, no. 3, pp. 541–551, 2016. [39] A. Mittal, R. Soundararajan, and A. C. Bovik, “Making a ‘completely blind’ image quality analyzer,” IEEE Signal Process. Lett., vol. 20, no. 3, pp. 209–212, 2013. [40] S. P. Rao, R. Rajendran, K. Panetta, and S. S. Agaian, “Combined transform and spatial 43
Journal Pre-proof
domain based ‘ no reference ’ measure for underwater images,” in Proceedings of the IEEE International Symposium on Technologies for Homeland Security (HST), 2017, pp. 1–7. [41] J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm
pro of
intelligence algorithms,” Swarm Evol. Comput., vol. 1, no. 1, pp. 3–18, 2011. [42] N. A. Ab. Aziz, M. Mubin, Z. Ibrahim, and S. W. Nawawi, “Statistical analysis for swarm intelligence — simplified,” Int. J. Futur. Comput. Commun., vol. 4, no. 3, pp. 193–197, 2015.
[43] A. Kim and R. M. Eustice, “Real-time visual SLAM for autonomous underwater hull inspection using visual saliency,” IEEE Trans. Robot., vol. 29, no. 3, pp. 719–733, 2013.
re-
[44] H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, “Speeded-up robust features (SURF),”
Jo
urn a
lP
Comput. Vis. Image Underst., vol. 110, no. 3, pp. 346–359, Jun. 2008.
44
Journal Pre-proof
Appendix A Original
GW
ICM
UCM
DIRS
IISR
RAHIM RGHS
pro of
1
2
3
re-
4
5
lP
6
7
10
Jo
9
urn a
8
45
ULAP
NUCE
Journal Pre-proof
Appendix B Image
Method
Quantitative Analysis Sobel PCQI Count 5240 1.000 4443 0.891 9593 1.171 9939 1.162 8815 1.103 9486 1.133 10712 1.161 8851 1.146 8992 1.103 13701 1.192
7.432 7.240 7.769 7.656 7.611 7.303 7.736 7.735 7.451 7.891
Image 2
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
7.036 6.625 7.570 7.511 7.517 7.474 7.647 7.600 7.327 7.576
5.114 5.500 11.003 10.842 10.723 10.547 12.224 10.130 8.974 15.291
7921 8861 13306 13405 13270 13154 13364 12840 12369 14574
1.000 1.025 1.222 1.209 1.233 1.239 1.217 1.199 1.197 1.200
1.369 1.384 1.543 1.530 1.654 1.562 1.569 1.501 1.583 1.719
3.201 3.183 3.291 3.247 3.550 3.561 3.116 3.304 3.064 3.357
Image 3
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
7.266 6.639 7.722 7.391 7.515 4.779 7.286 7.419 7.629 7.859
1.099 1.750 3.821 3.556 3.903 2.903 2.831 2.441 1.672 3.973
360 484 3512 3809 4001 1533 2469 2376 1258 3689
1.000 0.846 1.190 1.131 1.116 0.756 1.087 1.090 1.070 1.221
0.599 0.757 1.306 1.264 1.392 0.867 1.253 1.169 0.805 1.264
7.767 6.310 4.622 4.696 3.865 4.619 4.389 5.763 6.151 4.234
Image 4
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
1.150 1.466 2.801 2.573 2.680 2.144 3.147 2.318 1.718 3.309
466 408 2648 2638 2331 667 2935 2306 1545 3195
1.000 0.858 1.143 1.101 1.077 0.698 1.138 1.098 1.046 1.165
0.652 0.738 1.137 1.130 1.150 0.793 1.249 1.096 0.923 1.178
7.112 6.578 5.019 4.828 4.614 4.725 4.432 5.009 5.771 4.018
Jo
urn a
re-
lP
Entropy
pro of
Image 1
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
Average Gradient 4.130 4.395 6.739 6.357 6.681 6.604 6.928 6.023 5.725 8.429
7.600 6.987 7.886 7.762 7.565 5.431 7.533 7.788 7.591 7.939
46
UIQM
NIQE
1.380 1.400 1.506 1.486 1.565 1.631 1.550 1.505 1.568 1.498
4.645 4.624 4.549 4.086 4.419 4.202 4.424 4.528 4.117 4.480
Journal Pre-proof
Image
Method
Quantitative Analysis Sobel PCQI Count 2569 1.000 2270 0.922 7782 1.242 7798 1.228 6973 1.165 7672 1.141 9501 1.214 6266 1.188 6697 1.188 9576 1.291
7.512 7.289 7.842 7.830 7.728 7.372 7.653 7.851 7.755 7.864
Image 6
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
7.120 6.405 7.693 7.763 7.408 7.438 7.592 7.879 7.474 7.819
2.156 3.031 7.284 7.482 6.982 6.789 7.308 6.615 4.202 9.979
1860 3464 9072 9412 8031 7895 9157 8405 5510 11458
1.000 1.094 1.310 1.298 1.240 1.237 1.315 1.284 1.197 1.354
0.878 1.053 1.460 1.476 1.505 1.493 1.537 1.434 1.201 1.578
5.122 4.728 4.301 4.201 4.221 4.258 4.168 4.330 4.401 4.026
Image 7
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
7.674 7.033 7.925 7.863 7.659 6.796 7.707 7.821 7.775 7.949
1.411 1.733 3.225 3.202 3.142 3.821 2.773 2.728 1.982 3.956
513 615 2364 2545 2388 4203 1811 1919 1075 3186
1.000 0.940 1.159 1.155 1.106 1.033 1.128 1.084 1.074 1.196
0.710 0.793 1.339 1.331 1.368 1.308 1.258 1.289 0.901 1.424
5.999 5.279 4.780 4.711 4.996 4.943 4.623 5.386 5.515 4.470
Image 8
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
2.643 3.136 5.149 5.284 5.199 5.001 6.203 4.460 4.208 6.768
2492 3097 6386 6769 6558 6217 7646 5503 5004 8508
1.000 1.016 1.177 1.179 1.119 1.122 1.157 1.120 1.133 1.218
0.989 1.071 1.407 1.404 1.493 1.481 1.488 1.382 1.253 1.489
4.843 4.477 4.203 4.317 3.798 4.123 3.724 4.351 4.391 4.105
Jo
urn a
re-
lP
Entropy
pro of
Image 5
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
Average Gradient 2.857 2.975 5.718 5.501 5.316 5.843 6.708 4.770 5.035 6.848
7.680 7.030 7.828 7.847 7.625 7.643 7.681 7.837 7.759 7.842
47
UIQM
NIQE
1.105 1.119 1.544 1.508 1.527 1.500 1.608 1.473 1.485 1.583
4.140 4.052 4.250 4.132 4.150 3.993 4.274 4.329 4.011 4.216
Journal Pre-proof
Image
Method
7.691 6.958 7.715 7.733 7.555 7.323 7.673 7.712 7.842 7.901
Image 10
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
7.490 7.206 7.851 7.634 7.644 7.591 7.733 7.882 7.377 7.865
2.711 2.860 5.435 5.319 5.401 5.139 5.662 5.015 3.508 6.784
1694 1923 5165 5382 5292 4857 5673 4923 2898 7812
re-
Jo
urn a
lP
Entropy
Quantitative Analysis Sobel PCQI Count 3600 1.000 5627 1.074 9302 1.214 10077 1.208 9496 1.191 8383 1.137 9025 1.182 8543 1.181 5369 1.093 11549 1.236
pro of
Image 9
Original image GW ICM UCM DIRS IISR RAHIM RGHS ULAP NUCE
Average Gradient 2.931 3.972 6.569 7.205 6.640 6.193 6.178 5.817 3.841 8.582
48
1.000 0.999 1.232 1.205 1.174 1.167 1.192 1.222 1.080 1.295
UIQM
NIQE
1.099 1.212 1.503 1.505 1.572 1.523 1.544 1.456 1.262 1.549
3.247 2.813 2.943 2.944 3.016 3.052 3.024 3.242 3.127 3.474
1.025 1.079 1.497 1.476 1.478 1.450 1.519 1.478 1.193 1.598
6.068 5.870 5.828 5.643 5.856 5.756 5.967 5.945 6.227 6.059