Ocean Engineering 162 (2018) 224–238
Contents lists available at ScienceDirect
Ocean Engineering journal homepage: www.elsevier.com/locate/oceaneng
Image contrast enhancement using an integration of recursive-overlapped contrast limited adaptive histogram specification and dual-image wavelet fusion for the high visibility of deep underwater image
T
Ahmad Shahrizan Abdul Ghani Innovative Manufacturing, Mechatronics, and Sports Engineering (IMAMS), Intelligent Robotics and Vehicles Laboratory (IRoV), Faculty of Manufacturing Engineering, 26600, Pekan, Pahang, Malaysia
A R T I C LE I N FO
A B S T R A C T
Keywords: Contrast enhancement Adaptive histogram specification Recursive-overlapped Dual-image wavelet fusion Deep underwater image
Deep underwater images suffer from several problems, such as low contrast and visibility, which reduce the extraction rate of valuable information from the image. In this paper, we proposed an approach which integrates the three main steps of homomorphic filtering, recursive-overlapped CLAHS and dual-image wavelet fusion and thus increases the visibility of deep underwater images and the extraction rate of important data. Additionally, this approach implements homomorphic filtering to provide homogeneity in the illumination of the entire image which will be used for subsequent processes. Recursive-overlapped CLAHS refers to the recursive process of overlapped tiles of divided image channel, where half of a tile is processed twice with an adjacent tile. Half of the tile is overlapped with its next adjacent tile. Dual-image wavelet fusion merging two images obtained from the integration of upper- and lower-stretched histograms are applied with discrete wavelet transformation before the main process of wavelet fusion is implemented. Qualitative and quantitative results reveal that the proposed method outperforms the current state-of-the-art methods. The highest value of average quantitative evaluations in terms of entropy, average gradient, measure of enhancement (EME) and EME by entropy are 7.835, 12.802, 8.343 and 27.616, respectively, in the proposed method.
1. Introduction Most previous research works (Hitam et al., 2013; Dwivedi et al., 2015; Li et al., 2015) focused on shallow image where the improvement of the underwater quality is concentrated on images that are captured near the water surface. Shallow images exhibit a considerably high percentage of red colour channel compared with deep underwater image. Consequently, the low visibility of captured image is obtained because objects in the image are hardly seen. These phenomena decreases the amount of data extracted and sometimes results in useless images. Deep underwater image normally suffers from a high concentration of blue-green illumination. This phenomenon occurs as the light spectrum that travels in water medium is absorbed according to its wavelength (Hitam et al., 2013). The shortest wavelength, that is, blue, is absorbed last, causing deep underwater image to appear bluish. Some applications, such as unmanned underwater robot and ocean floor mapping, are required to capture and analyse deep underwater images. Nevertheless, these applications suffer from extremely low visibility as the captured images are dark and affected by blue-green illumination.
E-mail address:
[email protected]. https://doi.org/10.1016/j.oceaneng.2018.05.027 Received 21 May 2016; Received in revised form 23 April 2018; Accepted 14 May 2018 0029-8018/ © 2018 Elsevier Ltd. All rights reserved.
In an image processing area, no clear border exists to differentiate between shallow and deep underwater images, and no clear information is available for determining deep and shallow underwater images. In this paper, deep and shallow underwater images are assumed to determine the percentage of colour channels. In a red-green-blue (RGB) colour model, red is the first colour channel attenuated and lost during the propagation of light in water medium. The loss in the percentage of red colour channel increases the percentages of blue and green colour channels. Deep underwater images refer to images with < 10% red colour percentage compared with that of green and blue colour channels. Normally, deep image contains a high percentage of blue-green illumination as the image turns bluish or greenish. In deep underwater image, the objects are hardly seen, and no clear border exists between objects and background. Thus, objects in deep underwater images turn bluish or greenish. This paper explains in detail a proposed method that enhances the visibility of deep underwater images and increases overall image quality. It integrates the two main methods of recursive-overlapped contrast limited adaptive histogram specification (RO-CLAHS) and dual-image wavelet fusion (DIWF). RO-CLAHS involves tiles histogram
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
point is achromatic by the implementation of shifting process on the pixel distribution of a colour image. Although the method can reduce the effect of blue-green illumination, the overall image contrast remains low. Hitam et al. (2013) proposed the mixture contrast limited adaptive histogram equalization (CLAHE-mix) to increase image visibility and reduce the noise and artefact of the image. The output image becomes clear by integrating the CLAHE-RGB and CLAHE-hue-saturation value (HSV)-implemented images by using Euclidean norm. However, the images become increasingly green in some cases and produce high noise level by exhibiting low peak signal to noise ratio. Two famous methods normally used for comparing underwater images are integrated colour model (ICM) (2007) and unsupervised colour correction method (UCM) (2010), which were proposed by Iqbal et al. (2007, 2010). ICM can improve image contrast, but produces the effects of over- and under-enhancement as the output images produce dark and extremely bright areas. The proposed method of UCM sometimes produces brownish image that deviates from the nature of the original image. Abu Hassan et al. (2017) enhanced under-exposed underwater images through enhanced homomorphic filtering and mean histogram matching for object tracking. The method successfully increased the contrast and enhanced dark areas in the image. Consequently, the overall visibility of images increased and non-uniform illumination improved. The next section discusses the proposed method in detail.
processing, clip-limit process, Rayleigh distribution mapping and greylevel mapping. DIWF implements the discrete wavelet transform of resultant images produced by RO-CLAHS. After applying wavelet fusion, we applied the resultant image with inverse discrete wavelet transform to produce the final output image. The paper is organised as follows: Section 2 describes literature reviews that are related to the proposed method. Section 3 explains the proposed method in detail. Results and discussions are elaborated in Section 4. The paper ends with the conclusion, which is explained in Section 5. 2. Related works Substantial research has been conducted for the enhancement of underwater image. Most studies focus on images captured near water surfaces, where objects are visible. These images are normally clearer than images captured in deep water, where images are nearly invisible because of the high concentration of blue-green illumination and limited light sources. Chen et al. (2017) proposed entropy–preserving-mapping for contrast enhancement, which focuses on producing fine textures in an image to indicate the improvement of image contrast. Yelmanova and Romanyshyn (2017) proposed the adaptive enhancement of monochrome image with low-contrast objects. As they focus on monochrome image, the proposed methodology is not properly applicable for coloured images. For security purposes, Erat et al. (2017) addressed the colour cast problem in underwater images through implementation in border security. Eliminating this problem is the first priority of the proposed steps before contrast enhancement is implemented. The method proceeds with contrast stretching and white balance processing to improve the captured overall image quality. Dwivedi et al. (2015) proposed an enhancement method based on distance factor estimation. In the proposed method, an underwater image is assumed to be nearly identical to a haze image in free space where the image can be modelled as a combination of direct attenuation and scattering parts. As a consequence of motion in water medium, noise term due to motion of water medium is included in the equation of underwater image. Rayleigh scattering, which is the dominant scattering effect, leads to noise, which is non-additive in nature and can be minimised by statistics-based filtering techniques. Nevertheless, this technique produces numerous parameters that must be determined before running the method. The output images suffer from the effect of blue-green illumination and produce over-enhanced areas where images become extremely bright. These areas reduce image details, and the output image loses its important information. Underwater image enhancement using inherent optical properties has been proposed by Li et al. (2015) to enhance shallow ocean optical images and videos through fast dark channel prior descattering. The proposed method begins by the estimation of the depth map through dark channels before considering the positions of lighting lamp, camera and imaging plane. Through the implementation of weighted guided median filtering, the proposed method can remove the scattering effect. Finally, colour correction is applied through spectral properties. The proposed method reduces the effect of blue-green illumination, but the visibility of the object is inadequately increased. The background objects are still hardly seen. Identical results have also been obtained using adaptive dehazing framework, which is proposed by Qing et al. (2015), as the proposed method inadequately increases the visibility of the background areas. In 2012, pixel distribution shifting colour correction (PDSCC) was proposed by Naim and Isa. The method corrects the white reference point of the image and ensures that the white reference
3. Methodology: Integration RO-CLAHS and DIWF 3.1. Motivation Most underwater images are captured without constant illumination, and images are captured with bright areas at the foreground and dark areas at the background. Precisely, the areas near the light source are brighter than the areas that are located far away from the light source. This nature of images must be determined before further processing. Combining bright and dark areas into a tile produces enhanced output distribution because the balanced improvement between both areas is considered in the enhancement process. To determine the nature of the capture image, we utilised entropy to determine its image detail. The high detail of an area indicates that the objects in the image are distinguished well, that is, with high visibility, from the background, whereas the low visibility of objects in an area indicates that the area is low in detail. This phenomenon reflects the value of entropy. When an entropy value is high, the visibility level of a particular area is high, whereas a low entropy value indicates a low visibility area. Thus, high and low entropy values correspond to bright and dark areas of image, respectively. Therefore, as the first stage of the pre-processing of the image, the entropy at the borders of the image is determined. We used entropy to measure the richness of the image details. Entropy represents an abundance of image information, where the higher the value is, the more information is present in the image (Wu et al., 2005). Entropy is given by k
1
H (X ) = ∑x = 1 p (x )log 2 p (x ) k
H (X ) = − ∑x = 1 p (x )log 2p (x ) = −sum (p (x ). ∗log 2(p (x )))
(1)
where p(x) is the probability distribution function (PDF) of the image. The detail of the proposed method is explained in the next subsections. In some cases, conventional CLAHE process produces improper pixel distribution especially when an object is on the border between tile regions. When two tiles containing objects are independently
225
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 1. Flowchart of the proposed integrated RO-CLAHS and DIWF method.
brightness, which is implemented in HSV colour model. Finally, the image is converted back into the RGB colour model. The detail of the proposed method is explained in the next subsections.
processed, they may cause improper intensity distribution as identical objects must exhibit identical colour and contrast. CLAHE also presents several disadvantages, such as the fact that it reduces object colour and low frequency colour that are a part of an image. The resultant image shows that water ‘veil’ remains. The next section focuses on solving these problems of CLAHE and presents the proposed technique. Fig. 1 illustrates the flowchart of the proposed integrated RO-CLAHS and DIWF method. The method begins with the decomposition of the input image. The image which is normally in RGB colour model is then decomposed into respective channels, namely, red, green and blue channels. The enhancement starts with the implementation of enhanced homomorphic filtering through high-pass filtering. The image is subsequently divided into small tiles, which in this case organised into 8 × 8 tiles by default as recommended by Eustice et al. (2002). The image is then applied with RO-CLAHS, which involves three main steps as shown in Fig. 1: i) tile histogram generation and matching, ii) cliplimit process and iii) Rayleigh distribution and Grey-level mapping. Before dividing the image into tiles, the nature of the image is determined according to the entropy value. This entropy-based image division compares the entropy values at the sides of the image and thus decides the flow of entropy of the image. These entropy-based tile division and RO-CLAHS steps are the main significant contributions and improvement of the proposed method. Additionally, entropy-based division RO-CLAHS consists of three sub-steps, whereas the tile histograms of the divided image regions are processed recursively. In addition to the tile division, Rayleigh-based histogram distribution and grey-level mapping will be implemented. Bilinear interpolation is applied after the RO-CLAHS step to combine the image tiles as an image and reduces the artificial effects at the tile border that resulted from the combination. Subsequently, the image is applied with DIWF before the modification of image saturation and
3.2. Channel decomposition and enhanced homomorphic high-pass filter First, an image in RGB colour model will be decomposed into respective channels—red, green and blue. The general steps of enhanced Homomorphic high-pass filter are shown in Fig. 2. Homomorphic filter is applied by converting the image into natural log domain to separate the luminance L(x,y) and reflectance R(x,y) components of the image I(x,y).
I (x , y ) = L (x , y )⋅R (x , y )
(2)
ln (I (x , y )) = ln (L (x , y )) + ln (R (x , y ))
(3)
High-pass filter, which requires the image to be applied with fast Fourier transformation (FFT), is then applied to the image for respective channels in log domain as the image will be processed in frequency domain filtering. In the proposed method, high-pass Butterworth filter is implemented as it shows improved quality performance by producing maximal flat magnitude response in pass-band and demonstrates minimal phase shift over the pass band (Soni et al., 2015). High-pass filter maintains high frequencies in the image while discarding lowsignal frequencies. For an underwater image, the blue-green illumination of water is regarded as a low frequency signal which is not required for further analysis. Based on the illumination-reflectance model, an image f(x,y) is considered as the product of the illumination i(x,y) and of reflectance r(x,y) components, as shown in Equations (2) and (3) (Bazeille et al., 2006). Fig. 2. Enhanced high-pass filter.
226
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Table 1 Decision of histogram specification processing method as determined by the nature of the image. Lower entropy region Higher entropy region
Area Top Bottom Left Right
Top C R R
Bottom C R R
Left C C
Right C C R
R
C = column-wise; R = row-wise.
image are still visible. The next enhancement process is performed according to this nature of image. Thus, dividing the image, that is, column-wise or row-wise, according to the nature of the input image is important. As shown in Fig. 3, the image borders are divided into t tiles with 8 columns and 8 rows. The entropy of each tile is calculated. The mean entropy values for all four sides of the image are then calculated and compared with those at the opposite side. The right side is compared with the left side, and the top side is compared with the bottom side. Equation (7) is used to determine the average entropy Hmean value of the image sides. HX refers to the entropy value for each tile at image border, where 1 ≤ X ≤ 8. t is the total tile number, which is in this case is 8.
Fig. 3. Division of tiles at the border of the image for entropy calculation.
Considering this model, we assumed that the illumination factor changes slowly through the view field and thus represents low frequencies in the Fourier transform of the image. On the other hand, reflectance component is associated with high-frequency components. By multiplying these components with a high-pass filter, we can suppress low-pass-frequency signals and thus eliminate non-uniform bluegreen illumination. Thus, by removing blue-green illumination, the visibility of objects in the image is improved, and the object detail is increased. High-pass filter increases the homogeneity of illumination for the overall image areas. Butterworth high-pass filter has been proposed by Zhang and Zhao (2011).
H (u, v ) =
Mean entropy, Hmean =
D
n
(4)
Butterworth high-pass filter maintains frequencies outside the radius D0 and discards values inside the distance D0. Butterworth highpass filter exhibits a gradual transition from 0 to 1 to reduce ringing artefacts. Equation (3) shows that high-pass filter provides an enhancement method by adding an offset value ß and scaling parameters α as follows:
Hnew (u, v ) = α + βH (u, v )
Hnew (u, v ) = α + β
r = 2n − 1
(5)
D
(8)
An image will only be divided into rows or columns. If the same number of tiles is applied to conventional CLAHE, 64 tiles (8 × 8) are produced, indicating that 64 processing times of histogram equalization is applied to an image. Fig. 4 shows an example of row-wise and column-wise divisions for an image. The next step explains the recursive-overlapped CLAHS process for each tile after the tile division step.
1 n
1 + ⎡ 2 0 2⎤ ⎣u + v ⎦
(7)
After determination the nature of the input image, the image is divided accordingly. The entropy values at the border are compared. Table 1 summarises the column-wise or row-wise division of the image for the next process. After the determination of the division method, the image is further divided into tiles according to the previous step. The default number of row and column is 8, which is the best number of division from the tested images (Eustice et al., 2002) because the recursive process leads to 2n-1 times to process the entire image. This finding indicates that an image takes 15 times to be recursively processed. Thus, the number of recursive process r is determined using Equation (8), where n is the number of column or row. In this case, the default number of n is 8.
1 1 + ⎡ 2 0 2⎤ ⎣u + v ⎦
∑ H1 + H2 + H3+…+ H8 t
(6)
This equation shows that the high-frequency signal is amplified to levels higher than the low frequency signal if α < 1 and β > 1. As described by Eddins (2013), the best values for α and β are 0.5 and 1.5, respectively. These values are implemented in the proposed method and improve image contrast and sharpness. In most cases, these tested offset and scaling values produce improved results for underwater images. Inverse FFT and inverse log are implemented to the image after applying Butterworth high-pass filter.
3.4. RO-CLAHS 3.4.1. Tile histogram processing After determining whether the division method is row- or columnwise, the image is divided into tiles. The example in Fig. 5 shows that the image is divided and processed into row-wise tiles. Starting with the first processed tile that exhibits a purple border, the histogram with 256 bins is created. The second processed tile with an orange border will then be extracted from the image. This second processed tile is
3.3. Image tile division Although the implementation of homomorphic filtering reduces the effect of non-uniform illumination, the bright and dark areas in the
227
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 4. Example of division of image into tiles; (a) row-wise and (b) column-wise.
3.4.2. Applying clip limit The clip limit is applied to the histogram to restrict the maximum number of pixels for a certain grey-level value. The exceeding number of pixels is distributed uniformly through all grey levels, thereby increasing the number of pixels for all grey levels. Fig. 6 illustrates the process of cutting off the exceed pixels at the clip-limit value and the distribution of exceed pixels to all grey-level values. The spikes of the histogram for a certain grey level are cut, and the exceeding number of pixels above this clip-limit is distributed uniformly to all grey-level values. By distributing the pixels to all grey-level values, some regions of grey level exceed the clip limit value. The re-distributing pixels is considered complete if the excess is small enough to be ignored. This distribution causes the number of pixels for all grey levels to increase. Clip-limit results in the improvement of image contrast performance that hinders the concentration of pixels at a certain grey level. For the proposed method, the clip limit is set to 0.03 (normalised to 1) as the optimal clip-limit values are tested with a number of underwater images. Apparently, the clip-limit process reduces the problems of over- and under-enhanced areas in an image.
Fig. 5. Row-wise division of images and recursive order of row-wise local histogram specification process.
extracted from half of the first tile and half of the second tile. Thus, half of the first tile is processed twice. Half of the second tile (orange border) is then combined with the third tile (blue border). This process is recursively applied to the whole image. The numbering at the left side of the image in Fig. 5 and the coloured borders show the order of the recursive tile-processing for a clear explanation. For each extracted tile, a histogram will be generated and applied with clip limit. The process of clip limit is explained in the next subsection.
3.4.3. Rayleigh distribution and grey-level mapping In addition to the clip-limit process, the histogram of the extracted tile will be matched with bell-shaped Rayleigh distribution. Rayleigh distribution is the most suitable pixel distribution for underwater images (Eustice et al., 2002). Rayleigh distribution concentrates most of the pixels in the middle of the dynamic grey level. Both sides of the intensity levels, that is, upper and lower dynamic ranges, possess a less number of pixels. This type of distribution reduces the concentration of image pixels at the lower and
Fig. 6. Distribution of exceeding pixels to all grey levels by CLAHE.
228
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
are respectively expressed by Equations (11) and (12). The values in boxes 180, 240, 50 and 40 represent the intensity values of the pixels at respective points.
I(x , y1) =
(x − x1) (x 2 − x ) ⋅240 = a ⋅180 + (x 2 − x1) (x 2 − x1)
(11)
I(x , y2) =
(x − x1) (x 2 − x ) ⋅40 = b ⋅50 + (x 2 − x1) (x 2 − x1)
(12)
Therefore, the intensity value at the point (x, y) is determined by Equation (13).
I(x , y) =
(y − y 2) (y1 − y ) ⋅a + ⋅b (y1 − y 2) (y1 − y 2)
(13)
The output image is implemented with DIWF, which is explained in next subsection.
Fig. 7. Illustration of the implementation of bilinear interpolation.
3.6. DIWF As reported by Abdul Ghani and Mat Isa (2014), the integration of dual images, which are produced from the integration of over- and under-enhanced images, results in significant improvement in image colour and contrast. A histogram of image channel is divided at its midpoint for the generation of upper and lower regions. This individual histogram is individually stretched for the production of two independent image histograms, namely, lower and upper stretched histograms. An output image is produced by combining all upper stretched histograms, and another output image is produced by combining all lower stretched histograms. These combinations are under- and overenhanced, respectively. These output images are then applied with discrete wavelet transformation (DWT), where the fusion process is then implemented through average intensity values between these extreme output images. The resultant image is then applied with inverse DWT (IDWT) for the production of an enhanced contrast image, which exhibits high visibility. Fig. 8 illustrates the proposed method of integrated RO-CLAHS and DIWF. The results show that the combination between over- and underenhanced images produces enhanced resultant image where the detail is adequately improved. The dark areas and high concentration of bluegreen illumination are significantly reduced, producing pleasant view in the perspective of the human visual system.
upper regions, which contributes towards the production of dark and too bright areas. The PDF and cumulative distribution function of Rayleigh distribution are expressed in the following equations (Bibalan and Amindavar, 2015): 2
⎛ −t ⎞ t PDF = ⎛ 2 ⎞ e⎝ 2α2 ⎠ ⎝α ⎠
(9)
2
⎛ −t ⎞
CDF = 1 − e⎝ 2α2 ⎠
(10)
where t is indicated as input pixels, and α refers to distribution parameter for Rayleigh, which is set to 0.4. The distribution parameter α of the Rayleigh equation is set based on the finding by Eustice et al. (2002). They found that the distribution parameter of 0.4 shows the best result for almost all underwater images. The tested underwater images in the present study are also in agreement with this finding. Therefore, in the proposed method, the value of distribution parameter, α = 0.4, is set based on this finding and is implemented in the proposed method. This value is applied for all types of images regardless of the initial level of image contrast. As shown in Fig. 1, these three main steps are recursively applied to all tiles in an image. These steps obviously improve the contrast and visibility of underwater image. Although the resultant image is not real in the perspective of some human eyes, the output image shows significant improvement in terms of image visibility.
3.7. Saturation and brightness modification After the wavelet fusion step, the image is further processed for the improvement of saturation and brightness. To significantly improve the saturation and brightness of an image, we converted the original RGB colour model of an image into an HSV colour model. As reported by Hitam et al. (2013), improving underwater image in HSV colour model can increase image contrast and reduce the noise. In the HSV colour model, the S and V components are stretched to the entire dynamic ranges. Stretching process increases image saturation and brightness as all dynamic ranges of the image are occupied.
3.5. Bilinear interpolation The combined tiles of the output image will produce artificial border between the tiles. To remove artificially induced tile border, we applied mapping, namely bilinear interpolation (Hurtik and Madrid, 2015), to interpolate between the tile mappings. Through the bilinear interpolation, the basic calculation of unknown point can be obtained by the calculation of distance. An example of the calculation can be applied by the following explanation and illustration in Fig. 7. An unknown intensity value of point (x,y) and known intensity values at points (x1,y1), (x2,y1), (x1,y2) and (x2,y2) are considered. The intensity value at the point of (x,y) can be calculated by means of distance between the point of (x,y) and the four other known points. The calculation begins by calculating the intensity values at the points of (x,y1) and (x,y2). Thus, the intensity values at points I(x,y1) and I(x,y2)
4. Results and discussion Three hundred underwater images were captured in three different islands in Malaysia, namely Perhentian, Tioman and Payar Islands. These underwater images suffered from low contrast and were highly affected by blue-green illumination. These test images can be found in Google
229
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 8. Illustration of the integrated RO-CLAHS method and detailed steps of DIWF.
known methods normally used for comparing enhanced images. For evaluation, we employed an additional method, the dual-image Rayleigh-stretched CLAHS (DIRS-CLAHS) (Abdul Ghani and Mat Isa, 2015). DIRS-CLAHS consists of two main stages that contribute toward the improvement of underwater image contrast, that is, contrast correction and colour correction. DIRS-CLAHS combines global and local corrections of image intensity and produces an enhanced contrast of the output image. Apart from qualitative measurement, the proposed method was evaluated in terms of entropy (Wu et al., 2005; Padmavathi et al., 2010), which describes image information content. Average gradient (Wu et al., 2005), which shows the fine contrast, texture characteristic and clarity of image, was used for quantitative measurement. The mean of structural similarity (MSSIM) (Wang et al., 2004), which is generally used as quality matric based on structural information, is another quantitative metric for image quality evaluation. This quantitative evaluation approach is not a proper method for measuring image quality because it compares the similarity of the output image and original image, which suffers from the problems mentioned in the Introduction. However, MSSIM is included in this paper for evaluation according to previous research
drive through the following link (https://drive.google.com/drive/ folders/1cPSDsoyrMV1nRW7umB0Ds-lTioj1QM1n?usp=sharing). Most of the objects in the images were hardly differentiated from the background and thus possessed extremely low visibility level. We used these underwater images to evaluate the proposed method and compare it with other state-of-the-art methods, such as CLAHE-Mix (Hitam et al., 2013), PDSCC (Naim and Isa, 2012), CLAHE, UCM (Iqbal et al., 2010), ICM (Iqbal et al., 2007) and histogram equalization (HE). These methods are the latest state-of-the-art methods which fall within the identical field of image enhancement. To evaluate the effectiveness of the proposed method, we evaluated the output images of these compared methods in terms of contrast and colour, visibility, the effect of blue-green illumination and the production of under- and over-enhanced areas. The compared methods used for the evaluation of the effectiveness of the proposed method were included in the identical enhancement group, referred to as the histogram-based image enhancement group. Thus, all the compared methods improve image contrast and colour by histogram modification. CLAHE-Mix (Hitam et al., 2013) and PDSCC (Naim and Isa, 2012) are the latest methods developed 5–6 years before the present study. Meanwhile, CLAHE, UCM (Iqbal et al., 2010), ICM (Iqbal et al., 2007) and HE are well-
230
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 9. Image of red-fish. (a) Original image and the rest are images produced from (b) PDSCC, (c) CLAHE, (d) UCM, (e) ICM, (f) HE, (g) CLAHE-Mix, (h) DIRS-CLAHS and (i) the proposed integrated RO-CLAHS and DIWF. (For interpretation of the references to colour in this figure legend, the reader is referred to the Web version of this article.)
resultant images produced by the state-of-the-art methods with their respective 3D RGB colour model. Subjective results were added for evaluation purpose. These additional images are included in Appendix 1, and the quantitative evaluation of these images is featured in Appendix 2. The samples and their outputs of 300 underwater can be obtained through Google drive from the following link: https://drive.google.com/ drive/folders/1cPSDsoyrMV1nRW7umB0Ds-lTioj1QM1n?usp=sharing. Table 2 shows the quantitative dataset for the image in Fig. 9û12 in terms of entropy, average gradient, MSSIM, EMEE and EME. In Table 3,
(Jaya and Gopikakumari, 2013; Mittal et al., 2013; Wang et al., 2004). In addition to these quantitative measurements, we included measure of enhancement (EME) (Agaian et al., 2000, 2001, 2007) and EME by entropy (EMEE) (Agaian et al., 2007) in the evaluation to test the enhancement of output images. We added another quantitative evaluation parameter, namely, contrast, to improve the results of the evaluation of image contrast. In this comparison, 4 of 300 underwater images were selected as samples. Figs. 9–12 show the samples of underwater images and 231
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 10. Image of stone 1. (a) Original image and the rest are images produced from (b) PDSCC, (c) CLAHE, (d) UCM, (e) ICM, (f) HE, (g) CLAHE-Mix, (h) DIRS-CLAHS and (i) the proposed integrated RO-CLAHS and DIWF method.
compared with the original image. Low values of EME and EMEE compared to the original image indicate that the resultant image is not positively enhanced. CLAHE improves the image insignificantly as the blue-green illumination is retained in the output image. The background areas are hardly seen. UCM produces brownish image, whereas ICM improves only the foreground areas of the image and darkens the background areas. HE oversaturates the image as the foreground areas become too bright. This phenomenon results in reducing the image detail. CLAHE-
the average value of quantitative dataset for 300 underwater images is shown. The image red-fish in Fig. 9 shows that PDSCC inadequately improves the overall image quality as the blue-green illumination retains in the image and the background objects are hardly seen. The output image is almost identical to the original image. This qualitative observation is consistent with the quantitative result as the output image of PDSCC exhibits the highest value of MSSIM (0.978), which indicates high similarity of the resultant image 232
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 11. Image of stone 2. (a) Original image and the rest are images produced from (b) PDSCC, (c) CLAHE, (d) UCM, (e) ICM, (f) HE, (g) CLAHE-Mix, (h) DIRS-CLAHS and (i) the proposed integrated RO-CLAHS and DIWF method.
(27.807), which shows that the proposed method outperforms the other state-of-the-art methods. Generally, the proposed method produces the best result in terms of image visibility for all sample images. Quantitative evaluation supports the visual observation of the resultant images, which indicates that the output image of the proposed method produces the highest quality image. The identical effects of the output images could be observed in the other samples, such as Stone 1, Stone 2 and pillar. Quantitative
mix, however, enhances only a small amount of image contrast as the output image is almost identical to the original image. The proposed integrated RO-CLAHS and DIWF method significantly enhances the image as the objects in the background can be seen clearly and the blue-green illumination is adequately reduced. The overall image contrast and colour are highly improved. Quantitative evaluation indicates that the proposed method produces the highest value of entropy (7.892), average gradient (12.399), EMEE (8.867) and EME 233
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Fig. 12. Image of pillar. (a) Original image and the rest are images produced from (b) PDSCC, (c) CLAHE, (d) UCM, (e) ICM, (f) HE, (g) CLAHE-Mix, (h) DIRS-CLAHS and (i) the proposed integrated RO-CLAHS and DIWF method.
successfully implemented for underwater image. As shown in the results, the proposed method outperforms other state-of-the-art methods especially in terms of contrast, detail and visibility. Qualitative and quantitative results show that the proposed method produces output images with excellent quality. The image detail, contrast, clarity and level of enhancement are adequately improved. These qualitative evaluations are highly consistent with the quantitative measurements as the proposed method produces the highest average value for four out of five of these parameters.
evaluation using contrast does not agree with the visual evaluation. As shown in Tables 2 and 3, the output images from PDSCC, ICM and HE obtained the highest values of contrast, which shows that the output image contrast from these methods is the best. However, these values do not represent the actual contrast of the output images. 5. Conclusion The proposed integrated RO-CLAHS and DIWF method is 234
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Table 2 Quantitative evaluation of the resultant images for the respective state-of-the-art methods. Image
Red-fish
Stone 1
Stone 2
Pillar
Method
Quantitative analysis
Original PDSCC CLAHE UCM ICM HE CLAHE-Mix DIRS-CLAHS RO-CLAHS + Original PDSCC CLAHE UCM ICM HE CLAHE-Mix DIRS-CLAHS RO-CLAHS + Original PDSCC CLAHE UCM ICM HE CLAHE-Mix DIRS-CLAHS RO-CLAHS + Original PDSCC CLAHE UCM ICM HE CLAHE-Mix DIRS-CLAHS RO-CLAHS +
DIWF
DIWF
DIWF
DIWF
Entropy
Average gradient
MSSIM
EMEE
EME
Contrast
7.255 7.010 7.552 7.687 7.739 5.909 6.922 7.609 7.892 6.368 5.884 6.734 7.744 7.715 5.634 7.394 7.435 7.742 6.935 6.790 7.472 7.536 7.597 5.932 6.844 7.608 7.913 7.278 7.113 7.260 7.634 7.662 5.965 6.802 7.373 7.654
2.371 2.439 6.679 5.025 5.094 6.577 3.607 5.139 12.399 1.004 1.065 3.470 6.170 7.460 8.161 3.647 5.638 12.129 2.760 2.930 7.776 6.665 7.120 9.533 4.333 6.648 14.565 1.438 1.546 4.160 2.618 2.858 3.512 2.421 3.464 8.760
-. 0.978 0.722 0.701 0.682 0.673 0.872 0.936 0.415 – 0.988 0.803 0.425 0.449 0.381 0.797 0.946 0.282 – 0.994 0.717 0.669 0.672 0.591 0.831 0.958 0.412 – 0.990 0.803 0.741 0.739 0.765 0.864 0.914 0.462
0.298 0.241 1.539 1.164 2.720 3.076 0.652 2.514 8.867 0.159 0.127 0.588 1.086 1.407 1.986 1.005 0.950 3.879 0.333 0.310 1.524 2.279 4.154 4.729 1.153 3.899 10.321 0.190 0.177 0.721 0.958 2.174 2.312 0.727 3.159 5.626
5.401 4.600 14.896 10.384 17.732 16.448 9.563 16.156 27.807 3.416 2.792 9.554 9.319 11.837 16.879 12.093 12.494 22.977 5.884 5.615 15.689 13.637 21.768 20.410 13.822 21.500 30.682 3.782 3.572 10.325 9.062 15.497 15.161 9.956 14.875 24.583
42.228 43.272 41.232 39.727 39.649 42.042 39.215 39.494 41.891 39.521 40.673 40.027 42.768 42.824 42.047 41.063 39.751 41.706 41.615 42.280 40.885 38.874 38.867 42.031 37.617 39.162 41.939 41.041 41.911 40.527 38.703 38.585 42.052 37.192 37.796 41.677
Note: The values in bold typeface represent the best result obtained in the comparison.
Acknowledgement
Table 3 Average quantitatve values of 300 underwater images in comparison with stateOF-the-art methods. Method
Original PDSCC CLAHE UCM ICM HE CLAHE-Mix DIRS-CLAHS RO-CLAHS + DIWF
The author would like to thank the anonymous reviewers for their valuable comments and suggestions that have significantly improved this paper. This work is supported by the Universiti Malaysia Pahang internal research grant, RDU170392 entitled “Dual Image Fusion Technique for Enhancement of Underwater Image Contrast” with the support from Innovative Manufacturing, Mechatronics, and Sports Engineering (IMAMS) Lab and Intelligent Robotics and Vehicles Laboratory (IRoV) Cluster, Faculty of Manufacturing Engineering, Universiti Malaysia Pahang.
Quantitative analysis Entropy
Average Gradient
MSSIM
EMEE
EME
Contrast
7.101 6.995 7.614 7.583 7.588 5.878 7.125 7.577 7.835
3.563 4.105 9.298 8.190 7.588 9.744 4.960 7.115 12.802
– 0.914 0.686 0.617 0.667 0.579 0.884 0.932 0.407
0.534 0.410 2.826 2.141 3.318 4.064 1.116 8.263 8.343
7.400 6.771 19.538 14.147 18.927 20.018 13.168 24.303 27.616
40.274 41.588 40.349 40.118 39.878 41.988 40.595 39.254 41.633
Note: The values in bold typeface represent the best result obtained in the comparison.
235
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
APPENDIX 1
APPENDIX 2
Image
Evaluation
Original image PDSCC
CLAHE UCM
ICM
HE
CLAHE-Mix DIRS-CLAHS RO-CLAHS + DIWF
Image 1
Entropy Average gradient MSSIM EMEE EME Contrast Entropy Average gradient MSSIM EMEE EME Contrast Entropy Average gradient MSSIM EMEE EME Contrast Entropy Average gradient MSSIM EMEE EME Contrast
7.175 4.122 – 4.995 11.112 37.438 7.423 3.767 – 6.212 20.120 38.798 7.356 6.122 – 7.986 17.617 37.773 6.866 3.049 – 2.307 7.130 38.766
7.215 8.528 0.786 4.851 31.149 39.421 7.479 7.751 0.811 3.794 26.141 40.198 7.394 11.025 0.823 5.194 31.060 39.675 6.856 6.585 0.815 3.590 25.666 38.996
7.462 9.360 0.726 4.529 10.653 38.769 7.676 8.295 0.760 6.405 19.881 39.616 7.499 11.181 0.807 8.796 15.117 38.121 7.091 7.822 0.706 1.886 6.462 40.515
5.836 11.979 0.564 3.754 27.755 42.139 5.939 10.331 0.629 6.475 30.556 42.053 5.861 13.525 0.674 3.320 24.805 42.124 5.638 8.962 0.620 0.908 12.486 42.474
7.460 5.181 0.937 5.045 29.561 39.429 7.419 4.853 0.953 1.665 17.280 40.231 7.458 6.651 0.935 5.580 28.386 39.595 7.252 4.117 0.962 4.526 26.482 39.468
Image 2
Image 3
Image 4
7.176 5.588 0.903 0.485 4.685 38.728 7.277 4.809 0.932 0.789 9.109 40.254 7.350 7.246 0.952 1.805 11.306 38.475 6.769 6.539 0.698 0.241 3.072 39.901
7.439 11.243 0.588 2.273 7.843 39.556 7.602 8.809 0.668 4.910 16.924 40.357 7.401 12.551 0.691 6.628 11.994 38.513 6.977 10.146 0.575 0.378 2.343 40.650
236
7.667 9.250 0.983 11.040 38.443 38.572 7.746 8.339 0.982 11.907 33.142 39.466 7.670 11.210 0.984 10.226 36.000 38.651 7.538 7.704 0.941 7.511 31.472 37.968
7.918 13.469 0.540 11.781 42.998 40.928 7.971 12.553 0.554 12.238 37.901 41.574 7.952 16.780 0.573 16.733 44.425 41.229 7.783 12.721 0.491 6.714 32.089 40.273
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
Entropy Average gradient MSSIM EMEE EME Contrast 6 Entropy Average gradient MSSIM EMEE EME Contrast 7 Entropy Average gradient MSSIM EMEE EME Contrast 8 Entropy Average gradient MSSIM EMEE EME Contrast 9 Entropy Average gradient MSSIM EMEE EME Contrast 10 Entropy Average gradient MSSIM EMEE EME Contrast
Image 5
Image
Image
Image
Image
Image
6.853 1.885 – 2.894 8.475 39.736 7.074 3.074 – 5.293 27.882 41.146 6.083 0.845 – 1.580 16.417 40.179 7.151 1.638 – 1.883 13.800 40.339 6.356 0.998 – 1.590 16.255 40.458 6.023 1.256 – 4.967 29.013 40.795
7.080 5.518 0.696 11.292 40.860 41.330 6.785 3.560 0.948 0.725 10.797 43.029 6.078 1.033 0.952 0.168 3.332 42.760 7.053 2.333 0.933 0.291 4.983 41.488 6.425 1.185 0.962 0.238 4.240 42.600 1.623 6.097 0.931 0.584 6.126 43.366
6.863 4.402 0.849 2.151 19.776 39.742 7.259 6.808 0.773 2.769 22.398 40.826 6.714 2.085 0.916 0.874 11.955 40.185 7.249 3.561 0.854 1.391 15.583 39.833 6.883 2.397 0.901 0.959 12.512 40.404 6.940 3.080 0.822 1.679 18.075 40.268
7.199 12.075 0.427 2.387 5.259 41.625 7.663 10.769 0.521 7.915 26.895 41.288 7.635 5.800 0.461 2.042 14.665 42.306 7.295 4.567 0.606 2.370 12.384 38.091 7.618 5.585 0.498 2.928 17.302 40.801 7.386 8.524 0.301 9.941 34.020 0.355
6.691 7.163 0.594 4.517 8.611 41.824 7.772 11.157 0.540 8.988 29.074 40.690 7.597 5.222 0.572 3.014 19.032 41.584 7.394 4.120 0.614 2.272 15.166 36.676 7.730 4.758 0.627 4.054 21.079 40.112 7.508 8.102 0.355 8.572 32.202 39.017
5.674 6.594 0.648 1.169 14.343 42.307 5.950 12.910 0.470 10.585 31.987 42.037 5.674 6.815 0.426 3.739 18.144 42.054 5.849 5.772 0.643 1.939 15.512 42.064 5.638 6.869 0.444 6.788 23.333 42.033 5.575 11.171 0.286 17.642 37.733 42.057
7.337 3.103 0.952 3.513 23.984 40.649 7.329 4.913 0.945 2.670 18.180 40.756 6.783 2.058 0.914 1.167 13.283 42.781 7.311 2.686 0.972 1.774 15.646 40.306 6.961 2.150 0.938 1.261 13.806 42.276 6.901 2.661 0.919 2.451 19.288 42.182
7.376 6.246 0.869 3.954 24.007 37.689 7.555 9.253 0.980 6.922 30.920 38.911 7.499 4.342 0.945 2.585 18.708 38.976 7.302 4.342 0.938 1.856 17.082 36.380 7.327 4.171 0.938 3.172 21.663 37.710 7.069 5.555 0.948 3.649 25.003 36.233
7.789 14.134 0.353 5.307 26.918 40.675 7.983 15.628 0.407 24.203 43.778 42.033 7.712 9.248 0.390 2.984 20.999 41.463 7.838 10.191 0.429 9.784 29.691 41.077 7.788 10.449 0.340 6.841 28.060 41.667 7.841 13.749 0.237 10.669 38.630 41.499
steve/2013/06/25/homomorphic-filtering-part-1/, Accessed date: 1 March 2016. Erat, O., Panetta, K., Agaian, S., 2017. Contrast enhancement for underwater images in maritime border protection. In: IEEE International Symposium on Technologies for Homeland Security (HST). 25–26 April 2017. Waltham, MA, USA. Eustice, R., Pizarro, O., Singh, H., Howland, J., 2002. UWIT: underwater image toolbox for optical image processing and mosaicking in MATLAB. In: Proceedings of the 2002 International Symposium on Underwater Technology, April 2002, Tokyo, Japan, pp. 141–145. Hitam, M.S., Yussof, W.N.J.W., Awalludin, E.A., Bachok, Z., 2013. Mixture contrast limited adaptive histogram equalization for underwater image enhancement. In: IEEE International Conference on Computer Applications Technology (ICCAT). Sousse, 2022 January 2013, pp. 1–5. Hurtik, P., Madrid, N., 2015. Bilinear interpolation over fuzzified images: enlargement. In: IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). Istanbul, 2-5 August 2015, pp. 1–8. http://dx.doi.org/10.1109/FUZZ-IEEE.2015.7338082. Iqbal, K., Salam, R.A., Osman, A., Talib, A.Z., 2007. Underwater image enhancement using integrated color model. IAENG Int. J. Comput. Sci. 34 (2) IJCS 34-2-12, November 2007. Iqbal, K., Odetayo, M., James, A., Salam, R.A., Talib, A.Z.H., 2010. Enhancing the low quality images using unsupervised color correction method. In: International Conference on System Man and Cybernetics (SMC), 10-13 October 2010, Istanbul, pp. 1703–1709. Jaya, V.L., Gopikakumari, R., 2013. IEM: a new image enhancement metric for contrast and sharpness measurements. Int. J. Comput. Appl. 79 (9), 1–9 October 2013. Li, Y., Lu, H., Li, J., Li, X., Serikawa, S., 2015. Underwater image enhancement using inherent optical properties. In: Proceeding of the 2015 IEEE, International Conference on Information and Automation. Lijiang, China. August 2015, pp. 419–422. http://dx.doi.org/10.1109/ICInfA.2015.7279324. Mittal, A., Soundararajan, R., Bovik, A.C., 2013. Making a completely blind image quality analyzer. IEEE Signal Process. Lett. 22 (3), 209–212 March 2013. Naim, M.J.N.M., Isa, N.A.M., 2012. Pixel distribution shifting color correction for digital color images. Elsevier: Journal of Applied Soft Computing 12 (9), 2948–2962. Padmavathi, G., Muthukumar, M., Thakur, S.K., 2010. Non linear image segmentation using fuzzy C-means clustering method with thresholding for underwater images. International Journal of Computer Sciences Issues (IJCSI) 9 (3), 35–40 May 2010.
References Abdul Ghani, A.S., Mat Isa, N.A., 2014. Underwater image quality enhancement through Rayleigh-stretching and averaging image planes. Int. J. Naval Architect. Ocean Eng. 6, 840–866. https://doi.org/10.2478/IJNAOE-2013-0217. Abdul Ghani, A.S., Mat Isa, N.A., 2015. Enhancement of low quality underwater image through integrated global and local contrast correction. Elsevier: Appl. Soft Comput. J., vol.37, 332–344. http://dx.doi.org/10.1016/j.asoc.2015.08.033. Abu Hassan, M.F., Abdul Ghani, A.S., Dhanesh, R., Radman, A., Suandi, S.A., 2017. Enhancement of Under-exposed image for object tracking algorithm through homomorphic filtering and mean histogram matching. Adv. Sci. Lett. Vol. 23 (11), 11257–11261 November 2017. Agaian, S.S., Lentz, K.P., Grigoryan, A.M., 2000. A new measure of image enhancement. In: Proceedings of the International Conference of Signal Processing and Communication (IASTED), pp. 19–22. Agaian, S.S., Panetta, K., Grigoryan, A.M., 2001. Transform based image enhancement with performance measure. IEEE Trans. Image Process. 10 (3), 367–381 March 2001. Agaian, S.S., Silver, B., Panetta, K.A., 2007. Transform coefficient histogram-based image enhancement algorithms using contrast entropy. IEEE Trans. Image Process. 16 (3), 741–758 March 2007. Bazeille, S., Quidu, I., Jaulin, L., Malkasse, J.P., 2006. Automatic Underwater Image Prepreprocessing. SEA TECH WEEK. Caractérisation du Milieu Marin, Brest, France. Bibalan, M., Amindavar, H., 2015. A new alpha and gamma based mixture approximation for heavy-tailed Rayleigh distribution. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP). South Brisbane, QLD. 19-24 April 2015, pp. 3711–3715. http://dx.doi.org/10.1109/ICASSP.2015.7178664. Chen, B.H., Wu, Y.L., Shi, L.F., 2017. A fast image contrast enhancement algorithm using entropy-preserving mapping prior. IEEE Trans. Circ. Syst. Video Technol. 1–12. http://dx.doi.org/10.1109/TCSVT.2017.2773461. Dwivedi, P.K., Paul, B., Ghoshal, D., 2015. Underwater image enhancement using distance factor estimation. In: IEEE International Conference on Electrical, Electronics, Signals, Communication and Optimization (EESCO). Visakhapatnam, 24-25 Jan. 2015, pp. 1–5. http://dx.doi.org/10.1109/EESCO.2015.7254047. Eddins, Steve, Mathworks, 2013. Homomorphic filtering. http://blogs.mathworks.com/
237
Ocean Engineering 162 (2018) 224–238
A.S. Abdul Ghani
600–612 April 2004. Wu, J., Huang, H., Qiu, Y., Wu, H., Tian, J., Liu, J., 2005. Remote sensing image fusion based on average gradient of wavelet transform. In: Proceeding of the IEEE International Conference on Mechatronics & Automation, 29 July – 1 August 2005, Niagara Falls, Canada, vol. 4. pp. 1817–1821. Yelmanova, E.S., Romanyshyn, Y.M., 2017. Adaptive Enhancement of Monochrome Images with Low-contrast Objects. CSIT 2017, 05-08 September, 2017, Lviv, Ukraine. Zhang, Z., Zhao, 2011. Butterworth filter and Sobel edge detection to image. In: International Conference on Multimedia Technology (ICMT). Hanguhou, 26-28 July 2011, pp. 254–256. http://dx.doi.org/10.1109/ICMT.2011.6002091.
Qing, C., Huang, W., Zhu, S., Xu, X., 2015. Underwater image enhancement with an adaptive dehazing framework. In: IEEE International Conference on Digital Signal Processing (DSP). Singapore. 21-24 July 2015, pp. 338–342. http://dx.doi.org/10. 1109/ICDSP.2015.7251888. Soni, K., Mehra, M., Kumar, R., 2015. Design, performance and cost analysis of various band pass IIR filters for myriametre band applications. In: IEEE 3rd International Conference on MOOCs, Innovation and Technology in Education (MITE). October 2015. Amritsar, India, pp. 382–387. http://dx.doi.org/10.1109/MITE.2015. 7375349. Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P., 2004. Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13 (4),
238