Automatic inspection system of surface defects on optical IR-CUT filter based on machine vision

Automatic inspection system of surface defects on optical IR-CUT filter based on machine vision

Optics and Lasers in Engineering 55 (2014) 243–257 Contents lists available at ScienceDirect Optics and Lasers in Engineering journal homepage: www...

9MB Sizes 0 Downloads 144 Views

Optics and Lasers in Engineering 55 (2014) 243–257

Contents lists available at ScienceDirect

Optics and Lasers in Engineering journal homepage: www.elsevier.com/locate/optlaseng

Automatic inspection system of surface defects on optical IR-CUT filter based on machine vision Yan Liu, Feihong Yu n Optical Engineering Department, Zhejiang University, Hangzhou 310027, China

art ic l e i nf o

a b s t r a c t

Article history: Received 22 September 2013 Received in revised form 18 October 2013 Accepted 22 November 2013 Available online 20 December 2013

The paper presents an automatic surface defects inspection system for optical Infrared Cut-off (IR-CUT) filter, which is applied in all kinds of color cameras and video devices. The system involves illumination and imaging module, moving module, flipping module and machine vision algorithm. To highlight all the defected regions, the improved dark-field illumination technique is utilized in the imaging module. In order to accurately localize the region of optical IR-CUT filter in the captured image, stationary wavelet transform (SWT) is introduced to template matching algorithm. The introduction of SWT provides a more accurate estimate of the variances in the image and further facilitates the identification of the defected regions. The defects extraction method in this paper avoids the use of complicated learning process from a set of samples. Convexity theory is implemented on the algorithm of defects classification of edge crack. Experimental results on a variety of optical IR-CUT filter samples, including non-defective samples, samples with defects of stain, scratch and edge crack, have shown the efficiency (1.05 s per sample) and accuracy (96.44%) of the proposed system. Moreover, defects extraction performances of different filters are compared in this paper. The research and application of the system will greatly liberate the human workforce and inspire ideas to detect the defects of some other small optical elements. Crown Copyright & 2013 Published by Elsevier Ltd. All rights reserved.

Keywords: Optical IR-CUT filter Stationary wavelet transform Template matching Surface defects

1. Introduction Imaging devices such as CCD (Charge Coupled Device) and Complementary Metal Oxide Semiconductors (CMOS) have sensitive properties of being responsive to light radiation in a wavelength band that is wider than the wavelength band of visible radiation detectable by the human eye (the visible spectrum) [1]. The differences between human eye and CCD/CMOS in sensitivity to infrared spectrum are especially significant. Because many light sources emit infrared light, including the sun. Optical IR-CUT filter is designed to reflect or block mid-infrared wavelengths while passing visible light. It is applied in all kinds of color cameras and video devices to prevent infrared light radiation from reaching imaging sensor, in an attempt to capture images as close to those perceived by the human eye as possible. However, an optical IRCUT filter with defects can spoil the image, through chromatic aberration phenomenon and redundant objects in the image. Hence, using a color camera to obtain true color image in white light environment requires a non-defective optical IR-CUT filter to eliminate the influence of infrared light. Unfortunately, current optical IR-CUT filter surface defects inspection methods are mainly

n

Corresponding author. E-mail address: [email protected] (F. Yu).

done manually by skilled human inspectors. The traditional manual methods generally have many disadvantages such as low speed, highly subjective and low accuracy. Automated optical inspection (AOI) system has been widely used in industrial quality assurance procedures. Multi-task inspection in high-speed AOI system is becoming a significant problem in the design [2]. Therefore, a reliable and efficient AOI system is needed to replace human inspectors in the optical IR-CUT filter production line. Machine vision is a typical interdisciplinary subject, which involves artificial intelligence, neurobiology, psychophysics, computer science, image processing, pattern recognition and many other areas [3]. Machine vision is mainly aimed at processing the information extracted from images and videos to reproduce certain intelligent behavior concerned with human vision. It has been widely used in many industrial online inspection tasks such as printed circuit boards [4], fabric [5], electric components [6], wire bonding for Integrated Circuit (IC) [7], solar wafer [8] and fruit [9]. A variety of machine vision inspection techniques of surface defects have been discussed in literature. Advances in surface inspection field using computer vision and image processing techniques are reviewed in paper [10], particularly those based on texture analysis methods. Many other surface defects inspection techniques such as ultrasonic, laser, magnetic, ultraviolet, and X-ray have been developed, but those techniques are either costly, slow or have some limitations in environment or

0143-8166/$ - see front matter Crown Copyright & 2013 Published by Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.optlaseng.2013.11.013

244

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

setup [11]. In the past decade, various machine vision methods for defect inspection of liquid crystal display (LCD) have been developed [12]. D.-M. Tsai and H.-Y. Tsai [13] introduce optical flow method to detect very low-contrast mura defects. And many advanced machine learning methods such as support vector machine [14] are proposed to classify surface defects of LCD. As the most popular flat panel display, LCD is much bigger than optical IR-CUT filter in area, and the diameter of optical IR-CUT filter which is often used in the cellphone is only 4 mm as shown in Fig. 1. Hence, the inspection system for LCD not requires algorithm to localize the Region of Interest (ROI). Kuo et al. [15] applied image processing technology and the neural network to detect surface defects of color filters, which is a critical part of an LCD panel. The structure of the system proposed by Kuo is similar to LCD inspection system, and the algorithm proposed cannot be employed in the IR-CUT filter defects inspection system for global image processing operation. In recent years, several techniques in transform domain have been proposed to inspect various kinds of defects. Li [16] introduces wavelet transform (WT) to automatically detect the defect-induced partial fringe patterns. Rajshekhar et al. [17] proposed a two-dimensional space–frequency distribution based method to directly obtain the unwrapped estimate of the phase derivative. An algorithm to identify defects from fringe patterns using the two-dimensional Pseudo-Wigner-Ville distribution is also proposed by Rajshekhar et al. in the paper [18]. Qian [19] chooses two-dimensional windowed Fourier transform to determine the phase and phase derivatives. A windowed Fourier transform (WFT) based algorithm is described and theoretically analyzed for fault detection in the paper [20]. In this paper, an automatic optical IR-CUT filter surface defects inspection system based on machine vision is proposed. The key to the system is the return result of the inspection algorithm. One difficulty for the optical IR-CUT filter defects inspection algorithm is accurate localization of ROI in the captured image. The most common approach to object localization is to slide a window across the Image, which is called template matching. An improved template matching algorithm is performed to localize the region of optical IR-CUT filter in the paper. The introduction of Stationary Wavelet Transform (SWT) [21] in the template matching method can provide a more accurate estimate of the location of the

optical IR-CUT filter and further facilitate the extraction of defects. The proposed defects extraction method is very effective to inspect various subtle defects and avoids the use of complicated learning process. The research in this paper will inspire ideas to detect the defects of some other small optical elements such as high reflective mirror, anti-reflective coating and wafer level optical filter. Furthermore, the localization algorithm in this paper does apply to the finding of other objects, which are not limited to optics. Defects extraction method might also be a good choice for some other system such as the system proposed in papers [22–24] because it is effective and robust. The aim of defects classification is to guarantee the machine inspection standard is consistent with human inspection standard, and the theory is suitable for the classification of other kinds of defects which are determined by size and distance.

2. Mechanical and optical configuration The architecture of the proposed system is shown in Fig. 2(a). First, the optical IR-CUT filters to be detected are placed on tray A. Second, the optical IR-CUT filter will be transferred to dial B by a horizontal moving suction one by one, which will be inspected by the camera in dial B. Because there is no refracted infrared light into the optical IR-CUT filter, another side of it cannot be imaged in one photo. Hence, the filliping module is needed, which is two rotated suctions as shown in part C of Fig. 2(a). Third, the optical IR-CUT filter is flipped from dial B to dial D for the defects inspection of another side. Finally, according to the result of the inspection algorithm return, the optical IR-CUT filter will be classified into the corresponding tray of part E by a suction. In particular, the flipping module is displayed in Fig. 2(b). Considering the character of optical IR-CUT filter when transmitting the light of different wavelengths, the performance of illumination module is essential to the whole optical IR-CUT filter defects inspection system. Choosing the right lighting strategy remains a difficult problem because there is no specific guideline for integrating it into machine vision applications [25]. A suitable light source should ensure sufficient brightness and produce some differences between non-inspection region and ROI. There is an

Fig. 1. Optical IR-CUT filter.

Fig. 2. Mechanical structure of optical IR-CUT filter defects inspection system, (a) the whole structure and (b) the flipping module.

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

infrared light reflector and absorber in the optical IR-CUT filter, which has a light transmission property of 2% transmittance with respect to a wavelength in a wavelength band of 720–1100 nm. We choose infrared ring form Light Emitting Diode (LED) with 850 nm lighting wavelength as the illuminant of the system. In this case, there is almost no infrared beam passing through the optical IRCUT filter, so defects on the tray which is covered by optical IR-CUT filter will be excluded from the image. For this reason, the AOI system for optical IR-CUT filter must contain flipping module to inspect another side. Dark-field illumination and bright illumination are two standard methods in surface defects inspection [26], which are also widely used in microscopy to describe an illumination technique used to enhance the contrast in unstained samples. In our system, a modified dark-field illumination module is performed, which excludes un-scattered beam from the image. It is a specialized illumination technique that capitalizes on oblique illumination. Light scattering has always been a significant and fundamental problem in the field of optics [27]. Hence, it is crucial to comprehend the scattering characteristics of surface defects. There are three kinds of defects existing in the surface of optical IR-CUT filter: stain, scratch, and edge crack. Stain is caused by the dirt in the air or dirty objects in contact with it. The cleaning and testing process of optical IR-CUT filter may bring scratches on the surface by contact with sharp things such as Tweezers. Crack is generated in the process of segmentation after coating. As shown in Fig. 3(a), the infrared light is progressing from the top of the figure downward and approaches the optical IR-CUT surface at 301 angle. It is supposed that one deep scratch is on the

245

optical IR-CUT filter surface. According to Snell's law [28], when defects exist, the incident ray B1O3 will be reflected and go along O3B1′, which is almost perpendicular to the imaging sensor surface. Similarly, the incident ray C1O2 is also reflected by the defect and the reflected ray O2C1′ will reach imaging sensor. However, assuming that no defect exists on the surface, the incident ray A1O1 will be reflected at point O1. The reflected ray O1A1′ will not be received by imaging sensor. But we suppose that if the scratch is not so deep, the incident ray B1O3 and C1O2 will be reflected at point O6 and O5 and go along O6E1′ and O5F1′ respectively, which will not participate in imaging procedure. Hence, another infrared ring form light source with 601 angle is used to highlight shallow scratch in this system, as shown in Fig. 3(b). The incident light B2O2′ will be reflected at point O2′ and go along O2′B2′, which will be received by imaging sensor. In the same way, the ray of O3′C2′ will enter the camera to light the shallow scratch. Reflection example of stain is also displayed in Fig. 3(a and b), which is assumed as a raised nubbin in the surface. Because edge crack defects are similar to scratches in some sense, the reflection model of it is not analyzed in this part. In conclusion, the combined use of these two ring form infrared light source can meet the requirements of optical IR-CUT filter defects inspection and an image with bright defects and black background will be obtained. In addition, because light will first be incident on the reflector in the practical application of optical IR-CUT filter, the discussion in this part is specific to the reflector side. The basic model of improved dark-field illumination and traditional darkfield illumination are illustrated in Fig. 4(a and b) respectively. The design of two ring form infrared LED with different incident angles ensures the recognition rate of the defects. Indeed, even though the LED light source has fixed incident angle, some light with different angles will be reflected by the non-defective region and enter into the camera as stray light. Hence, there is no black region with grayscale value zero in the image. CCD camera is widely used in surface defects inspection system. In this system, monochromatic CCD camera is used to capture images of optical IR-CUT filter. As shown in Fig. 5, four example images denote stain, scratch, edge crack and nondefective optical IR-CUT samples respectively. The defect of graylevel data is represented by non-uniform brightness in the image. The outer circle in Fig. 5 is the tray border which is used to place optical IR-CUT filter. Brightness regions exist not only in the optical IR-CUT filter, but also in the tray. Hence, the first step of the algorithm is to localize the region of optical IR-CUT filter in the captured image.

3. Machine vision algorithm Fig. 3. Optical model of defects inspection. (a) Deep scratch and stain and (b) shallow scratch and stain.

After obtaining the image of optical IR-CUT filter, the next step is to extract useful information from it. Machine vision algorithm

Fig. 4. (a) Improved dark-field illumination model. (b) Traditional dark-field illumination model.

246

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

Fig. 5. Sample images of optical IR-CUT filter with defects and non-defective one. (a) Stain, (b) scratch, (c) scratch and (d) non-defective sample.

performed in this system includes ROI localization, defects finding and defects classification. 3.1. ROI localization The first step of automatic optical IR-CUT filter defects inspection algorithm is finding the location of optical IR-CUT filter in the image. The most common approach to object localization is to slide a window across the image, which is called template matching. A typical situation of template matching is searching for a specific pattern within a larger image possibly containing it. The problem of template detection fits within the statistical fields of detection and estimation theory which in turn may be considered as particular cases of game theory. Another popular approach is to extract local interest points from the image, and then to classify each of the regions around these points [29]. A weakness of this approach is that it can fail when template image information is insufficient. Due to little corner feature points in optical IR-CUT filter covered region, an improved template matching approach is proposed to localize optical IR-CUT filter region. A major issue in template matching is the stability of similarity scores with respect to noise, including un-modeled phenomena [30]. To handle this issue, the improved template matching algorithm is composed of three stages. In the first stage, an image de-noising method based on stationary wavelet transform (SWT) is performed on the image. In the second stage, the Normalized cross correlation (NCC) between the template after SWT de-noising operation and target image after SWT de-noising is computed. NCC overcomes the influence of the lighting variance by normalizing the image and template vectors to unit length. In the final stage, the most similar region in the image is detected by comparing the correlation coefficients. 3.1.1. Stationary wavelet transform SWT is a redundant transform that doubles the number of input samples at each iteration, which can provide a more accurate estimation of the variances and facilitate the identification of

Fig. 6. Stationary wavelet transform decomposing procedures and filter computation procedures.

salient features in a signal, especially for recognizing noise or signal rupture [31]. SWT is a kind of non-orthogonal wavelet transform. For orthogonal wavelet transform, although high frequency contents removed through subsampling process of lowpass filter get a descending image resolution, only via double the subsampling scale factor to ensure the reversibility of wavelet transform (WT) [32]. SWT is shift invariant, the subsampling operation of which is canceled. Hence low-pass filter and highpass filter need insertion of zeros between coefficients of every

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

level filter, which is called a'trous algorithm [33]. The decomposition procedures of SWT are shown in Fig. 6; in addition, the relationship between filter Lj and Hj can be also visualized in it. We choose diagonal coefficient to de-noise and reconstruct the image with approximated coefficient. Let g be the operator that alternates a given sequence with zeros; thus, for all integers p, ( 0 if n ¼ 2p g½n ¼ ð1Þ g½p if n ¼ 2p þ 1 Define filter Lj and Hj to have weights gLj and gH j respectively. The operator of SWT filter (Lj, Hj) is no longer to satisfy orthogonal conditions, while the sample operator (gLj , gH j ) is still to meet orthogonal conditions. We assume that operator (g 0 Lj , g 0 H j ) produces even term coefficients and operator (g 1 Lj , g 1 H j ) generates odd term coefficients. Apply R0[r] and R1[r] to perform inverse transform respectively. Hence, the reconstruction operator of SWT can be written as (R0[r] þR1[r])/2. So inverse stationary wavelet transform (ISWT) of original data is   ½j aj ¼ 1=2 ðR½j ð2Þ 0 þ R1 Þðaj þ 1 ; dj þ 1 Þ There are three main steps in the SWT threshold de-noising method. First, decompose the image at level N using a wavelet basis in SWT to obtain approximation coefficients and detail coefficients. Second, choose shrinkage function and the threshold to shrinkage the coefficients at each level. Finally, use the shrinkage coefficients to reconstruct the de-noised image. Factors that determine the defects extraction effects include: the decomposition level N, wavelet basis and threshold method. In general, the decomposition level N has a relationship with image size. Wavelet basis include haar, db, sym, etc., Haar wavelet basis is the simplest one in the wavelet basis family, which only have one finite support. Db wavelets are not symmetrical. Sym wavelets are a general term for a series of near symmetrical and compact support wavelets. The choosing of threshold method includes shrinkage function and threshold selection rules.

 Shrinkage function Donoho and Johnstone proposed two classical threshold functions: hard threshold function [34] and soft threshold function [35]. Hard threshold estimator dm(x) is implemented with ( x if jxj 4 T dm ðxÞ ¼ ð3Þ 0 if jxj r T where T is threshold value. The soft threshold function dm(x) is the solution that minimizes a distance to the data, which is written as 8 > < x T if x Z T if jxj r T dm ðxÞ ¼ 0 ð4Þ > : x þT if x r  T

 Threshold selection rules Actually, many approaches have been proposed for calculating the threshold value, such as fixed form threshold, multi-scale SURE, mini–max and penalize rule. We choose Rigorous SURE as our threshold rule in the de-noising procedures. Define s as standard deviation of image data. Donoho proposed a good estimator s for the wavelet de-noising given as s¼

medianðdN  1;k Þ ; 0:6745

k ¼ 0; 1; …; 2N  1  1

ð5Þ

Rigorous SURE is an unbiased estimator of the mean-squared error to a given estimator. First, give a value of threshold T. Then, compute the likehood estimation of threshold T, and get

247

final T after minimize it. Assume X be a vector, the elements of it are an array of the square of SWT coefficients with a descending order, X¼[x1, x2, …, xn], x1 rx2 r… xn, n denotes the number of SWT coefficients. The risk rt of this rule is i

r t ¼ ½n  2i ðn  iÞxi þ ∑ xk =n

ð6Þ

k¼1

where i¼1, 2,… n. Supposing that rm is the smallest value in the risk set, and then can get the corresponding wavelet transform coefficients, thus pffiffiffiffiffiffi T ¼ xm ð7Þ

3.1.2. Template matching The template image, matching with each video sequence, is acquired from a non-defective sample image because the radius and shape of optical IR-CUT filter can be obtained from the manufacturer. To identify the region of optical IR-CUT, we have to compare the template image against the target image by moving template image patch one pixel at a time. At each location, a parameter is computed to explain the match rate of this region. To eliminate the influence of uneven illumination, NCC is calculated as the parameter, which is defined as ∑x0 ;y0 ðDT ′ðx′; y′Þ U D′S ðx þ x′; yþ y′ÞÞ Rðx; yÞ ¼ qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ∑x′y′ ðD′T ðx′; y′Þ2 U∑x0 y0 D′S ðx þx′; y þ y′Þ2

ð8Þ

where D′T ððx′; y′ÞÞ ¼ DT ððx′; y′ÞÞ 

1 ∑ DT ðx″; y″Þ ðwT UhT Þx″;y″

D′S ðx þ x′; y þy′Þ ¼ DS ðx þx′; y þ y′Þ 

1 ∑ DS ðx þx″; y þy″Þ ðwS hS Þx″;y″

ð9Þ

ð10Þ

In Eq. (9), DT (x′, y′) denotes the template image which has been operated by the SWT de-noising method. wT and hT represent the width and the height of the template image, and x′¼0…wT  1, y′¼ 0…hT 1, x″ ¼0…wT  1 and y″ ¼0…hT  1. In addition, DS(x, y) in Eq. (10) is the target source image data after SWT de-nosing operation. The location of point (x, y) corresponds to the slide start point of template region in the target image. Similarly, ws and hs are the width and height values respectively of target source image. The value of R(x, y) represents how similar the template match is to that particular region of the target source image. The value of R(x, y) is a greater delegate whose similarity is bigger. In order to display the data of two-dimensional matrix in the form of image, image matrix R′ is calculated by Eq. (11). The procedures of template matching are shown in Fig. 7. An example image of R′ is included in Fig. 7. The brightest location of R′ indicates the highest matches. We can locate the region of optical IR-CUT filter by this brightest point, which stores the start point coordinate of the best match region. A locating result is shown in the end of Fig. 7 in which optical IR-CUT filter covered region is marked by a pink circle. R′ ¼ R  255

ð11Þ

3.2. Defects finding Since we have localized the region of optical IR-CUT filter, we just need to fill the region with color RGB (255, 255, 255) to produce a mask, which is shown in Fig. 8(a). Then, do AND operation between mask image and SWT de-noising source image. The final ROI sample image is illustrated in Fig. 8(b).

248

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

Defects in the optical IR-CUT filter image are some set of brightness regions. Binary large object (BLOB) algorithm [36] can be employed in many occasions so that the image can be well segmented with a threshold for finding out the scope of gray scale values mutations in an area. There are several motivations for using the BLOB detection algorithm. One main reason is to provide complementary information about regions, including size, shape, area and location. The BLOB algorithm proposed in this paper can be simply summarized in the following steps: (1) Convert the source image to a binary one. (2) Find the contours of connected components. (3) Estimate the shape of the blobs, and calculate the area and the perimeter of blobs.

Fig. 7. Process of template matching.

Given that ROI image has the region which has not been binary processed, we must start the algorithm from step (1). It is necessary to find an optimal threshold method that can be used to convert the gray scale mask into a binary image in order to highlight all the defect regions. Fig. 9(a) demonstrates a mask image containing some hardly visible stain defects. Basic threshold method is unable to adapt to the optical IR-CUT filter imaging property in infrared band. A threshold method using Sobel operator is performed in this system. Sobel operator gives an approximation of the image gradient by differencing pixels in the horizontal and vertical directions. Because the gradient is a two dimensional vector, it has a norm and a direction. Assuming that DM is the data of mask image, the Euclidean norm which is also called the L2 norm is computed by Eq. (12). However, we choose to compute the norm as the sum of the absolute values as shown in Eq. (13). It gives values close to the Euclidean norm but at a much lower computational burden, which is called L1 norm. Eq. (14) shows the direction of gradient angular. The Mask image processed by Sobel operator according to L1 norm and Eq. (15) in Fig. 9(b) sobelfinal (x, y) indicates that the defect regions are distinguishable from other regions. Then, a basic threshold method as Eq. (16) is performed. The choosing of threshold Th is a tradeoff between defects region and background. Since the grayscale values of a majority of pixels in the non-defects region are 255, Th can be computed by the average grayscale value of sobel image which is shown in Fig. 8(b), in particular, only the grayscale value of the pixel which is under 255 is contributed to the computation of Th. The binary result in Fig. 9(c) demonstrates that all pixels in defects region including some hardly visible defects are set to 255. Because the opening filter removes the small blobs introduced by uneven illumination, the morphological operator of opening is further performed on the binary image. Opening filter is defined as the dilation of the erosion of an image. The binary image after opening filter is shown in Fig. 9(d), in which some points that are too small to contain the structure element of opening operator are eliminated. It should be noted that the opening operation should only be performed when the manufacturer requires only the width of defects which is above 3 pixels to be detected. sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi     ∂DM 2 ∂DM 2 L2 norm ¼ jgrad ðDM Þj ¼ þ ð12Þ ∂x ∂y     ∂DM ∂DM þ abs L1 norm ¼ jgradðDM Þj ¼ abs ∂x ∂y

ð13Þ

  ∂DM =∂y ∠grad ðDM Þ ¼ arctan  ∂DM =∂x

ð14Þ

sobelf inalðx; yÞ ¼ 

Fig. 8. (a) Mask and (b) ROI image.

255 sobelðx; yÞ þ 255 maxðsobelÞ

ð15Þ

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

249

Fig. 9. (a) Mask image, (b) sobel image, (c) binary image and (d) opening image.

( binary_imgðx; yÞ ¼

255 0

if sobelf inalðx; yÞ o T h if sobelf inalðx; y_Þ 4 T h

ð16Þ

The second step of BLOB algorithm is finding contours of defects. Although algorithms like canny edge detector [37] can be used to find the edge pixels that separate different segments of an image, they do not tell you anything about those edges as entities in themselves [38]. A contour is a list of points that represent a curve in an image. Border following technique has been one of the fundamental techniques to find contours, because it has a large variety of applications. Border following techniques and their improvements are narrated in the article [39], and we use this technique to find the contour of optical IR-CUT filter. The detected contours are sequences of points representing some kind of curve in the image. In this case, the detected contours are stored as Freeman chains [40], in which a polygon is presented as a sequence of steps in one of eight directions and each step is designated by an integer from 0 to 7; the theory can be explained in Fig. 10. The contour of the hexagon can be represented by a series of numbers 0–1–3–4–5–7. In the final step, the shapes of blobs are estimated, and the perimeter and area of each blob is computed. It is common to approximate a contour representing a polygon with another contour having fewer vertices to engage shape analysis. Although there are many different ways to do polygon approximations, Douglas–Peucker (DP) [41] algorithm is widely used to approximate a series of points. Hence, DP algorithm is performed in the final step of BLOB algorithm. 3.3. Defects classification There are three kinds defects may exist in the optical IR-CUT filter, which include scratch, stain and edge crack. Since many blobs in the binary image are similar to ellipse, ellipse fitting method is used to distinguish stain from scratch, which is performed on all blobs inner the optical IR-CUT filter.

Fig. 10. Freeman chain moves are numbered 0–7.

Assuming that variable a is the major axis of the fitting ellipse, variable b is minor axis of the fitting ellipse. If a4 3b, this blob is considered as a scratch. Otherwise, the blob is regarded as a stain. Edge crack is a kind of defect that may exist in the edge of optical IR-CUT filter, which is caused by surface strain. In our system, edge crack is described by computing convex hull and convexity defects [42] of a contour, which is applied in the field of gesture recognition and tracking. The computation of the convex hull of a finite set of points is one of the first and most important concepts studied in computational geometry [43]. The convex hull has several different definitions and several construction methods [44]. The Sklansky’82 algorithm [45] is a sequential linear time algorithm used to compute the convex hull for a set of points. For a set of N points {d0, d1, d2 … dN}AD, D denotes the contour of optical IR-CUT filter, the convex hull of D is defined as the smallest area of convex polygon which can enclose D. A hull P can be created with M points from the set N to create a minimum area convex polygon. Referring Fig. 11, α is the interior angle of the hull; if α4π, then d9 eM, so the final convex hull P is {d1, d2, d4, d5, d6, d8}. The convexity defects in a hull is the space between the hull and the actual object such as the triangle region d1d8d9 in Fig. 11, which

250

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

Fig. 11. Convex hull logic.

Fig. 12. Flow chart of the proposed optical IR-CUT filter defects inspection algorithm.

can be regarded as edge crack region in the optical IR-CUT filter image. Assuming that the optical IR-CUT filter has no edge crack defects, the convex hull will be overlapped with the inner contour of the optical IR-CUT filter, and no convexity region is produced. On the contrary, supposing that the optical IR-CUT filter has edge crack defects, the

convexity regions will be generated. If the area of it exceeds the permissible range, the optical IR-CUT filter to be inspected will be considered as a defective one. To better understand the entire process of the inspection algorithm, the flow chart is illustrated in Fig. 12 including

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

251

Table 1 Machine defects inspection results. Results name

Nondefection

Scratch Stain Edge crack

Total samples

Efficiency (time/ number), s

Proposed method Skilled worker NO SWT

2858

31

387

24

3300

1.05

2873

39

366

22

3300

2.36

2456

28

792

24

3300

1.03

Table 2 Accuracy comparison between proposed method and No SWT. Method

Nondefection

Scratch Stain

Edge crack

Proposed method (%) NO SWT (%)

99.51

94.87

95.90 95.45

96.44

88.97

82.05

12.0

68.49

90.91

General accuracy

ROI localization, defects finding and defects classification. Every step of the inspection algorithm is described in detail in the flow chart.

4. Experimental results 4.1. Inspection accuracy and efficiency The proposed machine vision algorithm was implemented on a computer using C þ þ language. To evaluate the effectiveness of the proposed system, a total of 3300 optical IR-CUT filter samples were inspected by the machine, which were already labeled as non-defection, scratch, stain and edge crack by the skilled worker. This manual inspection result is used as the judging standard for the accuracy of the machine. Machine inspection results for the samples are shown in Table 1. To estimate the efficiency of the system, the inspection time of all 3300 samples by the machine and human is recorded. The efficiency of the system is displayed in the Table 1, which shows that the machine inspection system requires less time. In addition, the inspection results using the algorithm which excludes SWT procedure are also displayed in the Table 1. Due to the influence of noise, the algorithm without SWT can lead to some false white regions in binary image, and it further causes false result. In order to evaluate the accuracy of the system in a more intuitive model, we define the evaluation criteria as   absðNhuman  N machine Þ accuracy ¼ 1  100% ð17Þ N human where Nhuman and Nmachine denote the total number of corresponding classification result by human and machine respectively, and the function of abs(x) returns the absolute value of x. The inspection accuracies of two methods for non-defection, scratch, stain and edge crack are displayed in Table 2. As we can see from Table 2, final accuracy of the algorithm with SWT is 96.44%, which is computed by averaging the accuracy of four kinds of inspection accuracies. However, the final accuracy of algorithm without SWT operation only has 68.49%, which brings many misjudgments to stain defects. 4.2. Localization accuracy Localizing the region of optical IR-CUT filter accurately is crucial to the success of the inspection algorithm. An improved template

Fig. 13. Comparison of localization accuracy between the proposed method and traditional template matching method. (a) NCC and (b) center distance.

Fig. 14. Localization method of real center.

matching algorithm is utilized to localize the optical IR-CUT filter. NCC R(x, y) is computed by Eq. (8), which represents the similarity between the template and the region in target image. The maximum value of Mat R denotes that the corresponding region in the target image has the best match ratio with template. Particularly, in a test set A including 100 optical IR-CUT filter samples are picked from 3300 samples randomly. The matrix R of each sample in the set A is calculated. Assuming that matrix R1 is the match coefficient of improved template matching method, R2 is the match coefficient of traditional template matching method. The

252

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

maximum values of R1 and R2 for each sample are compared in the Fig. 13(a). To make the chart easier to read, samples in set A are ranked by the maximum value of NCC in a decreasing order. Fig. 13(a) shows proposed method largely improved the similarity between the template and target image, which is beneficial to find the precise location of optical IR-CUT filter. To further illustrate the accuracy of the proposed method, we choose 10 samples randomly as test set B. Two right triangles are picked from the circle of optical IR-CUT filter to find the point O manually as the real center which is shown in Fig. 14. Assuming that the center localized by traditional template matching method and proposed method are O1 and O2 respectively, the center distances OO1 and OO2 are calculated and compared in Fig. 13(b). It demonstrates that the proposed method will achieve better precision. In addition, although Hough Circle Transform [46] is simple to detect a circle, it is not a robust method for complicated and varied image. The Hough Circle detection results in Fig. 15(a) reveal that the Hough method cannot localize the region of optical IR-CUT filter accurately. In addition, the suitable settings of Hough transform parameters are hard to choose, so some incorrect circles will be detected. Fig. 15 (b) is an example localization result of traditional method, which will cause an inaccurate estimate of the edge crack size for a small shift to the center. The localization result of the proposed method in Fig. 15(c) indicates that the circle in this image is closer to the real location of optical IR-CUT filter than the traditional method.

4.3. Defects extraction accuracy In this paper, there are three steps to distinguish defective regions from background surface in each optical IR-CUT filter. Table 3 summarizes three kinds of defects extraction performances of six image processing filters, including median filter, Gaussian filter, WT chosen as haar and sym5 as wavelet basis, SWT chosen haar and sym5 as wavelet basis. We displayed the original defects image which is not processed by any image processing filter in the first row of Table 3. In order to show the variation of defects region in a more intuitive form, the gray values of the image are displayed in 3D perspective. The binary images in the Table 3 are obtained from L1 norm and Eq.16. The images shown in the opening operation row of Table 3 are the final images to extract defects. It can be seen from the binary images of original defects images in Table 3 that the noise caused by uneven illumination and low spectral response of CCD can seriously affect the extraction of defects. Although median and Gaussian filter can reduce the noise to a certain degree, the two filters can make some fine regions in the image become defects. WT, which has high time–frequency and multi-resolution characteristic, covers the disadvantage of Fourier transform (FT) and gets better performance in image and signal processing fields. However, small shifts in the input signal can cause major variations in the distribution of energy among coefficients at different levels and may cause some error in reconstruction of WT [47]. Haar wavelet basis only has one finite support, so choosing WT and haar wavelet basis as image processing filter is likely to miss some defects as shown in Table 3. The combining of WT and sym5 wavelet basis can compensate the drawback of WT in some extent, but it cannot meet the requirements of scratch. The use of SWT can solve the problem of shift invariant absolutely. However, SWT with sym5 wavelet basis can increase the area of defects. Hence, the combination of SWT and haar wavelet basis as pre-processing for defects extraction is a robust method. Experimental results show that the introduction of SWT provides a more accurate estimate of the variances in the image and further facilitates the identification of the defected regions.

Fig. 15. Localization results of different methods. (a) Hough circle transform, (b) traditional method and (c) proposed method.

4.4. Defects classification Ellipse fitting, DP algorithm and convexity theory are used to classify defects, which is easy to implement for real-time applications in manufacturing. In order to present the classification results

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

253

Table 3 Binary results of three kinds of defects by different image pre-processing filters. Filter Name

Defects name

Original image

Stain

Scratch

Edge crack

Median filter

Stain

Scratch

Edge crack

Gaussian filter

Stain

Scratch

Defects image

3D picture

Binary image

Opening operation

254

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

Table 3 (continued ) Filter Name

Defects name Edge crack

SWT sym5

Stain

Scratch

Edge crack

SWT haar

Stain

Scratch

Edge crack

WT sym5

Stain

Defects image

3D picture

Binary image

Opening operation

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

255

Table 3 (continued ) Filter Name

Defects name

Defects image

3D picture

Binary image

Opening operation

Scratch

Edge crack

WT haar

Stain

Scratch

Edge crack

of the algorithm we employed different color to draw the contour of corresponding defects, which is shown in Fig. 16. Yellow contour denotes the inner contour of the optical IR-CUT filter, which can be used to compute the edge crack defects, as shown in Fig. 16a. The defects inspection results of stain and scratch are illustrated in Fig. 16(b and c) respectively. The contour of scratch is drawn by the color of green. The pink contour in the figure is the contour of optical IR-CUT filter. However, there is no pink contour in Fig. 16c and d, because edge crack defect is not existed in them, which result in the produce of purple contour for the blend of color pink with color yellow.

5. Conclusion An automatic optical IR-CUT filter defects inspection system based on machine vision is proposed in this paper, including illumination and imaging module, moving module, flipping module and machine vision inspection algorithm. Improved dark-field illumination technique is performed in the imaging module. We choose two infrared light sources with different incident angles as

illuminant for the light selective character of optical IR-CUT filter and defects. Meanwhile, SWT and template matching are combined to localize the region of optical IR-CUT filter. The proposed defects extraction scheme avoids the use of a complicated learning process from a set of defective and non-defective training samples. It is easy to perform and is computationally fast for real-time inspection system. In addition, convexity theory is performed in the algorithm to classify the defects of edge crack. Experimental results on defects region images processed by different filters have shown that choosing SWT with haar wavelet basis is very effective to inspect various subtle defects. We compared its performance with the algorithm without SWT and the classified results of skilled human. The experimental results showed that the proposed method can be reasonable and a practical solution for the optical IR-CUT inspection machine in the sense of computation time and accuracy. The inspection of defects on optical IR-CUT filter surface is a very important process to guarantee the image quality of all the color imaging devices. The research and application of the system will greatly liberate the human workforce and inspire ideas to detect the defects of some other small optical elements.

256

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

Fig. 16. Inspection results. (a) Crack defects, (b) scratch defects, (c) stain defects and (d) good optical IR-CUT filter. (For interpretation of the references to color in this figure, the reader is referred to the web version of this article.)

References [1] Saitoh H, Ohnishi M, CUT IR. FILTER, 2013, US Patent 20, 130, 094, 0752013. [2] Gao H, Ding C, Song C, Mei J.Automated Inspection of E-Shaped Magnetic Core Elements Using K-tSL-Center Clustering and Active Shape Models; 2013. [3] Liu B, Wu S, Zou S..Automatic detection technology of surface defects on plastic products based on machine vision. In: 2010 International conference on mechanic automation and control engineering (MACE), IEEE2010; 2010. p. 2213–6. [4] Xie Y, Ye Y, Zhang J, Liu L, Liu L. A physics-based defects model and inspection algorithm for automatic visual inspection. Opt Lasers Eng 2014;52:218–23. [5] Ngan Henry YY, Pang Grantham KH, Yung Nelson HC. Automated fabric defect detection – a review. Image Vision Comput 2011;29:442–58. [6] Lahajnar F, Bernard R, Pernuš F, Kovačič S. Machine vision system for inspecting electric plates. Comput Ind 2002;47:113–22. [7] Wang M-JJ, Wu W-Y, Hsu C-C. Automated post bonding inspection by using machine vision techniques. Int J Prod Res 2002;40:2835–48. [8] Ko S-S, Liu C-S, Lin Y-C. Optical inspection system with tunable exposure unit for micro-crack detection in solar wafers. Opt – Int J Light Electron Opt 2013. [9] Cubero S, Aleixos N, Moltó E, Gómez-Sanchis J, Blasco J. Advances in machine vision applications for automatic inspection and quality evaluation of fruits and vegetables. Food Bioprocess Technol 2011;4:487–504. [10] Xie X. A review of recent advances in surface defect detection using texture analysis techniques. Electron Lett Comput Vision Image Anal 2008;7:1–22. [11] Sun T-H, Tseng C-C, Chen M-S. Electric contacts inspection using machine vision. Image Vision Comput 2010;28:890–901. [12] Gan Y, Zhao Q. An effective defect inspection method for LCD Using active contour model. IEEE Trans Instrum Meas 2013;62:2438–45. [13] Tsai D-M, Tsai H-Y. Low-contrast surface inspection of mura defects in liquid crystal displays using optical flow-based motion analysis. Mach Vision Appl 2011;22:629–49. [14] Yang M-D, Su T-C, Pan N-F, Liu P. Feature extraction of sewer pipe defects using wavelet transform and co-occurrence matrix, International Journal of Wavelets. Multi-Resolu Inf Process 2011;9:211–25. [15] Kuo C-F, Hsu C-TM, Fang C-H, Chao S-M, Lin Y-D. Automatic defect inspection system of colour filters using Taguchi-based neural network. Int J Prod Res 2013;51:1464–76. [16] Li X. Wavelet transform for detection of partial fringe patterns induced by defects in nondestructive testing of holographic interferometry and electronic speckle pattern interferometry. Opt Eng 2000;39:2821–7. [17] Rajshekhar G, Gorthi SS, Rastogi P. Estimation of displacement derivatives in digital holographic interferometry using a two-dimensional space-frequency distribution. Opt Express 2010;18:18041–6. [18] Rajshekhar G, Gorthi SS, Rastogi P. Detection of defects from fringe patterns using a pseudo-Wigner–Ville distribution based method. Opt Lasers Eng 2012;50:1059–62.

[19] Kemao Q. Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations. Opt Lasers Eng 2007;45:304–17. [20] Qian K, Seah HS, Asundi A. Fault detection by interferometric fringe pattern analysis using windowed Fourier transform. Meas Sci Technol 2005;16:1582. [21] Nason GP, Silverman BW. The stationary wavelet transform and some statistical applications. Wavelets and statistics. Springer; 1995; 281–99. [22] Nazaryan N, Campana C, Moslehpour S, Shetty D. Application of a He3Ne infrared laser source for detection of geometrical dimensions of cracks and scratches on finished surfaces of metals. Opt Lasers Eng 2013. [23] Valle M, Gallina P, Gasparetto A. Mirror synthesis in a mechatronic system for superficial defect detection. IEEE/ASME Trans Mechatron 2003;8:309–17. [24] Rosati G, Boschetti G, Biondi A, Rossi A. Real-time defect detection on highly reflective curved surfaces. Opt Lasers Eng 2009;47:379–84. [25] Jahr I. Lighting in machine vision. Handbook of Machine Vision. 73–203. [26] Tian GY, Lu RS, Gledhill D. Surface measurement using active vision and light scattering. Opt Lasers Eng 2007;45:131–9. [27] Rynne B, Sleeman B. The interior transmission problem and inverse scattering from inhomogeneous media. SIAM J Math Anal 1991;22:1755–62. [28] Wolf KB, Krotzsch G. Geometry and dynamics in refracting systems. Eur J Phys 1995;16:14. [29] Murphy K, Torralba A, Eaton D, Freeman W. Object detection and localization using local and global features. Toward Category-Level Object Recognition. Springer; 2006; 382–400. [30] Brunelli R., Template matching techniques in computer vision: theory and practice; 2009. Wiley.com. [31] Zhong S, Oyadiji SO. Crack detection in simply supported beams using stationary wavelet transform of modal data. Struct Control Health Monitor 2011;18:169–90. [32] Liu Y, Liu P, Yu F.. Denoising of spectral signal in miniature spectrometer based on stationary wavelet transform. In: Symposium on photonics and optoelectronics (SOPO), IEEE2012; 2012. p. 1–4. [33] Shensa MJ. The discrete wavelet transform: wedding the a trous and Mallat algorithms. IEEE Trans Signal Process 1992;40:2464–82. [34] Donoho DL, Johnstone JM. Ideal spatial adaptation by wavelet shrinkage. Biometrika 1994;81:425–55. [35] Donoho DL. De-noising by soft-thresholding. IEEE Trans Inf Theory 1995;41:613–27. [36] Geng R-f, Cao L-l, Geng S, Lin P-y. Design and realization of robot-vision system. Electron Instrum Cust 2007;1:022. [37] Ding L, Goshtasby A. On the Canny edge detector. Pattern Recognit 2001;34:721–5. [38] Bradski G, Kaehler A. Learning OpenCV: computer vision with the OpenCV library. O'reilly; 2008. [39] Zhang Z. Parameter estimation techniques: a tutorial with application to conic fitting. Image Vision Comput 1997;15:59–76.

Y. Liu, F. Yu / Optics and Lasers in Engineering 55 (2014) 243–257

[40] Freeman H. On the classification of line-drawing data, models for the perception of speech and visual form (Boston, MA, November 11–14, 1964). Cambridge, MA: MIT Press; 1967; 408–12. [41] Douglas DH, Peucker TK. Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartogr: Int J Geogr Inf Geovis 1973;10:112–22. [42] Homma K, Takenaka E-i. An image processing method for feature extraction of space-occupying lesions. J Nucl Med: Off Publ, Soc Nucl Med 1985;26:1472–7. [43] Brun C, Dufourd J-F, Magaud N. Designing and proving correct a convex hull algorithm with hypermaps in Coq. Comput Geometry 2012;45:436–57.

257

[44] De Berg M, Cheong O, van Kreveld M, Overmars M. Computational Geometry. Springer; 2008. [45] Sklansky J, Gonzalez V. Fast polygonal approximation of digitized curves. Pattern Recognit 1980;12:327–31. [46] Pedersen SJK. Circular hough transform, Aalborg University, Vision, Graphics, and Interactive Systems; 2007. [47] Mortazavi S, Shahrtash S., Comparing denoising performance of DWT, WPT, SWT and DT-CWT for partial discharge signals. In: Proceedings of the 43rd International Universities Power Engineering Conference, 2008, UPEC 2008. IEEE2008. p. 1–6.