Optics and Lasers in Engineering 105 (2018) 141–149
Contents lists available at ScienceDirect
Optics and Lasers in Engineering journal homepage: www.elsevier.com/locate/optlaseng
Laser vision seam tracking system based on image processing and continuous convolution operator tracker Yanbiao Zou∗, Tao Chen Institute of Mechanical and Automotive Engineering, South China University of Technology, Guangzhou, Guangdong 510640, China
a r t i c l e
i n f o
Keywords: Seam tracking Morphological image processing CCOT object tracking algorithm Line laser sensor Welding robot Weld feature point
a b s t r a c t To address the problem of low welding precision caused by the poor real-time tracking performance of common welding robots, a novel seam tracking system with excellent real-time tracking performance and high accuracy is designed based on the morphological image processing method and continuous convolution operator tracker (CCOT) object tracking algorithm. The system consists of a six-axis welding robot, a line laser sensor, and an industrial computer. This work also studies the measurement principle involved in the designed system. Through the CCOT algorithm, the weld feature points are determined in real time from the noise image during the welding process, and the 3D coordinate values of these points are obtained according to the measurement principle to control the movement of the robot and the torch in real time. Experimental results show that the sensor has a frequency of 50 Hz. The welding torch runs smoothly with a strong arc light and splash interference. Tracking error can reach ±0.2 mm, and the minimal distance between the laser stripe and the welding molten pool can reach 15 mm, which can significantly fulfill actual welding requirements. © 2018 Elsevier Ltd. All rights reserved.
1. Introduction With the development of automation, welding robots have been extensively used in the industrial field and have become the main welding automation equipment. Simultaneously, the development of an automatic seam tracking technology to replace the traditional working mode of teaching and payback has become an inevitable trend in welding automation and intelligence. This endeavor has become a significant approach to improve welding production efficiency and welding quality by capturing welding signal and information to directly simulate a welder to operate [1,2]. The traditional method to improve welding quality is by making some quality assessment or by adding some monitoring devices [3,4]. However, the seam tracking technology based on laser vision combines the advantages of computer vision and laser 3D visual measurement technology, which is more flexible and convenient than the traditional methods. It exhibits the benefits of capturing abundant information, and possessing evident weld characteristics and strong anti-interference capability. Therefore, the aforementioned seam tracking technology is being gradually favored by practitioners. In recent years, considerable research and product development have been conducted in laser vision seam tracking technology. The Beijing Sai Cheng Industrial Corporation developed a laser vision seam tracking system used in welding spiral pipes with a high real-time precision of ±0.5 mm. The British Meta Vision System Corporation developed the
∗
Laser Probe Series laser vision seam tracker, which is suitable for a variety of weld types and exhibits an accuracy of up to ±0.2 mm. During the welding process, the laser stripes and the molten pool maintain a certain distance between them because the laser vision sensor is ahead of the welding torch (Fig. 1). The shorter the distance, the higher the tracking accuracy will be. The distance is typically less than 30 mm. However, a short distance will lead to strong arc and splash during the extraction of the image information using a visual detection system, thereby reducing measurement accuracy and generating numerous error data, particularly when the current is over 300 A [5]. Therefore, the above seam tracking systems cannot work well under such situations, so learning how to identify the weld feature points from an image with strong noise interference is significant for obtaining accurate location rapidly in real-time seam tracking. A schematic diagram of the seam tracking process is shown in Fig. 1. Kawakara et al. designed a laser vision seam tracking system for a V-type weld seam through the continuous operation of multiple images to eliminate noise, and then fitting the straight line and intersection to obtain weld seam points [6]. Jae et al. proposed a method that used text analysis to improve the robustness of a structural light visual tracking system [7]. The original data of the laser stripes were extracted and processed using the text analysis technique to detect feature points. Their methods can only resist a certain degree of arc and splash interference. Wu and Smith designed a vision sensor using structured light as illumination; they designed a high-performance Transputer Image Processing
Corresponding author. E-mail address:
[email protected] (Y. Zou).
https://doi.org/10.1016/j.optlaseng.2018.01.008 Received 6 November 2017; Received in revised form 18 December 2017; Accepted 15 January 2018 0143-8166/© 2018 Elsevier Ltd. All rights reserved.
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
ing method and continuous convolution operator tracker (CCOT). We use HOG to extract multi-resolution features instead of CNN because of the real-time requirements of our system.
Camera
Torch
2. Experimental platform
Laser
Welding direction
The hardware of the platform consists of an execution mechanism with six degrees of motion, welding equipment, a structured light vision, and an industrial control computer. The complete seam tracking system is shown in Fig. 2. The three-stripe laser vision sensor consists of a threestripe laser generator and a camera mounted in front of the torch for obtaining weld information. During the welding process, an image with weld seam profile information is captured by the camera and transmitted to the computer via Ethernet. The computer image processing module is based on the Visual Studio 2015 platform and the HALCON machine vision library. Accurate weld feature points can be obtained by the image processing module, and then, the control signal can be determined for real-time control of the torch movement, which can achieve accurate welding seam tracking. BAO Shun-dong et al. carried out a detailed analysis on a variety of arc spectrums [19]. The arc spectrum of the pulse metal inert gas welding shows that the arc intensity in the neighborhood of 450 nm, 610–700 nm band and 850–1000 nm band is relatively weak. In the actual situations, we usually take visible laser in 400–760 nm as the light source for easily mounting and debugging. Therefore, a sensor that uses a 660 nm laser generator and an optical filter with a central wavelength of 660 nm help reduce the interference of arc light. The optical specifications of the laser used in this study are listed in Table 1. The camera resolution is 1024 × 1280, and the sampling rate is 50 fps. The motion execution mechanism applied is that of the YASKAWA welding robot (model MA1440), which has six degrees of freedom and a repeat positioning accuracy of ±0.08 mm. The welding equipment adopts the MOTOWELD-RD350 welding system, with a maximum welding current of 350 A. The image processing module is an IPC-510 embedded industrial computer with a 3.4 GHz Intel i7-3770 quad-core processor and 12GB RAM.
Welding pool
Laser beam d Fig. 1. The schematic diagram of the weld seam tracking.
System (TIPS) to accelerate image processing speed [8]. Bae et al. used vision sensing technology to measure groove gap based on gap size; they then applied welding knowledge to establish fuzzy logic for the real-time control of welding speed [9]. Though accuracy has been improved but these methods still cannot cope with the aforementioned harsh situations. Yang et al. proposed a method that combined a CCD camera with weld detection and process control method based on adaptive Hough transform; their method could extract feature points from laser stripes in real time [10]. However, this method requires the use of neural networks to obtain the corresponding welding parameters, thereby making the process cumbersome. Moreover, this method only analyzes the type of line weld seam, and sensor measurement frequency is relatively low. Xu proposed a real-time tracking and control technique for 3D weld seams during robotic gas tungsten arc welding based on vision sensor and arc sensor [11]. The tracking error for all types of weld seam can be controlled at ±0.4 mm, but the sampling frequency of the system is only 2 Hz, which is difficult to meet welding processes with high real-time requirements. In this paper, a seam tracking system based on the morphological image processing method and the continuous convolution operator tracker (CCOT) (2016) object tracking algorithm [12] is proposed. This algorithm can track target and feature points in image sequences or videos with high accuracy and robustness. Classical target tracking algorithms, such as the optical flow method (LK method), use the correlation between adjacent frames to find the optical flow pattern between them [13]. Dalal et al. first proposed the Histograms of Oriented Gradients (HOG) method to extract image characteristics [14]. Experiments show that the HOG descriptor exhibits good performance and can extract considerable information. Simultaneously, the support vector machine algorithm was used to realize pedestrian detection. Zhang et al. proposed a compressive tracking (CT) algorithm, which is a real-time target tracking algorithm based on linear stochastic measurement [15]. Kalal et al. proposed an object tracking algorithm (TLD) that combined detection and tracking algorithms with online learning [16]. This algorithm consists of a tracker, a detector, and a learning module that treats the tracking problem as an online learning classification problem. However, these algorithms cannot resist the strong interference in our tracking process and we will show in later part by performing a comparative experiment. Henriques et al. proposed a Kernelized correlation filters (KCF) algorithm that applied the filter method in signal processing to object tracking; this approach converted loss functions into the Fourier domain to significantly increase calculation speed [17]. To improve tracking accuracy, Galoogahi et al. proposed a multi-channel method for filter algorithms [18]. The CCOT algorithm used in this study is also a filter algorithm. However, its efficiency, accuracy, and robustness have been considerably improved compared with those of the traditional filter algorithms. Considering its superior ability to resist interference while keeping high accuracy compared with other algorithms, we designed this seam tracking system combining the morphological image process-
3. Principle of structured light vision system 3.1. Basic measurement principle The 3D measurement principle of the structured light vision system is shown in Fig. 3. A dot laser beam generated by a semiconductor laser is optimally formatted with three highly linear, stable, and uniformdensity laser lines under a Powell prism. This beam is called the line structured light. It is projected onto the welding workpiece, thereby forming three laser stripes with the change in shape of the seam on the surface of the workpiece. The middle stripe is utilized as the measurement stripe to facilitate feature point extraction. A fixed angle exists between the CMOS camera and the laser plane, and thus, the point of the laser stripe on the image does not only contain the position of the plane, but also its depth. 3.2. Mathematical model of the 3D measurement of the structured light vision The perspective imaging process of the laser stripe is shown in Fig. 4. Planes Π1 and Π2 represent the measurement and imaging planes, respectively. The 3D coordinates OC XC YC ZC denote the camera coordinate system. The 2D coordinates OI XI YI denote the imaging coordinate system. OP XP YP denotes the pixel coordinate system. The straight line with OC OI is the optic axis of the camera, and f is the equivalent focal length. A laser line is formed by the intersection of the laser and the measurement planes. When P is assumed as a point on this laser line, then its coordinates in the coordinate system are (x, y, z). P′ is an image of P and 142
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
Welding power supply CMOS camera
Welding robot (Image acquisition & Welding)
laser
Industrial computer Image processing Ethernet
Robot cabinet Beckhoff modules
Control Signal Fig. 2. Seam tracking schematic. Table 1 Laser parameters. Diode power
Wavelength
Intensity
Distribution
Fan angles
Line uniformity
Bore sighting
100 mw
660 nm
Uniform lengthwise
Gaussian widthwise
30°
Line uniformity to ± 5%
<3 mrad
Camera Laser Prism
Filter Workpiece
Seam Laser stripe Fig. 3. Schematic diagram of measurement.
its coordinates in the OI XI YI coordinate system are (u, v). On the basis of the perspective projection principle, the transformation between (x, y, z) and (u, v) is established via homogeneous coordinates as follows: ⎡𝑢 ⎤ ⎡1 𝑠⎢𝑣⎥ = ⎢0 ⎢ ⎥ ⎢ ⎣1⎦ ⎣0
0 1 0
0 0 1∕𝑓
⎡𝑥⎤ 0⎤ ⎢ ⎥ 𝑦 ⎥ 0 ∙⎢ ⎥ ⎥ ⎢𝑧⎥ 0⎦ ⎢ ⎥ ⎣1⎦
Fig. 4. Perspective projection model.
cal coordinates (u, v) is given as follows: [ ] [ ] 𝑢𝑑 𝑢 2 = √ 𝑣𝑑 1 + 1 − 4𝐾(𝑢2 + 𝑣2 ) 𝑣
(1)
where s is a nonzero scale factor and the 3 × 4 matrix is the perspective transformation matrix. Geometric distortions occur during the transformation from imaging coordinates to pixel coordinates. We correct the image using the Lenz distortion model (proposed by Lenz and Fritsch in 1990) [20], and the relation between the real imaging coordinates (ud , vd ) and the theoreti-
(2)
where K is the distortion coefficient. Imaging uses pixel coordinates (c, r), which represent the numbers of rows and columns of pixel points in a CMOS array, unlike imaging coordinates, which represent the positions of points. The transformation between these two coordinates is established via homogeneous coordi143
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
Fig. 5. The flowchart of CCOT.
nates as follows: ⎡𝑐 ⎤ ⎡1∕𝑆𝑥 ⎢𝑟 ⎥ = ⎢ 0 ⎢ ⎥ ⎢ ⎣1⎦ ⎣ 0
0 1∕𝑆𝑦 0
𝐶𝑥 ⎤ ⎡ 𝑢𝑑 ⎤ 𝐶𝑦 ⎥ ∙ ⎢𝑣𝑑 ⎥ ⎥ ⎢ ⎥ 1⎦ ⎣1⎦
To resolve this issue, the seam feature point detection algorithm combining the morphological image processing method and CCOT algorithm is adopted.
(3)
where Sx and Sy represent the distance between the two adjacent photosensitive elements in the horizontal and vertical directions, respectively. (Cx , Cy ) represent the pixel coordinates of the intersection of the optical axis and the photosensitive chip. A constraint of the equation is added to establish the exact mapping relation between the 2D pixel coordinates (c, r) and the 3D camera coordinates (x, y, z), which can be written as follows: 𝐴𝑥 + 𝐵𝑦 + 𝐶𝑧 − 𝐷 = 0
4.1. Algorithm principle A traditional morphological image processing method which containing the above steps is used to determine the location of the weld feature point in the first noise-free frame. A rectangle is drawn to represent the tracking target whose center is the feature point in the first frame. Because there is no noise interference so the seam feature point can be easily and accurately located as shown in the first picture in Fig. 7(a). However, when the welding begin, strong arc light and splash in the image are detected by the weld real-time tracking system, which resulted in reduced measurement accuracy and large amount of erroneous data. In this case, the image morphological method may still be processed for an independent image. However, automatically analyzing and dealing with the absence of artificial intervention are frequently difficult for welding images acquired continuously that have different noise distributions. The splash in the welding process using a laser stripe is easily captured by mistake. Thus, the anti-interference capability is weak, which results in low accuracy. So the CCOT algorithm is combined. The effect of noise on the weld location is weakened to a large extent during the implementation of the algorithm. Thus, the anti-interference capability is strong, and the accuracy is also significantly improved compared with that of the traditional method. The CCOT algorithm is derived from traditional discriminative correlation filters (DCFs). However, CCOT applies an implicit interpolation model to pose the learning problem in a continuous spatial domain to obtain a series of continuous filters. The advantages of the CCOT algorithm over the DCF algorithm are as follows. (1) The CCOT algorithm allows natural integration of multi-resolution feature maps, which is particularly desirable for object tracking and detection. In this study, the multi-resolution HOG is used to ensure that the sampling rate is sufficiently high for the seam tracking system. (2) The continuous domain learning formulation enables accurate sub-pixel localization, which is achieved by labeling the training samples with sub-pixel precise continuous confidence maps. Therefore, the CCOT algorithm is suitable for accurate feature point tracking, and consequently, for the designed seam tracking system. In [12], the CNN (convolutional neural networks) is used to extract features so the algorithm runs relatively low. Here, we use the HOG to extract features so as to maintain the real-time performance while keeping the accuracy. The working principle of the CCOT algorithm is mainly shown in Fig. 5, from which we can see the CCOT algorithm is similar to the DCF algorithm, however it utilizes multiresolution feature map and is performed in continuous domain. The objective is to train continuous convolution operator filters based on training sample xj , which consist of feature maps extracted from image patches using HOG. Each sample xj contains D feature channels 𝑥1𝑗 , … 𝑥𝐷 𝑗 , which are extracted from the same image patch. Unlike conventional DCF formulations, CCOT assumes that feature channels have different spatial resolutions, i.e., they have different numbers of spatial sample points. Let Nd denote the number of spatial samples in 𝑥𝑑𝑗 . Then, the sample can be expressed as 𝜒 = ℝ𝑁1 × ... × ℝ𝑁𝐷 . To pose the learning problem in the continuous spatial domain, an implicit interpo-
(4)
The following equation can be derived using Eqs. (1)–(3): ⎧ 𝐷(𝑆𝑥 𝑐 − 𝐶𝑥 𝑆𝑥 ) ⎪𝑥 = 𝐴 ( 𝑆 𝑐 − 𝐶 𝑆 𝑥 𝑥 𝑥 ) + 𝐵(𝑆𝑦 𝑟 − 𝐶𝑦 𝑆𝑦 ) + 𝐶𝑓 𝜉 ⎪ 𝐷(𝑆𝑦 𝑟 − 𝐶𝑦 𝑆𝑦 ) ⎪ ⎨𝑦 = 𝐴 ( 𝑆 𝑐 − 𝐶 𝑆 𝑥 𝑥 𝑥 ) + 𝐵(𝑆𝑦 𝑟 − 𝐶𝑦 𝑆𝑦 ) + 𝐶𝑓 𝜉 ⎪ 𝐷𝑓 ⎪𝑧 = ⎪ 𝐴 ( 𝑆 𝑐 − 𝐶 𝑆 ) + 𝐵( 𝑆𝑦 𝑟 − 𝐶𝑦 𝑆𝑦 ) + 𝐶𝑓 𝜉 𝑥 𝑥 𝑥 ⎩
(5)
where 𝜉 = 𝐾 𝐶𝑥2 𝑆𝑥2 − 2𝐾 𝐶𝑥 𝑆𝑥2 𝑐 + K𝐶𝑦2 𝑆𝑦2 − 2𝐾 𝐶𝑦 𝑆𝑦2 𝑟 + 𝐾 𝑆𝑥2 𝑐 2 + 𝐾 𝑆𝑦2 𝑟2 + 1 and (Sx , Sy , f, K, Cx , Cy ) are collectively referred to as the internal parameters of the camera, and (A, B, C, D) as the parameters of the laser plane. Therefore, the transformation between the 2D pixel coordinates and the 3D coordinates of the camera is achieved. In this study, the internal parameters of the camera can be determined using the calibration method based on the HALCON machine vision library. The laser plane is fitted by the orthogonal regression of the geometric distance based on m non-coincident laser stripes to obtain the laser plane parameters. Then, mapping between the coordinates of the feature points in the world coordinate system and those of the feature points in the camera coordinate system is shown in Eq. (6) through eye-in-hand calibration. P1 can be obtained directly from the axis of the encoder, whereas P2 is the matrix obtained via eye-in-hand calibration. Therefore, the actual coordinates of the weld feature points from the acquired timing image can be determined. ⎡𝑥𝑤 ⎤ ⎡𝑥⎤ ⎢ ⎥ ⎢ ⎥ 𝑦 𝑤 ⎢ ⎥ = 𝐏1 𝐏𝟐 ⎢𝑦 ⎥ ⎢ 𝑧𝑤 ⎥ ⎢𝑧 ⎥ ⎢1⎥ ⎢1 ⎥ ⎣ ⎦ ⎣ ⎦
(6)
4. Seam feature point detection system The traditional seam feature point extraction method is usually based on morphological image processing. Each image captured by a camera will be processed through the following steps: median filtering, contrast enhancement, threshold segmentation, morphological reconstruction, region of interest extraction, centerline extraction, and feature point extraction. This method can correctly find the feature point when the spatter in the image is minimal. However, the welding image can be seriously polluted when the distance between the welding pool and the laser beam is reduced to 15 mm. In such situation, the correct seam feature point and the laser strip cannot be extracted using the aforementioned traditional method, as shown in Fig. 7(a). 144
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
Fig. 6. The flowchart of the seam tracking system based on CCOT.
𝑑 𝑑 ̂ 𝐽̂ 𝑑 {𝑥 }[𝑘] = 𝑋 [𝑘]𝑏𝑑 [𝑘] by using the results of the Fourier analysis. In ∑𝑁𝑑 −1 𝑑 −𝑖 2𝜋 𝑛𝑘 this case, 𝑋 𝑑 [𝑘] =∶ 𝑛=0 𝑥 [𝑛]𝑒 𝑁𝑑 , 𝑘 ∈ ℤ is the discrete Fourier transform of xd . The Fourier coefficients of the output confidence function are derived using the linearity and convolution properties as follows:
lation model for the training samples is used. ( ) ∑ { } 𝑇 𝐽 𝑑 𝑥 𝑑 (𝑡 ) = 𝑥𝑑 [𝑛]𝑏𝑑 𝑡 − 𝑁𝑑 0 𝑁𝑑 −1
(7)
where T denotes the continuous interval [0, 𝑇 ) ⊂ ℝ to be the spatial support of the feature map. For each channel, Eq. (7) completes the transformation from discrete space ℝ𝑁𝑑 to continuous space L2 (T). bd ∈ L2 (T) denotes the interpolation function. To use all the feature channels, the objective of this study is to learn a linear convolution operator, i.e., Sf : 𝜒 → L2 (T), which maps a sample to a target confidence function s(t) = Sf {x}(t), where s(t)∈R, t∈[0,T). Therefore, the target is localized by maximizing the confidence scores in an image region. The operator is parameterized using a set of convolution filters f = (𝑓 1 , … , 𝑓 𝐷 )∈L2 (T)D , where fd ∈ L2 (T) is the continuous filter for feature channel d. We define the convolution operator as 𝑆𝑓 {𝑥} =
𝐷 ∑ 𝑑=1
{ } 𝑓 𝑑 ∗ 𝐽𝑑 𝑥𝑑 ,
𝑥∈𝜒
𝑆̂ 𝑓 {𝑥}[𝑘] =
𝐸 (𝑓 ) =
𝑗=1
𝐷 ‖ { } ‖2 ∑ ‖ 𝑑 ‖2 𝛼𝑗 ‖𝑠𝑓 𝑥𝑗 − 𝑦𝑗 ‖ + ‖𝑤𝑓 ‖ ‖ ‖ ‖ ‖
𝑑=1
𝑓̂𝑑 [𝑘]𝑋 𝑑 [𝑘]𝑏̂ 𝑑 [𝑘], 𝑘 ∈ ℤ
(10)
We obtain the following using Parseval’s formula: ‖∑ ‖2 ∑ 𝐷 ‖𝐷 𝑑 𝑑 ‖ ‖ ‖2 ̂ 𝑋 𝑏̂ 𝑑 − 𝑦̂𝑗 ‖ + 𝑓 𝛼𝑗 ‖ (11) ‖𝑤̂ ∗ 𝑓̂𝑑 ‖ 2 ‖ ‖ 𝑗 ‖ ‖𝑙 ‖ ‖ 𝑗=1 ‖𝑑=1 ‖𝑙2 𝑑=1 The function can be equivalently minimized with respect to the Fourier coefficients for each filter. For practical purposes, the filter should be represented by a finite set of parameters. CCOT mini𝐾1 mizes Eq. (11) over the finite-dimensional subspace V = span{𝑒𝑘 }−𝐾 × 𝐸 (𝑓 ) =
(8)
𝑚 ∑
1
𝐾
𝐷 … × span{𝑒𝑘 }−𝐾 ⊂ 𝐿2 (𝑇 )𝐷 . Wd is further defined as the (2𝐾𝑑 + 2𝐿 + 𝐷 1) × (2𝐾𝑑 + 1) Toeplitz matrix that corresponds to the convolution operator 𝑊𝑑 𝐟̂𝑑 = 𝑣𝑒𝑐 𝑤̂ ∗ 𝑓̂𝑑 . Finally, let W be the block-diagonal matrix 𝑊 = 𝑊1 ⊕ ... ⊕ 𝑊𝐷 . Then, the minimization of Function (11) subject to f ∈V is equivalent to the following least squares problem:
Unlike the conventional DCF method, the samples xj ∈ 𝜒 are labeled by confidence function yj ∈ L2 (T) and defined in a continuous spatial domain. The filter is trained, given a set of m training sample pairs {(𝑥𝑗 , 𝑦𝑗 )}𝑚 ⊂ 𝜒 × 𝐿2 (𝑇 ), by minimizing the function as follows: 1 𝑚 ∑
𝐷 ∑
𝑚 ( ) ∑ ‖ ‖2 ‖ ‖2 𝐸𝑉 𝐟̂ = 𝛼𝑗 ‖𝐴𝑗 𝐟̂ − 𝑦̂𝑗 ‖ + ‖𝑊 𝐟̂‖ ‖ ‖2 ‖ ‖2
(9)
(12)
𝑗=1
𝑑=1
where
where the weights 𝛼 j ≥ 0 control the impact of each training sample. The second term is a spatial regularization term that is determined by the penalty function w [21]. This term enables the filter to be learned on arbitrarily large image regions by controlling the spatial extent of the filter. The spatial regions that typically correspond to background features are assigned a large penalty in w, whereas the target region has small penalty values. To train the filter, we minimize Eq. (9) in the Fourier domain. The Fourier coefficients of the interpolated feature map are given by
[𝑨𝑇1
the
matrix 𝑨𝑗 = [𝑨1𝑗 … 𝑨𝐷 𝑗 ].
Then,
the
sample
matrix
… 𝑨𝑇𝑚 ]𝑇 ,
A= the diagonal weight matrix 𝚪 = 𝛼1 𝐼 ⊕ ... ⊕ 𝛼𝑚 𝐼, 𝑇 ]𝑇 are defined. The minimizer of and the label vector 𝐲̂ = [𝐲̂ 1𝑇 … 𝐲̂ 𝑚 Eq. (12) is found by solving the normal equations as follows: ( 𝐻 ) (13) 𝑨 𝚪𝑨 + 𝑾 𝐻 𝑾 𝐟̂ = 𝑨𝐻 𝚪̂𝐲 where H denotes the conjugate-transpose of a matrix, which is described in 1D space for simplicity. The 2D case is specifically considered for tracking applications. For example, space L2 (T) is generalized to L2 (T1 , T2 ) and the interpolation function b(t1 , t2 )= b(t1 ) b(t2 ) is applied. 145
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
Fig. 7. The comparison of three tracking algorithms: (a) Traditional method (b) TLD (c) CCOT.
4.2. Extraction of weld seam points during welding
To further illustrate the anti-interference ability of the CCOT weld extraction algorithm, the noisy images taken from the experiment are tested using the morphological image processing algorithm, the TLD tracking algorithm, and the CCOT algorithm. The test results are presented in Fig. 7. In Fig. 7, the rectangle represents the target tracked by the tracking algorithms and the red intersecting line segments in the middle of the rectangle denotes the feature points extracted. The morphological image processing algorithm can only locate the weld feature point accurately in a noiseless image. The TLD target tracking algorithm is less robust, and can only deal with images with small noise. The target cannot be found when arc and splash noises are large. However, the CCOT target tracking algorithm can still accurately locate the target. Therefore, the problem of seam tracking under considerable noise pollution can be solved using the morphological processing algorithm and the CCOT object tracking algorithm.
In our experiment, we use three-line laser to provide much information for determining the weld seam points. The three-line laser has the characteristic to maintain robustness when the features (HOG) of the stripes are not continuous interfered by arc and splash, which is valuable in extracting the position of the weld feature point accurately. In the first frame, we locate the seam feature point through traditional morphological image processing method. When welding starts, the CCOT tracking algorithm is used to track the target and to determine the weld feature points in images with large noises, such as arc and splash. The process of the presented weld feature extraction algorithm is shown in Fig. 6. We can see that the seam feature points can be located even in harsh welding environment. The training of the continuous convolution filter is performed by iteratively solving Eq. (13). The conjugate gradient (CG) method is adopted due to its computational efficiency. In the first frame, 100 iterations are used to find an initial estimate of the filter coefficients because of the lack of samples. Subsequently, 5 iterations per frame will be sufficient when CG is initialized by the current filter. In tracking, the trained filter is used to localize the target in the new frame and a new sample is added to update the filter coefficients. The corresponding importance weight is set to 𝛼𝑗 = (𝛼𝑗 − 1)∕(1 − 𝜆) using the learning parameter 𝜆 = 0.0075. The ∑ weights are then normalized, such that 𝑗 𝛼𝑗 = 1. A maximum of 400 samples are stored by replacing the sample with the smallest weight.
5. Experimental research and analysis To verify the performance of the designed seam tracking system, an experimental platform is built for the experiment. The physical map of the platform is provided in Fig. 10. The welding parameters are listed in Table 2. Our experiments are performed using two types of weld seam which are lap weld seam and butt weld seam as shown in Fig. 8. The lap weld seam trajectories we use are arc and curve as shown in Fig. 9(a) 146
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149 Table 2 Welding parameters. Welding method
Welding voltage(v)
Welding current(A)
Welding speed(mm/s)
Wire diameter(mm)
Welding material
Pulse MIG welding
24.0
130
6
1.0
Q235
Fig. 8. Feature points of two types of seam: (a) Lap weld seam (b) Butt weld seam.
Fig. 9. Welding workpieces.
Fig. 10. Physical diagram of experimental platform.
and (b) and the butt weld seam trajectory we use is arc as shown in Fig. 9(c), which can represent the industrial working situations well. A teaching trajectory is established before the experiment, such that the robot can move in the path without tracking. Then, the workpiece is manually moved at a certain angle and distance, and the trajectory is recorded given that the standard position data denote the theoretical trajectory. In the experiment, the real-time tracking position is compared with the teaching trajectory to obtain a deviation value. Then, this value is multiplied by a proportional factor to obtain the required
real-time control. Subsequently, the torch is controlled to move to the measurement point. The proposed robot control platform is based on the current trajectory correction function of the common sensor in the YASKAWA DX100 controller. The intelligent sensor module of the IPC control cabinet is bridged by an EK1100 coupler (Beckhoff, Germany), such that the output of the control is converted into a certain interval of analog signals (i.e., voltage) to drive the robot movement in real time. The experimental analysis of arc and curve lap welds is shown in Figs. 11 and 12. 147
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
Fig. 11. Coordinates of the torch on the X-O-Y plane (workpiece a): (a) Trajectory (b) residual error.
Fig. 12. Coordinates of the torch on the X-O-Y plane (workpiece b): (a) Trajectory (b) residual error.
Fig. 13. Coordinates of the torch on the X-O-Y plane (workpiece c): (a) Trajectory (b) residual error.
6. Conclusions
The average absolute tracking error of the experiment is 0.0400 mm and 0.1286 mm, and the error range is 0.2509 mm and 0.4859 mm. These values satisfy actual welding demands. The average absolute tracking error for the arc butt weld is 0.1884 mm and the error range is 0.603 mm and the experimental analysis is shown in Fig. 13. The welding results of the experiments are shown in Fig. 14. The welding position is accurate and the welding process is good, thereby demontrating the superior performance of the proposed weld tracking system.
The following conclusions can be drawn from this study. (1) A set of six-axis robot arm welding seam tracking experiment platform based on the HALCON machine vision library is designed to resolve the seam tracking issue. The control system includes a realtime motion control module, a control cabinet, and an upper computer software module based on Visual Studio Microsoft 2015. The control cabinet mainly controls the motion of the six-axis robot arm 148
Y. Zou, T. Chen
Optics and Lasers in Engineering 105 (2018) 141–149
Fig. 14. Seam tracking result.
platform. The upper computer software module mainly performs image acquisition, feature point extraction, 3D coordinate calculation, and human–computer interaction. (2) A system of structured light sensor is designed to fulfill the requirements of the visual and detection components. The conversion algorithm from the 2D coordinates based on the image coordinate system to the 3D coordinates based on the base coordinate system is obtained. The experimental results show that the measurement accuracy of the sensor can reach ±0.2 mm and frequency is 50 Hz, which can fulfill the requirements of welding seam detection. (3) Feature point extraction is investigated. A seam tracking system based on the morphological image processing method and CCOT algorithm is proposed to detect seam feature points and resolve the issue of strong arc and splash interference. The results of comparative experiments with the method based on morphology and the TLD object tracking algorithm show that the CCOT algorithm is highly robust and can work stably in an environment with strong arc and spatter interference.
[5] Yang L. Study on seam laser tracking system based on image recognition. Jinan: Shandong University; 2006. [6] Kawahara M. Tracking control system using image sensor for arc welding. Automatic 1983(4):22–6. [7] Jae SK, Young TS, et al. A robust method for vision-based seam tracking in robotic arc welding. In: Proceedings of tenth international symposium on intelligent control; 1995. p. 363–8. [8] Wu J, Smith JS, Lucas J. Weld bead placement system for multipass welding [using transputer-based laser triangulation vision system]. IEEE Process Sci Meas Technol 1996;143(2):85–90. [9] Bae KY, Lee TH, Ahn KC. An optical sensing system for seam tracking and weld pool control in gas metal arc welding of steel pipe. J Mater Process Technol 2002;120(1–3):458–65. [10] Yang S-M, Cho M-H, Lee H-Y, Cho T-D. Weld line detection and process control for welding automation. Meas Sci Technol 2007;18(3):819–26. [11] Xu YL. Research on real-time tracking and control technology of three-dimension welding seam during welding robot GTAW process based on vision sensor and arc sensor. Shanghai: Shanghai Jiao Tong University; 2013. [12] Danelljan M, Robinson A, Shahbaz Khan F, Felsberg M. Beyond correlation filters: learning continuous convolution operators for visual tracking. In: Computer vision – ECCV 2016. ECCV; 2016. p. 472–88. [13] Lucas BD, Kanade T. An iterative image registration technique with an application to stereo vision. In: Proc. seventh int’l joint conf. artificial intelligence, 81; 1981. p. 674–9. [14] Dalal N, Triggs B. Histograms of oriented gradients for human detection. In: Computer vision and pattern recognition, 2005.IEEE computer society conference on, 1. IEEE; 2005. p. 886–93. [15] Zhang K, Zhang L, Yang M-H. Real-time compressive tracking. In: Proceeding of European conference on computer vision. Florence, Italy:LNCS; 2012. p. 866–79. [16] Klal Z, Mikolajczyk K, Matas J. Tracking-learning-detection. IEEE PAMI 2011. [17] João F, Rui Caseiro H, Martins P, Batista J. High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 2015;37(3):583–96. [18] Galoogahi HK, Sim T, Lucey S. Multi-channel correlation filters. ICCV; 2013. [19] Bao S-d, Zhang K, Wu Y-x. A detailed analysis of welding arc spectrum distribution characteristics to choose light sources of laser sensors. J Optoelectron Laser 2009;04:504–8. [20] Lenz R, Fritsch D. Accuracy of photometry with CCD-sensors. ISPRS J Potogramm Remote Sens 1990;45:90–110. [21] Danelljan M, Hager G, Khan FS, Felsberg M. Learning spatially regularized correlation filters for visual tracking. In: IEEE international conference on computer vision. IEEE Computer Society; 2015. p. 4310–18.
Acknowledgement The authors would like to thank the anonymous reviewers for their constructive comments. This paper is supported by National Science and Technology Major Project (grant No. 2015ZX04005006-03). References [1] Li X, Wu C, Li W. Study on the progress of welding science and technology in China. J Mech Eng 2012;48(6):19–31. [2] Wang Q, Chen Z, Yu Q. Current development of welding robot in China. Mod Compon 2013;3:77–8. [3] Stavridis J, Papacharalampopoulos A, Stavropoulos P. Quality assessment in laser welding: a critical review. Int J Adv Manuf Technol 2017:1–23. [4] Leo M, Del Coco M, Carcagnì P, Spagnolo P, Mazzeo PL, Distante C, Zecca R. Automatic visual monitoring of welding procedure in stainless steel kegs. Opt Lasers Eng 2017.
149