Measurement 149 (2020) 106987
Contents lists available at ScienceDirect
Measurement journal homepage: www.elsevier.com/locate/measurement
Real-time spatial intersecting seam tracking based on laser vision stereo sensor Zhenwei Jia a,b, Tianqi Wang a,b,⇑, Junjie He a,b, Liangyu Li a,b, Kaixuan Wu a,b a b
School of Mechanical Engineering, Tianjin Polytechnic University, Tianjin 300387, China Tianjin Key Laboratory of Advanced Mechatronics Equipment Technology, Tianjin 300387, China
a r t i c l e
i n f o
Article history: Received 13 June 2019 Received in revised form 13 August 2019 Accepted 23 August 2019 Available online 30 August 2019 Keywords: Intersecting seam Real-time tracking Laser vision sensor Spatio-Temporal Context Kalman filter Trajectory planning
a b s t r a c t A real-time robotic weld tracking system based on laser vision sensor is designed for intersecting seam. Firstly, the traditional image processing method is used to determine the weld feature point in the first frame when there has no arc noise. Then, an extending and adopting Kalman Filter with Spatio-Temporal Context algorithm was proposed to extract the weld feature point when the laser stripe was blocked by heavy arc and splash noise during welding. Next, in order to control torch to weld automatically, the torch frame was established through novel three points principle and an expanding circle and arc length method. In addition, a segmentation weld method and an intermediate proportional interpolation method for step-size control of mobile torch were proposed to guarantee the accurate of weld. Finally, Experiments and analyses results show that the tracking system is good in real-time, accuracy, stability and flexible, which can meet the weld requirements. Ó 2019 Elsevier Ltd. All rights reserved.
1. Introduction Robotic seam tracking system has a fast developed in the past decade and been applied widely to improve effectiveness, precision and quality in the industrial field of weld such as car, ship, jackup oil rig, and aircraft manufacturing, etc. However, at present, the robotic seam tracking system which is only maturely used in some weld specimens which surface are plane or relative flat [1–5]. At the same time, it is difficult to weld if the seam line of specimen is spatial curve liking the intersecting seam because of the deviations which are caused by production, extrusion of transport storage process, thermal cutting and other factors, such as branch pipe coordinate system translation deviation, main pipe straightness deviation and main pipe ovality deviation [6]. Nowadays, technological development improves gradually so that the robotic seam tracking system, especially vision-based weld seam tracking system, can gently achieve the tracking of spatial curve with the big advantages of low cost, non-contact measurement, strong antiinterference ability and high precision, and achieve fast, efficient, flexible and intelligent welding under the condition of improving welding quality [7]. For the automatic weld based on visual sensor for intersecting curve, there still exist some difficulties at present. For the
⇑ Corresponding author. E-mail address:
[email protected] (T. Wang). https://doi.org/10.1016/j.measurement.2019.106987 0263-2241/Ó 2019 Elsevier Ltd. All rights reserved.
extracting of weld feature point, well as it was extracted in the scene of no arc, it failed in the situation of arc noise. For the position and posture planning of weld torch, one reason is not always easy because the robot arm can’t reach the position around intersecting curve without collision with the intersecting pipes, the other reason is that the robot and the intersecting pipes are two independent things, whose relative positional relationship is difficult to obtained in some complicated condition. Those motivations drive the design of new robotic weld tracking system for the purpose of deployment about intersecting automatic weld based on visual sensor. Recently, domestic and foreign institutional researchers have conducted some researches to explore those difficulties. Paper [8] introduced a general description of kinematic for automatic welding scheme of intersecting pipes in 2014. The geometric model of the intersecting pipes was established, and the parameter expressions and welding torch’s posture based on the independent rotation variable was given. Paper [9] designed a control system of automatic welding robot for space intersection in 2016. A mathematical model of the saddle shaped trajectory of the intersection line was also established, and a drive system of the multidimension and multi degree of freedom is designed. However, the above welding schemes are based on the ideal intersecting curve models and haven’t executed some experiments to confirm the correctness. Paper [10] studied the automatic welding of complex intersecting weld of large structures cross-pipes in scaffold structures in 2017, proposed the methods
2
Z. Jia et al. / Measurement 149 (2020) 106987
that two profiles scanning to locate intersecting pipeline workpieces coordinate system, modifying initial welds by refining procedure, and carried out arc-free simulation experiment. Paper [11] used laser pre-scanning to collet welding seam images, used the lowest point and slope to extract the approximative intersecting points of weld seam images, fitted them with NURBS curve, and combined with the positioner matching, the welding of the intersecting joint was realized finally in 2018. Although above researches verified by the test, it hasn’t enough adaptability to adopt more scenarios because of variety of deviation about intersecting pipes. Paper [6] introduced a novel method to quantify and compensate the path deviation of theoretical intersecting curve to find the key points position on the actual intersecting curve in 2019. The deviation sources were analyzed by comparing the theoretical intersecting curve and the non-ideal actual intersecting curve, building the coupling connections of deviation sources, and quantifying the main deviation sources to compensate the information of theoretical intersecting curve to approximate the non-ideal actual intersecting curve and to the goal of accurate welding of intersecting curve. But all above models only can maximumly approximate the actual intersecting curve, there still exists location error and has low flexible to adjust when the position and shape of intersecting curve changed because of thermal deformation during the process of welding. In this paper, a flexible real-time robotic weld seam tracking system based on laser vision sensor is designed for intersecting joint. The traditional image processing method (TIPM) was used to extract the weld characteristics and determine the weld feature point in the first frame before welding. Aimed at the situation that had heavy interference by arc and splash noise during welding, an extending and adopting Kalman Filter with Spatio-Temporal Context algorithm (EAKF-STC) was proposed to extract the weld feature point when the TIPM and STC was invalid. In addition, in order to control torch to weld automatically, the scheme of weld plan must be analyzed. Therefore, the torch coordinate frame was established, and an expanding circle and arc length method (ECAL) was proposed to determine it. Besides, a segmentation weld method and an intermediate proportional interpolation (IPI)
method for step-size control of mobile torch were need to be proposed to guarantee the safety and accurate of weld. Finally, a software with graphical user interface (GUI) is developed, which made the tracking system could track the weld simply and flexibly in real-time. 2. Tracking system setup The laser vision based robotic welding system for intersecting joint is shown in Fig. 1. The whole system can be divided into two distinct parts, and it includes Server-Terminal (ST) and Robot-Terminal (RT). The ST is an industrial personal computer (IPC), the RT consists of a six-degree freedom industrial robot, a laser vision stereo sensor, an automation weld control machine and a platform can rotate around one of its own axes. Furthermore, the laser vision stereo sensor is made of one industrial CCD, one red line laser generator and one filter glass. In the system, the visual sensor could be worked by the principle of optical pinhole perspective projection model [12], its relative positional relationship is shown in the enlarge view of Fig. 1. In the process of real-time welding, the visual sensor is used to capture the images. Think about the arc is colorful and to filter large amounts of arc noise, a narrow band pass filter glass is installed in the front of CCD. After capturing one image, a multithread software self-developed with GUI by C++ is not only used to extract and location the weld feature point while process related data and calculate the posture data of welding torch, but also transfer the posture data to the robot control system module in the IPC of ST. Next, the torch following the robot end-effector moves to the feature point while the automation weld machine control to weld. Therefore, by continuing capture new image of feature point, the whole system could to be weld in real-time continuedly. 3. Intersecting weld feature points extraction algorithm The weld precision depends on the accuracy of weld seam extraction from the captured images, which is the most important
CCD
laser
torch
Image capturing Wire feeder controller
filter d
Service-Terminal
Welding direction
workpiece
Interface module
Laser generator
Torch
Filter glass Intersecting joint Rotation platform
Robot-Terminal Fig. 1. System composition.
3
Z. Jia et al. / Measurement 149 (2020) 106987
step in the automation weld based on visual sensor. In this Section, a TIPM is proposed to extract the laser stripe characteristic and locate the weld feature point when there has no arc and splash noise before welding. Aiming at the situation that has arc and splash noise during the process welding, an EAKF-STC is proposed to solve the drift problem and guarantee the robustness. 3.1. TIPM extracts weld feature points in no arc noise The laser stripes are not contaminated before the welding, and it is easy to detect the weld seam feature points by the traditional image processing method (TIPM). As shown in Fig. 2. the welding characteristic can be obtained by recognizing the red laser stripe from original image, thresholding to transform the three channels to single channel, connected area filtering to remove the small area noise outside the laser stripe, morphological dilating to remove breakpoint and internal hole noise of the laser stripe, thinning to extract the centerline of laser stripe, locating the weld feature point and fusing the processed image with original image. The processing results is shown in Fig. 3. 3.2. STC algorithm to extract weld feature points in arc noise Spatio-Temporal Context (STC) algorithm is a hybrid model which make trade-offs between generative and discriminative based on their appearance models, and it makes full use of the information surrounding target regions while it constructs the effective appearance model. The STC includes the spatial context and the temporal context. The spatial context is a set of the target and region’s gray value information including their relative position relationship in the one frame. The temporal context is the target and region’s time sequence which is involved in the current frame, previous frame and the next frame image [13]. As shown in Fig. 4 it is a frame about laser stripe, heavy arc and splash noise. Due to close to the welding torch, the region is contaminating by heavy noise in the bottom of the image while there is slight occlusion by splash noise in the top. During the real-time welding process of intersecting joint, the laser stripes contaminat-
(a )
Original image
Fig. 4. Laser stripe image.
ing by arc noise with its surrounding region constitute the spatial context, and the change of the laser stripes in the adjacent two images were very small so that the successive weld seam image’s sequence can be as the temporal context. The core idea of STC algorithm is to construct a confidence map by the probability, as shown in (1) [13]. Therefore, the laser stripe feature point and its spatial context has a so stable geometric relationship that the arc noise has not a big influence in confidence intervals, at the same time, adding to update the temporal context model, and it could be used to estimate the feature point position in real-time [14].
cðxÞ ¼ PðxjoÞ ¼ bejðxx
Þ=ajb
ð1Þ
where x 2 R2 is a laser stripe location, o denotes the laser stripe present in the scene, and x* is the maximum value of x, denoting the weld feature point, as shown in Fig. 5. The parameter b is normalization, a and b denote scale parameter and shape parameter, respectively. In spatial context, according to Bayesian theorem, a confidence map c(x) is also computed which estimates the location region likelihood of laser stripe about weld feature point [13]:
cðxÞ ¼ PðxjoÞ P ¼ g P ð x; g ð zÞjoÞ PgðzÞ2X ¼ g ðzÞ2X g P ðxjg ð zÞ; oÞP ð g ðzÞjoÞ
(b )
(c)
(d )
(e)
Red color recognition
Channel conversion
C onnected area filtering
Morphological dilation
(g )
(f)
T hinning laser stripe
Weld feature Point extract
ð2Þ
(h )
Image fusion
Fig. 2. The TIPM algorithm flowchart.
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(h)
Weld Feature Point
Fig. 3. The processed results of TIPM algorithm.
4
Z. Jia et al. / Measurement 149 (2020) 106987
ctþ1 ðxÞ ¼ Hstc tþ1 ð xÞ I tþ1 ðxÞ- x xt
ð10Þ
And the weld feature point x* can be determined by maximizing new confidence map ctþ1 ðxÞ.
xtþ1 ¼ max ðctþ1 ðxÞÞ x2Xg ðxt Þ
ð11Þ
3.3. Extending and adopting KF-STC algorithm to fix weld feature point tracking Fig. 5. Weld feature point and target region.
where Xg = {g(z) = (I(z), z)|z 2 Xg (x*)} is the context feature points set, I(z) denotes image intensity or the gray value at pixel z, Xg (x*) is the neighborhood of feature point x* that is twice the size of the target region. Moreover, the spatial context model is defined as [13]: sc
Pðx; g ðzÞjoÞ ¼ h ðx zÞ
ð3Þ
sc
where h (x z) is a non-radially symmetric function with respect to the relative distance and direction between the weld feature point x and its surrounding location z, which models the spatial relationship between weld feature point and its context surrounding information and it helps to resolve ambiguities effectively when the degraded image of laser stripe measurements allow different interpretation. And P(g(z)|o) is a context prior probability which models appearance of the local region Xg (x*), its definition is [13]:
Pðg ðzÞjoÞ ¼ IðzÞ-r ðz x Þ
ð4Þ
where -r () is a Gaussian weighted function defined by [13]:
-r ðz xÞ ¼ aejzxj =r 2
2
ð5Þ
wherein a is a normalization constant and r is a scale parameter. So, putting (1)–(5) together, the formulate (2) can be rewritten as (6) in spatial domain, and can be transformed to (7) in frequency domain. b
cðxÞ ¼ bejðxxÞ=aj P sc ¼ g ðzÞ2X g h ðx zÞI ðzÞ-r ðz x Þ
ð6Þ
¼ h ðxÞ ðIðxÞ-r ðx x ÞÞ sc
b F bejðxx Þ=aj ¼ F ðhsc ðxÞÞ F ðIðxÞ-r ðx x ÞÞ
ð7Þ
where, denotes the convolution operator, F denotes the FFT function and is the element-wise product. Therefore, the fast learning spatial context model could be constructed. That is,
0 sc
h ð xÞ ¼ F
1 @
b F bejðxx Þ=aj
F ððIðxÞ-r ðx
ð8Þ
x ÞÞÞ
1
where F denotes the IFFT function. In temporal context, the laser stripe has a close relationship between the t-th frame and the (t + 1)-th frame because the robot has a slow speed when actual weld in real-time, so the temporal context model in (t + 1)-th frame is updated by formulate (9) to reduce arc noise influence in the target region appearance variations. stc Hstc tþ1 ðxÞ ¼ ð1 qÞH t ðxÞ þ qht ð xÞ sc
Hstc 1 ð xÞ
¼
ð9Þ
sc h 1 ð xÞ
Although the STC tracker is a practical approach in challenging scenarios, it has some limitations [15]. As shown in Fig. 6. Through the actual test to extract feature point for intersecting weld, the phenomenon could be found that the target would be lost when tracking the weld feature point of laser stripe images from the process of no arc noise to contaminating by heavy noise (the green denotes the failed tracking by STC and the yellow indicates what we want to track). The reason why the weld feature point extract failed is that the target location update in all frames is relied on the scale adaptation scheme based on sole confidence map calculation [15]. Therefore, the confidence map has an equal value in coming frame which is contaminated by the heavy arc noise, which results in drift problem in spatial context. Aiming at this problem, an Extending and Adopting Kalman Filter combined with STC algorithm (EAKF-STC) has been developed to solve the drift problem. In the actual welding process, thanks to the visual sensor similarities, the system dynamic noise and the sensor measurement noise are colored rather than white [16]. According to [16] and [17], the method of extending state eigenvector and adopting an equivalent observation equation were chosen to white them to be Gaussian white noise and to apply them into Kalman Filter algorithm. In the actual welding process of intersecting joint, the weld speed of robot is set as constant, so it is easy to know that the speed of weld feature point x* is variable whether it is in the uaxis direction or the v-axis direction because the intersecting weld seam is a curve. Therefore, as shown in Fig. 7, the system dynamic equation on the u-axis orientation of pixel coordinate system in weld images can be expressed by:
1 u ðk þ 1Þ ¼ u ðkÞ þ u_ ðkÞt þ xu ðkÞt 2 2
where xu(k) is system dynamic noise, donating the random acceleration. t is the sampling time of the CCD camera about the adjacent two images in real-time welding. According to [16,18,19], the process model was established as follows:
X u ðkÞ ¼ ½ u1 ðkÞ u2 ðkÞ u3 ðkÞ
sc
T
ð13Þ
where u1(k) = u*(k), u2(k) = u_ (k) and u3(k) = u*(k 1). Meanwhile, u1(k) is the measurement value and u3(k), apparently, is the alteration value of weld feature point in u-axis orientation from time k 1 to k. Then, the weld feature point state equation and measurement equation can be described as (14) and (15), respectively.
2
1
t
0
32
u1 ðkÞ
3
2
7 6 6 76 7 6 6 76 X u ðk þ 1Þ ¼ 6 0 1 0 76 u2 ðkÞ 7 þ 6 5 4 4 54 1 0 0
where q is a learning parameter and ht is the spatial context computed by (8) at the t-th frame. The confidence map of (t + 1)-th frame can be described as [13]:
ð12Þ
0:5t 2
u3 ðkÞ
2 3 u1 ðkÞ 6 7 Z u ðkÞ ¼ 4 u2 ðkÞ 5 þ M u ðkÞ 1 0 1 u3 ðkÞ
1 0
0
t
3 7 7 7xu ðkÞ 5
ð14Þ
0
ð15Þ
5
Z. Jia et al. / Measurement 149 (2020) 106987
Fig. 6. The process of no arc noise to contaminating by heavy noise.
By adopting an equivalent observation equation, the colored noise Mu(k) can be white as follows [16]:
Z u ðk þ 1Þ wZ u ðkÞ ¼ ½H u wH X u ðkÞ þ gu ðkÞ |fflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflffl} |fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl}
ð21Þ
H
Z u ðkÞ
The weld feature point position measurement equation in uaxis orientation (20) can be replaced by:
3 u ðkÞ 6 g1 ðkÞ v u ðk þ 1Þ amu ðkÞ 1 a t 0 0:5t 6 u_ ðkÞ 7 7 ¼ 7þ 6 2 4 g ðk Þ u ðk 1Þ 5 v Du ðk þ 1fflÞ{zfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl bmDu ðkÞ b t b 0:5t |fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl ffl} |fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl} |fflfflfflfflffl2{zfflfflfflfflffl} x ð k Þ Z u ðkÞ gu ðkÞ H |fflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflffl}
Fig. 7. Pixel coordinate system.
where Mu(k) = [mu(k) mDu(k)]T is the measurement noises of visual sensor. However, the system dynamic noise xu(k) and the sensor measurement noise Mu(k) are colored rather than white, which can’t apply into traditional KF algorithm directly. In order to solve the problem, a shaping filter was chosen to white them, and the model can be represented as follows [16,17]:
xu ðkÞ ¼ kxu ðk 1Þ þ nu ðkÞ
mu ðkÞ
m u ðkÞ |fflfflfflfflfflfflfflD{zfflfflfflfflfflffl ffl} M u ðk Þ
¼
a 0
ð16Þ
mu ðk 1Þ
0 b mDu ðk 1Þ |fflfflfflfflffl{zfflfflfflfflffl} |fflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflffl} w
þ
M u ðk1Þ
gu1 ðk 1Þ gu2 ðk 1Þ
ð17Þ
|fflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflffl} gu ðk1Þ
2
#
2
X u ðk Þ
ð22Þ From the Eqs. (19) and (22), it is known that the system dynamic noises xu(k) and the measurement noises Mu(k) have been transformed to Gaussian white noises, which are satisfied with the conditions of the KF algorithm. Because the conditions of v-axis are similar to the u-axis, so the integral system dynamic equation and the integral measurement equation for the u-axis and v-axis orientation of pixel coordinate system can be written as follows:
where nu(k) and gu(k) are Gaussian white noise sequences with zero mean, and the variances are Qu(k) and Ru(k), respectively, and are statistically uncorrelated with each other. Symbol k, a, b represent constant coefficients. respectively. By extending state eigenvector, the state vector (13) can be update as follows [16]:
"
X u ðk þ 1Þ
X v ðk þ 1Þ |fflfflfflfflfflfflfflfflffl ffl{zfflfflfflfflfflfflfflfflfflffl}
¼
u
0
X u ðkÞ
X v ðkÞ 0 u |fflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflffl} |fflfflfflfflffl{zfflfflfflfflffl} u0
0
X ðkþ1Þ
þ
Cu nðkÞ Cv
ð23Þ
|fflffl{zfflffl} C0
0
X ðk Þ
T T X u ðkÞ ¼ ½ X u ðkÞ xu ðkÞ ¼ ½ u ðkÞ u_ ðkÞ u ðk 1Þ xu ðkÞ
ð18Þ EAKF-STC
The system state Eq. (14) and the weld feature point position measurement equation in u-axis orientation (15) can be update as follows:
3 3 2 u ðk þ 1Þ 1 t 0 0:5t2 60 1 0 6 u_ ðk þ 1Þ 7 t 7 7 7 6 6 ¼ 7 7 6 6 41 0 0 4 u ðkÞ 5 0 5 xðk þ 1Þ 0 0 0 k |fflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflffl} |fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl} 2
u
X u ðkþ1Þ
3 2 3 0 u ðkÞ 6 u_ ðkÞ 7 6 0 7 7 6 7 6 7 þ 6 7 n ðk Þ 6 4 u ðk 1Þ 5 4 0 5 u 2
xðkÞ
|fflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflffl}
X u ðk Þ
STC
(a) Strong noise image
(b) Slight noise image 10
1 |fflffl{zfflffl} Cu
EAKF-STC and STC
10
Multiple peaks
-3
4
1.5
3 1
2 0.5
1
0
Z u ðkÞ
u ðkÞ u_ ðkÞ
3
6 7 m ðkÞ u 7 6 ¼ 7þ 6 1 0 1 0 4 u ðk 1Þ 5 mDu ðkÞ |fflfflfflfflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflfflfflfflffl} |fflfflfflfflfflfflffl{zfflfflfflfflfflfflffl} xðkÞ H M u ðk Þ |fflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflffl}
1 0
0
0
X u ðkÞ
Single peak
5
ð19Þ 2
-3
0 -1
-0.5
-2
-1
120 100
-1.5 0
ð20Þ
80 20
40
60 60
80
40 100
120
20 140
160
0
(c) Strong noise confidence map
-3 150 100 50 0
0
50
100
150
200
(d) Slight noise confidence map
Fig. 8. Noise and confidence map. (a) Strong noise image. (b) Slight noise image. (c) Strong noise confidence map. (d) Slight noise confidence map.
6
Z. Jia et al. / Measurement 149 (2020) 106987
Zu X u ðkÞ g ðkÞ H 0 ¼ þ u Zv X v ðkÞ gv ðkÞ 0 H |fflfflffl{zfflfflffl} |fflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflffl} |fflfflfflfflffl{zfflfflfflfflffl} |fflfflfflfflffl{zfflfflfflfflffl} Z 0 ðkÞ
H0
X 0 ðk Þ
ð24Þ
gðkÞ
Then, the drift phenomenon of weld feature point can be fix through the following prediction and states update equation based upon KF algorithm [20] during the arcing process. Phase 1. The prediction equation:
^ 0 ðkjk 1Þ ¼ u0 X ^ 0 ðk 1jk 1Þ X Pðkjk 1Þ ¼ u0 Pðk 1jk 1Þu0T þ C0 Q C0T
As shown in Fig. 8a, there has a strong arc noise contaminating the laser stripe in the moment of arc start. With the weld going on, the noise tends to be stable and the arc noise transforms into slight, it can be seen that the laser stripe is less contaminated by arc noise in the Fig. 8b. Correspondingly, there have multiple peaks in confidence map when the arc noise is strong, it is failed to track using STC method and the EAKF-STC algorithm tracks successful in Fig. 8c. When the noise is slight, the STC and EAKF-STC could track all well in Fig. 8d.
ð25Þ 4. Controlling of torch for intersecting weld in real-time
Phase 2. The state update equation:
h i1 K ðkÞ ¼ Pðkjk 1ÞH0T H0 Pðkjk 1ÞH0T þ R h i ^ 0 ðkjkÞ ¼ X ^ 0 ðkjk 1Þ ^ 0 ðkjk 1Þ þ K ðkÞ Z 0 ðkÞ H0 X X
PðkjkÞ ¼ I K ðkÞH0 Pðkjk 1Þ
ð26Þ
In this Section, the weld controlling scheme will be considered for the real-time welding of intersecting joint. Firstly, the posture scheme planning by proposed ECAL method is given to establish the welding torch coordinate frame. Next, an analysis for intersecting curve segmentation method is going to discuss to determine the welding scheme. And finally, the controlling of step-size is chosen to optimize weld trajectory and keep the posture continuity of welding torch. 4.1. Establishing the coordinate system of torch Through the visual sensor, all the points on the laser stripe can be extracted and located. In the robotic base coordination frame {B}, it is assumed that the pt is the welding point, q1t and q2t are two points on the laser stripe in the time of t. As shown in Fig. 9, a plane X can be obtained by fitting the three points pt, q1t, q2t. Assuming that the equation of plane X is amx + bmy + cmz + 1 = 0 (27), the normal vector of plane X is pxt, so the vector of pxt can be written as:
pxt ¼ ðam ; bm ; cm Þ Fig. 9. The model of welding point coordinate frame.
ð27Þ
Then, the angle bisector vector of \v1tPtv2t can also be obtained through the Angle Bisector Theorem on the plane of X:
pzt ¼ ax ; ay ; az ¼
B1 B2
C1 ; C2 C2 C1
*
A1 ; A2 A2 A1
B1
B2
ð28Þ
*
where, pt q1t ¼ ðl1 ; m1 ; n1 Þ, pt q2t ¼ ðl2 ; m2 ; n2 Þ and
8 8 8 qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi > > < A1 ¼ Nl1 Ml2 < A2 ¼ am > < M ¼ l1 2 þ m1 2 þ n1 2 B1 ¼ Nm1 Mm2 ; B2 ¼ bm ; qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi > > > : : :N ¼ l 2þm 2þn 2 C 2 ¼ cm C 1 ¼ Nn1 Mn2 2 2 2
Fig. 10. The model of torch coordinate frame.
Therefore, the welding point coordinate frame {Pt} can be established by the right-hand rule. However, there is some different between the welding point coordinate frame {Pt} and the torch coordinate frame {Wt} [21]. As shown in Fig. 10, the directions of wyt, wzt are opponent with
Fig. 11. Schematic diagram of expanding circle and arc length method.
Z. Jia et al. / Measurement 149 (2020) 106987
the directions of pyt, pzt, respectively. So, the torch coordinate frame {Wt} can be establish as follows:
f W t g ¼ wxt ; wyt ; wzt ¼ pxt ; pyt ; pzt
ð29Þ
4.2. Expanding circle and arc length method to extract points p1t and p2t As can be known from Section 4.1, in order to establish the coordinate system of torch, the positions of q1t and q2t must be
Y (f)
(a)
Fig. 16. The self-developed GUI of intersecting weld system.
Rotating direction
(b)
T
(c)
X (e)
(d)
Fig. 12. Diagram of segmentation scheme.
Fig. 17. Flowchart of communication scheme. Fig. 13. Torch frame system in certain segmentation part.
Fig. 14. Schematic diagram of intermediate proportional interpolation method.
Fig. 15. Experiment platform.
Fig. 18. The welding processes of all segmented parts.
7
8
Z. Jia et al. / Measurement 149 (2020) 106987
obtained. However, during the process of welding, due to the arc noise contaminating and it has not a stable geometric relationship on the laser stripe outside the weld feature point area, it is difficult to extract the points p1t and p2t which are corresponding points q1t and q2t by the TIPM and EAKF-STC algorithm. Therefore, an expanding circle and arc length method (ECAL) is proposed to extract the points p1t and p2t in the image. As is shown in Fig. 11, the weld feature point x* which was extracted by the EAKF-STC method, the yellow circle is expanding with x* as the center and radius of R. Fig. 11a is the preprocessed image which is processed by thresholding and connected area filtering, and pij (i = 1,2,3,4; j = 1,2) are points which the circle intersects with the boundary of the white area. As shown in Fig. 11b, it can be determined the arc Ci, corresponding to the angle hi, by using the points pij as the midpoints of the arc Ci, where hi1 = hi2 and hi1 + hi2 = hi. As can be seen from the Fig. 11a and b, the areas of h11, h12, h21, h22, h31, and h42 are black on the preprocessed image, but h32 and h41 are white. Therefore, the points p31 and p41 can be removed by detecting the different values of hi1 and hi2. Then, the p1t and p2t can be obtained by calculating the midpoints of p11,
p12 and p21, p22, respectively. And the Fig. 11c is the final result of extraction.
Fig. 19. Time consumed in weld image processing.
Fig. 21. Some extraction results of weld feature points.
4.3. Segmentation welding scheme of intersecting curve in real-time Due to the intersecting curve is a closed annulus of 3D, it is difficult to weld under the condition of no collision with the intersecting joint. Besides, nowadays, it is difficult or even unrealistically to obtain the relationship between the robotic base frame {B} and the intersecting joint coordinate frame in real-time. In this paper, an intersecting curve segmentation weld method is developed to avoid the collision, achieving the real-time welding of the intersecting curve. Aiming at the problem of collision, the segmentation scheme of the spatial intersecting curve seam is considered in the condition of no positioner machine [22,23]. The number of segmentations is more, the continuous influence of weld quality is poor and the
(a)Extracting errors of weld feature point for first segmental part
(b)Extracting errors of weld feature point for second segmental part
(c)Extracting errors of weld feature point for third segmental part Fig. 20. Extracting errors of feature point on x-axis and y-axis of welding image. (a) Extracting errors of weld feature point for first segmental part. (b) Extracting errors of weld feature point for second segmental part. (c) Extracting errors of weld feature point for third segmental part.
Z. Jia et al. / Measurement 149 (2020) 106987
time wastes more, and the number of segmentations is less, the laser stripe is less to be captured by CCD. Therefore, as shown in Fig. 12, from the top view of intersecting joint frame {T}, the intersecting curve can be divided into six parts because of radially symmetric of intersecting curve. During the real-time welding, due to the weld seam can be tracking and compensating the deviation automatically, so the order of welding is (a) ? (b) ? (c) ? (d) ? (e) ? (f) ? (a) and can start from any of them. As can be seen from the Fig. 13, in certain segmentation interval, the tracking system controls torch to weld from P1 to Pn while the specimen of intersecting joint is stationary, and the intersecting joint need to be clockwise rotated certain angle around the Z-axis of frame {T} after the former segmentation part has been weld while the torch return to the position of former. Thus, through five times rotation, the intersecting seam can be weld automatic.
9
Before the welding, the distance between the laser stripe and the welding tip was set as 20 mm, and the parameter of step-size was set as 2 mm. Then, the all tracking system of visual sensor
4.4. Step-size control of mobile torch From the enlarge view of Fig. 1 in Section 2, we can know that the step-size of the mobile torch depends on the parameter d. When the d is larger, the step-size will be longer and the precision will be worse and vice versa. However, the noise of contamination will increase if the step-size is smaller. To solve this problem, an intermediate proportional interpolation (IPI) method of step-size controlling is proposed in this paper. As shown in Fig. 14, the point cp, np and ip denote the current point, location point and interpolation point, respectively. Because the tracking is a method of approaching curves with straight lines, the interpolation method can be achieving by using the pointdirection linear equation.
8 > < ipx ¼ cpx þ lnx t ipy ¼ cpy þ lny t > : ipz ¼ cpz þ lnz t
(a) Location errors of intersecting weld seam for first segmental part
ð30Þ
where (cpx, cpy, cpz), (ipx, ipy, ipz) and (lnx, lny, lnz) denote the threedimensional coordinates of cp, ip and the unit vector of l, and parameter t is a constant. In this way, the proper step-size d’ can be decided by choosing the appropriate value of t. 5. Experiment and analysis
(b) Location errors of intersecting weld seam for second segmental part
The equipment used in the experiment platform is shown in Fig. 15. It mainly includes ABB IRB1410 six-axis industrial robot, dual network port industrial personal computer (IPC) with i7 CPU, Fronius TPS 4000 CMT automatic welding machine, Allied Vision Manta G-301 CCD camera, red one-line laser and Ethernet transmission cable. The CCD resolution was chosen to be 656 492, and the wavelength of laser generator is chosen to be 650 nm and the same as filter glass [24]. The specimen is an orthogonal T-type steel intersecting joint with an outer diameter of 203 mm and 152 mm for the main pipe and the branch pipe, respectively. 5.1. Weld experiment of tracking in real-time As can be seen from the Fig. 16, the self- developed graphical user interface (GUI) of intersecting weld system consists with eight modules, such as communication between the IPC and CCD, welding mode choosing, welding parameter inputting, communication between the IPC and robot, weld feature point detecting, weld tracking, welding monitoring and welding messages module for operation and communication. All of the modules could cooperate under the support of multithread in Windows.
(c) Location errors of intersecting weld seam for third segmental part Fig. 22. The results of relative location errors of intersecting weld seam. (a) Location errors of intersecting weld seam for first segmental part. (b) Location errors of intersecting weld seam for second segmental part. (c) Location errors of intersecting weld seam for third segmental part.
10
Z. Jia et al. / Measurement 149 (2020) 106987
and robot were calibrated, mainly including camera calibration, hand-eye calibration, laser plane calibration and the tool center point calibration of torch. Next, the robot carrying the visual sensor moves to the intersecting joint randomly until the intersecting weld seam can be captured by the CCD. And last, the system of intersecting weld can be tracking in real-time automatically. During the welding, the weld feature point and points p1t and p2t can be extracted by the EAKF-STC and ECAL algorithm, combining with the calibrated results, whose three-dimensional coordinates of the objective world can be calculated, respectively [25]. Further, the transformation matrix and corresponding quaternion can be obtained by means of Eqs. (27)–(30), Then, as shown in Fig. 17, according to the socket communication principle of TCP/ IP Protocol Suite and self-developed communication scheme, the data of the objective three-dimensional coordinates of weld feature point and the quaternionic posture can be send to the robot, controlling the robotic end-effector and torch to move and weld. In the tracking weld experiments, the CMT welding process was adopted, the welding voltage was 13.6 V, the welding current was 122A, the wire feeding speed was 3.3 m/min, and the welding speed was 8 mm/s. The welding processes of all segmented parts are shown in Fig. 18.
For the accuracy, the error of weld seam location was calculated, and the extracted errors of first three segmented parts were drawn in Fig. 20. As can be seen that the maximum errors of all of them were 6 pixels and 5 pixels in u-axis and v-axis, respectively. Some extraction results of weld feature points were shown in Fig. 21. However, the actual positioning errors of intersecting seam
5.2. Result analysis In order to verify the real-time, accuracy, stability and flexible of the tracking system, a series of experiments were performed. For the real-time, the spent time was an important factor that we need to be considered and, meanwhile, the time spent in extracting the weld feature point was longest in the whole multithread tracking. Therefore, 1200 welding images which were captured in several experiments were processed by the EAKF-STC algorithm, as shown in Fig. 19, the waste time for weld seam extracting is about 120 ms–210 ms and it is enough to extract for the tracking system. However, due to the symmetry of intersecting weld seam, the torch trajectory is similar between the (a)(b)(c) and (d)(e)(f) in Fig. 18. Therefore, only the case of (a), (b) and (c) were verified here.
(a)Quaternion of first segmental partial torch posture
(b)Quaternion of second segmental partial torch posture
(c)Quaternion of third segmental partial torch posture
Fig. 23. The actual weld results.
Fig. 24. Quaternion of torch posture. (a) Quaternion of first segmental partial torch posture. (b) Quaternion of second segmental partial torch posture. (c) Quaternion of third segmental partial torch posture.
Z. Jia et al. / Measurement 149 (2020) 106987
were difficult to obtained because the theoretical values of the arbitrary placed intersecting curve were unknown. So, the relative location errors were calculated through fitting the teaching points which were obtained by the calibrated torch’s tip touching the intersecting seam, and the results of relative location errors was shown in Fig. 22. As can be seen from Fig. 22, the positioning errors of all of them were within 0.3 mm, 0.7 mm and 0.2 mm in x-axis, yaxis and z-axis, respectively. Furthermore, we could find that the accuracy is enough for weld from the actual forming effect results in Fig. 23. For the stability, the quaternions of torch frame about first three segmented parts were calculated. As is shown in Fig. 24, the curve corresponding q1, q2, q3 and q4 was smooth, which denotes the movement of robot end-effector, as well as torch, is stable. For the flexible, because it is same in tracking principle, the tracking system not only could weld the arbitrary placed intersecting joint without groove, but also could weld specimens which the pipeline intersects with the plane or spherical things and so on. 5.3. System superiority Through the above researches and analyses on the real-time tracking system for intersecting seam, the superiority of the system in this paper over present similar commercial products is demonstrated following: Firstly, the main superiority is the tracking system could achieve the real-time automatic welding for spatial intersecting seam, which can save a lot of time to improve efficiency in engineering application. Until now, there were few similar products on the market. Secondly, the tracking system has strong flexible. Whether there is deviation of distortion about intersecting pipeline, change of clamping position of intersecting joint, thermal deformation of intersecting seam during the process of welding, or weld the specimens which the pipeline intersects with the plane or spherical material and so on, the system could work under the condition of without adjustment. Therefore, the system has strong flexible, which is more conducive to industrial automated welding in mass production and make the automatic welding for intersecting seam easily. 6. Conclusion In this paper, a flexible real-time tracking system based on laser vision stereo sensor for intersecting weld was developed. Aiming at the problems encountered in the development of tracking system, some methods were proposed to solve the feature point extraction and torch controlling scheme when the arc noise is heavy during the real-time welding. (1) The TIPM was proposed to extract the weld characteristics and determine the weld feature point in the first frame when there was no arc noise. However, it is invalid to extract when the laser stripe was contaminated by arc and splash noise during the real-time welding. So, the EAKF-STC was proposed to solve the drift problem during the process of no arc noise to heavy noise and guarantee the system’s robustness. (2) To control the mobile torch trajectory, the welding torch coordinate frame was established by the new principle of three points, and an ECAL method was proposed to help to determine it. Then, a segmental weld method was considered to avoid the collision between the welding torch and
11
the intersecting joint when there has no positioner machine. Besides, an IPI method of step-size controlling was proposed to improve the precision, stability and smoothness. (3) Experiments and analysis show that the tracking system is good in agility, accuracy, stability and flexible. The maximum extraction error was 6 pixels and maximum positioning error was 0.7 mm when the front measurement distance of laser stripe was 20 mm, which met the weld requirements.
Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. Acknowledgements This work is financially supported by National Natural Science Foundation of China (grant No. U1733125), and National Natural Science Foundation of China (grant No. 51875408), Natural Science Foundation of Tianjin (grant No. 18JCYBJC18700) and Natural Science Foundation of Tianjin (grant No. 18JCYBJC19100). References [1] Y. Huang, Y. Xiao, P. Wang, A seam-tracking laser welding platform with 3D and 2D visual information fusion vision sensor system, Int. J. Adv. Manuf. Technol. 67 (1–4) (2013) 415–426. [2] Y. Zou, Y. Wang, W. Zhou, et al., Real-time seam tracking control system based on line laser visions, Opt. Laser Technol. 103 (2018) 182–192. [3] Y. Xu, N. Lv, G. Fang, et al., Welding seam tracking in robotic gas metal arc welding, J. Mater. Process. Technol. 248 (2017) 18–30. [4] W. Shao, Y. Huang, Y. Zhang, A novel weld seam detection method for space weld seam of narrow butt joint in laser welding, Opt. Laser Technol. 99 (2018) 39–51. [5] Y. Ding, W. Huang, R. Kovacevic, An on-line shape-matching weld seam tracking system, Rob. Comput. Integr. Manuf. 42 (2016) 103–112. [6] Y. Liu, J. Liu, X. Tian, An approach to the path planning of intersecting pipes weld seam with the welding robot based on non-ideal models, Rob. Comput. Integr. Manuf. 55 (2019) 96–108. [7] Y. Zou, W. Zhou, Y. Wang, Laser vision seam automatic tracking based on probability continuous model, J. Mech. Eng. 10 (2017) 70–78. [8] L. Shi, X. Tian, Automation of main pipe-rotating welding scheme for intersecting pipes, Int. J. Adv. Manuf. Technol. 77 (5–8) (2015) 955–964. [9] K. Wang, M. Liu, B. Long, et al., Research on control system of automatic welding robot for space intersection, Int. Conf. Sensors (2016). [10] X. Chen, A.G. Dharmawan, S. Foong, G.S. Soh, Seam tracking of large pipe structures for an agile robotic welding system mounted on scaffold structures, Rob. Comput. Integr. Manuf. 50 (2018) 242–255. [11] K. Yan, 2018. Research on intersecting curve welding robot system based on laser vision. Tianjin. [12] Y. Zou, T. Chen, Laser vision seam tracking system based on image processing and continuous convolution operator tracker, Opt. Lasers Eng. 105 (2018) 141– 149. [13] K. Zhang, L. Zhang, Q. Liu, et al., Fast visual tracking via dense spatio-temporal context learning, in: Eur. Conf. Comput. Vision, Springer, 2014, pp. 127–141. [14] Y. Zou, X. Chen, G. Gong, et al., A seam tracking system based on a laser vision sensor, Measurement 127 (2018) 489–500. [15] S.M. Marvasti-Zadeh, H. Ghanei-Yakhdan, S. Kasaei, Adaptive spatio-temporal context learning for visual target tracking, in: The 10th Iranian Conference on Machine Vision and Image Processing, 2017, pp. 22–23. [16] X. Gao, X. Zhong, D. You, et al., Kalman filtering compensated by radial basis function neural network for seam tracking of laser welding, IEEE Trans. Control Syst. Technol. 21 (5) (2013) 1916–1923. [17] S. Xiong, Z. Zhou, Neural filtering of colored noise based on Kalman filter structure, IEEE Trans. Instrum. Meas. 52 (3) (2003) 742–747. [18] X. Gao, D. You, S. Katayama, Seam tracking monitoring based on adaptive Kalman filter embedded Elman neural network during high-power fiber laser welding, IEEE Trans. Ind. Electron. 59 (11) (2012) 4315–4325. [19] X. Gao, S.J. Na, Seam tracking based on estimation of weld position using Kalman filtering, Sci. Technol. Weld. Joining 10 (1) (2005) 103–109. [20] T. Basar, A new approach to linear filtering and prediction problems, in: Control Theory: Twenty-Five Seminal Papers, Wiley-IEEE Press, 2009, pp. 1932–1981.
12
Z. Jia et al. / Measurement 149 (2020) 106987
[21] L. Shi, X. Tian, C. Zhang, Automatic programming for industrial robot to weld intersecting pipes, Int. J. Adv. Manuf. Technol. 81 (9–12) (2015) 2099–2107. [22] Y. Zhang, X. Lv, L. Xu, et al., A segmentation planning method based on the change rate of cross-sectional area of single V-groove for robotic multi-pass welding in intersecting pipe-pipe joint, Int. J. Adv. Manuf. Technol. 101 (2019) 23–38. [23] H. Fang, S. Ong, A. Nee, Robot path planning optimization for welding complex joints, Int. J. Adv. Manuf. Technol. 90 (2017) 3829–3839.
[24] Y. Xu, G. Fang, S. Chen, J. Zou, Zhen Ye, Real-time image processing for visionbased weld seam tracking in robotic GMAW, Int. J. Adv. Manuf. Technol. 73 (9– 12) (2014) 1413–1425. [25] R. Du, Y. Xu, Z. Hou, J. Shu, S. Chen, Strong noise image processing for visionbased seam tracking in robotic gas metal arc welding, Int. J. Adv. Manuf. Technol. 101 (2019) 2135–2149.