Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
Contents lists available at ScienceDirect
Robotics and Computer-Integrated Manufacturing journal homepage: www.elsevier.com/locate/rcim
Measurement error analysis and accuracy enhancement of 2D vision system for robotic drilling Weidong Zhu n, Biao Mei, Guorui Yan, Yinglin Ke Department of Mechanical Engineering, Zhejiang University, Hangzhou 310027, China
art ic l e i nf o
a b s t r a c t
Article history: Received 5 June 2013 Received in revised form 12 September 2013 Accepted 24 September 2013 Available online 18 October 2013
Robotic drilling for aircraft structures demands higher accuracy on industrial robots than their traditional applications. Positioning error measurement and compensation based on 2D vision system is a costeffective way to improve the positioning accuracy in robotic drilling. In this paper, we first discuss the principle of error measurement and compensation with a 2D vision system for robotic drilling and the determination of tool center point of the vision system so that the Abbe errors are eliminated in the measurement process. Measurement errors due to nonideal measurement conditions, i.e. nonperpendicularity of the camera optical axis to the workpiece surface and incorrect object distance, are mathematically modeled and experimentally verified. A method utilizing four laser displacement sensors is proposed to ensure perpendicularity of the camera optical axis to the workpiece surface and correct object distance in the measurement process, and hence to achieve high accuracy in 2D vision-based measurement. Experiments performed on a robotic drilling system show that the 2D vision system can achieve an accuracy of approximately 0.1 mm with the proposed method. & 2013 Elsevier Ltd. All rights reserved.
Keywords: Robotic drilling 2D vision system Error measurement Measurement accuracy Perpendicularity Object distance
1. Introduction Drilling fastener holes on modern aircraft structures is a difficult task due to the large number of fastener holes and extensive usage of hard to machine materials (titanium, CFRP etc.) on these structures. Manual drilling is labor-intensive, timeconsuming and may lead to health problems to the workers, and furthermore, the quality of drilled holes cannot be guaranteed [1]. Due to its relatively low investment, high flexibility and satisfactory drilling quality, robotic drilling systems have been developed and studied by researchers from both the academia and the aerospace industry. Olsson [2] developed an effective robotic drilling prototype system, demonstrating great potential of the application of high precision industrial robot-based drilling systems in the field of aircraft assembly. Electroimpact Inc. developed the ONCE robotic drilling system and used it to drill, countersink and inspect fastener holes of the wing trailing edge flaps of the Boeing's F/A-18E/F Super Hornet fighter aircraft [3]. In the robotic drilling process, Computer-Aided Design (CAD) models of the workpiece are used as the basis for the generation of robot programs. However, the CAD model is not accurately coincident with reality regarding the shapes, positions and orientations of the workpieces and jigs, leading to position errors of the
n
Corresponding author. Tel./fax: þ 86 57187953929. E-mail address:
[email protected] (W. Zhu).
0736-5845/$ - see front matter & 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.rcim.2013.09.014
fastener holes created by the robotic drilling system. In order to improve the position accuracy of drilled holes, robot programs should be created in accordance with the actual assembly status (shape, position and orientation) of the workpiece. Furthermore, positioning errors of industrial robots should be compensated for since they have only a marginal positioning accuracy compared with the high standard of position accuracy of fastener holes required by the aerospace industry [2,4,5]. Due to the characteristics of non-contact and high precision, vision systems are widely applied in industrial activities, such as robotic assembly [6], robotic welding [7], robot calibration [8], localization [9], inspection of surface quality [10], evaluation of surface roughness [11], automated micromanipulation [12], etc. Compared with general purpose high accuracy measurement systems (e.g. portable CMMs), vision systems with a 2D camera have a lower cost and are technically feasible for the measurement requirements of robotic drilling. By scanning the workpiece with a vision system, the ONCE system can determine the workpiece position and correct the drilling position by comparing the theoretical and actual position of the workpiece. Liang [13] developed an end-effector drill end effector (DrEE)for robotic drilling, which integrated a vision system for the measurement of workpiece location and welding seams. A robotic drilling system developed at Zhejiang University also integrated a vision system, which is used to measure reference holes, refer to Fig. 1. The high quality standard of aerospace industry requires that position accuracy of fastener holes to be kept within 70.2 mm
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
Fig. 1. Robotic drilling system developed at the Zhejiang University.
[14], which in turn places a stringent requirement on the measurement accuracy of the vision system integrated on the robotic drilling system. Since image processing and camera calibration are well developed technologies [15], major influence factors to the measurement accuracy of a 2D vision system are nonideal measurement conditions in the measurement process rather than the defects of the vision system itself. In this paper, we discuss the measurement errors of the vision system due to nonperpendicularity between the camera optical axis and the workpiece surface and incorrect object distance between the camera lens and the workpiece surface. We also introduce a method to ensure perpendicularity and correct object distance in vision-based measurement process, which take the advantage of the laser displacement sensors initially used for normality adjustment of the drill axis relative to the workpiece surface in the robotic drilling process. The paper is structured as follows. In Section 2, we first present the theoretical foundations for the measurement of relative errors between the drill and the workpiece with a 2D vision system, and determine the most appropriate choice of the tool center point (TCP) of the camera such that the Abbe errors are eliminated in the measurement process. In Section 3, a mathematical model of measurement errors caused by nonperpendicularity is deduced. In Section 4, a mathematical model of measurement errors caused by incorrect object distance is deduced. In Section 5, a method to ensure perpendicularity of the camera optical axis to the workpiece surface and correct object distance is proposed, in which four laser displacement sensors provide the feedback information for adjusting the pose of the camera with respect to the workpiece. In Section 6, extensive experimental results are provided. Results of experiments performed on a coordinate measuring machine (CMM) verify the correctness of the proposed mathematical models of measurement errors. And experiments performed on a robotic drilling system show that the 2D vision system can achieve an accuracy of approximately 0.1 mm by eliminating the error sources with the proposed methods.
2. Relative error measurement in robotic drilling 2.1. Introduction to the robotic drilling end-effector The robotic drilling end-effector consists of a robot flange interface, a 2D vision system, laser displacement sensors, a pressure foot, a spindle, linear guideways, linear scales and so on, as shown in Fig. 2. The drilling end-effector is connected to the industrial robot through the robot flange. The 2D vision system is used to locate reference holes
161
Fig. 2. Robotic drilling end-effector.
Fig. 3. Deviation between the drill TCP and the hole frame.
to correct the drilling positions of fastener holes. The laser displacement sensors provide feedback information for the adjustment of the end-effector orientation. The pressure foot clamps up the workpiece to stabilize the robotic drilling system in the drilling cycle. And the spindle moves along the linear guideways to provide the feed motion in the drilling process. 2.2. Error analysis for robotic drilling In robotic drilling, fastener holes are drilled by using programs generated in the off-line programming system based on the nominal CAD model of the robotic drilling workcell. Ideally, as shown in Fig. 3 (large aircraft structures can be approximated with a plane locally [15]), the drill TCP and the hole frame should be coincident in the drilling process and the drilled hole has no position errors, i.e. TDrill TCP Nominal ¼ THole Frame Nominal
ð1Þ
where TDrill TCP Nominal and THole Frame Nominal are 4 4 homogeneous transformation matrices. However, due to the positioning errors of the robot and installation errors of the workpiece, the actual drill TCP and the actual hole frame
162
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
are different from their nominal counterparts, and can be represented by using homogeneous transformation matrix and small angle approximation as Eqs. (2) and (3), respectively [16]. 2 3 1 εz ðθÞ εy ðθÞ δx ðθÞ 6 εz ðθÞ 1 εx ðθÞ δy ðθÞ 7 6 7 TDrill TCP Actual ¼ TDrill TCP Nominal 6 7 4 εy ðθÞ εx ðθÞ 1 δz ðθÞ 5 0
0
0
1 ð2Þ
where δx ðθÞ, δy ðθÞ, δz ðθÞ are the translational errors of the actual drill TCP relative to the nominal drill TCP, εx ðθÞ, εy ðθÞ, εz ðθÞ are the rotational errors of the actual drill TCP relative to the nominal drill TCP, and θ is the vector of joint angles of the robot. 2 3 1 εz ðiÞ εy ðiÞ δx ðiÞ 6 ε ðiÞ 1 εx ðiÞ δy ðiÞ 7 6 z 7 THole Frame Actual ¼ THole Frame Nominal 6 7 4 εy ðiÞ εx ðiÞ 1 δz ðiÞ 5 0
0
0
1 ð3Þ
Fig. 4. Aircraft panel structure on which fastener holes are to be drilled.
where δx ðiÞ, δy ðiÞ, δz ðiÞ are the translational errors of the actual hole frame relative to the nominal hole frame, εx ðiÞ, εy ðiÞ, εz ðiÞ are the rotational errors of the actual hole frame relative to the nominal hole frame, and i is the index of the hole. Thus, there are relative errors between the actual drill TCP and the actual hole frame (where the hole should be drilled) which can be described by 1 THole Frame Actual ¼ TDrill TCP Actual THole Frame Actual 3 1 εz ðθÞ εz ðiÞ εy ðiÞ εy ðθÞ δx ðiÞ δx ðθÞ 6 ε ðiÞ ε ðθÞ 1 εx ðθÞ εx ðiÞ δy ðiÞ δy ðθÞ 7 z 6 z 7 ¼6 7 4 εy ðθÞ εy ðiÞ εx ðiÞ εx ðθÞ 1 δz ðiÞ δz ðθÞ 5
DrillTCPActual
2
0
0
0
ð4Þ
1
The rotational part of Eq. (4) is denoted by the rotation matrix as
2
1 6 RotErr ¼ 4 εz ðiÞ εz ðθÞ εy ðθÞ εy ðiÞ
εz ðθÞ εz ðiÞ 1 εx ðiÞ εx ðθÞ
εy ðiÞ εy ðθÞ
3
εx ðθÞ εx ðiÞ 7 5
ð5Þ
1
which causes perpendicularity error of the drilled hole axis with respect to the workpiece surface. The rotational errors can be measured and compensated for by using methods such as those proposed in [17] and the method outlined in Section 5 as well. The translational errors of the actual drill TCP relative to the actual hole frame can be represented as 2 3 δx ðiÞ δx ðθÞ 6 δ ðiÞ δ ðθÞ 7 y 6 y 7 TrErr ¼ 6 ð6Þ 7 4 δz ðiÞ δz ðθÞ 5 1 Since the drill is perpendicular to the workpiece surface in the drilling process, the translational error δz ðiÞ δz ðθÞ along the Z-axis of the drill TCP is irrelevant to the position accuracy of the drilled hole. The translational errors δx ðiÞ δx ðθÞ and δy ðiÞ δy ðθÞ of the drill TCP relative to the hole frame directly affect the position accuracy of drilled holes and should be measured and compensated for in the drilling process.
Fig. 5. Reference hole measurement with 2D vision system.
holes are measured and compensation values for each hole can be determined by using some interpolation method [18]. By moving the vision system to the nominal position of the reference hole by the robot, the actual reference hole pre-drilled on the panel structure can be measured by the vision system, and the relative errors δx ðiÞ δx ðθÞ and δy ðiÞ δy ðθÞ between the camera TCP and the actual reference hole can be determined. Of course, the camera TCP should be first determined. For the 2D vision system, a natural coordinate system is the camera coordinate system (Camera CS), refer to Fig. 5, where the well-known pin-hole model of the camera is adopted [19]. However, if the Camera CS is used as the camera TCP in measurement, large Abbe errors will occur in the measured results. Here, we first analyze the case in which the camera coordinate system (Camera CS) is defined as the camera TCP. From Fig. 5, it can be deduced that CameraCS
TCamera TCP ¼ I
2.3. Relative position error measurement with 2D vision system For a typical aircraft panel structure such as shown in Fig. 4, there are thousands of fastener holes to be drilled in the assembly process. In order to improve the position accuracy of drilled holes, the panel structure is divided into several zones, each of which is surrounded by several reference holes. Then the position errors of the reference
ð7Þ 2
CameraTCPNominal
1
60 6 THole Frame Nominal ¼ 6 40 0
0
0
1
0
0
1
0
0
0
3
0 7 7 7 z 5
ð8Þ
1
where z is the in-focus object distance of the 2D vision system.
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
The actual position of the camera TCP is 2 1 εz ðθÞ 6 εz ðθÞ 1 6 TCamera TCP Actual ¼ TCamera TCP Nominal 6 4 εy ðθÞ εx ðθÞ 0
2 εy ðθÞ
δx ðθÞ
εx ðθÞ 1
0
6 ε ðiÞ ε ðθÞ z 6 z ¼6 4 εy ðθÞ εy ðiÞ
3
δy ðθÞ 7 7 7 δz ðθÞ 5
0
0
2
1
6 ε ðiÞ 6 z THole Frame Actual ¼ THole Frame Nominal 6 4 εy ðiÞ 0
εz ðiÞ
εy ðiÞ
1
εx ðiÞ
εx ðiÞ
1
0
0
δx ðiÞ
3
δy ðiÞ 7 7 7 δz ðiÞ 5 1 ð10Þ
Thus, the pose deviation of the actual hole frame with respect to the actual Camera TCP (equals to camera CS) is CameraCS
THole Frame Actual ¼ CameraTCPActual THole Frame Actual
1 ¼ TCamera TCP Actual T Hole Frame Actual
2
1 6 ε ðiÞ ε ðθÞ z z 6 ¼6 4 εy ðθÞ εy ðiÞ
εz ðθÞ εz ðiÞ
εy ðiÞ εy ðθÞ
1 εx ðiÞ εx ðθÞ
εx ðθÞ εx ðiÞ 1
0
0
0
δx ðiÞ δx ðθÞ þ εy ðθÞz
3
δy ðiÞ δy ðθÞ εx ðθÞz 7 7 7 δz ðiÞ δz ðθÞ z 5 1 ð11Þ
As a result, the Abbe errors εy ðθÞz and εx ðθÞz due to the inaccuracy of the robot will be contained in the measurement results. In this paper, we propose to define the camera TCP as an offset frame of the camera CS. 2
1 60 6 CameraCS TCamera TCP ¼ 6 40
0
0
1
0
0
1
0
0
0
CameraTCPNominal
0
3
0 7 7 7 z 5
ð12Þ
1
THole Frame Nominal ¼ I
1 THole Frame Actual ¼ TCamera TCP Actual THole Frame Actual
1
εx ðθÞ εx ðiÞ
εx ðiÞ εx ðθÞ
1
0
0
δx ðiÞ δx ðθÞ
3
δy ðiÞ δy ðθÞ 7 7 7 δz ðiÞ δz ðθÞ 5
ð14Þ
1
THole Frame Actual ¼ CameraCS TCamera TCP Actual CameraTCPActual THole Frame Actual
2
1 6 ε ðiÞ ε ðθÞ z 6 z ¼6 4 εy ðθÞ εy ðiÞ 0
εz ðθÞ εz ðiÞ
εy ðiÞ εy ðθÞ
1
εx ðθÞ εx ðiÞ
εx ðiÞ εx ðθÞ 0
1 0
δx ðiÞ δx ðθÞ
3
δy ðiÞ δy ðθÞ 7 7 7 δz ðiÞ δz ðθÞ z 5 1 ð15Þ
And the Abbe errors are now eliminated in the translational errors of the reference hole frame with respect to the camera CS along the X- and Y-axis. Strictly speaking, the relative errors between the reference hole frame and the drill TCP rather than the camera TCP should be used in error compensation. In this paper, the relative errors measured by the vision system can be used effectively for error compensation because the offset between the camera TCP and the drill TCP is small and industrial robots maintain high accuracy (close to their repeatability) when the changes of joint angles are small [19]. Suppose the robot joint angles are ð θ1 θ2 ⋯ θ6 Þ when the camera TCP is positioned to the measurement position, while the joint angles are ð θ1 þ Δθ1 θ2 þ Δθ2 ⋯ θ6 þ Δθ6 Þ when the drill TCP is positioned to the same position, where ð Δθ1 Δθ2 ⋯ Δθ6 Þ are small values since the offset between the camera TCP and the drill TCP is small. And the robot positioning errors are similar at these two positions. For example, the translational error of the robot drill TCP along its X-axis is δx ð θ1 þΔθ1 θ2 þΔθ2 ⋯ θ6 þ Δθ6 Þ ¼ δx ð θ1 ∂δx ∂δx ∂δx Δθ1 þ Δθ2 þ ⋯ þ Δθ6 þ ∂θ1 ∂θ2 ∂θ6
θ2
⋯
θ6 Þ ð16Þ
As long as the positioning errors of the robot change smoothly with respect to its joint angles, the partial derivatives ðð∂δx =∂θ1 Þ ð∂δx =∂θ2 Þ ⋯ ð∂δx =∂θ6 Þ Þ take small values, and thus δx ðθÞ δx ðθþ ΔθÞ.
ð13Þ
In this case, the actual hole frame with respect to the actual camera TCP is CameraTCPActual
εy ðiÞ εy ðθÞ
is CameraCS
and the actual hole frame is
εz ðθÞ εz ðiÞ
And the actual hole frame with respect to the actual Camera CS
1 ð9Þ
1
163
3. Nonperpendicularity-induced measurement errors The influence of nonperpendicularity to the measurement accuracy of the 2D vision system is analyzed in this section by
Fig. 6. Measurement errors caused by nonperpendicularity. (a) 2D illustration and (b) surface model of measurement errors.
164
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
using the conventional pin-hole imaging model. The imaging principle is shown in Fig. 6(a) with 2D illustrations. Suppose O is the optical center of the lens, OA is the optical axis of the camera, A is the point of intersection between the optical axis and the workpiece surface, I 1 I 2 is the image plane of the camera, B1 B2 is a workpiece surface perpendicular to the optical axis, E1 E2 is a workpiece surface which is rotated by an angle θ with respect to B1 B2 . Ideally, the image points of object points B1 and B2 are I 1 and I 2 , respectively. When the plane B1 B2 is rotated to the plane E1 E2 , the image points are moved to H 1 and H 2 . In the measurement process, the distances AB1 and AB2 between points on the workpiece surface are quantities to be measured by the vision system, which are calculated by their corresponding positions on the image plane and the camera parameters (calibrated and fixed). Thus the incorrect position of image point causes errors in the quantities measured by the 2D vision system. Two points B1 and B2 are in the two sides of the optical axis, and AB1 ¼ AB2 , suppose the length of the two line segments is s. When the normal line of the workpiece surface rotates from its ideal position by an angle θ, Points B1 and B2 on the workpiece surface will be changed to E1 and E2 , while the length keeps constant, i.e. AE1 ¼ AE2 ¼ s. Also suppose the object distance is OA ¼ z when measuring the workpiece. According to Fig. 6(a), △OAC 1 ffi △E1 D1 C 1 ,then ðAC 1 =OAÞ ¼ ðC 1 D1 =E1 D1 Þ ¼ ððAE1 cos θ AC 1 Þ=ðAE1 sin θÞÞ; AC 1 ¼
zs cos θ z þs sin θ
Fig. 6(b) shows nonperpendicularity induced measurement errors of 2D vision system under typical measurement conditions of robotic drilling, where the object distance is z¼ 214 mm, degree of nonperpendicularity is 51rθr51, distance between the geometric feature and the optical axis is 0 rs r10 mm. It can be observed that when 51 rθr51, 0 rsr10 mm, the maximum measurement error is 0.0785 mm, which contributes to the total measurement error of the vision system. In order to achieve the highest measurement of the vision system in robotic drilling,
ð17Þ
hence, Δs ¼ AC 1 AB1 ¼
zs cos θ s z þ s sin θ
ðs 4 0; θ 40Þ
ð18Þ
OA U AE2 cos θ zs cos θ ¼ OA AE2 sin θ z s sin θ
ð19Þ
Since △OFE2 ffi △OAC 2 ; OF E2 F ¼ ; OA AC 2
AC 2 ¼
and Δs ¼ AC 2 AB2 ¼
zs cos θ s z s sin θ
ðs 4 0; θ 40Þ
ð20Þ
Defining θ ¼ ∠OAE1 90 3 (θ4 01) for Eq. (18) and θ ¼ ∠OAE2 901 (θo01) for Eq. (20), then the two equations can be combined into a single equation as follows: Δs ¼
zs cos θ s z þ s sin θ
ð21Þ
Fig. 8. Procedures for position error measurement with 2D vision system.
Fig. 7. Measurement errors caused by incorrect object distance. (a) 2D illustration and (b) surface model of measurement errors.
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
perpendicularity between the optical axis and the workpiece surface should be strictly controlled.
4. Measurement error induced by incorrect object distance The influence of incorrect object distance to measurement accuracy of 2D vision system is also based on the conventional pin-hole imaging model of the camera. Measurement error caused by incorrect object distance is illustrated in Fig. 7(a). Suppose O is
Fig. 9. Laser displacement sensors on the end-effector.
165
the optical center of the lens, OA is the optical axis of the camera. GH is the image plane of the camera, AB is the workpiece surface perpendicular to the optical axis. In the ideal state, the image point of object point B is H, when the object distance increases to OA1 , the corresponding image point moves to H 1 ; accordingly, when the object distance decreases to OA2 , the corresponding image point moves to H 2 . As a result, measurement errors occur when
Table 1 The statistical data of the measurement errors. Experimental conditions
Axis of the Upper limit Lower limit Average CMM
212 mm–901
X Z
0.065159 0.067161
214 mm–901
X Z
0.010771 0.010350 0.000570 0.005294 0.006334 0.014300 0.003360 0.004723
216 mm–901
X Z
0.069048 0.069630 0.000620 0.039266 0.071178 0.073910 0.004070 0.041110
212 mm–891
X Z
0.048848 0.045540 0.001810 0.025508 0.055095 0.040110 0.005506 0.027345
214 mm–891
X Z
0.033078 0.031550 0.026865 0.034130
216 mm–891
X Z
0.096166 0.088660 0.002481 0.053837 0.088290 0.097950 0.000220 0.053433
212 mm–871
X Z
0.044349 0.016530 0.055269 0.040460
214 mm–871
X Z
0.057359 0.043380 0.003996 0.026997 0.033499 0.055980 0.002200 0.021409
216 mm–871
X Z
0.119293 0.100450 0.093867 0.117250
0.003950 0.065451 0.000790 0.059548
212 mm–851
X Z
0.033557 0.013660 0.045272 0.041060
0.005352 0.010532 0.000280 0.016620
214 mm–851
X Z
0.094429 0.060240 0.005194 0.043216 0.059099 0.061664 0.000500 0.026897
216 mm–851
X Z
0.183430 0.097201
Fig. 10. Error measurement with laser displacement sensors.
Fig. 11. The experimental platform for the verification of measurement error models.
Standard deviation
0.075520 0.002990 0.041040 0.069480 0.001452 0.040438
0.001600 0.016257 6.6E 05 0.014695
0.006530 0.014714 0.000831 0.022920
0.455760 0.332920 0.082790 0.147610 0.024000 0.064998
166
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
the geometric dimension of the object is calculated from the image acquired at an incorrect object distance. Let the object distance OA ¼ z, and suppose the object distance changes by Δz, then OA1 ¼ z þΔz, OA2 ¼ z Δz. For clarity, define AB ¼ A1 E ¼ A2 F ¼ s. From the similarity of △OAC ffi △OA1 E, it follows that tan θ1 ¼
A1 E AC ; ¼ OA1 OA
AC ¼
OA U A1 E zs ¼ OA1 z þ Δz
ð22Þ
and the measurement error is Δs ¼ AC AB ¼
zs s z þ Δz
ðs 4 0Þ
ð23Þ
Similarly, from △OAD ffi △OA2 F, we can get tan θ2 ¼
A2 F AD ; ¼ OA2 OA
AD ¼
OA UA2 F zs ¼ OA2 z Δz
ð24Þ
hence the measurement error is Δs ¼ AD AB ¼
zs s z Δz
ðs 4 0Þ
ð25Þ
zs s z þ Δz
5. Measurement accuracy enhancement by perpendicularity and object distance control in the measurement process When reference holes are measured with the 2D vision system, the robot is driven to a position with the nominal hole frame, and relative errors exist between the camera TCP and the actual hole frame, which can be represented using a homogeneous transformation matrix 2 3 1 εz εy δx 6 ε 1 εx δy 7 6 z 7 ð27Þ 6 7 4 εy εx 1 δz 5 0
Assuming that Δz is positive when the object distance increases (and vice versa), Eqs. (23) and (25) can be combined into a unified model of measurement error as follows: Δs ¼
and accuracy of the assembly of aircraft panel structures are usually at the mm level), 0 rsr 10 mm, the maximum measurement error is 0.0943 mm, which is significant for 2D vision system in robotic drilling. Therefore the object distance should also be controlled effectively.
ð26Þ
Fig. 7(b) shows measurement errors induced by incorrect object distance under typical measurement conditions of robotic drilling, where the nominal object distance is z¼ 214 mm, range of the variation of object distance is 2 r Δz r 2, distance between the geometric feature and the optical axis is 0 rsr10 mm. It can be observed that when 2 r Δz r2 (accuracy of industrial robots
0
0
1
where the translational errors δx and δy are to be measured by the 2D vision system. When the camera TCP is positioned by the robot using the nominal frame of the reference hole, the rotational errors εx around the X-axis and εy around the Y-axis will cause perpendicularity error between the optical axis of the camera and the workpiece surface, leading to measurement errors of the 2D vision system. The translational error δz along the Z-axis causes inaccurate object distance and also introduces measurement errors. Therefore, we propose a method for enhanced measurement accuracy of the 2D vision system, in which the rotational errors εx and εy as well as the
Fig. 12. Measurement errors under the condition 214 mm–901.
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
translational error δz are measured and compensated for before the 2D vision system starts to measure the reference hole. The procedures of the method are shown in Fig. 8. In the proposed method, the rotational errors εx , εy and the translational error δz are measured by four laser displacement sensors, which are initially used to measure the perpendicularity error between the drill axis and the workpiece surface in the drilling process, refer to Fig. 9. The optical axis of the camera is adjusted to be parallel to the drill axis during installation, and each axis of the drill TCP is chosen to be parallel to the corresponding axis of the camera TCP. When the drill TCP and the camera TCP are positioned to the same hole frame, the corresponding robot positions in the joint space are close to each other. Therefore, the rotational errors εx , εy and the translational error δz measured with the drill TCP can be effectively used by the 2D vision system in its measurement process. A laser displacement sensor can be mathematically modeled by a point ðxi ; yi ; zi Þ and a vector ðui ; vi ; wi Þ. The point and vector with respect to the robot-mounting flange (tool0), i.e. the virtual reference point on the laser beam where the sensor reading is zero and the orientation of the laser beam, can be determined by sensor calibration [19,20]. The point and vector of each laser sensor can be further converted into the drill TCP, refer to Fig. 10. Thus each point measured by the laser sensor can be represented in the drill TCP as x ¼ xi þui Li y ¼ yi þ vi Li z ¼ zi þ wi Li where Li is the reading of the laser sensor.
167
With the readings of four laser sensors, four points in the coordinate frame of the drill TCP can be acquired as ðxi þ ui Li ; yi þ vi Li ; zi þ wi Li Þ, i¼ 1, 2, 3, 4. Suppose the best-fit plane to these points is axþ by þ cz þ d ¼ 0, then the rotational errors εx , εy around the X- and Y-axis of the drill TCP can be calculated by pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi 2 εx ¼ b= a2 þ b þ c2 pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ð29Þ 2 εy ¼ a= a2 þb þc2 After the drill TCP is adjusted to be perpendicular to the workpiece surface by the robot, the translational error δz along the Z-axis of the drill TCP can be acquired by the following formula δz ¼
∑4i ¼ 1 wi ðLi L′i Þ 4
ð30Þ
where L′i s are readings of the laser sensors when measuring the plane coincident with the XY-plane of the drill TCP, which are determined in the calibration stage. The rotational errors εx , εy and the translational error δz measured by laser sensor are then used to generate a modified hole frame 2 3 1 0 εy 0 6 0 1 εx 0 7 6 7 TModif ied Hole Frame ¼ THole Frame 6 ð31Þ 7 4 εy εx 1 δz 5 0
ð28Þ
0
0
1
Finally, by driving the robot to the modified hole frame, the translational errors δx and δy can be accurately measured by the vision system.
Fig. 13. Measurement errors under the condition 216 mm–901.
168
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
6. Experiments 6.1. Verification of the measurement error models of 2D vision system To verify the measurement error models of 2D vision system established in this paper, experiments were conducted. The experimental platform is shown in Fig. 11, which includes a coordinate measuring machine (CMM), an industrial camera, a clamping unit, an annular light, a test piece with a 5 mm diameter reference hole, and a computer installed with the vision-based measurement software. In the experiments, measurement points are generated by moving the CMM which has an accuracy of 5 μm. The camera was clamped in the clamping unit after the lens' focal length was fixed; the test piece with reference hole is fixed on the Z-axis of the CMM; the camera was connected to the computer with an Ethernet cable; the position of the reference hole in the imaging plane is adjusted by moving the CMM along its X-axis and/ or Z-axis, while the object distance can be adjusted by moving the CMM along its Y-axis. Thus measurement accuracy of the 2D vision system can be evaluated by comparing the position of the reference hole measured by the vision system and the CMM readings. For the purpose of analyzing the influence which incorrect object distance and nonperpendicularity exert on the measurement accuracy of the 2D vision system, experimental conditions are defined with “object distance–perpendicularity” combinations as follows: (1) 214 mm–901; (2) 212 mm–901; (3) 216 mm–901; (4) 214 mm–891; (5) 212 mm–891; (6) 216 mm–891; (7) 214 mm– 871; (8) 212 mm–871; (9) 216 mm–871; (10) 214 mm–851; (11)
212 mm–851; and (12) 216 mm–851. In condition (1), the test piece being photographed is in-focus and without perpendicularity error, hence there is no measurement error due to incorrect object distance and nonperpendicularity. The procedures of the experiments are described as follows: (1) Distribution of measurement points for vision-based measurement. In the condition that the object distance is 214 mm (in-focus object distance) and the perpendicularity is 901, adjust the CMM until the image of the reference hole is in the center of the imaging plane (the error of the adjustment can be controlled within 0.01 mm). After the adjustment, the location of the reference hole in the XZ plane of the CMM is regard as the origin of the XZ plane, on which the 11 11 planar virtual circle array are distributed. (2) Calibration of camera internal parameters. Move the CMM along its X direction from left-end to the right-end of the planar virtual circle array. The pixel distance in the image corresponding to the movement of the CMM can be obtained using the 2D vision system. The ratio of the two distances serves as the internal parameter in the X direction of the CMM. Similarly, the internal parameter in the Z direction of the CMM can be obtained. To eliminate random errors, the internal parameters are defined by averaging multiple calibration results. (3) Computation of measurement errors. Move the CMM and adjust the reference hole of the test piece to the measurement points distributed in (1) according to the patterns “left-to-right” and “top-to-bottom”. Measure the reference hole's location using the
Fig. 14. Measurement errors under the condition 214 mm–851.
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
2D vision system and compute the deviation of the measurement with CMM readings. Similarly, the measurements errors with different measurement conditions can be obtained. By analyzing the measurement errors of the experiments, we can get the statistical data as shown in Table 1. The surface models and distribution histograms of the measurement errors under typical experimental conditions are shown in Figs. 12–14. As shown in Fig. 12, under the experimental condition of 214 mm– 901, the measurement errors are random errors. Hence the error induced by lens distortion can be reasonably ignored, and it is suitable to analyze the measurement errors of the 2D vision system with the conventional pin-hole model. However, when there is perpendicularity error or the object distance is incorrect, the systematic errors are obvious, as shown in Figs. 13 and 14. When z¼ 214 mm and the perpendicularity is 901, Δz¼ 2 mm, Δz ¼2 mm, the differences of the theoretical measurement errors and the measured ones are shown in Fig. 15. The maximum deviations are 0.0155 mm and 0.0106 mm in Fig. 15(a) and (b), respectively. When z¼ 214 mm and the degree of nonperpendicularity is 51, the differences of the theoretical errors and the measured ones are shown in Fig. 16, in which the maximum deviation is 0.0418 mm. These conclusions verify the soundness of the theoretical deduction in Sections 3 and 4. The experiments and analysis above show that, in 2D vision system, the measurement error induced by lens distortion can be ignored, while the errors induced by nonperpendicularity or incorrect object distance are significant. Thus measurement accuracy of the 2D vision system can be enhanced through the control of object distance and perpendicularity.
169
(3) Using the calibration results and locating result obtained in steps (1) and (2), move the robot and alignment the axis of the ball-shaped tool with the axis of the reference hole of the experimental workpiece (see Fig. 17). (4) Measure the center of optical target mounted in the reference hole with the laser tracker, and fit a line to the ball-shaped tool axis using the data points obtained using the laser tracker. (5) Compute the distance of the target's center point and the line of the ball-shaped tool obtained in step (4), and the distance is the measurement error of 2D vision system. The coordinates of targets' center points, the line equations of the ball-shaped tool axes in laser tracker coordinate system, and the resultant distances are shown in Table 2. As a comparison, experiments on hole 3 and hole 4 without adjusting perpendicularity and object distance were also conducted. The results are shown in the last two lines of Table 2. It can be observed that improvement on measurement accuracy is obvious by adjusting perpendicularity and object distance in vision-based measurement. Due to the fabrication and installation of the calibration device and the algorithms of image processing, additional errors may be induced. Still, a measurement accuracy of 0.1 mm is achieved through the control of perpendicularity and object distance, which verifies the effectiveness of the proposed method. Measurement accuracy of the vision system developed in this paper is higher than vision systems for robotic drilling reported in literature [3,15]. For example, the hole locations are placed within the specification of 70.06′′ tolerance in the ONCE robotic drilling system, and measurement accuracy of the
6.2. Reference hole measurement with 2D vision system To verify the validity of the method for controlling object distance and ensuring perpendicularity in engineering application, the experiments of measuring reference holes are conducted on a robotic drilling platform. The equipment of the platform is shown in Fig. 17, which includes: a KUKA industrial robot, a drilling end-effector, a Leica laser tracker, an industrial camera, an experimental workpiece and so on. The detailed experimental procedures are shown as follows: (1) Calibrate the camera internal parameters and hand–eye relationship of the drill TCP and the camera TCP, and record calibration results. (2) Locate the reference hole on the experimental workpiece with the calibration results obtained above using the 2D vision system, and record the locating result.
Fig. 16. Deviations between theoretical and experimental errors caused by nonperpendicularity.
Fig. 15. Deviations between theoretical and experimental errors caused by incorrect object distance. (a) Δz ¼ 2 mm and (b) Δz ¼ 2 mm.
170
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
Fig. 17. Experimental platform of reference hole measurement with a vision system.
Table 2 Experimental data of reference hole measurement. Hole ID
Perpendicularity and distance control
Center point of optical target
Parameters of drill axis (direction vector/point on axis)
Distance (mm)
Hole 1 Hole 2 Hole 3 Hole 4 Hole 5 Hole 6 Hole 3 Hole 4
Yes Yes Yes Yes Yes Yes No No
2366.422, 1413.671, 971.788 2432.467, 1446.628, 1038.065 2299.542, 1379.751, 970.621 2365.810, 1412.536, 1036.981 2433.221, 1447.462, 973.104 2499.462, 1480.494, 1039.459 2299.512, 1379.835, 970.570 2365.618, 1412.818, 1036.897
12.699, 23.868, 0.039 / 2377.472, 1393.026, 971.694 14.085, 26.484, 0.034/ 2448.287, 1417.029, 1038.061 12.694, 23.855, 0.047/ 2308.587, 1362.864, 970.648 12.691, 23.858, 0.025/ 2376.840, 1391.779, 1037.067 12.703, 23.884, 0.036/ 2446.717, 1422.011, 972.957 12.679, 23.819, 0.040/ 2515.722, 1449.945, 1039.548 14.110, 26.517, 0.044/ 2310.317, 1359.141, 970.556 14.089, 26.487, 0.030/ 2381.528, 1382.501, 1037.867
0.084 0.077 0.079 0.108 0.115 0.141 0.184 0.191
vision system developed at the Beihang University is within 0.4 mm. Overall, the developed vision system provides a lowcost and high-accuracy solution for measurement tasks in robotic drilling.
7. Conclusions In this paper, the relative errors between the drill axis and the workpiece in robotic drilling are modeled with homogeneous transformation matrix and small angle approximation, and the principle of relative error measurement with a 2D vision system is described. The camera TCP for the vision system is determined to eliminate the Abbe errors in the measurement results. Measurement error models with respect to nonperpendicularity and incorrect object distance are deduced and experimentally verified. Measurement accuracy is enhanced by controlling perpendicularity and object distance in the measurement process with four laser displacement sensors. Experiments of reference hole measurement have been conducted on a robotic drilling system. By using the proposed method in this paper, the vision system can achieve a measurement accuracy of approximately 0.1 mm, which meets the accuracy standard of robotic drilling.
Acknowledgments This research was supported by the National Natural Science Foundation of China (Project no. 51205352).
References [1] Bi S, Liang J. Robotic drilling system for titanium structures. International Journal of Advanced Manufacturing Technology 2011;54:767–74. [2] Olsson T, Haage M, Kihlman H, Johansson R, Nilsson K, Robertsson A, et al. Cost-efficient drilling using industrial robots with high-bandwidth force feedback. Robotics and Computer-Integrated Manufacturing 2010;26:24–38. [3] DeVlieg R, Sitton K, Feikert E, Inman J. ONCE (ONe-sided Cell End effector) robotic drilling system. In: Proceedings of the SAE 2002 Automated Fastening Conference & Exposition. SAE Technical Papers. Chester, ENGLA; 2002. p. 01– 2626. [4] Nubiola A, Bonev IA. Absolute calibration of an ABB IRB 1600 robot using a laser tracker. Robotics and Computer-Integrated Manufacturing 2013;29: 236–45. [5] Jayaweera Nirosh, Webb Phil. Metrology-assisted robotic processing of aerospace applications. International Journal of Computer-Integrated Manufacturing 2010;23(3):283–96. [6] Bone GM, Capson D. Vision-guided fixtureless assembly of automotive components. Robotics and Computer-Integrated Manufacturing 2003;19: 79–87. [7] Ma H, Wei S, Sheng Z, Lin T, Chen S. Robot welding seam tracking method based on passive vision for thin plate closed-gap butt welding. International Journal of Advanced Manufacturing Technology 2010;48:945–53. [8] Motta JMST, de Carvalho GC, McMaster RS. Robot calibration using a 3D vision-based measurement system with a single camera. Robotics and Computer-Integrated Manufacturing 2001;17:487–97. [9] Royer E, Lhuillier M, Dhome M, Lavest J-M. Monocular vision for mobile robot localization and autonomous navigation. International Journal of Computer Vision 2007;74:237–60. [10] Li G, Shi J, Luo H, Tang M. A computational model of vision attention for inspection of surface quality in production line. Machine Vision and Applications 2013;24:835–44. [11] Lee B, Tarng Y. Surface roughness inspection by computer vision in turning operations. International Journal of Machine Tools and Manufacture 2001;41:1251–63. [12] Bilen H, Hocaoglu MA, Unel M, Sabanovic A. Developing robust vision modules for microsystems applications. Machine Vision and Applications 2012;23: 25–42.
W. Zhu et al. / Robotics and Computer-Integrated Manufacturing 30 (2014) 160–171
[13] Liang J, Bi S. Design and experimental study of an end effector for robotic drilling. International Journal of Advanced Manufacturing Technology 2010;50:399–407. [14] Summers M. Robot capability test and development of industrial robot positioning system for the aerospace industry. SAE Transactions 2005;114:1108–18. [15] Zhan Q, Wang X. Hand–eye calibration and positioning for a robot drilling system. International Journal of Advanced Manufacturing Technology 2012;61:691–701. [16] Slocum A. Precision machine design. Englewood Cliffs, NJ: Prentice-Hall; 1992.
171
[17] Tian W, Zhou W, Zhou W, Liao W, Zeng Y. Auto-normalization algorithm for robotic precision drilling system in aircraft component assembly. Chinese Journal of Aeronautics 2013;26:495–500. [18] Zhu W, Qu W, Cao L, Yang D, Ke Y. An off-line programming system for robotic drilling in aerospace manufacturing. International Journal of Advanced Manufacturing Technology 2013;68:2535–45. [19] Gan Z, Tang Q. Visual sensing and its applications: integration of laser sensors to industrial robots. Zhejiang University Press, Hangzhou, China; 2011. [20] Zhu Z, Tang Q, Li J, Gan Z. Calibration of laser displacement sensor used by industrial robots. Optical Engineering 2004;43:12–3.