Robot Visual Servoing for Curve Tracking

Robot Visual Servoing for Curve Tracking

Copyright © 19961FAC 13th Trienni~ World Congre5~ . San Francisco. USA lb-JO 4 ROBOT VISUAL SERVOTNG FOR CURVE TRACKING Jiang Ping, Cben Hui-tang, ...

331KB Sizes 0 Downloads 110 Views

Copyright © 19961FAC 13th Trienni~ World Congre5~ . San Francisco. USA

lb-JO 4

ROBOT VISUAL SERVOTNG FOR CURVE TRACKING

Jiang Ping, Cben Hui-tang, Wanl Yue-juan, Lin jing

Tongjl University , Dept. DIE. E., Shanghai (200092). PR. China

Abstract: Aiming at high speed visual tracking by integrntion of vision and control, this paper presents a new control scheme for robot visual tracking of a drawing curve (eye in hand configuration). To track a curve, the control is decomposed into two subfunctioo. The one is velocity control, which controls the center of CCD camera to move along tangential direction of the curve. The other is position control, which controls the center of CCD camera to move in nonnal direction approaching to Ihe curve. AJ; the result the center of CCD camera always moves along the curve. Tbe unit tangential vector and the nonnal tracking deviation can be measured from image within servo sampling time, Hence the visual tracking control is realized on Ihe base of servo cycle (2ms). For inlegration of vision and control, a Transputer based parallel controller is developed The tracking speed can reach 25cmls with a small field of view (2.64cm x 2.64cm), which is about 10 times higher than the conventional method. Keywords: Direct drive robots: Robot

vision ~ Tracking

I . INTRODUCTION In most of vision-b'l1ided robot control problem, the Iraditional approach has become known as the " look and move" . The images are used in higher decision levels and the motion is slow. In order to increase a robot's ability to operate in an uncertain environment, a recent trend is to use vision information in the dynamic feedback loop. They can be classified as two categories: the first uses static camera and the second uses a camera mounted on the end of robot forearm that is referred to as eye-in-hand configuration. Because of different image area, the first one suits for mst tracking and the second one suits for high accuracy tracki ng. Wijesoma, et al. (1993 ) and Coo, et al . ( 1990) employed an overhead static camera and detected the position of the end-effector in the image plane, which was directly fed to joint servo controller. AlIen, et al. ( \ 993) used two static cameras to follow the motion of a toy train. The literatures ( Papanikolopoul06, et al. , 1993;

systems; Servo: Decomposition mcth
Espiau, et al. , 1992; Hashimoto, et al .. 1993) worked in the eye-in-hand manner. Papanikolopoulos, et aI , ( 1993) introduced SSD optical flow and studied LQG, PI, pole assignment regulators for oI1ject tracking. The experiments were shown lhat the average speed of tracking was between 2 .5 - 3.Ocmls. Espiau, et al . (1992) studied visual servoing with a more b",neral point. It introduced the concepts of task function and interaction screw for image follows . Hashimoto, et al. (1993) proposed a nonlinear control scheme for fast tracking, which compensated the nonlinearity of Ihe visual feedback system, and only si mutations were carried out However> increasing the tracking speed is mainly limited by the delay time of image acquisition and processing. In all of these researches the system is closed-looped with 25hz frame rate or lower. Another typical application of vision in robot motion control is the curve tracking, for example, visually guided robot arc-welding (Morgon, et al. , 1983), drawing curve tracer (Jiang, et al. . 1994), etc.. A vision-b'l1ided curve

337

tracking system for the drawing input is described in (Jiang, et al . , 1994). It worked in the sequence of curve searching from the image, trajectory generating, trajectory tracking by the robot servo controller. The vision information was only used to generate the trajectory, but was not included into the servo loop. Therefore, the tracking is very slow (lcm/s). In order to improve the tracking performances, this paper presents a visual servoing strategy for the curve tracking in the serve rate (500hz). For the purpose of curve tracking, it is ideal that the center of camera follows the curve obtained by the camera with a constant velocity Vlfithout deviation from it. Hence our visual servoing control is designed in the curve coordinates and is decomposed IOto tangential velocity control and normal position control. The velocity control guarantees uniform tracking and the position control guarantees high accuracy. While in the trajectory generating scheme, it is difficult to find the corresponding point of the trajectory end point of the last image frame in its new frame to guarantee the continuity of the trajectory. Our scheme avoids this problem by tracking velocity control, which is more smooth than position control and need not generate the position trajectory. Tn addition, the normal tracking error is easily obtained from the image,

such that every image frame (4OmS/frame) can he used in 20 servo control cycles (2ms) and the problem exists between lower video rate and higher servo rate is solved. Section 2. gives the theoretical derivation of the curve tracking algorithm based on the tangential and normal decomposition, the algorithm is proved to he asymptotically stable. Section 3. discusses the problem of tracking state measurements from images. Section 4 introduces a Transputer based parallel controller for our scheme and the experimental resuhs are presented. In Section 5, conclusion remarks are made.

Usually, a trajeccory trajectory controller.

vision-guided curve tracking system acquires information through image sequences. The is then transferred to the robot position Because the manipulator is still moving in the image processing period, it is difficult 10 ensure continuity of the trajectory segment generated by different image. Discontinuity is related to image processing time, tracking speed and transient process of the servo control. Improper trajectory may cause tracking stop. I n fact, it is not necessary to acquire the whole position datum of trajectory for curve tracking. We need only the center of camera does not deviate from the curve with a constant tracking velocity. Hence we develop our control strategy basing upon tangential and normal direction decomposition.

.~_-{ i

//

r

x

-

o

- -.+ _j

Fig 1. Onve frane

Suppose that no cusps and discontinuity points exist in the curve and the second order derivative of the curve to the arc length exists. The drawing curve is shown in Fig.I , where x-y is the world coordinates frame. Let r he the vector to the center of the camera, then the normal direction deviation from the curve is e and corresponding intersection point with the curve is 0 ' . We establish the curve frame i-j, where i is tangential unit vector of the curve and j is nonnal unit vector of the curve. Then

Because point 0' lie on the curve, the following equation holds . dro ds ., r = - e - = j1 Q ds dt where .. is arc length of the curve. Notice

2. TANGENTIAL AND NORMAL DfRECfION RESOL VED CONTROL FOR CUR YE TRACKING

,I,. s

j

i = w )( i and J= w

x j . Micre w is

anbru1ar

velocity of the i-j frame. [t is related to the manipulator motion and

where a is tangential angle, K is curvature of the curve.

We have

i=

KSj,

j = - KSi and

r= .ii +ej - eK.H=(I- Ke).i i+ej The manipulator velocity in i direction is

( I)

f; = (I - Ke).<

and the curve tracking velocity can he written as .f = _ _I _ r

1-Ke

338

(2) I

Equation (2) expresses that the curve tracking velocity is proportional to manipulator velocity in i direction and also related to curvature K of the curve, and normal direction error e.

v= e;[M(r)(u'Kj) + c(r, r)(ui) + G(r) - k p
F = M(rXusKj) + c(r, r)(ui) + G(r) + k,e, - kpej

Let the dynamic equation of the robol manipulator in the world coordinates can be represented as

(5)

where kl' is any positive definite matrix.

M(r)i" + c(r, r)i"+ G(r) = F

(3) ( Oy

Define the velocity error vector e v = ui - i" , where u is desired tracking velocity (constant), and the normal error vector e p ;::; ej . The control objective can be expressed as

tim r

or

= ui ,

;<

0), Thus

lim', = 0

manipulator follows the curve with

desired velocity,

tirn i" = ui. tim e = 0 I~

' -t OO

In order to prove that the normal error e converges to zero. substitute control law (5) into (3)

The stability of the system can be anaIyzed as follows: Define a Lyapunov function candidate by v =

-

M(r)(r- uSKj) + c(r, r)U - ui) = k,e, - kp
~ [e;M(r)e, + k pe')

From

• then

lim

~ {e~M(r)e, + k p[ej),[ej]}

sub6titutedinto(6) wehave

,-

Let

,"" ::: eTMe' '+ 2'e I TM'e,,- k p [eJ')T[ eJ+e " j') y

r = ui

2

wbere kp > O . Then

lim

(4)

y

j

r = lim ul = lim uSKj.

lim e =O ,

the desired tracking velocity

centrifugal

acceleration

(6)

ra = ui ,

ra= uKSj =

and

ui;K , j •

\-eK

the then

transform (5) to actuators' torque From (I )

cj = r - (1 - Ke).i i , substitute it and

j=-

K:ii

into (4). we get

T

V = e"'M'e}'+ 2"I eTM e•. + k p ['JT[, eJ r - SI" ) v = e;Me"

+ ~e~ Me" + kp[ejJ'[(r 2

lIi) + (u-s)i]

Because i is perpendicular to j . then

,.' = e; Mf, + ~e;Me, + k p[ejj'[r- ui] = e,.T[Me'

V

+2"I Mel'- k peJ')

Substitute the dynamic equation into it ,,' = e;'[M(r)( u,;Kj)+ c(r, r)(ui)+ G(r)+ ~ Me,

-c(r, r)e, - kp
= J T[M(r)r. + c(r, r)r. + G(r) + k, (ui - r) - kp
(7)

where J is the Jacohian transformation from the joint coordinates to the world coordinates, The control law consists of a nonlinear compensator and a linear feedback controller, The schematic diagram is shown in Fig, 2, We can see that the feedback control is decomposed into tangential and normal directions, The normal error is obtained from the image and position feedback is conducted in normal direction, Velocity feedback is conducted in tangential direction, A dynamic compensation is added to guarantee asymptotic stability of the system and the performances of high speed tracking can be achieved, 3, TRACKING STATE EXTRACTION FROM IMAGES For the purpose of fast tracking by the foregoing visual servoing algorithm, the curve features K. i, j and normal direction deviation e should be extracted from the images in real time (within 2ms servo period), Fast computation becomes very important.

339

The curve segment in the neighborhood of the camera center is described by following straight line approximately.

i VI ~--~ KVj U

sin (a)x-cos

(a)y~sin

(a)xt-cos (a)Yt

_e ;

-'O~·

II

Then, the normal deviation of the camera center to the curve is e~

sin (a)x" -cos (a)y" -sin (a)x t + cos (a)y,

(10)

i 1

ViS.~1

From e, i, j the feedback term in the control law can be completed. In order to compensate the dynamics, the

setvOmg

j.

//--

curvature K is required. Note that

..,I

../~/

r. ~ uK'i

and K,; ~ da , dl

we adopt following filter equation to estimate KS

Fig. 2. Schematic diagram of visual servoing for curve tracking (1\)

After sampling the mth image Im' the curve is described as a black pixels band. Influenced by the sampling noise and different width of the drawing curve, this black band is irregular and assume that

wllIin

is the minimum width of

the curve in image coordinates. In order to measure curve feamres in the neighborhood of the camera center (x ,Yo) ,

where &0) is the estimation of Ks and T=2ms is the servo period.

0

a window IV (2d x 2d, d ·.l'K) centered aronnd (x",y,,) is established. Count the number N of the continuous black pixels with l!(x.y)< Th along the boundary of the window, where E(x.y) is the brightness of the image at point (x.y) and Th is the threshold. If N'W,ic then they lie on the curve. The center of them corresponds to one intersection

point

(XI 'YI)

of the curve and the window. Repeat above

procedure, another intersection point (x"y,) can be obtained. On the assumption of d . ilK , only two intersection points exist. Thus, the tangential angle of jth selVo time is (n

~

0,1) (8)

where 0," is rotation angle of the camera coordinates with respect to world coordinates, which is determined by the manipulator configuration in the time of the mth frame sampled, n is the tracking direction and determined by a'_1 and satisfies < 1f i 2.

la, -- aHI

The tangential unit vector and the normal unit vector can be expressed as

i~[COS

(a j ), sin (a j ) ( ,

j~[-sin

(a j ), cos (a j ) ( (9)

All measurements needed in the control law have been obtained, Considered that the video rate (25hz) is too slow compared with servo rate (500hz), the control law should be computed 20 times every frame. In fact, every frame of the image includes the information of the curve for several times serve tracking. So, we can map the center of the camera to mth image plane every servo period by following transformation (12)

where if; is the kinematics equation of the manipulator, eT. is the homogeneous transfonnation matrix from world coordinates to camera coordinates, which is determined by the joint angles e of the manipulator in the time

of the mth frame sampled. Therefore, the vision-guided tracking control can work in the servo rate. This can be described by Fig. 3. From the point of video rate, the control is based on the image sequences, but in the image sampled interval (40ms), the static image is used. [n the foregoing procedure, we attach importance to the fast computation such that it can be completed in servo period (2ms). If the control processors are fast enough, we can find an optimal line or spline (least square error) to fit the black pixels in the window. It makes the accuracy of the measurements increase.

340

I.s1ot3 samples the joint positions and transfers to slotl. 2.s1otl computes current position of the camera center

in the image plane and transfers to correspunding processor (root or slot2). 3.root or sl012 finds the curve segment and transfers approximate line equation to slotI.

4.e, i, j, ~(i) are computed in slot! and transferred to slot3. 5.Torques of each actuator are computed in slot3. Video period Fig.. 3. Vision-guided control in servo rate

4. EXPERIMENT DESIGN AND RESULTS

The aim of this research is for the drawing contour input of a glass cutting direct-drive robot developed in our laboratory (Chen, et aI., 1993). For the purpose of real time visual servoing, a Transputer based parallel controller with four T800 chips is implemented. The configuration of the parallel processing system is sketched in Fig. 4. Where the image input card is designed for the Transputer system, the visual signals are sampled by a 4bit NO and can be distributed to anyone of the T800 chips, selected by the swilCh, through INMOS link. In this system, we connect the image input card with root and slot2 through link3. The serial link rate is 20Mb/s. In order to meet the rea1-

By the aid of this parallel controller, we realize vision servo control within 2ms and the tracking velocity achieves 250mmls with a small field of view (26.4mm x 26.4mm). The tracking process is fast and also smooth. The superiority of this algorithm over pusition tracking control (Jiang, et aI., 1994) is obvious.

As an example, we draw a circle with radius of 197.5mm in a white paper and let the manipulator track it clockwise. The dynamic parameters are the same as Chen, et al. (1993) and the control parameters are selected to be kp = 40.0, k, = 1.0, KPO = 0.2, KeG = 43.1 Let the desired tracking velocity lFl50mmls. One circle tracking time is 8.64s with 216 frames of image sampled. The average velocity is 143.6mmls. The normal deviation of the visual tracking is depicted in Fig. 5. The maximum error is less than 0.6mm except starting the tracking. In there, the maximum error reaches 1mm, which is caused by the desired velocity jumped from 0 to 150mmls.

time requirement, root and slot2 sample the image one

Changing the velocity u has little influence on the perform-

after the other every 40ms. This means that the image stored in the rootlslotZ is used by the controller when slot2/root is sampling the image. Only odd field is sampled with a spatial resolution of 256pixels x 256pixels correspunding to 26.4mm x 26.4mm field of view. The

ances, such as smoothness. But in the trcYectory generating scheme, the tracking velocity is bound by image processing

control interface connects robot with slot3 to input position measurements and output digital control signals every 2ms.

Except feedback control, there is a dynamics compensation in the control law. It is necessary for stability analysis. In order to test the effect in practice, we do comparative

Communication of slotl with root and slot2 forms an image information flow and slot! with slot3 forms servo control flow. The working procedure is as follows:

time, so that the smooth tracking can be implemented only in a proper velocity range.

2.0

e(mm)

1.0

-1. 0

4.32

8.64 t(sec)

Fig. 5. Normal deviation

Fig. 4. Hardware configuration

(u~150mmls)

341

4.0

e(mm )

2.0 0.0

-2.0 2.44

4.88

different control objective is adopted. The tangential velocity centrol ensures high speed smooth tracking and normal direction position centrol ensures tracking accuracy. A dynamic compensation is added to guarantee asymptotic stability of the system. In order to implement the control law in servo rate, a Transputer-based parallel controller is established. Experiments are carried out on a SCARA type direct-drive manipulator and exhibit excellent tracking performances. Under small field of view (26.4mm x 26.4mm) condition, 25Omm/s fast tracking is realized. It is 10 times higher than former approach. It is applicable to visually guided robot arc-welding,drawing curve tracer,etc ..

r(sec) Fig. 6. Normal devitioo without dynamics compensation (lF25Omm/s)

4.0

dmm)

2.0

2.0 2.56

5.12 t(sec )

Fig. 7. Normal deviation with dynamics compensation (lF25Omm/s)

experiments between including the dynamics compensation and neglecting it. We increased the desired tracking velocity to u=250mm/s for enhancement of the result. The normal direction error of neglecting dynamics compensation is shown in Fig. 6 mth 3.30101 maximum starting error, 1.601m maximum tracking error, 254.3mm/s average velocity, and 122 frames of image sampled. The error of including dynamics compensation is shown in Fig. 7 with 3.8nml maximum starting error, I . lmm maximwn tracking error, 242.3mmJs average velocity, and \28 frames of image sampled. So, the tracking accuracy is increased. Compared Fig.6 mth Fig. 7, we can find that the average error in Fig.6 is on the positive side. on the contrary, it is around zero in Fig. 7. This can be explained as that the neglected centripetal force of the motion has to be compensated by corresponding force, which is generated by the normal position feedback control. 5. CONCLUS[ONS A visual servoing control scheme for drawing curve tracking has been proposed. The control is decomposed into tangential and normal directions of tbe curve and

REfERENCES Alien, P.K., A. Timcenko, B. Yoshinti and P.Michelman ( 1993). Automated tracking and grasping of a moving object mth a robotic hand-eye system. IEEE Trans. on Robotics and Automation, 9, 152-l65. Cao, B.L., X. Wang and H.T. Chen (\990). A real-time visual servoing control scheme for robot manipulator. /-'r(}(.' , 1nl. Con! on A ulomation. Robotics and Compurer Vision. 936-939. Chen, HT, P. Jiang, W.H. Zhu, Y.l Wang (1993). Comparative study of 0 .0 . robot control algorithms for trajectory tracking. ?reprint of 12th WA C World Congress, 157-160. Espiau, B., F.Frallcois and P.Rives (\992) A new approach to visual servoing in robotics. It:HE Trans. on Robotics and Automation. 8, 313-326. Hashirnoto, K. , T .Ebine, K.Sakamoto and H.Kimura (\993). Full 3D visual tracking with nonlinear modelbased control. ?roe. American Control Con!, 31803184. Jiang, P., H.T. Chen and YJ. Wang (1994). A visionguided curve tracking and input system for glass cutting robot. ?roe. 2nd Asian Con! Robotics and Application, 665-669. Morgan, C. G., et al . (1983). Visual guidance techniques for robot arc welding. l'roe. 3th Int. Con! Robot Vision and Sensory Controls, 6\5-624. Papanikolopoulos, N. P., P.K.Khosla and T.Kanade (1993). Visual tracking of a moving target by a camera mounted on a robot: a combination of contro1 and vision. fHHE Trons. nn Robotics and A ulomation. 9, 14-35. Slotine, J.1E. and W. Li (1987). Adaptive strategies in constrained manipulator. ?roe. IEEE ('on! Robolics and Automation, 595-60\. Wijesoma, S. W., D.F.H. Wolfe and R.J. Richards (1993). Eye-to-hand coordination for vision-guided robot control awlications. Int . .J. of Robotics Research. 12. 64-78.

342