External Control and Graphics Simulation of a Tracking Robot

External Control and Graphics Simulation of a Tracking Robot

Copyright © IFAC ~Iicrocomputer Application in Process Control , Istanbul Turkey, 1986 EXTERNAL CONTROL AND GRAPHICS SIMULATION OF A TRACKING ROBOT M...

1013KB Sizes 1 Downloads 23 Views

Copyright © IFAC ~Iicrocomputer Application in Process Control , Istanbul Turkey, 1986

EXTERNAL CONTROL AND GRAPHICS SIMULATION OF A TRACKING ROBOT M. B. Zaremba and W.

J.

Bock

Uni versiU du Quebec cl Hull, Hull, Canada

Abstract. In advanced robotic systems operating in an unstructured environment, hlerarchical, real-time control is needed to handle large volumes of sensory data and to execute the required operation. The paper concentrates on methods for visual guidance of tracking robots. The first part of the paper discusses optical sensors for robot vision and an industrial implementation of a robot guidance system using a photogrammetric 3-D vision sensor. Next, the paper describes a graphics system for the simulation of robot motion in an unstructured environment. The purpose of the system is to facilitate the design and analysis of higher lever tracking algorithms that will make obstacle avoidance possible and minimize the possibility of losing the target. The system takes the robot dynamics into account and simulates the use of various vision sensors. Keywords. Robotics; sensors; vision systems; target tracking; graphical simulation; hlerarchical control; microprocessor control. I NTROOUCTI ON

sions with the obstacles. Similar problems also occur in fl ex i b1e manufacturi ng sys tems. (Freund, 1984; Shin, 1985).

The evolution of robot control has led from simple point-to-point control, through supervisory continuous path control, up to multi-robot, flexible systems using AI techniques. Two areas can be distinguished as essential for further expansion of robotics in complex manufacturing systems: sensory capabilities and effective control techniques applied on the level of task and action planning.

The three above levels are executed externally for commercially available robots, whereas Robot Posi tion Contr ol is usually executed by the robot controller unit, which generates signals to joint servos. Figure 1 identifies a Tracking Contr ol block. In the case of robots tracking an object, when the robot positions are determined by the trajectory of the target, the control aspects of the design are stressed rather than path planning. The main requirements focus on real-time response and stable, reliable operation. The tracking problem is often encountered in industrial practice, particularly in the domain of visual guidance of arcwelding robots (Baudot, 1983; Masaki, 1981). An analysis of the control system for a tracking robot equipped with a 3-D vision sensor is given later in the paper.

The use of sensed information in determining proper motion strategies leads to greatly enhanced robot performance capabilities, particularly for robots working in dynamically changing environments. Although artificial vision plays the main role, all available information will be employed to make robot technology as effective as possible, including information obtained from acceleration sensors, acoustic sensors, magnetic sensors, and force sensors. The initial part of the paper discusses a 3-D vision sensor designed at the National Research Council of Canada and optical range sensors used in machine vision applications.

The operation of the tracking robot in an unstructured environment renders it necessary to acquire information on the environment and subsequently elaborate obstacle avoidance strategies while not losing the target. Artificial intelligence techniques have recently been introduced to control robot motion in a dynamically changing environment. The final part of the paper presents an outline of a graphics system for the simulation of a tracking robot in such an environment. The system, currently under development at the University of Qu~bec in Hull, also makes it possible to simulate range sensors and the 3-D vision sensor described earlier in the paper.

The sensory information constitutes the perception part of the robotic system. The role of the control part is to connect perception to action. The control tasks are executed on different levels (Fig. 1). The Task Planning level supervises the operations to be performed by the robotic system. The Action Planning level plans a path to a specified goal, using sensory information that provides a model of the work environment. The Path Control requires a model that reflects the state of the environment as the plan is being executed. The problems of task and action planning are of particular importance, and subject to intensive research in the area of autonomous vehicle guidance. (Crowley, 1985; Keirsey, 1985). Proper mission task control and vehicle navigation should result in getting a vehicle from one place to another in a reasonable time without any colli-

OPTICAL SENSORS FOR ROBOT VISION. Primary information to be sought with machine vision systems is always spatial information, including position, orientation and shape measurements. Research effort in this area is mainly focused on one problem: how to obtain more useful information 185

186

M. B. Zarembo and W. J. Bock

ROBOT POSITION CONTROL

ROBOT JO INTS CONTROL ROBOT ARM Environment

Fig . 1.

Hierarchical structure of robot contro l.

in l ess time . This means simp l y how to overcome t wo major l imiting factors situated at the l eve l of physica l detecting and at that of postdetection processing . Therefore, new physica l sensors are being invented to reduce the measuring time and new measuring techniques that reduce postprocessing (e . g. optical computing) are gradua l ly being introduced . Range measurement appears to be the most important to acquire in each machine vision system . Basica ll y, four generic range measurement techniques are known and described in the literature (Strand, 1985) .

Interferometric techniques are based on the comparison of the phase difference between the mea suring beam sent to the object and reflected back, and the reference beam. Phase difference be t wee n the beams resu l ts in interference pattern and fi na l measureli1ent variab l e is l ight intensity. Commercia l interferometric systems are avai l ab l e for numerous applications and cover the distance scale from nanometers to ki lometers with extreme accuracy (1 part in 10) . The fourth group cons ists of diffraction methods . If a spot on an object is coherent ly i l luminated, the scattered l ight forms a speck l e pattern on a screen . The corre l ation l ength of this pattern depends on the distance . These methods provide high data acquisition speed with MHz data rates achievable .

The most common and the largest cl ass uses a triang ul at i on concept based on geometric measure ments . Most triangu l ation tec hniques are active . There are passive methods that use ambient i l lumi nation , but they are rare l y used because they re quire a great dea l of post -detection data pro cessing . The simplest active method consists of projecting a sing l e point onto the object and subsequent l y imaging this point on a detector . This point is then scanned across the object to provide the range data over the object fie l d. An examp l e of such a method , using scanned l aser beam and a sing l e- el ement sensing detector, is described in (Kanade, 1984) . The idea can be easily extended by projecting a l ine and using a 20 detector or even fur t her by projecting a grid pattern t hat codes each pos i tion.

When considering the appl i cation of any particular method, it is important to see how efficient the measurement process i s . Thi s can be described in terms of the useful bi t s of range data obtained per bit of measured data; if one has to take many measurements to obtain a singl e bit of range data, the method would obviousl y be inefficient and s l ow . To deve l op effi c ient implementat i ons, care fu l ana l ysis and computer s imul ation of the rea l meas urement conditions are usua l ly needed to keep the number of measurements and the execu t ion t i me to a minimum.

A second cl ass of range -measuring techniques is based on the time-of-flight measurement. A signal carrier (acoustic or optical waves of known velo city) is sent from the measuring system , reflected off the object and detected by the observer after the measured period of time . Due to timing limi tations, however , range accuracy of the sensors is l ow (about 2 cm) . The method requires fast detectors with high SNR (signal-to - noise ratio), because the pu l ses of l ight are short and t herefore carry very l itt l e energy .

A 3- D vision sensor based on photogrammetry techno l ogy was deve l oped at the National Research Counci l of Canada (Pinkney, 1978) . The sensor provides precise information on both position and orientation of a specia l target with reference to the optica l centre of the camera . The sensor is therefore particu l arly well suited to situations where the information required relates to the l ocation rather than to the shape of an object . The operat i on of the sensor is fast enough for rea ltime robot co ntro l (30 - 60 Hz) .

Graphics Simulation of a Tracking Robot

The task of the robot is to unload a part from a swinging carrier and deposit it in a container. The various phases of the operation include target selection and acquisition, subsequent part stationkeeping and possible target reacquisition, through to the final materials handling sequence.

CONTROL SYSTEM OF A TRACKING ROBOT. The vast majority of vision systems applied for robot guidance have been based on image processing. From the point of view of real-time control, image interpretation is in many cases computationally expensive and may introduce considerable noise into the control system.

A set of targeted mechanical parts is located on a carrier suspended on an overhead conveyor. The carrier swings freely in a 6 degrees of freedom motion. Consequently, the robot is required to track and stationkeep on the selected target before an attempt to remove or load a part is made.

This paragraph discusses an application of the NRCC vision sensor mentioned above for the control of a tracking robot. A dotted target is mounted in a fixed position with reference to the tracked object. Given the offsets, the location of the object with respect to the robot end effector can be determined and kept at the desired value by the control system.

The robot used in the development is an industrial Cincinnati Milacron T3 robot, equipped with a camera mounted on its wrist. The gripper action is monitored through a force / torque sensor. The robot controller and the external computer are DEPC (Direct External Path connected by serial Control) communication link.

The diagram of the control system (Zaremba, 1986a) is shown in Fi g. 2.

Operator's console

Graphics display r-I

I I

Data

r+ logg i ng

computer

---~----------------i---Graphics processor

I

-

Vision system processor

L __ - - - - - - - - - - - - -

-

187

----------., 1

EXTERNAL I MULTIPROCESSOR I CONTROLLER I

Robot guidance processor

I

---------- ---------- --RS-170

RS-232 IJEPC communication l i nk Robot Contro 11 er

Cincinnati T3 robot arm

t~ilacron

Video camera

Gripper

Force / torque sensor

I---

Targeted parts

AFlFl

I

I

carrier

J

Overhead conveyor

Icontro11 er Part and carrier detection sensors

Fig. 2.

Robot guidance system.

-

188

M. B. Zarembo and W.

J.

Beck

Task Planning Level

~ Vision system error handling algorithms

Action coordinator

Vision system solution algorithms

_·V·_ROBOT MANIPULATOR ANO SENSORY SYSTEM

.- .

Robot controller and robot arm

Video camera

-'-'-

._-

Sensor for operation progress

Robot gripper

Control software organization.

The structure of the control software is shown in Fig. 3. The position and orientation of the target with respect to the camera is obtained directly from the vision system. The solution is in the form of 6 coordinates: 3 linear - x, y, z, and 3 angular - pitch (e), yaw (IjI ), roll (
Material handling algorithms

__ . __ . __ . - . __ . __ . __ . __ . _ .

The external control is executed by three Intel 8086 processors and a mathematical co-processor 8087. The first processor serves to obtain the vision system solution. The second is dedicated to graphics display. The control functions are handled by the third processor. Ouring the initial development, an additional data logging computer was used to store and analyze the data.

S

Target acquisition algorithms

I

EXTERNAL CONTROL ~

Fig. 3.

Tracking algorithms

[COSIjI .COS
= cos e 'sin
sin e·cos ljl ·sin
o

sinljl ·sine·cos
o

o

(1)

The tracking algorithm calculates the predicted position and orientation of the robot arm. The prediction consists of smoothing and extrapolating the desired values calculated up to the latest sensory solution. Therefore, for the vision sampling instants t , the transformation matrix describing predicte~ position and orientation of the end effector (tool) with respect to the robot world coordinate system is: (2)

The actual desired robot position and orientation RO(t v ) is derived from the kinematic chain as in

Fig. 4, and calculated according to the formula:

RO

( Fig. 4.

-, R

::

E

-I

):

S

Kinematic chain for the target tracking system. R(t v )'E'S(t v )'P'F- 1 (3)

where all the elements are homogenous 4-by-4 transformation matrices: E from the vision sensor frame to the end effector frame, P - from the tracked part frame to the target frame, F - from the tracked object frame to the end effector frame. Tracking, although the most important, is only one of a few operations supervised by the Action Coordinator. The coordinator, based on the sensory information and on operator commands, selects the phase of the robot cycle to be executed. The major situations are (Zaremba, 1986b): - Part selection, - Target acquisition, - Normal tracking with the target within the camera field of view (FOV), - Target reacquisition in the situations when the target is partially outside the FOV, - Approaching and grasping the mechanical part, - Materials handling sequence, - Termination of the cycle. The robot guidance system has been implemented showing the advantages of the photogrammetric vision system, the hierarchical structure of the

Graphics Simulation of a Tracking Robot

system controller, and the prediction-based tracking. Excluding the target acquisition phase and tracking phase, the system operates in a structured environment. There are no obstacles intersecting the robot path. A further step to enhance the system capabilities is to extend its operation to an environment which is not fully structured. GRAPHICS SIMULATION. The design and verification of the algorithms for the control of a tracking robot in an unstructured environment with obstacles requires the application of simulation tools. A program for graphics simulation of robot motion in such an environment is currently under development at the University of Quebec in Hull. The simulation is performed on a Jupiter 7+ graphics terminal connected via DMA link with a VAX-11/750 minicomputer under ULTRIX-32 operating system. The purpose of the program is to analyze algorithms on the task and action planning level, taking into account the inertia of the robot. The robot dynamics are accounted for by the generation of a -'forbidden volume" (Ramirez, 1984) around the robot arm. The dimensions of the volume are the functions of the velocity and inertia of robot links. The program organization is shown in Fig. 5. The main module is the trajectory planner, activated when an intersection between the forbidden volume and an obstacle is detected. The planner works out a strategy that will allow the robot arm to avoid the obstacles and reacquire the target if the target is lost from the camera FOV during the process. The module makes use of the information in the environment data base, the models of the variable parts of the environment obtained from the sensory data, and the knowledge base. The knowledge base can contain the priority rules for actions to handle dangerous situations, dynamic parameters of the robot, collision avoidance algorithms, the tolerances of the environment model allowed, etc. CONCLUSIONS. Sensory capabilities as well as effective control strategies and techniques determine the applicability of robotic systems for industrial processes involving tracking tasks in an environment that is only partly structured. The analysis and implementation of the sensorybased robot guidance development presented in the paper demonstrated that the NRCC photogrammetric vision system combined with the hierarchical structure of the system controller and predictionbased tracking provide real-time, industrially applicable solution for a flexible system with a certain degree of intelligence. Operation in an environment that is fully unstructured imposes more severe requirements on the h~gher level control algorithms. In order to provlde a tool for the design and analysis of such algorithms, a graphics simulation system is being developed at the University of Quebec in Hull.

189

ACKNOWLEDGEMENTS. The work reported in this paper was supported by the University of Quebec in Hull, grant SIR-52, and by the National Research Council of Canada, PILP program office. REFERENCES. Baudot, W., and others (1983). Visually guided arc-welding robot with self-training features. Proc. Robots 7 Conference. Brady, M., and Paul, R. (Ed.) (1984). Robotics Research: The First International Symposlum. The MIT Press, Cambrldge, Mass. Crawley, J.L. (1985). Navigation for an intelligent mobile robot. IEEE J. Robotics and Automation, RA-1(1), 31-41. Freund, E., and Hoyer, H. (1984). Collision avoidance for industrial robots with arbitrary motion. J. Robotic Systems, 1(4), 317-329. -Kanade, T., and Asada, H. (1981). In B.R. Altschuler (Ed.), 3D Machine Perception, SPIE, vol. 283. Keirsey, D., and others (1985). Autonomous vehicle control using A1 techniques. IEEE Trans. Software Engineering, SE-11 (9)-,986-992. Masaki, 1., and others (1981). Arc welding robot with vision. Proc. 11th ISIR. Paul, R. (1981). Robot Mampulators: Mathematics, Programming, and Control. The MlT Press, Cambrldge, Mass. Pinkney, H.F.L. (1978). Theory and development of an on-line 30 Hz video photogrammetry system for real-time three-dimensional control. Proc. ISP Symp. Photogrammetry for Industry. Ramirez, C.A. (1984). Robotic intelligent safety system. Proc. Robots 8 Conference, ~, 50-62. Shin, K.G., Malin, S.T. (1985). A structured framework for the control of industrial manipulators. IEEE Trans. System, Man, cybern., SMC-15 (1), 78-90. Strand, T.C. (1985). Optical three-dimen s ional sensing for machine vision. Optical Engineering, 24(1), 33-43. Zare~B.~ssant, M.A., and Hageniers, O.L. (1986 a). Robot target tracking system. Proc. British Robot Association -9 Conference. Zaremba, M.B., Poissant, M.A., and Hageniers, O.L. (1986 b). Sen sory information for supervisory control of a tracking robot. Proc. RAI / IPAR '86 Conference.

M. B. Zaremb o and W. J. Bock

190

Calcula tion of actual robot positlon - inverse kinematic solutio n, - dynamics models, - limitati ons.

Target position

Determination of the "forbidden vol ume"

Generation of the environment model - sensory information, - environment data base. Yes

~

No """""ollision detecte~

I

Tracki ng algorithm

Determination of the collisio n zone ~

I

Traject ory planner - obstacl e avoidance algorith ms, - predicti on of target position - target reacqui sition

l Generation of graphics display

Fig. 5.

Organization of the program for graphics simulation of robot motion.