211
An Approach to Model-Based Robot Software for Industrial Applications Toshio Sata, Fumihiko Kimura, Hiroyuki Hiraoka and Masayuki Enomoto *
1. Introduction
Department of Precision Machinery Engineering Faculty of Engineering, University of Tokyo, Hongo, Bunkyo.ku, Tokyo 113, Japan
There is strong tendency, these days, to introduce industrial robots into production systems to cope with increased requirements for flexibility. But these robots are usually operated by the teach-
This paper describes a robot programming system for industrial applications, which utilizes the models of robot and its environment. It generates information of robot motion from the task description written in a high-level language and verifies the generated motion with simulation based on models. An interface to sensory-interactive robots is also discussed.
Keywords: Off-line Robot Programming, Robot Language, Solid Modelling System, Model of the Environment, Sensory Interactive Robotics
Toshio Sara graduated from University of Tokyo, Faculty of Engineering, Department of Precision Machinery Engineering in 1948. After his graduation, he joined the Institute of Physical and Chemical Research as a research member and worked on various subjects such as wear, plastic forming, metal cutting, grinding and precision m e a s u r e m e n t . He received his doctorate in Engineering Science in 1959 from University of Tokyo. Dr. Sata joined MIT, Department of Metallurgy as a research associate in 1961 and then the Carnegie Institute of Technology, Department of Mechanical Engineering as a Senior Research Engineer. In 1965 he was appointed a Professor at the University of Tokyo, Faculty of Engineering, Department of Precision Machinery Engineering. Since then, he has been working on the subjects of metal cutting technology, computer control of machine tools, flexible manufacturing systems, computer aided design and computer aided manufacturing. Internationally he is a Past President of CIRP, a council member of EXAPT Verein, and a member of the Technical Committee of several international federations such as IFIP, IFAC and IMEKO.
* Now in Production Engineering Laboratory, NEC Corporation. North-Holland Computers in Industry 7 (1986) 211-225
Fumihiko Kimura is an Associate Professor in the Department of Precision Machinery Engineering, University of Tokyo, and was a Research Associate at the Electrotechnical Laboratory, Ministry of International Trade and Industry, from 1974 to 1979. For the last 10 years, he has been active in the fields of geometric modelling (GEOMAP) and sculptured surface synthesis. His research interests now also include product modelling and engineering database, C A D / C A M system architecture, man-machine interaction and computer graphics, robotics and application of artificial intelligence techniques to C A D / C A M . He is involved in the graphics standardization activities of I S O / T C 9 7 / S C 5 / W G 2 , and is a member of IFIP WGS.2/5.3. Fumihiko Kimura received a Dr. Eng. Sci. degree in aeronautics from the University of Tokyo in 1974. He is a member of ACM and Eurographics Association.
Hiroyuki Hiraoka is a Research Associate in the Department of Precision Machinery Engineering, University of Tokyo. He works on sensory interactive robotics and utilization of models to robotics, and is involved in the development of geometric modelling system GEOMAP-III. His research interests also include off-line robot programming, product modelling, computer aided design and computer aided manufacturing. Dr. Hiraoka received a Dr. Eng. Sci. degree in mechanical engineering from the University of Tokyo in 1983. Masayuld Enomoto is a member of the Production Engineering Laboratory, NEC Corporation, Kawasaki, Japan. He received the B.E. and M.E. degrees in precision machinery engineering from the University of Tokyo in 1982 and 1984 respectively. His M.E. thesis research was on model-based robot programming systems. Since he joined NEC Corporation in 1984, he has been engaged in research and development of total advanced production systems. His research interests include computer integrated manufacturing systems, C A D / C A M systems and robotics.
212
Industrial Robots in Discrete Manufacturing
ing/play-back method to repeatedly perform simple predefined operations. They are quite insufficient for flexibility requirements. It is not so easy for teaching/play-back robots to realize complicated operations like assembling, to change flexibly their operations according to changes of predefined conditions, and to adapt to variations of working environments with the use of sensors. Nowadays we can buy various types of good robot hardware, but it will become almost impossible to teach them how to do their jobs efficiently. Therefore, it is strongly desired to develop powerful programming systems for industrial robots to perform practical operations. And there are already a lot of works done on robot programming systems. However, programming robots are not yet considered practical tools in general. The major reason seems to be as follows. The conventional robot programming languages are rather low level, so a human programmer must specify all the necessary robot motion information precisely. This is a quite cumbersome task, and it is extremely hard to make a program that can practical perform complicated operations. In order to realize a high level robot programming language, it is probably necessary to give robots the general basic knowledge about their own characteristics (models of robots), their working environments, objects to be manipulated, and the specification of the jobs to be performed, etc. There are many research activities in the fields of artificial intelligence and intelligent robotics. But they are rather scientific in nature, and not so practical to be used in factory environments. Here we propose several basic principles which may be useful for constructing practical high-level robot software. (i) In order to cope with the complexity of practical jobs, the programming system should be interactive. The system should be furnished with basic knowledge about robot manipulation, such as models of robots and environments, and should offer good programming environment for human programmers. Then human programmers will be able to make intelligent decisions about difficult problems, such as work planning, etc. By combining the power of both man and computer, it becomes possible to perform complicated practical jobs by robots.
('omputer~ m lmh~str~
Based on similar considerations, we have already proposed a general framework for integrated C A D / C A M systems [1,2]. Interactive robot software should be considered as a part of such an integrated C A D / C A M system. Then, during robot programming, we can utilize all the necessary information which has been developed during product and manufacturing design. (ii) In order to achieve high efficiency of robot operations, it is better to split the robot software into two parts: off-line programming and run-time control. In the off-line part, nominal commands for robot operations are generated high-level task specifications. This is usually a complicated work, and will require a rather powerful large-scale computer. Then the generated robot commands are down-loaded into run-time control systems. Based on this concept, it is not necessary for each robot to be equipped with an expensive computer, and still it is possible to achieve complicated operations. There remain difficult problems about the types of information which are transferred between off-line and run-time systems. We are now elaborating the idea based on the general framework shown in Fig. 1 (off-line part) and Fig. 2 (run-time part). (iii) In order to cope with uncertainty about working environment, it is essential to use various kinds of sensors during robot operations. There are many research works about computer vision, tactile sensing etc. for robot applications. But still it is not clear how to maintain the information about working environments based on sensor information, in another words, how to keep the correspondence between the real world and the simulated world in robot programming systems. In this paper, we concentrated on first topics, that is, an off-line robot programming system which will give an easy-to-use programming tool. By investigating this system, we will show the effectiveness of model-based approach for robot programming, and suggest further works in this field. As to Fig. 1, we will only discuss about motion planner, kinematic simulator, and part model. We have also done some related works, for example kinematic simulation [3], dynamic simulation [4], and scene analysis based on models [5]. But many topics remain untouched.
Computers in Industry
T. Sata et al. / Model-Based Robot Software
213
operator
I
. user interface ]
language handier
display handler
planner plannint
library process planner
model & code --)-Int.-code generator
action planner
run-time state manager
/ /
motion planner
sensing planner
mode1
process
state
~eometric
kinesatic
dynamic
sensor
simulator
silulator
simulator
simulator
simulator
siBulator
part model mass
property
assembly inforlation
robot model Bass
property
sensor model
control perforsance
sensory
perfor=ance
geometric model
Fig. 1. Conceptual frameworkfor a model-basedrobot programming system.
2. Model-Based Programming System for M a c h i n e Assembly 2.1. Introduction
Following the considerations in the previous Section, we have developed a prototype modelbased programming system for simple machine assembly [6,7]. Here we roughly classify robot language into
two categories: task description languages and motion description languages. In practical situations, it is cumbersome and error-prone to use motion description languages. Therefore it is desirable to use task specification, such as to place object or to insert object, etc. The problem is how to translate task specifications into motion commands. For this purpose, it is effective to use models of robots and parts to be handled. Levels of task description languages
214
('omputers m Industn
Industrial Robots in Discrete Manufacturing
off-line computer ( VAXIII780 )
ti
run-tim coaputer ( SOP.D N685)
r
vision computer ( PDPIl/34 )
inage processing unit
...~cano ra
hand
F Fig. 2. Hardware configuration of a sensor-based industrial robot.
heavily depend on the ability of these models. Conventional programming systems utilize, in many cases, only coordinate systems as models. Therefore it is not possible to achieve high-level operations. Here we use the more general and powerful models based on solid models, and can exhibit user-friendly programming interface. We use our own solid modeller GEOMAP-III [8] for this purpose. System structure of GEOMAP-III is shown in Fig. 3. Model data structure is handled by the data structure management system (DSMS) in GEOMAP-III, and is represented as so-called boundary representation. See Fig. 4. As stated in the previous section, the run-time control system also needs the manipulability of the
environment information to cope with sensory activities. We propose to give the run-time system a simplified model of the environment called "runtime model" as a database for keeping the current environment information. The run-time system modifies the robot motion to correspond to the real environment state whose information is measured by the sensors and is stored in the run-time model. 2.2. Environment Models
For robot programming, we need several kinds of models, such as robots, parts to be manipulated, fixtures and jigs, other obstacles etc. There is no
Computers in Industry
T. Sata et aL / Model-BasedRobot Software
GEOMAP-III
215
ENVIRONMENT
Primitive
Model
eneratlon Movement Set operations
definition Model
1] ~1 II
manipulation
Data
,l ,l
I
Management
Shape
/ / / ~
ASSEMBLY
PART ~
)
I
POSITION j
GRIP
modifieatior opecationa Display
Utility Free-form curve & surface synghesis
b
L_
fundamental difference among these models. We proposed a general representation framework called a product model in [2]. All the above models can be treated as a specialization of a product model. We concentrate here on the model of machine parts to be handled. For the application of assembly planning and operations, it is necessary to represent various technological information in addition to geometry of parts. We use GEOMAP-III for representing basic geometric information, as well as other additional information, with the aid of DSMS of GEOMAP-III (Fig. 5). In this Figure, " E N V I R O N M E N T " consists of several "ASSEM-
SOLID
I
SHELL
I
PACE
I FACELOOP
,I ( Plane Quadrics, torus Free*form
SURFACE
i
EDGE
l<
CURVE
VERTEX
i<
POINT
TOpOLOGy
PAIR
PACE
CONSTRAINT
I
Fig. 3. Geometric modellingpackage GEOMAP-III.
I 1 :n
SOLID
Straight Arc t Free-form
f
GEOMETRY
Fig. 4. Internal representationof a solid in GEOMAP-III.
J
Fig. 5. Internal representation of machine parts for robot manipulation.
BLY'"s, and "ASSEMBLY" consists of several "PART'"s, and so on. More general and powerful approach is briefly described in [2]. We can divide the information contained in the part models into two categories: the one that is characteristic to machine parts, and the other that has some relations with assembling operations. The former information is normally generated during product design phase, and can be used without redefining, if complete integration of CAD and robot software is established. The latter information must be supplied at the assembling planning phase. Important information in parts models is as follows. (1) Shape Information In addition to the usual geometric elements used in solid modelling systems, machine-oriented characteristic features are necessary, such as holes, shafts, grooves, etc. These features are used for specifying positional relations among parts. (2) Technological Information This includes weight and center of gravity of parts, surface roughness, dimensions and tolerances etc. This information is used for various decisions and verifications of robot motion. (3) Connection Information This exhibits connection relation among parts. Currently we use plane pair, shaft-hole pair, fixed pair, slide pair etc. In our system, users give the specification of the initial and final connection information as a part of task specification. In
216
Industrial Robots in Discrete Manufacturine
some cases, this information can be derived from CAD database or from the results of another task specifications. According to user's program, the system automatically manages the transition from the initial connectivity to the final connectivity.
(4) Position Information This specifies the position and orientation of the initial and final state of parts. Current position information is automatically managed by the system during robot operations.
(5) Grasping Information This is specification of standard default grasping data, such as gripping position, orientation, grip width etc. If a programmer does not specify a special grasping position, then the system uses this default data. Much of the above information can be derived from CAD database, and some must be specified at the robot programming phase.
2.3. Assembly Description Language In Table 1, we show the keywords of the assembly description language. We omit their detailed definitions, but many of them are self-explanatory. (1) Commands for assembling: There are four assembling commands, and all other in Table 1 from (2) to (5) are qualifiers to the commands. If any qualifiers are not specified, the necessary information is extracted from the parts model of the final state. (2) Qualifier for temporal position: FINALPOSITION. This qualifier is used when it is necessary to specify some positions other than final position. (3) Qualifier for approach direction: APPROACH. This specifies the approach direction of robot motion. (4) Qualifier for intermediate point: VIA. Intermediate points are defined when they should be passed before the target position. (5) Qualifier for grasping: GRASP. If a programmer wants to specify some special grasping condition, he can use this qualifier. Some examples of use of these assembling commands are shown in Fig. 6.
2.4. Generation of Robot Motion All the assembly description commands stated in the previous section, can be translated into sets of simple motion commands in a straight forward
Uomputet~s l~i In~h~str~ Table I Assembly Description Language.
I. Commands for Assembling PLACE "object SLIDE 'object ROTATE 'object INSERT "object 2. Qualifier for temporal Position: F I N A L _ POSITION ex. PLACE 'object F I N A L P O S I T I O N ( A T 0,0,100) CONTACT 'facel, 'face 2 ALIGN 'hole 'pin FIT facel, 'face 2 F R O M PRESENT 'face, 'dis F R O M _ FINAL 'face, 'dis AROUND 'hole, 'theta ON 'object AT 'vector TO 'vector 3. Qualifier for Approaeh Direction: APPROACH ex. PLACE 'object APPROACH(PARALLEL 'face) PARALLEL 'face ANTIPARALLEL 'face F R O M _ ABOVE F R O M _ BELOW 4. Qualifier for Intermediate Point: VIA ex. PLACE 'object VIA(100,100,100) 5. Qualifier for Grasping: GRASP ex. PLACE 'object GRASP 'facel 'face 2 ( F R O M PARALLEL 'face 3) F R O M PARALLEL 'face F R O M _ ANTIPARALLEL 'face F R O M _ ALIGNED 'face FROMABOVE F R O M _ BELOW D E P T H _ H A N D _ DEEP D E P T H _ H A N D MEDIUM DEPTh HANDTHIN D E P T H _ HAN D _ 'depth DEPTH 'face, 'depth A P A R T FROM 'face A P A R T FROM 'face IN FINAL
manner. One example is shown in Fig. 7. In order to generate motion commands, we have to get joint angles of the robot from Cartesian coordinates of the trajectory, that is, to solve the so-called' inverse kinematics' problem. As there are a lot of difficulties in providing the general solution to this problem, we take an approach to provide the dedicated kinematic subroutine for each configuration of the robot. For trajectories generated from this kinematic consideration, we still have to check their feasibility in some other aspects. Collisions between the robot and the obstacles
facel
APPROACH (ANTIPARALLEL acel)
part2
FINAL POSITION (CONTACT facel,face2; ALIGN holel,hole2)
Fig. 6. Examples of robot assembling operations.
hole2 J ' ~ , ~ S L I D E
E partl GRASP face2,face3 (FROM PARALLEL facel; DEPTH facel 10)
face3 ~~~,fa~el
face2
I
J
part2
part4
part3
partl
~
~
~
GRASP pinl4
INSERTpart4
FINAL POSITION (CONTACT facel,face2; ALIGN holel,pin2 ; FIT face3,face4) GRASP pini3
INSERTpart3
PLACEpart2 GRASP pinl2
e~
218
Industrial Robots in Discrete Manufacturing
Computers m Industry
/ •
mOVO
|
U:VIA2
5ZVIAI
•
• . 3
:
move ;
"
~
10:
t
J
•
6 : Oi~l l..e081
I~
move :
move: •
.
9 : OlSJ~..J*O$1
v©:jj / 16 ;
=ore :
Fig. 7. Expansion of the assembling command PLACE into a set of motion commands.
Table 2 Intermediate Codes 1: speed: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11 : 12: 13: 14: 15: 16: 17:
move: move: speed: dmove: close: speed: move:
speed: speed: move: speed: dmove:
0100: 5:VIA1 6:OBJI_POS1 0030: 7:OBJI_REL1 20.0 0050: 6:OBJI_POS1 0070: 8:VIA2 9:OBJ2_POS1 0030: 10:OBJ2_REL1
open: speed: move:
end:
0050: 9:OBJ2__POS1
in the environment, are detected at several points in a trajectory segment by utilizing the intersection operation of geometric modelling systems. When collisions are found, the user can modify the trajectory interactively by indicating some extra intermediate points. Not only geometric feasibility but also physical feasibility should be checked. As for one of the physical feasibility checks, a grasping simulation program has been developed to provide the necessary gripping force [7]. Fig. 8 shows an example of results of the grasping simulation. Currently we do not use complicated planning for fine motion generation. For practical applications, it is satisfactory in many cases. But, of course, we need more elaborated methods for further applications.
Computers in Industry
Fig. 8. Simulation result of grasping force. Part: OBJ3 (mass = 48 × 10- 3 kg). Simulation: 1.56 × 10 -2 N. Experiment: 1.53x10 -2 N.
2.5. Preparation for Sensory Interactive Activities The robot programming system generates a sequence of robot motion commands which is applicable to an existing industrial robot. However, it is necessary to generate a different type of information when you consider a robot with sensory interactive functions as a target run-time system. We have developed a prototype of the run-time control system which has sensory interactive functions [9,10]. The functions are also realized through model-based approach. The system has a simplified model of the environment to maintain the information of its changeable state. As the information of the model is modified according to the data acquired from the sensors, the system can control the robot to accommodate its motion to the current state of the environment by referring to the information of the model. This simplified model for the run-time control system, which we call "run-time model", contains information for sensory activities, information of state and information for motion. Run-time control system keeps the correspondence between the real world and the predicted one using information in the run-time model, as follows (Fig. 9). State of the acquired data from sensors is identified easily, for state of the predicted sensory data is classified in relation to the corresponding features, in information for sensory activities. In our system, we have developed a table of features of
T. Sata et al. J Model-Based Robot Software
219
visual image as an example of information for sensory activities. State of the sensory data is interpreted into state of the environment by use of information of state. As the deviation of the real environment from the one predicted is calculated, information for motion, which is represented in terms of a network of coordinate frames [11], is modified. So, the run-time control system gets current data required for motion, such as grasping points or approach points, by referring this information of motion. All the initial information for the run-time model is generated through the simulations executed in the off-line programming system. Coordinate frames for an object are generated from geometric data of the environment based on CAD information. For example, when we consider the task of grasping a mechanical part according to the information from visual sensors, the preparatory process in the off-line system is as follows. First, all the stable states of the part are predicted by the calculation of its gravity center. For every stable state, the visual images are predicted by simulation and classified according to their features. Lastly, the relations between the classified visual features and the states of the part are arranged in the run-time model. As we have a run-time model in the run-time control system, the intermediate motion codes which are transferred from the off-line system to the run-time system, do not refer to a specific point with numerical values but to the names registered in the model. Thus a sequence of motion codes are valid for many situations where the system can accommodate the model to the real environment using its sensors.
2.6. Examples We have developed a prototype software, and tried several assembling examples. We want to assemble the parts shown in Fig. 10. A human programmer perhaps gives the assembling commands as shown in Fig. 11. This set of commands has the effect shown in Fig. 12. Actually these commands are expanded into sets of motion commands automatically, as shown in Fig. 15, by the command processor. (The run-time model only contains information for motion as we
220
Industrial Robots in Discrete Manufacturit ~
( bmputer~ m Imht.strl
run-time model object A
information for sensory activities i
state
--
I
I II
feature
sensor
[
~featurez
( line, 42deg ),I
line, 40deg <] plane~ (92,38) ]
information for state state of sensory activity
state of motion
relation
Ideg=(-l,2,0)
> I
"1 information for motion origin i_
(100,110,20)
approach (relative to origin)
(15,80,0)
(98,114,20)
approach_A= (113,122,20)
intermediate code
arm controller
move approach_A
I move (113,122,20)
Fig. 9. Generation of motion which corresponds to the current state of environment by use of the Run-time model.
do not have sensors in this example.) We also have developed a prototype run-time control system, and tried simple examples of sensory-interactive robot motion. An algorithm which utilizes high intensity regions of metal parts [5] is employed for visual recognition. Sensory-interactive motion of grasping a mech-
anical part is executed for several states of the part, by using a single sequence of intermediate codes and a run-time model. An image acquired from the visual sensor is shown in Fig. 14. The extracted highlight is shown in Fig. 13. Fig. 17 shows the resulting grasping motion of the robot accommodated by the information.
Computers in Industry
T. Sata et al. / Model-Based Robot Software
F265 -- ~
F277 F279 oBJ2 L _ /
F269~
W F276
F287
........~,.~:9..J 7
~
(0) \
~
221
(1)
F2,2
OBJ3
~
6
Fig. 10. Example task: Parts to be assembled. PLACE OBJ2 FINAL(CONTACT F287,F228; FIT F276,F243; FIT F271,F242); GRASP F271,F277; INSERT OBJ3 FINAL(ON OBJ2); GRASP F265,F269 (APART FROM F277 IN FINAL);
Fig. 12. Assembling operations specified by the commands in Fig. 11.
SLIDE OBJ2 GRASP F279,F281; PLACE OBJ4 GRASP F318,F316 (APART FROM F271
C4)
IN FINAL);
END
Fig. 11. Assembling commands for the parts in Fig. 10.
Fig. 13. Highlight regions extracted from the image in Fig. 14.
Fig. 14. Example of sensory interactive grasping motion: acquired image from visual sensor.
222
Industrial Robots in Discrete Manufacturing
I: 2:
speed: move :
3: 4:
speed: dmove:
5: 6: 7: 8: 9: 10: 11: 12: 13: ]4: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: 30: 31: 32: 33: 34: 35:
ClOSe: speed: move: speed: move: speed: dmove: open: speed: move: speed: move: speed: dmove: close: speed: move: speed: move: speed: dmove: speed: dmove: open: speed: move: speed: move: speed: dmove: close:
Computer.~ m /udustrr
0100: 7 :O B J 2 _ P O S I 0030: 8 : O B J 2 RELI 20.0 3 :OBJ2 : 0050
:
7 :OBJ2_POSI
a.o,
0070 :
a.ta ~.21
9 :OBJI_POSI 0030:
3.i, 3.t,
1 0 : OBJI_REL1
0050: 9 :OBJ1 POSt 0100 : 11 : O B J 3 _ P O S I 0030 : 12 : O B J 3 _ R E L 1 19.0 4 :OBJ3 : 0050 : 1 I : O B J 3 POSt -0070 : 1 3 :O B J 2 _ P O S 2 0030 : I 4 :O B J 2 _ R E L 2 OOLO: 15 :OBJ2
--
REL3
41:
42: 43: 44: 45: 46: 47: 48: 49; 50: 51: 52: 53: 54: 55:
speed: dmove: open: speed: move: speed: move: speed: dmove: close: speed: move: speed: move: speed: dmove: open: speed: move: end:
Intermediate
O.~0 0.ll0t
OI4ts
o.ms e.~o
~i
6.an 7.40 ?.I, ?.a,
I.l, II.I, |.21 s.eo ~.j, e.al lO.O, |II. I , |o.2, 11.0, I l, l '
01 0 0 : 16 :OBJ2_POS3 0 0 30 : 17 :OBJ2_REL4 41. 0 3 : 0BJ2
:
REL2 --
:
16 :OBJ2_POS3 0100: 19 :OBJ4_POSI
0030 : 20 :OBJ4_REL1
20.0
5 :OBJ4 :
0050 : 1 9 :O B J 4 _ P O S 1
0070 : 21 :O B J I _ P O S 2 0030: 22:OBJI REL3
13.0, 13. I I 13.21 14.0o 14. t , 14.2' iS.e, 11. I , 1S. :t', IS.l, IS. I o 16.21
17.0, 17.1,
t?.~l II.i, 11.11 18.21 re.on 19.1, 19.21
ae.e, 20.11 ~.2s 21.o, 2t.l, at.as 2~.I,
la
I~lKm
la
~,
O.qSle0 ql.lle,
O.~0 0.0N, s
Is
e.mn e.oee, VORKn
o.ms O.INNI,O
e.m0 O.O~llt
08J3,
o.sees O. IOQI
~kl4,
e.mu 11 11
0.llel (d0IMC, ..lees O.BI WORKs
O.Ne0 O.OeO0
T~L~t
-see.me O.ms
0IJR.POSI 8
e.oee0 18 38
3311.5Oo, o.m, 3n o.eoe,
0IIJ2_RKLt !
e.ooen
OlMI.POSI z
O* I:~O' o.m,
01KII.IL'LII
S.IMIO, e.oees
OIM3.J,QS18
a@O, ollel
l.m0
08,K1.RKi.I n
O,,llOll,
If.as
13 : O B J 2 _ P O S 2
0050
~0
S.t0 S.2a 6.as K.I,
12.Ou 11'.1'
0050 :
0030 : 1 8 :OBJI
3.2! 4.o8 4 • I, 4.Z8 s.e,
it.a,
4 :O B J 3 : 36: 37: 38: 39: 40:
I.e, I. t, l .at
it J, 4o
3~lg.G4~l loll011
08Ja.E1~8 ~
e.ONn II.INI01
s
Olai.~l~
o.lmes O,OOOs
I
08J~, ?IS.Tw~OI IN.0111 01Mat O.eNs 0*001l 08.hlJ o.oeee O.01el Clail 137.aN; IN.0elII OOJaa O.mo O.ll~l ~14|1 *11,1.~41 IoNS 0144, r/e.eee, IN.10Om OilJ4, O.0e0s O.illl0l
3o 3o 31
II -1 * 1 4 1 0 1 0.0110, Sis ~ . "lsos 4.~' l;o O.m, 0.041@,
OId4.~Ll~
o~41.~w,a,
I, ~ t o --e.seoo S M •~,e0 e,mo
01J I . I I ( I I . I o
as. t, as.2; ~.Oo~s
0050:
2~ :o~;~_pos2
i.ml 0.100,
ill
IN.0N,~ 01Jlu O.0N, O.01101
Run-tAme model
codes
F i g . 15. M o t i o n c o m m a n d s w h i c h c o r r e s p o n d t o t h e a s s e m b l i n g c o m m a n d s
III.INQI
I.~8
O.N0! O,Iltl
08J4J,0el~
iN.too
30
3o
01~ | . ~ l a ~ I
o.ms OlMI t $69.144i, IN.m, 0141,1 l.Im41l O.eN, 01M30 4141. 'F-~Il 08J3J
aQe.oeel o.llel
O~a.R~40
~,4.~eo tM.m0 4.m0
08Jaa
41
1,0e08 08Ja.Pm~*
ql,.qmo, &J0NKo ,se.m, e.Not Oik~J
i n F i g . 11.
e.eee, O.01QO o. oe~,l O.~I
e.eee, O.OOOe ,-IN.m, 0.0N8
-[N.~I N.~s - 1 ~ . ~ao, O.eN, 043.84N11 ~.mo - 12:3. SIS0, I.eee,
-411.11o111
l~.mJ -i,la .0001 l.aeos q.0OOI |lll,lilll -t~. mo 0.0101 -.4e.oeel O.00el 3.ms l IMli,,0@01
-t~.m,
O,l~tO
l . l~ll 1.4il~l -71 . a s s , - i ~ .01411
-IS4.N e o
0.41~II
-'4 .2NO ll0,lelo
-l~4.m, 0.4eel
w
224
Industrial Robots in Discrete Manufacturme
Fig. 17. Robot motion modified by the sensory data of Fig. 16.
(OtII[gIIICF~ I?l hldlcGll'h
Computers in Industry
T. Sata et al. / Model-Based Robot Software
3. Conclusion
We have proposed a general framework for an advanced robot software system which consists of off-line and run-time parts. Then we have briefly explained a model-based robot assembly programming system as a part of off-line modules. This simple experiment shows the usefulness of the model-based approach to robot software. Out future work will include: (1) Clearer definition of interface between off-line and run-time systems (2) Use of sensor information in higher feedback rate for the sensory interactive motion. References [1] F. Kimura, T. Sata and M. Hosaka: Integration of Design and Manufacturing Activities Based on Object Modelling. Advances in CAD/CAM (North-Holland, 1983) 375-385. [2] F. Kimura, S. Kawabe, T. Sata and M. Hosaka: A Study
[3]
[4] [5]
[6] [7] [8] [9] [10]
[11]
225
on Product Modelling for Integration of CAD/CAM. Computers in Industry, Vol. 5, No. 3 (1984) 239-252. T. Sara, F. Kimura and A. Amano: Robot Simulation as a Task Programming Tool. Proc. of llth ISIR (1981) 595-602. Y. Mizugaki: Dr.Eng. Thesis (in Japanese). Dept. of Prec. Mach. Eng., University of Tokyo (1984). K. Takai, F. Kimura and T. Sata: A Fast Visual Recognition System of Mechanical Parts by Use of Three Dimensional Model Proc. of COMPINT (1985). M. Enomoto: Ms.Eng. Thesis (in Japanese). Dept. of Prec. Mach. Eng., University of Tokyo (1984). Y. Taguchi: Ms.Eng. Thesis (in Japanese). Dept. of Prec. Mach. Eng., University of Tokyo (1985). F. Kimura: GEOMAP-III: Designing Solids with Free-form Surfaces. IEEE CG&A, Vol. 4. No. 6 (1985) 58-72. K. Shimada: Ms.Eng. Thesis (in Japanese). Dept. of Prec. Mach. Eng., University of Tokyo (1985). H. Hiraoka, K. Shimada, Y. Taguchi, K. Kondo, S. Yoshida, T. Yokoyama, F. Kimura and T. Sata: Utilization of Environment Models for a Sensor Equipped Industrial Robot. Proc. of 15th ISIR (1985) 113-120. R. Paul: Modelling, Trajectory Calculation and Servoing of Computer Controlled Arm. Ph.D. Thesis, Dept. of Comp. Sc., Stanford University (1978).