Copyright to IFAC Intelligent Assembly and Disassembly, Bled, Slovenia, 1998
SENSING AND ACTING IN INTELLIGENT ASSEMBLY SYSTEMS
Dragica Noe
University ofLjub/jana, Faculty ofMechanical Engineering Laboratory for Handling, Assembly and Pneumatics Alkereeva 6, 1000 Ljub/jana, SLOVENIA E-mail:
[email protected]
Abstract: The development and use of intelligent assembly systems to automatize batch production is the most promising future direction in this field. Most of the governments of the world's developed countries are, through their scientists and production engineers, putting huge efforts into developing strategies, systems and components to realise the concept of intelligent behaviour in automated manufacturing. Sensors are one of the necessary components in the development of intelligent assembly systems. Basic sensing components, like robot vision, tactile sensors and force torque components, are indispensable in these systems. A survey of the state of the art in sensor technics and plans for future development of the assembly process will be discussed in the present paper. Copyright © 1998IFAC Keywords: Assembly, Sensors, Flexible Manufacturing Systems, Intelligent Systems
1. INTRODUCTION
most far-reaching fundamental work on this subject appeared three years ago (Lee and Rowland, 1995). Some recently published papers will furnish the background for this overview.
The phrase "intelligent assembly system" (IAS) is used to describe robotic assembly cells and lines with the human-like flexibility capable of assembling different products in small batches, as well as the ability to manage quick changes regarding new products. These systems are also characterised by easy programmability and reprogrammability, minimal product-specific tooling, and resilience to manufacturing tolerances. These systems are both compliant and skilful, and have the capability to detect errors and recover from them (Lee and Rowland, 1995). This set of requirements represents a complex problem for assembly system designers as well as production engineers.
Different institutes supported by national authorities and international sources have done impressive work towards developing the new, flexible assembly, socalled intelligent assembly systems. Unfortunately these systems can hardly fmd an application in actual batch production (Storm and Meijer, 1994, Onori et aI., 1993 and Wyns et al., 1996). Some efforts can be observed in surface mounting and other electronic circuits assembling, but assembling in small batch production remains a manual job. The development and use of flexible assembly systems are strongly connected with their sensors, namely their availability, quality and stage of development. The sensor system as a significant part of robotics has been researched for a long time, but knowledge of its use in assembly systems does not
The present paper intends to point out some efforts made in the past to understand the role of sensing components in intelligent assembly systems and connect the sensing system with control, programming and acting units (Noe et al., 1995). The
147
have nearly as long a history. The present applications of inteIJigent sensors in flexible assembly systems are mostly dependent on the robot manufacturer. The possibility of exchanging parts and using sensor systems available on the market is rather smaIJ, or at the least time-consuming if one were to attempt adaptation.
system task describes the relationship between the robot arm or gripper and the part that has to be assembled. The sensing needed to observe the process and control the robot can be described as a real time task. The requirements are strongly connected with the sensor type and sensing strategy. In the very first stage of development of the IAS, some questions appear which require answers. For example: What sensory information is useful and relevant for this application, what information is necessary to achieve the task, how should one integrate the sensors into the IAS, is there already a generic solution, etc.
2. ASSEMBLY AND REQUIREMENTS IN SENSORS The assembly process can be defined as an ordered sequence of assembly operation - handling and matching, control and other tasks in which discrete components, subcomponents and material are brought together and joined to form a specified configuration. It is surprising that the assembly process, despite its importance in the production process, especially in small batch production, is still performed mainly by hand. In comparison with mass production, it is possible to recognise that automation has a long, successful history. As is known, mass production in the automotive, household appliance and electromechanical industries accounts for no more than 10% of total production. The rest is produced in smaIJ and medium batches and in a variety of different ways. Production today is also characterised by a permanent state of product redesign, caused by short product life cycles, short product development phases, reduction of production costs and the requirement for high availability of production systems. High product quality is selfevident.
In a robotized assembly ceIJ, the environment, in comparison with other robot applications, is an essential part of the system. The great number of assembly units (feeders, grippers, pallets) and assembled components present offers an enormous range of possibilities with unlimited complexity, as weIJ as possibilities for error. In dealing with such complex assembly systems and fulfil their given requirements, two current approaches exist: Prediction and planning, a classical AI approach, and a newer, behaviour-based approach. The latter approach rejects all planning and deals with event reactivity through direct coupling between sensors and actuators (Rowland and Nicholls, 1995). An essential aspect of robotics applications and research, known to researchers and designers, is the problem of interfacing incompatible pieces of equipment in order to build an assembly system. Commercial robots and other units are extremely manufacturer-dependent. Usually a great deal of time is spent in simply acquiring and building the assembly cells, including configuration and writing software, before any useful work on applications can begin.
Also worth mentioning are the folJowing characteristics required of assembly systems today: • • • • •
Ability to make quick product changes adaptability Easy programmability and reprogrammability trainable Minimal product specific tooling - dextrous Resilience to manufacturing tolerances compliant and skilful Error detection and recovery - observant and meticulous
3. SENSORS IN FLEXIBLE ASSEMBLY "Robotics is the intelligent connection of perception to action" (Brady, 1985). It is evident that when discussing intelligent assembly systems (IAS) as an essential application area for robots, the above defmition could be identical. Sensors are playing an important role in the development and application of assembly systems.
Assembly systems with the above mentioned characteristics have to be supported with the computer intelligence to empower them. A decisive turning point is involvement of machine intelligence, including knowledge, sensing and monitoring, new system architecture and control principles, into the assembly systems (Noe et al., 1995).
It is known that for robots to operate effectively and intelligently within assembly systems, they must be equipped with an array of sensors. Leaving aside the internal sensors necessary for the robot's correct behaviour, there are several external sensors needed in the assembly ceIJ for observing the environment the incoming parts and subcomponents, the feeding,
There are three different tasks that must be observed. Identification of the parts - shape and dimension, present or not present, orientation - in the robot space is treated as a global task. The phrase local sensor
148
storage and other equipment supervJSJon, and the control and supervision of the assembly process.
A single task requirement for sensing may be solved using one of several different sensing modalities, but some of these modalities are better developed than others. Although the first robotic eye was developed in laboratories in the early 1960s, it wasn't until the late 1970s that the first industrial applications appeared in which it was used for part recognition, placement and inspection. The three-dimensional tactile head that could identify parts and their contours by binary image processing appeared (Wright and Boume, 1988), and in 1979 "smart" sensors were developed for the detennination of proximity, force and torque, tactile, and slip parameters used for control of automated robots. The "artificial skin" sensor that appeared in 1980 was capable of identifying an object by measuring the pressure in the matrices' cells. These developments were followed by, in 1982, tactile sensors with a resolution rivalling that of a human, and, in 1984, a microcomputer-based robotic SONAR imaging system capable of perfonning contour plotting, part location and sizing, and simple part recognition.
INTERNAL TORQUES VELOCITY
POSITION FORCES
ACCELERATION DISTORTION
SURfACE TACTILE - cunl.cl
pOlnlS.
surface unys
EXTERNAL
PROXIMAL
MED lA L
l)
1ST A l
!
PROXIMITY
I
SONICS
RANGEFINOERS
1
1
STRUCTURED OPTICS
1 YISIO;
t
RA 1) 10
Fig. I. Sensors used in robotics (Lee, 1989) Generally, sensors are intended to provide infonnation about the position and/or orientation of the component to be assembled, to correct positional and/or orientation errors of robot grippers and the handling component or assembly tool, to aid in detecting faults and failures of the assembled products by visual inspection, to provide continuous infonnation about the assembly process (for example, inserting, screwing and handling by measuring forces and torque), to provide other infonnation about the environment to avoid a collision between robots, and provide infonnation on environmental status as well.
While work on sensor development and low-level processing goes on, most of the problems of robotic sensing seem to involve the understanding and proper use of infonnation provided by sensors.
3. J Computer vision system in assembly
Paradoxically, all the enonnous amount of academic work in robot vision in past years has not been reflected in its practical use in the manufacturing process. When academic researchers were struggling to find solutions which approximated the human eye, capable of processing a scene with more object, with shadows, and using Al techniques to understand the images, production engineers were pushing the investigation in the direction of special-purpose solutions (hardware), binary images, active lighting and partial object recognition. For that reason, simple devices for feeding and pre-positioning of an observed object (an assembly component) or some significant marks on the object were used for identification in assembly systems. The grasping and handling component stored randomly in magazine or open space remains an unsolved problem.
Different types of sensors measure the above mentioned parameters: for example, proximal tactile sensors detect real-world parameters through contact with an object, vision sensors through the use of a camera, and range fmders through a triangulation system (Fig. I). A taxonomy for sensing to aid the selection process or application in assembly systems, developed by Lee (1989), identified five levels of refinement leading to sensor selection: •
•
• •
Assembly task requirements - localisation, slippage detection, size confinnation, inspection, testing, choice of modality - vision, force, tactile Specification of sensor attributes - output complexity, discrete or continuous variable, imaging or non-imaging, local or global Specification of operational parameters, size, accuracy, cost Selection of mechanism - switching devices, inductive sensors, CCD vision imaging
Both local and global requirements exist for the application of computer vision in the assembly process. The orientation and position of the component, or only a part of the assembled component with respect to the robot wrist or gripper fingers, and verification of the component's or tool's presence in the gripper are treated as local requirements. Location of the component or only part of it in the robot working space prior to the assembly
149
process is a global requirement. Robot vision is used mainly for global requirements. There has been, however, some research published showing the use of robot vision for position determination of assembled part during the assembly process (Kim et al., 1996)
research has focused on treating tactile sensing as an active sense. The results of processing a tactile image were used for selecting the next action in a collection of data from many images, to construct a three-dimensional representation of a touched object (Nichols, 1995).
The importance of robot vision as a percentage in assembly tasks (handling, feeding, assembling, etc.) is approximately 30%, and in automatic inspection more than 50% (Wright and Boume, 1988). In flexible systems, essential maturing of the use of robot vision, especially in monitoring and control of the insertion process and supporting of the flexible orientating equipment, is expected.
Tactile sensors are also applied for recognition or discrimination of a grasped object, for texture monitoring, and checking of joins and damage detection. One advantage of using the tactile sensors mounted on a robot gripper is that inspection and verification can be carried out during manipulation. Tactile sensors on a robot gripper can provide manipulator - object relationship information, as well as the more conventional object feature characteristic. Grasp monitoring concerns watching for changes and can also confum object grasp or release, and also involves location and orientation verification. Calculation of positional and rotational errors can be passed to the robot control system and the appropriate trajectory adjustment made. This type of function is very important for autonomous systems, since it enables automatic recovery from certain types of task error.
3.2 Tactile sensing in assembly
Tactile sensors are used to simply detect the presence or absence of contact between sensor and stimulus. On-off switch sensors are often used for detecting the presence of components in the gipper, etc. This is the primary definition of tactile sensors given by Harmon (1984). In addition, tactile sensors have been used for measuring the controlling force, threedimensional shape recognition, control of slippage in handling the components, and in detection of the thermal conductivity of a contacting stimulus.
Advanced robots often have end-effectors with dedicated multiple jointed fmgers for grasping and dextrous manipulation, such as rolling and twirling of objects in an unstructured environment (Bicchi and Sorrentino, 1995).
Tactile sensing has received an increasing amount of attention since the early '80s, with a number of new sensors and tactile data processing approaches. Compared with, for example, developments in machine vision, tactile sensors are used primarily for local tasks and their share in the assembly process is greater than that of vision systems; however, in inspection their role is subordinate. One reason for the lack of sophisticated tactile sensing in robot applications is due to the scarcity of robust, reliable, accurate and high-resolution sensors. Technology advances in tactile sensors are now concentrated on developing robust sensors, and in processing the data and its integration into a suitable control system.
Recognition of objects through the analysis of a single tactile image is a static mode and analogous to human fingertip sensing. The tactile sensor provides only a local view of a contacting stimulus. To receive more information about an object - for example, to localise an object in the robotics world or recognise a touched object - active sensing is used. Usually, two main domain issues are involved: strategy and control. A strategy is needed for the exploration of an object by moving a robot fmger and/or hand around the touched object and gathering data (shape, surface texture, material properties). The data are used for the next move and may be used as source data for object representation. For the support, a suitable algorithm is needed (Brady, 1989). Such active sensing has been called dynamic exploration (Berger and Khosla, 1991).
Tactile sensors do not have difficulties with reflection from the object, and also do not suffer from problems with shadows as does robot vision. Measurement with tactile sensors is direct and independent of position; measurement with robot vision is not. Tactile sensing has other limitations, such as only having the ability to provide local views, and having the potential to distort a sensory object. Machine vision provides global views, while a tactile sensor, as noted, provides local views.
Active tactile sensors will eventually be integrated into factory-based robotic systems. Research in the tactile sensor domain has recently been concentrated on systems rather than on individual sensors. The close integration of the sensor and actuator subsystems as part of a coherent exploratory strategy is common to several promising systems in the development of active sensing. More research is needed to establish how and in what context active tactile sensing should be employed
Early work on tactile sensing followed that of machine vision by concentrating on processing and analysis of single images, perhaps to achieve localisation or recognition of the object. More recent
150
3.3 Force and Torque Sensors
sensors and actuators is becoming more and more important. In this context the virtual sensor concept was proposed and developed by Aberystwyth Framework for Industrial Robot Monitoring AFFIRM (Rowland and Nicholls, 1995). The concept is similar to others, such as logical sensors (Henderson, T.C. and E. Shilcrat, 1984) or the autonomous sensing unit (Noe et al., 1995).
To avoid the use of high precision robots in simple insertion assembly operations and to compensate for errors in positioning, the passive adaptive unit referred to as Remote Centre of Compliance (RCC) can be used. RCC is used in the insertion processes of a peg or a screw in a chamfered hole that can be threaded. RCCs are rather inexpensive units for avoiding positioning errors, and have a number of disadvantages: for example, the size range of operation is limited, the parts sanction is effective only if the parts have portrait chamfer, the unit does not allow for dynamic effects and high-speed operations, etc.
Any flexible assembly cell will contain diverse sensor types ranging from, for example, binary sensors, such as micro switches or proximity sensors, to analogue sensors for parameters such as force and displacement. End-effector coordinates probably derived from manipulator control systems, as well as time, may constitute further sensory inputs. Tactile pads or fingers, along with conventional vision systems, may be used for inspection, determining part orientation, task monitoring and verification, and possibly also part recognition and classification. All of these sensors will differ in electrical characteristics and in their interface and computational requirements.
An active sensing system includes sensors for force
and torque measuring; this is the so- called wrist force sensor (Devjak and Noe, 1997). The six axis sensors developed in different laboratories are actually produced and sold as standard components. In the past some authors presented the results in different branches, and the active sensing system was also used in analysing quality assurance in assembly (Linstrom and Sodequist, 1996), where three aspects were analysed: quality of process, quality of components and quality of supervision.
An essential feature of the virtual sensor concept is that the interface to all sensors, whether real or virtual, should offer the supervisor a similar command and response structure for controlling and accessing the sensors, analogous to the concept of a "virtual machine" in the context of computer operating systems. This is clearly an advantage in seeking out the principles of a "generic supervisor" capable of operating with different physical work cells.
3.4 Other types ofsensors in intelligent assembly
In assembly systems and other robotic applications, other types of sensors are presently in use: namely, proximity and range sensors.
Most workers in robotic assembly have used application-specific rather than generic sensor integration methods. The concept of virtual sensors, based on Henderson and Aberystwayt's laboratory work, contains a solution with some characteristics of a generic system, and the concept could be appropriate for the development of IAS.
Most proximity sensors indicate the presence or absence of an object within their workspace; some research has also been done in classification of objects (Caselli et al., 1996). Few proximity sensors can give information regarding the distance between the sensor and object to be sensed. There are a variety of sensors, such as optical proximity sensors, magnetic field, and acoustic proximity sensors. In assembly systems, proximity sensors can be used for insertion operation identification, collision avoidance, presence and absence in part detection, etc.
The concept of virtual sensing is a versatile basis for the design of sensor integration systems for flexible assembly. It also provides the ability to integrate information from multiple sensors and to define a uniform information interface for different sensors. The concept can provide sensing functions which do not correspond directly to physical sensors, along with an ability to incorporate expectations and respond according to whether or not these expectations are met; an ability to respond to sequences of sensor events or conditions; an ability to recruit different sensing functions during a task; and an ability to provide virtual sensor functions which reveal trends in workcell parameters, over successive assemblies, and over combinations of sensing devices.
A range sensor is a device that can provide a precise measurement of the distance between the sensor itself and the object. These basically measure the reflection time of a laser or sonar beam to the object and back to the source, operating on the principle of beam triangulation.
4. THE AUTONOMOUS SENSING CONCEPT One of the new approaches in development of IAS is a task-oriented approach, where the coupling of
151
The virtual sensing system concept is part of a strategic approach to the new architecture in building robotized assembly cells. Mindful of the generic approach, the new architecture concept follows goals such as simplicity in the design, planning and manufacture of assembly systems; reliability in practice; capability of supervision in use; and simplification in the rebuilding of assembly systems.
Devjak, L. and D. Noe (1997). The Peg Hole Insertion in SCARA Robotized Assembly Cell, Procee-dings of RAAD 97, pages 523-128. Cassino. Harmon, L.D. (1984). Tactile Sensing for Robots. Robotic and Artificial Intelligence, pages 109158. Springer Velag New York. Henderson, T.e. and E. Shilcrat (1984). Logical Sensor Systems. 1. Robotic Systems, 1(2), pages 169-193. Kim W.S., H.S. Cho and S. Kim (1996). A New Omnidirectional Image Sensing System for Assembly, Proc. IROS, pages 611 - 617. Osaka. Lee, M.H. (1989). Intelligent Robotics. Chapman& Hall, London. Lee, M.H. and J.1. Rowland (1995). Intelligent Assembly Systems, pages 165 - 180,. World Scientific Publishing Co Pte Ltd, Singapore. Lindestram C. and B.A.T. Sodequist (1996). On Using Sensors for Quality Assurance in th Assembly, Proc. of the 28 CIRP International Seminar on Manufacturing Systems, pages 66 69, Johanasburg. Nichols H.R. (1995). Tactile Sensing for Automation, Intelligent Assembly Systems, pages 131 - 164. World Scientific Publishing Co Pte Ltd, Singapore. Noe, D., L. Devjak and T. Perme (1995). Human or Industrial Robots. Human-Oriented Design of Advanced Robotics Systems, DARS '95,2, pages 115-118. IHRT, TU ofViena. Onori, M., A. ArnstrOm and A. Ericsson (1993). Application of a New Programming Environment within the MARK III FAA System, Proc. of the 24 th Int. Symposium on Industrial Robots (ISIR), Tokyo. Pinkava, J.J. (1995) Towards a Theory of Sensory Robotics, Intelligent Assembly Systems, pages 209 - 234, World Scientific Publishing Co Pte Ltd, Singapore. Rowland J.J. and H.R. Nicholls (1995). Sensing and Actuation for Flexible Assembly. Intelligent Assembly Systems pages 165 • 180. World Scientific Publishing Co Pte Ltd, Singapore. Storm T. and B.R. Meijer (1994). Delft Intelligent Assembly Cell, e-mail: http://www-op.wbmt. tudelft.nVpt/ProjectsIDIAe.htrnl Wyns, J., H.Van Brussel, P. Valckenaers and L. Bongaerts (1996): Workstation Architecture in Holonic Manufacturing Systems, Advanced in Manufacturing Technology Focus on Assembly Systems, Proceedings of the 28 th CIRP International Seminar on Manufacturing Systems, Johanasburg. Wright, P.K. and D.A. Bourne (1988) Manufacturing Intelligence, Addison - Wesley Publishing company, INC, Reading.
5. CONCLUSION It is evident that the development of intelligent assembly systems as well as flexible assembly systems for batch· production, and their exploitation on the shop floor with high availability, is strongly connected to the state of the art and future development of the so-called external sensors. Survey of sensor use in assembly systems have shown us the discrepancy between academic research and application. While scientific research has gone in the direction of developing ever more sophisticated sensors and systems, closer to human senses, on the business end production engineers are working in the direction of simpler and cheaper solutions, completely task-oriented systems and reliable sensors. We now have well-developed sensing units, but their use is mostly dependent on the robot manufacturer.
It must be pointed out that published examples of task-oriented solutions are very interesting, but direct re-use in new assembly systems is rather difficult as well as time- consuming. It can be expected that future research will be oriented in the direction of generic solutions (Pinkava, 1995) and will involve greater knowledge about the role of the sensing task in assembly systems, as well as about mechanisms concerning both robot and environment, and the behaviour of the mechanisms and controller.
REFERENCES Berger, A.D. and P.K. Khosla (1991). Using Tactile Data for Real-Time Feedback. Int. 1. Robotics Research. 10(2), pages 88-102. Bicchi, A. and R. Sorrentino (1995). Dextures Manipulation Through Rolling, IEEE Conf. on Robotics and Automation, pages 452-457. Brady, M. (1985). Artificial Intelligence and Robotics. Artificial Intelligence, 26, pp. 79-121. Brady, M. (1989). Introduction: The Problems of Robotics. Robotics Science (Brady, M. (ed», chap. 1, pages 1-35. MIT Press, Cambridge. Caselli, S., I. Silitoe, A. Visioli and F. Zanichelli (1996). Object Classification for Robot Manipulation Tasks Based on Learning of Ultrasonic Echoes, Proc. IROS, pages 260-267
152