165
Robotics in the Factory of the Future
A Flexible Inspection System for Gauging Precision Industrial Parts D.R. Sollberger, M.P. Thint, and P.P. W a n g Machine Intelligence and Robotics Research, Electrical Engineering Department, Duke University, Durham, NC 27706, USA This paper presents an overview of an inspection system called Automated Robotic Visual Inspection System (ARV1S). The ARVIS system concept, hardware components, work-cell, and calibration and inspection procedures are described. Application of ARVIS is currently focused on inspecting precision industrial parts, specifically, traveling wave tube (TWT) components for avionic radar systems. The variability of parts which comprise a T W T requires a general purpose flexible inspection system that is both adaptable and programmable. ARVIS integrates robotics and computer technology together with vision processing software to achieve a flexible system.
to perform these tasks with higher reliability than a human worker, and (3) to operate independently without human intervention. David R. Sollberger received a BS degree in Computer and Information Sciences Engineering from the University of Florida, Gainesville (1985), and an MS degree in Electrical Engineering from Duke University (1986). He is interested in advanced industrial automation and intelligent instrumentation. Sollberger is a member of several honor societies: Tau Beta Pi, Upsilon Pi Epsilon, and Sigma Xi. He currently works for the National Aeronautics and Space Administration (NASA), John F. Kennedy Space Center, Florida, in the Systems Engineering and Experiments Division of Shuttle Payloads.
K
1. Introduction Inspection is an operation in which a product is compared with an accepted standard to ensure compliance with customer expectations of quality and performance. Expectations differ widely sometimes the inspection involves only cursory viewing if the customer's standards are not high, but at other times sophisticated equipment is used to inspect a product in great detail. Automated visual inspection (AVI) has become an umbrella term to encompass the area of study where image processing concepts are being developed for industry. AV! systems are designed with three goals: (1) to perform tedious and repetitive inspection tasks faster than a human worker, (2)
North-Holland Robotics and Autonomous Systems 5 (1989) 165-17l
Marcus P. Thint received a BS degree in Electrical Engineering from North Carolina State University, Raleigh (1983), an MS degree in Electrical Engineering from D u k e University (1984), and is currently pursuing the Ph.D. degree at Duke University. From 1985 to 1986, he worked for Northern Telecom Inc., Research Triangle Park, North Carolina, and was involved in automated test equipment software development, and industrial a u t o m a t i o n / r o b o t i c s projects. His professional interests include applied research in machine intelligence and robotics, and neural network approach to artificial intelligence. Thint is a member of Eta Kappa Nu.
Paul P. Wang received a BS degree in Electrical Engineering from the National Taiwan University (1958), an MS degree in engineering from the University of New Brunswick (1963), and a Ph.D degree from Ohio State University (1965). From 1965 to 1968, he was a member of the technical staff at Bell Laboratories, Inc. He joined Duke University in 1968 as an Assistant Professor and is currently a Professor of Electrical Engineering. His primary interest for the past 20 years has been in control theory and pattern recognition. He has published articles, chapters of books, and edited two books. Wang has been an associate editor for several journals and is currently the editor-in-chief of the Information Sciences Jour-
nal.
0921-8830/89/$3.50 ,~ 1989, Elsevier Science Publishers B.V. (North-Holland)
166
D.R. Sollberger et al. / Flexible Inspection System for Gauging
This paper presents an overview of an AVI system called Automated Robotic Visual Inspection System (ARVIS). Application of ARVIS is currently focused on inspecting traveling wave tube (TWT) components for avionic radar systems, but ARVIS concepts, in general, are applicable for inspection of other industrial parts. This overview discusses the ARVIS system concept, hardware components, work-cell, and calibration and inspection procedures. Today, most manufacturing envxronments inspect only a sample of components, because the cost of manually gauging all components is prohibitive. If, however, the product being manufactured is technologically complex, requiring precision components with tight tolerances, then inspection of each individual component is mandatory. Such is the case with TWTs. The variability of parts which comprise a TWT requires a general purpose flexible inspection system that is both adaptable and programmable. ARVIS integrates robotics and computer technology together with
vision processing software to achieve a flexible system.
2. Typical Inspection Scenario The ARVIS inspection procedure is divided into four processes: (a) system calibration; (b) part placement, (c) analysis and inspection, and (d) part removal and sorting. The interrelationship of these four processes can best be described by a typical inspection scenario: (1) the system is calibrated using a "golden part" (cf. Section 4.1); (2) the robot picks a part from the part-feeder and places it in the camera's field of view: (3) the vision system photographs and analyzes the part and signals the robot whether the part is defective or acceptable; (4) the robot removes the part from the camera's field of view and places the part in the appropriate pass or fail container. Procedures two through four are repeated for each additional part being inspected.
OPTOMATIONVISIONSYSTEM CID
i
~
~ C a m e r a
Video
CAMERA
VIDEO PROCESSING a) Camera Interface b) Thresholdlng e) Image Plane Memory d) Image Display
IMAGE PROCESSING
~l 1~ Pass/FailExternalTrigger"~. . . . . . . / ~ i PictureTakenExternalTrigger
Unit
RobotFe.baekandPositionalEneoing
1
Part-In-PlaceExternalTrigger DATAGENERALM~'/20000: Uploadlngand Downloading RobotProKrarns. Report Generator and Inspectlon Database
~t l
MV/20000 Data Link Monitor
e) Corner Point Encoder b) Feature Extracter c) Feature Arfthmetle
. - OBJECT
DECISIONPROCESSING o) Feature Instruction Unit b] Decisfon Processing c) Program Control
INPUT/OUTPUTMULTIBUSCONTROLLER /%
1
i ]VI~/20000Data Link
Fig. 1. Hardware configurationand communicationinterfaces.
Optomution Control Console and System Monitor
IBM PC: Uplaudlng and Downloading of VPL (Visual Prnlram Laujuo|e) Programs
D.R. Sollberger et al. / Flexible Inspection System for Gauging
3. Hardware Configuration The hardware configuration of ARVIS consists of five parts: (1) a vision system, (2) the control console and IBM-PC computer, (3) a robot, (4) the data link monitor and MV/20000 computer, and (5) the lighting system. A block diagram of the system is presented in Fig. 1, and the robotic work-cell is shown in Fig. 2.
and builds the feature extraction database. The decision-processing section provides the user interface, VPL interpreter, feature extraction database queries, program control, and i n p u t / o u t p u t to the Intel Multibus controller. The Multibus controller provides both RS-232 ports and binary trigger signals.
3.2. Control Console and I B M PC The user's interaction with the Optomation is through the control console. This console is used to enter, edit, and execute VPL programs. The control console is also used to monitor the Optomation's status during operation. The Optomation does not contain an internal mass storage device; thus an IBM PC is interfaced to the control console. The IBM PC allows VPL programs to be uploaded and downloaded to the Optomation via a RS-232 link, and also serves as a means to archive programs on 5.25 inch diskettes.
3.1. Vision System The vision system is the General Electric Optomation lI. It is programmed using Visual Programming Language (VPL), which is both the programming language and the operating system. Input to the Optomation is provided by a solidstate charge injected device (CID) camera with 244 × 246 resolution. The Optomation hardware is composed of three processing sections: video-, image-, and decision-processing. The videoprocessing section provides the camera interface, image memory storage, user monitor image display, and global image thresholding. The imageprocessing section performs corner point encoding
3.3. Data Link Monitor and M V / 20000 Computer The data link monitor is used to oversee the integrity and status of the RS-232 communication
CID CAMERA SHUTTER/REFLECTOR
PART-POSrrIONEK RACK
X-MOTION PLATFORM ( ~ Back-Lightlng ( ~ Diffuse.Lightlng
FROM PART FEEDER
~ ) Structured-Lighting
PART
PASS/FAIL BIN'S
LASER
0 ROBOT
PART POSITIONER
Fig. 2. R o b o t i c work-cell and X-motion platform.
167
168
D.R. Sollberger et a L / Flexible Inspection System for Gauging
link between the Optomation and the MV/20000 computer. The MV/20000 is a general purpose minicomputer produced by Data General. A1] data collected during an inspection is transferred to the MV/20000 for formatting, analysis, and storage. The MV/20000 serves three purposes: (1) uploading, downloading, and storage of robot programs. (2) generation of inspection reports, and (3) generation of database interface files. The inspection report contains an itemized listing of measurements for each part inspected, and a final summary regarding the pass/fail status of each part and the full inspection run. The database interface files are used as input to a database, which monitors the history of component quality and their respective manufacturers. It should be noted that the MV/20000 is a significantly more powerful computer than is required for ARVIS, and is currently used for ease of access and familiarity. In an industrial environment, these same functions could be performed by the IBM PC (with reduced computation speed) to realize a more compact, cost-effective system.
3.4. Robot The prototype laboratory version of ARVIS uses a Microbot Alpha robot to pick-and-place parts during the inspection process. The Optomation sends and receives signals from the Microbot. but the Microbot is not directly controlled by the Optomation. The Microbot has its own controller and programming language; it is programmed separately from the Optomation and only communicates with the Optomation via binary trigger signals. The Microbot controller does not have a mass storage unit, so the robot programs are uploaded and downloaded to the MV/20000 via a RS-232 communication link.
CAE%A SHUTTER/REFLECTORL--..r...--.--~
/ ! SOURCE/DI~USER~Q ~ "
V\,/
/
~ ~-MOTION
\ Pll
H
Fig. 3. Lighting system {diffuse configuration). structured-, and directional-fighting [1]. The variability of parts which comprise a TWT requires that three lighting techniques be utilized: diffuse-. back-, and structured-lighting. The ARVIS lighting system, shown in Fig. 3, consists of five parts: (1) source/diffuser box, (2) X-motion platform, (3) shutter/reflector. (4) part-positioner, and (5) laser scanner.
3.5.1. Source~Diffuser Box Several light sources have been proposed for AVI applications [2], however, these sources can produce specular reflections on highly polished or machined surfaces. The highly reflective metallic surfaces on many TWT components required an alternate source to be designed. The source/diffuser box, shown in Fig. 3. houses eight incandescent light bulbs which are equally spaced with two bulbs on each side wall. The combination of the source/diffuser box. the X-motion platform, and the shutter/reflector provides for highly diffuse illumination.
3.5. Lighting System Lighting techniques are of crucial importance for AVI. In an industrial environment where speed and accuracy are of essence, poor image quality due to improper lighting should be eliminated so that simple and fast image processing algorithms can be employed. Uncontrolled lighting can result in low-contrast images, shadows, a n d / o r specular reflections. Several techniques used for illuminating parts in an AVI system are: diffuse-, back-.
3.5.2. X-Motion Platform The X-motion platform, shown in Fig. 2, is a moveable lighting system platform. The X-motion platform is divided into sections which correspond to the three lighting techniques (i.e. back-, diffuse-, and structured-lighting). Each of the platform sections are briefly described: 1. Black-lighting uses a translucent surface made of ground glass (i.e. acid-etched/ frosted glass). The shutter/reflector (cf. Sec-
D.R. Sollberger et al. / Flexible Inspection System for Gauging
tion 3.5,3) has the flat-black slats turned toward the X-motion platform so illumination only originates from below the part. 2. Diffuse-lighting uses a translucent ground glass surface with a center hole to accomodate a part-positioner (cf. Section 3.5.4). The shutter/reflector has the reflective slats turned toward the X-motion platform. The highly diffuse illumination produced by the source/diffuser box and the glass surface is projected onto the part-positioner by the shutter/reflector. 3. Structured-lighting uses an opaque surface with a center hole to accommodate a part-positioner. The structured lighting technique uses a laser source to produce a plane of light (cf. Section 3.5.5) rather than the diffuse lighting. The geometry of the part being inspected determines which lighting type(s) are used; a part may require more than one lighting technique to completely gauge/measure its dimensions. The X-motion platform, shutter/reflector, and the laser scanner are controlled by the Intel 8749A micro-controller. The 8749A receives commands from the Optomation via the auxiliary port of the Multibus controller. 3. 5.3. Shutter / Reflector The shutter/reflector, shown in Fig. 2 and Fig. 3, is composed of a slatted surface which can be rotated to achieve back- and diffuse-lighting. The slats are reflective on one side and flat-black on the other side. The reflective sides are turned toward the glass diffuser when front-lighting is required; the flat-back sides are used for back-lighting and structured-lighting. The shutter/reflector receives command signals from the 8749A micro-controller. 3. 5.4. Part-Positioner The part-positioner is a platform on which parts are placed to be photographed during inspection. The design of the part-positioner depends on the geometry of the part. For example, flat (2-dimensional) parts may only require one photograph to completely measure/gauge them, however, 3-dimensional parts may require multiple photographs of different orientations. The variability of parts which comprise a T W T requires several part-positioners to inspect all corn-
169
ponents. The part-positioners are stored in a rack which is accessed by the robot. The location of the part-positioner in the rack is passed to the robot over the positional encoding lines from the Multibus controller. The base of the part-positioner, shown in Fig. 3, is beveled to aid in placement and centering on the X-motion platform, and to hold the part-positioner in place during the inspection process. 3.5.5. Laser Scanner Considerable success has been achieved in obtaining depth and range information using structured lighting [3]. Duke Universities' current research is directed towards three-dimensional part analysis using multiple image frames. ARVIS uses planar laser projection. The laser source is a Uniphase 3mW He-Ne laser and the light plane is produced by passing the collimated beam through a cylindrical lens. A series of parallel scans are then obtained using a scanning mirror. The scanning mirror is controlled by the 8749A micro-controller.
4. Calibration and Inspection Procedures From an operations viewpoint, ARVIS is designed with a user-friendly interface; the system is fully menu driven. The operator selects the part type from the menu, then ARVIS guides the operator through a step-by-step check-list to prepare the system for inspection. At this juncture, the ARVIS prototype cannot automatically adjust the camera aperture and focus; it is performed manually with interactive programs that assist the operator (the camera adjustments can be automated using an auto-focus and motorized zoom lens system). After the camera adjustment is completed, ARVIS automatically performs the calibration and inspection procedures without human intervention. 4.1. Cafibration Procedure The part type being inspected determines which lighting technique and part-positioner are utilized. Depending on the part type, the Optomation sends control signals to move the X-motion platform to the appropriate section, and commands to the robot to mount the part-positioner.
170
D.R. Sollberger et aL / Flexible Inspection System for Gauging
ARVIS is calibrated using a "golden part," a part which meets all blueprint specifications for physical dimensions and surface finish. The robot places the golden part on the part-positioner and moves out of the camera's field of view where it signals the vision system using the part-in-place trigger. The vision system then photographs the part and performs image preprocessing calibration. The controlled lighting environment of the ARVIS work-cell greatly simplifies image preprocessing because only thresholding is used. Thresholding is used to differentiate parts from the background. Thresholding sets all pixels above the threshold level to "1" and sets those at or below the threshold level to "0". Changes in the threshold level cause the size of parts (i.e. hole and body dimensions) to vary. In general, increasing the threshold level causes the size of holes to increase and the body size to decrease. Thus. the growth of hole and body sizes are inversely related, and there is only one threshold that yields correctly proportioned parts and holes. Consequently, a major difficulty exists in generating the proper threshold automatically by the vision system. and one such automatic threshold calculation algorithm has been developed at Duke University [4]. All analysis and computations performed on the image use pixels. The pixels-per-unit (PPU) conversion is a factor which relates pixel count to a measurement unit. Pixel counts from the golden part are used to calculate the PPU. After the golden part is removed from the part-positioner and stored, the system calibration is complete and ARVIS begins inspecting parts. 4.2. Inspection Procedure
The inspection cycle proceeds as described m Section 2. After the robot has placed a part in the camera's field of view. multiple pictures are taken and analysis are performed to ensure the repeatability of data to 99% confidence level for plus-orminus one pixel variance [5]. The inspection data produced by ARVIS will vary in quantity and format because the size, geometry, and allowable tolerances for each part type are unique. One salient feature of the ARVIS software (for flat 2-dimensional parts) is that the inspection programs are orientation independent (i.e. the parts are recognized and analyzed regardless of the
orientation in which they are placed on the partpositioner). The relationship between each of the part's key features (i.e. holes, principal axis, etc.) are analyzed to determine the part's orientation. Once the orientation of the part has been determined, each feature can be measured and compared to the known blueprint specifications. The data collected during the inspection is transferred to the MV/20000 where it is formatted and stored in a file. This file contains all the inspection data in human readable form with proper units and a "pass" or "fail" indicator, referenced to the blueprint specifications. After the analysis is completed, the Optomation signals the robot to remove the part using the picture-taken external trigger. The quality of the part is relayed to the robot using the pass/fail external trigger. The robot removes the part from the part-positioner, and places it in either the pass or fail bin.
5. Summary This paper presented an overview of the ARVIS system concept, hardware components, work-cell, and calibration and operation procedures. ARVIS integrates robotics and computer technology together with vision processing software to achieve a flexible inspection system that is both adaptable and programmable. The system is capable of reliable inspection on complex two-dimensional parts and research is directed towards three-dimensional part analysis. Consistent with the goals of AVI. ARVIS seeks to demonstrate a system that can perform tedious and repetitive tasks faster, with improved quality and reliability.
Acknowledgment This research was funded in part by a contract from the U.S. Navy through the Microwave Laboratories, Inc., Raleigh, North Carolina.
References [1] R.C, Gonzalez and R. Safabakhsh. Computer vision techniques for industrial applications and robot control. Computer (December 1982) 17-29.
D.R. Sollberger et al. / Flexible Inspection System for Gauging
(2] B.G. Batchelor, D.A. Hill and D.C. Hodgson, Automated Visual Inspection. (IFS Publications Ltd., UK 1985), [3] R.A. Jarvis, A persepctive on range-finding techniques for computer vision, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 5, No. 6 (March 1983) 122 139. {4] D.R. Sollberger and P.P. Wang, A THRESH: A Global Thresholding Calculation Algorithm for Automated Inspect-
171
ion Gauging - Technical Report Robotics EE-86-26, Duke University, Electrical Engineering Department (December 1986). [5] P.P. Wang, D.R. Sollberger and M.P. Thint. Robotic, Techniques for Traveling Wave Tube Manujacturing: Duke Automated Robotic Visual Inspection System (A R V1S) - Technical Report Robotics EE-86-25. Duke University. Electrical Engineering Department (October 1986).