A three-dimensional laser range camera for sensing the environment of a mobile robot

A three-dimensional laser range camera for sensing the environment of a mobile robot

453 Sensors and Acruators A, 25-27 (1991) 453-458 A Three-dimensional Laser Range Camera for Sensing the Environment of a Mobile Robot C. FROHLICH, ...

700KB Sizes 0 Downloads 25 Views

453

Sensors and Acruators A, 25-27 (1991) 453-458

A Three-dimensional Laser Range Camera for Sensing the Environment of a Mobile Robot C. FROHLICH,

F. FREYBERGER

and G. SCHMIDT

Lehrsruhl fir Steuerungs- und Regelungstechnik @_SR), Technische Universitiir M&hen, P.O.B. 202420 (F.R.G.)

Abstract Motion planning and control of the fullscale autonomous mobile robot MACROBE is based on a multilayered information processing system, cooperating with a multisensory system. A primary component within this multisensory system is an eye-safe threedimensional (3-D) laser range camera. This complex sensor has been developed for highspeed acquisition of 3-D geometric information in indoor environments. The paper reports the optical, electronic and mechanical design of the camera as well as its static and dynamic performance.

1. Introduction The full scale autonomous mobile robot MACROBE (Fig. l), developed by the Robotics Group of the Laboratory for Automatic Control Engineering (LSR) of the Technical University of Munich, plans and executes its indoor motion through a threelayered motion planning and control system (MPCS). Operation of the robot is supported by an advanced multisensory system (MSS). To execute autonomous manoeuvres including guarded motion, traversing narrow passages, reorientation, following or evading moving objects and docking, the robot requires real-time information about the geometry of its actual environment. For this purpose an eye-safe 3-D long-range and wide-angle laser range camera was developed and integrated into the MSS as a primary geometry sensor. Other physical sensors 0924-4247/91/$3.50

D-8GiMMunich 2,

within the MSS are a multisonar system at the vehicle’s front end, several side-looking ultrasonic range finders and two shaft encoders mounted at measuring wheels for odometric purposes. This paper focuses on the 3-D laser range camera and its application within sensorassisted navigation based on typical indoor landmarks like walls, doors or corridors. The performance requirements of MACROBE’s sensory system and the camera’s measurement principle are outlined in Section 2. Sensor hardware design including laser head, receiver electronics, 3-D deflection system and a digital signal processing unit are discussed in Section 3. Static and dynamic performance data from a prototype laser range camera are given in Section 4, while Section 5 concludes the paper with an outlook to future work in the mobile robot project MACROBE.

2. Basic Design Aspects of the LSR Range Camera 2. I. Sensor Performance Requirements Due to the demands for MACROBE’s navigation speed of up to 2 m/s for sensor-assisted navigation, a sufficiently large area in front of the mobile robot has to be monitored. Thus, a field of view with a spatial angle of about 60” x 60”, a high spatial resolution within a range up to 10 m and a frame rate of several 3-D range images per second are required. Only non-tactile sensors [ 1,2] are suited for covering these demands. Non-tactile range measurement techniques may be classified as either active techniques 0 Elsevier Sequoia/Printed

in The Netherlands

454

-

t

Receiver

Fig. 2. The two-frequency phase-shift method. Fig. 1. MACROBE system.

and its information

processing

directing visible or infrared (IR) light, ultrasonic [ 31 or radar pulses [4] to the surface to be measured or as passive techniques based on vision [5]. To achieve the demand for a high spatial resolution, only an active technique emitting collimated laser light or passive stereo vision is suitable. However, due to changing illumination conditions and complex calibration procedures, stereo vision is in general not well suited to MACROBE’s operational requirements. Furthermore, by providing range data with minimum computation time, active techniques are more attractive for real-time navigation. For these reasons we selected an active range measurement technique. A collimated laser beam is directed to the object to be measured and back-scattered light is sensed. Full-scene range data are obtained by scanning the transmitted laser beam by two synchronized mirrors. In contrast to high-power laser range systems developed for mobile robots in outdoor applications [6], the LSR range camera is designed for eye-safe indoor operations. 2.2. The Two-frequency Phase-shift Method Because of the requirements for high-precision range measurement within a range up to 10 m, evaluation of the phase shift between a reference laser beam and the back-scattered laser light (Fig. 2) is more suitable than measuring the light’s extremely short time-offlight. The amplitude PE of the emitted

continuous-wave laser signal is simultaneously modulated with two frequencies o, and w2. Laser light back-scattered from a target is collected by an ultrasensitive photodetector. The amplitudes PD are fairly small. Due to the time-of-flight, the received sine-shaped signals are phase-shifted relative to their reference in the transmitted signal (Fig. 2). Phase shifts ‘p,,2 are proportional to the range D and the modulation frequencies. Since phase shifts are only unique modulo 2x, the modulation frequencies w, and w2 are selected to provide a sufficient measurement range and an appropriate range resolution. A low-frequency signal (LFS:w, = 10 MHz) guarantees a coarse but absolute measurement range of s, (s, = 15 m), whereas the component (HFS:w, = high-frequency 80 MHz) provides a fine (s2 = 1.875 m) but ambiguous range information over sl. A correct combination of the phase shifts ‘p, and (p2of both frequencies provides absolute and accurate range measurements within the specified range. Due to the high sensitivity of the HFS, inevitable errors in the phase measurements (shot noise) may cause only minor errors. Errors occurring with the LFS in cpl (Acp,) do not cause range errors if Acp,
3. The LSR Laser Range Camera

The camera hardware [7] consists of four major components: a laser head, transmitter/ receiver electronics and a 3-D beam-deflection

An les of DeLtion Digital Signal Processing Unit I Handshake

Fig. 4. Mechanical and optical design of the 3-D laser range camera.

Sensor Computer

Fig. 3. Modules of the 3-D laser range camera.

system. High-speed data transfer between the laser camera and external devices as well as real-time sensor data preprocessing are achieved by a digital signal processing (DSP) unit (Fig. 3). 3.1. Laser Head The laser head emits a collimated IR laser beam (A = 810 nm) and detects the laser light back-scattered from a target. It consists of four micromechanical and electrooptical components: objective, APD unit, transmitter module and a precise body. The laser cone emitted from a GaAlAs semiconductor laser diode is formed ,by a set of microlenses into a coaxial path of rays. The laser beam is split into a reference and a range-measuring beam by means of a mirror which has been perforated at an angle of 45” and has a diameter of 4 mm. An aspheric receiver lens, an IR filter for elimination of spectral noise in laser light as well as the APD unit are used to evaluate reflected laser light (Fig. 4). Coaxial ray tracing of emitted and back-scattered laser light is achieved by mounting the transmitter module and objective rigidly to the laser head’s body. With respect to reduction of noise as well as electromagnetic interference, the highfrequency components of the laser head are carefully shielded. The laser power of the emitted laser beam is continuously controlled with respect to a reference value which guarantees eye-safe operation [8]. Any deviation from the eye-safe

operation level causes the laser beam to be shut off automatically. 3.2. Receiver Electronics The demodulation electronics consist of two frequency-selective quadrature receivers, four filters and four high-speed flash A/D converters. The quadrature receivers evaluate the respective phase-shift signals of LFS and HFS and generate sine (pi) and cosine (qi) signals for each frequency. Both signals form a complex measurement vector. After amplification and Bessel filtering in the analog unit, followed by sample, hold and digitizing by means of IZbit flash A/D converters, the signals pi and 9; of LFS and HFS are subsequently processed in the digital unit. By means of a Pythagoras processor, digitized signals are transformed into a magnitude representing the signal quality Qi and phase ‘piof the complex vector. 3.3. 3-D Defection System

Since apertures of electrooptical and acoustooptical light-deflection systems are inadequate for deflection of the received light cone to the aspheric receiver lens with 55 mm diameter, a 3-D deflection system was developed. Two synchronized electromechanical mirrors perform fast line-by-line scanning of the specified 3-D scenery (Fig. 4). A rotating six-sided polygonal mirror is used for rapid azimuth deflection and a tilt mirror performs comparatively slow elevation deflections. A non-linear gear transforms rotational movement of the d.c. motor into a saw-tooth

456

movement of the tilt mirror. The rotation of the polygonal mirror is speed controlled. The rotation speed of the tilt mirror is controlled at a fixed ratio to that of the polygonal mirror. Both mirrors are used for deflection of the emitted laser beam and for collection of the back-scattered light cone to the APD unit. The tilt rate and rotational speed of the polygonal mirror are selected such that a measurement rate of three range images per second, consisting of 41 scan lines with 161 range pixels each, can be guaranteed (Table 1). 3.4. Digital Signal Processing (DSP) Unit The DSP unit operates at two processing levels. Based on 3-D range values computed at the first level, level two performs steps of pixel-orientated range data preprocessing. Level 1: Range Value D

Phase shifts (P~,~of both frequencies are combined into an absolute and accurate range value. Due to the HFS’s range ambiguity within the measurement interval of the camera,

iv = w&l, D,(qo,)

=

cl(2~2h2

(1) +AcG,>,

j=O,...

,N-1

D(R) = c/(~w)cP,

(2) (3)

the final range value D is obtained by selecting the Dj(q2) value closest to the coarse range value D(qp,) as given by the LFS. Level 2: Pixel-orientated range image preprocessing

For real-time sensor data processing, special functions requiring high computational effort can be performed in a pixel-oriented way, concurrently with measurement. Pixel Pi passes data preprocessing steps during the measurement of pixel Pj+l. The following steps of range data preprocessing are performed by means of a DSP-board based on two TMS 320 C 25 processors.

Table 1 Characteristics of the 3-D laser range camera Laser head

IR semiconductor laser (8 10 nm) Hybrid APD photosensor Emitted laser power : ImW

Receiverelectronics Two-frequency phase-shift method Modulation frequencies: IO/SOMHz Resolution : up to 7 mm Working distance : 0.2-15m 3-D deflection system Beam deflection through a set of two mirrors Field of view : 60” x 60” Spatial resolution :I61 x 41 range pixels Frame rate per second : three 3-D range images

3.5. Filtering Software

Due to the properties of the two-frequency phase-shift method, raw data may be disturbed. Filtering software based on a detailed sensor model eliminates fixed distance spikes and edge-overshooting. 3.6. Coordinate Transformation The position of an object point in the vehicle’s coordinate system can be derived from the measured range D and the position of both mirrors with respect to their angles of deflection (azimuth and elevation). Both Cartesian and cylindrical coordinates (Fig. 1) are used. 3.7. I/O Interface Different steps of data processing can be performed without detailed knowledge about the complete Cartesian range image of the 3-D environment. To limit data transfer and following steps of range data processing to a specified region of the image, the I/O interface of the DSP unit is on-line programmable through the sensor computer. Data transfer of range pixels between the camera and a sensor computer is performed via direct memory access. Because range data sampling intervals are consistent in the horizontal and vertical directions, the resulting image can be represented in a 2-D indexed matrix. Each element of this matrix includes Cartesian and cylindrical

451

coordinates of the object’s surface point and Q, representing the signal quality. This highly reliable matrix is the basis for subsequent data processing steps.

4. Experimental Results 4.1. Performance Data The 3-D laser range camera was validated within structured test scenes by means of special 3-D test objects with differing reflectance. Static performance data result from evaluation of test scenes with a fixed position of the deflection mirrors. Range measurements for various distances and under various angles of incidence demonstrate that range data are nearly independent of external conditions and surface materials. The evaluation of range data and the corresponding signal quality Q demonstrates that range errors and drop-out-frequency are correlated to Q (Fig. 5). Therefore, the signal quality Q can be used directly as a performance index representing range data reliability. Dynamic performance data are derived by deflecting the laser beam in the horizontal direction only. Both distance data and the width of the objects are evaluated. Due to the scanning mechanism with a horizontal resolution of 161 points over 60”, only objects with a minimum width of about 1 cm/m can be reliably detected.

(a)

(b)

1

Cc)

4.2. Sensor-assisted Navigation Figure 6 shows a 3-D laser range image and a video image from a sensor-assisted navigation experiment on MACROBE’s operation in corridors. To achieve real-time capability, 3-D range image data are reduced to a 2-D description representing free motion space [9]. This description of the scene serves as a base for further steps of range data processing. Experiments like guarded motion, obstacle evasion, object following and navigation in corridors require different strategies of sensor-assistance even with different physical sensors. For this purpose feature extraction is done by means of so-called virtual sensors.

Fig. 5. Static performance data: (a) measurement variance; (b) measurement average; (c) drop-out frequency.

Virtual sensors represent basic algorithm structures which process one or more input data sets into a single output data set. Input data for virtual sensors may result from physical sensors and/or from other virtual sensors. Depending on the vehicle’s motion task, the MSS configures and activates the corresponding network of virtual sensors to extract natural landmarks from range data. Output data can be used immediately by the MPCS as motion-relevant features, performing sensorassisted navigation with MACROBE.

steps of our research will focus on recognizing more complex landmarks like doors or crossings of corridors. Furthermore, the robustness of the implemented range data processing algorithms in case of disturbances will be an essential aspect of future research. Finally, integration of a CCD camera into the multisensory system is planned.

References (a)

1 P. Besl, 3-D range imaging sensors, Genera/ Mofors Research 1988.

Publications,

GMR-6090,

MI,

USA,

2 H. R. Everett, Survey of collision avoidance and ranging sensors for mobile robots, Robotics Autonomous Syst., 5 (1989) 1, 5-67. 3 A. Elfes, Sonar based real-world mapping and navigation, IEE .I. Robotics Automation, RA-3 (1987) 1151-1156. 4 M. Lange and J. Detlefsen, 94 GHz 3-D imaging radar for sensor based locomotion, IEEE Int. Microwave Symp, Long Beach, CA, U.S.A. June 13IS, 1989, pp. 1091-1094.

(b) Fig. 6. Operation in a corridor: (a) video image; (b) perspective animation of 3-D range image.

5. Conclusions Experiments on sensor-assisted navigation even in complex and unstructured indoor environments [9, lo] emphasize the real-time capability of the 3-D laser range camera within the MSS. Feature extraction by means of virtual sensors leads to a clear and expandable software structure which is capable of multiprocessing operations. Current logical sensors process data only from the camera’s instantaneous 3-D range map. To achieve perception of moving objects or long time observations of situations during navigation, feature extraction must be based on a sequence of images. The next

5 E. D. Dickmanns and Th. Christians, Relative 3Destimation for autonomous visual guidance of road vehicles, Int. Con. Intelligent Autonomous Systems, Amsterdam, The Netherlands, Dec. I I 14, 1989, pp. 683-693. 6 J. T. Savage, A 3-D scanned laser range-finder system for a cross country autonomous vehicle, Proc. Int. Advanced Rob. Program, First Workshop on Manipulators, Sensors and Steps towards Mobility, Martin, Karlsruhe, F.R.G. 1987, pp. 159- 171.

7 G. Karl and G. Schmidt, Die Umwelt dreidimensional erfassen, Elektronik, 19 (1989)78-88. 8 VDE-Verlag, Strahlungssicherheit von Laser-Einrichtungen, Deutsche Norm. VDE 0837, IEC 825 (1986).

9 G. Schmidt and G. Karl, A 3-D laser range camera for mobile robot motion control, IEEE Int. Workshop Intelligent Robots and Systems, Tokyo, Japan, Oct. 31-Nov. 2, 1988, pp. 605-610.

10 Ch. FrGhlich, G. Karl and G. Schmidt, Sensor assisted navigation of a mobile robot in building environments, IEEE Int. Workshop Sensorial Integrution for Indu.ctrial Robots, Architectures and Applications, Zaragoza, Spain, Nov. 22-24, 1989, pp. 397 399.