World modeling for an autonomous mobile robot using heterogenous sensor information

World modeling for an autonomous mobile robot using heterogenous sensor information

ELSEVIER Robotics and AutonomousSystems 14 (1995) 159-170 Robotics and Autonomous Systems World modeling for an autonomous mobile robot using heter...

1MB Sizes 1 Downloads 33 Views

ELSEVIER

Robotics and AutonomousSystems 14 (1995) 159-170

Robotics and Autonomous Systems

World modeling for an autonomous mobile robot using heterogenous sensor information K l a u s - W e r n e r J6rg * Computer Science Department, Robotics Research Group, University of Kaiserslautern, P.O. Box 3049, W-67653 Kaiserslautern, Germany

Abstract An Autonomous Mobile Robot (AMR) has to show both goal-oriented behavior and reflexive behavior in order to be considered fully autonomous. In a classical, hierarchical control architecture these behaviors are realized by using several abstraction levels while their individual informational needs are satisfied by associated world models. The focus of this paper is to describe an approach which utilizes heterogenous information provided by a laser-radar and a set of sonar sensors in order to achieve reliable and complete world models for both real-time collision avoidance and local path planning. The approach was tested using MOBOT-IV, which serves as a test-platform within the scope of a research project on autonomous mobile robots for indoor applications. Thus, the experimental results presented here are based on real data. Keywords: Laserradar; Sonar sensing; Sensor fusion; Collision avoidance; World modeling

I. Introduction An Autonomou,; Mobile Robot (AMR) is a system capable of interpreting, planning and executing a given task without any external support. In order to be considered fully autonomous an A M R must show two fundamental behaviors: goal-oriented behavior and reflexive behavior. To achieve goal-oriented behavior requires the capability to plan the execution of a given task. In a classical, hierarchical control architecture (Fig. 1) planning usually is performed at two levels of

* Tel. +49 631 205 2621; Fax: +49 631 205 2803; E-mail: [email protected]

abstraction: strategical level and tactical level. At the strategical level ('Planner') global path planning is performed which requires information about the topology of the A M R ' s overall environment. This plan is used at the tactical level ('Navigator') to perform local path planning which requires geometrical information about the vehicle's local environment. The plan generated at the tactical level describes a path which is collision free with respect to the environmental information available while generating the plan. Unfortunately, in an indeterministic, inpredictable, dynamic world objects can change their locations such that the planned path may not be collision free at execution time. Thus, reflexive behavior is needed to efficiently avoid collisions. Collision

0921-8890/95/$09.50 ~ 1995 Elsevier Science B.V. All rights reserved SSDI 0921-8890(94)1)0027-1

K.-W. J6rg /Robotics and Autonomous Systems 14 (1995) 159-170

160

strategical

"r.......................................... i;~........................ i ..................',

tactic'ai".

/'

" J

~~, .... I

,

, operational

~ : i """

..

i

i I' execut~i 1"Vi ~, -~ !i::

,

problem in sensing is that of coping with uncertainty. The synergistic use of multiple heterogenous sensing devices may help to overcome the shortcomings of each individual sensing device since information can be provided concerning environmental features that cannot be perceived using individual stand-alone sensing devices. Also the fusion of redundant information from multiple sensors can reduce uncertainty especially if an individual sensor fails. It may be said that synergism in sensing is the key to obtain more accurate world models and consequently more reliable autonomous mobile robots. This paper describes an approach for the integration of heterogenous sensor information, which is provided by both a laser-radar and a set of 24 sonar sensors. The approach is basically suitable to perform world modeling at both tactical and operational abstraction level and has been tested using the mobile platform MOBOT-IV (Fig. 2). Section 2 presents a brief survey of related work in sonar world modeling. Section 3 describes the approach in detail. Finally, Section 4 presents experimental results.

Fig. 1. Hierarchy of abstraction levels.

2. Related work in sonar world modeling

avoidance is performed at the operational abstraction level ('Pilot') and requires information about the relative position of objects in the close vicinity of the A M R with respect to its actual position. Concerning the real-time conditions an autonomous mobile robot has to cope with, it may be stated that the lower a component is located in the hierarchical control structure the more it has to take into account real-time aspects. This means that the associated information-producing components have to perform their computations in real time, too. The informational needs of the individual abstraction levels are satisfied by associated world models which are updated periodically using problem-oriented sensor signal interpretation. The major difficulty during robot perception is to interpret the sensor signals in real time such that each world model at any time represents an optimal estimate of the real world's state. A crucial

Beckerman and Oblow [1] have developed a rule-based approach which deals with the treatment of systematic errors. Each cell of their grid is either labeled as empty, occupied, unknown, or conflict. Conflicts may occur whenever an object is being observed at different times from different locations. The approach has disadvantages concerning dynamic obstacles, and moreover is time-consuming since they use a two-dimensional sensor model. The probabilistic approach of Elfes [2,3] generates an occupancy grid which explicitly represents free and occupied regions in space. Elfes' sensor model is two-dimensional, as well. Since he takes into account the ultrasound beam as a whole, a large number of grid cells corresponding to the projection of the beam has to be updated. This approach also requires quite a lot of computing time. Borenstein and Koren [4,5] call their method for grid-based mapping the vector field histogram. In order to achieve short

161

K.-W. J6rg / Robotics and Autonomous Systems 14 (1995) 159-170

computing times, they use a one-dimensional sensor model which updates the grid cells corresponding to the b e a m ' s acoustic axis only. Thus, a high firing rate of tile sensors is achieved and the shortcomings of tile trivial sensor model are largely compensated. These approaches were developed in order to overcome three major problems in sonar sensing: specularity, b e a m width, and frequent misreadings due to either external ultrasound sources or crosstalk. If specular reflections can efficiently be eliminated and frequent misreading can be suppressed, each valid sonar range reading tells two things. First, there is an obstacle somewhere in the measured distance and second there is no obstacle in between (free space information). Under real-time aspects it would be desirable to operate all available sonars in parallel. Unfortunately, a crucial problem in sonar sensing with multiple sensors is that of mutual influence: a transducer's sound wave must not influence other sensors (cross-talkk Thus, sonar sensing with multiple sensors always requires a firing strategy which is a compromise between the physical constraints of sound on the one hand and real time

requirements on the other. The approach presented here operates groups of sonars sequentially while the individual sensors of each group are operated in parallel (scheduled firing). Firing strategies are not further discussed here; a very sophisticated firing strategy developed by Borenstein can be found in [6].

3. Integrating heterogenous multisensory information As

already

mentioned

above,

within

the

M O B O T - p r o j e c t world modeling is based on two

sensory sources: a laserradar which is mounted on top of the robot and a set of 24 sonar sensors which are arranged as a ring around the vehicle (Fig. 3). The laserradar delivers two panoramic scans per revolution with 720 range readings per scan. These are used to perform two different tasks: statistical interpretation and geometrical abstraction. Statistical interpretation provides information about both orientation and position of the vehicle in its indoor environment. Geometrical

Fig. 2. Mobile platform MOBO'I'-IV

162

K.-W. J6rg / Robotics and Autonomous Systems 14 (1995) 159-170

36°

beam width resulting from an intensively absorbing target Fig. 3. Sonar sensor configuration (in scale).

K.-W. Jiirg / Robotics and Autonomous Systems 14 (1995) 159-170

abstraction uses the range readings to compute a line description of the AMR's actual environment which is called Radar Map (RM) [7]. In order to become invariant from the AMR's motion, the sonar range readings are transformed into global Cartesian coordinates, first, and then used to update a representation structure which is called S e g m e n t Map (SMM). The approach presented here is based on both of these maps.

3.1. Extracting complementary sonar information The first step of the approach uses the Radar Map to verify corresponding sonar-based range information provide, d by the Segment Map. This step's objective is the extraction of sonar range readings which are complementary to corresponding laser-based range information in the sense that they provide additional environmental information. The step is motivated by the assumption that, although the laser-radar is not able to detect all relevant environmental features, the distance information provided by the Radar Map is valid in any case. This is justified by both the laser-radar's high accuracy and its excellent angular resolution. Extraction of complementary sonar information is accomplished by applying an ideal sonar sensor model to the Radar Map in order to compute a hypothetical sonar scan, first. The ideal sensor model treats the beam of an ultrasonic sensor as sector of a circle with an angle ag = 30 °. The area covered by this idealized beam is considered to be empty while an obstacle exists somewhere along the circle's segment given by and the measured distance. A hypothetical sonar scan specifies range readings one would expect to obtain by the real sonar sensors under the assumptions that (a) sonar sensing does not suffer from specular reflections, crosstalk, other sonar sources, etc., and (b) that the laser-radar would be able to provide information about all really existing obstacles. The superposition of both a hypothetical sonar scan and its corresponding real sonar scan allows to identify those real sonar range readings which come along with additional environmental information. Consequently, redundant and false (specular reflec-

163

tions) distance information are almost completely eliminated. By contrast, this effect cannot be achieved for erroneous range readings caused by crosstalk a n d / o r other sonar sources, because this type of range reading is characterized by the fact that the measured distance is smaller than the real one. Fig. 4 illustrates this. Please note that since the example is intended to demonstrate the basic idea it is not based on real data. However, the example resembles reality to a great extent. Fig. 4a shows the AMR's actual environment. The room has a closed glass door and is completely empty except for one round obstacle, which is assumed to be invisible for the laserradar. Thus, neither the round obstacle nor the glass door are represented in the Radar Map (Fig. 4b). Consequently, both obstacles cannot be represented in the corresponding hypothetical sonar scan which is shown in Fig. 4c. Since the hypothesis is computed by abstracting from all shortcomings of a real sonar sensor, specular reflections can also not be found in the hypothetical sonar scan. Fig. 4d shows the (presumed) real sonar scan. Both the glass door and the round obstacle have been detected. Additionally, two specular reflections have occurred. The superposition of hypothesis and real sonar scan is shown in Fig. 4e. For the sake of clarity, the individual hypothetical range readings are shown as one continuous area. Real sonar range readings providing additional environmental information are depicted in grey. These sonar range readings represent environmental information which could not be provided by the laser-radar. All the other real sonar range readings are either redundant or false and thus eliminated. Please note that specular reflections are efficiently eliminated by this mechanism. However, since a real Radar Map normally has gaps (i.e. no corresponding laser range information is available), possibly not all specular reflections can be eliminated.

3.2. Stabilizing the complementary sonar information using grid-based fusion Similar to the approaches presented in Section 2, the second step of the approach utilizes a

164

K.- W. J6rg / Robotics and Autonomous Systems 14 (1995) 159-170

Fig. 4. Example for the extraction of complementary sonar information. (a) environment; (b) (presumed) Radar Map; (c) hypothetical sonar scan; (d) (presumed) real sonar scan; (e) valid sonar range readings.

K.-W.

J6rg/ Robotics and Autonomous Systems 14 (1995) 159-170

grid-based mechani,;m to locally and spatially stabilize the extracted, valid sonar range informations over time. This step's purpose is to reduce uncertainty caused by both a sonar's poor directionality and by disturbing effects (cross-talk, other sonar source,.;, etc.). Additionally, the grid helps to weaken the impacts of specular reflections which could riot be eliminated by the step previously described. To accomplish this, the complementary sonar range readings are, interpreted, using a trivial sonar sensor model, and gradually fused with the accumulated environmental knowledge. Since a multitude of range readings regarding certain environmental features can be obtained from multiple sonar sensors at various positions of the vehicle only, motion is an essential prerequisite of this step. As a con,;equence of the sonar's inherent uncertainty and inaccuracy, some of the extracted sonar range readings are redundant with respect to the corresponding accumulated environmental knowledge while others are contradictory. The fusion of redundant informations leads to an increased belief in their correctness, while contradictory information causes just the opposite: gradual degradation of the correctness of accumulated environmental knowledge can finally provoke that the knowledge falls into oblivion. To sum up, this second step's objective is to incrementally build up a grid-based representation of the environment which most accurately reflects the real obstacle configuration despite of the inherent limitations coming along with sonar sensing. The trivial sonar sensor model defines that only one cell per range reading is incremented (Fig. 5). This cell corresponds to the projection of the beam's acoustic axis in the measured distance. In contrast to the ideal sensor model used in the previous step, this sensor model explicitly takes into account a sonar's poor directionality: the maximum uncertainty is half the beam width. So far, the sensor model is similar to the one used by Borenstein [6]. Differently from Borenstein, the sensor model presented here does not take into account the free space information associated with a sonar range reading in order to revise already occupied grid cells. If this were

if

~'..~lI ",X. 7"-~1 ) ' 7

/ %<-

,~'><

"'/

",'i"-.,: ~ -

/i--.<

i--.,/

"><.. -

""/'"

~}'"-

7---.1 " ~

. . . ~ l '"-.. " i " "/.7 "y

"~~-~.~._,~_,,~-~ 37-.. I

i "i

)<-,<."'/..j":-z..,>"->4 "?<-Z"><-.>"-1 }>'-,.">'~ t"-/... *'%., ~/- / " " y -

- ' - ~

....)'J"~~ P ~" ~ ' z/ "

t "~Z

165

".

""

i ...... b>'-

,--,,, ->. /--+, "S"

~ . . ? * . > L [ . I .

"'-~z

~

7/'-~/

--,

L ><-,,

"--+ ,---.,":,..."-;4 "<-.Z">.... -<. i-.

..... "-~<>'>..~"><--7"~-'.>x"-7 • "'~ "-< "2"-? -/ . / " : i . "¢-. 2"--./ /

....... ..~,-->< ../

Fig. 5. Trivial sonar sensor model.

done, specular reflections could cause correctly occupied grid cells to be attributed as being empty. Since only one cell is incremented per range reading, the sensor model requires less computational effort to update the grid map. Shorter update periods allow a higher firing rate of the sensors and thus the acquisition of more environmental information per time interval. This additional information comes along with a partial compensation of the sensor model's directional uncertainty. In contrast to the approaches mentioned above, the approach presented here utilizes a local grid map. This map is defined relatively to the A M R and consists of a fixed number (100.100) of quadratic cells. Since all valid, individual sonar range information is accumulated over time, this intermediate representation structure is called Accumulative Grid Map (AGM). The A G M follows all translational motions of the robot, but not the rotational ones (Fig. 6). Although the A G M is a local map which is defined relatively to the AMR, it represents the distance information in global, Cartesian coordinates. The A G M is updated at discrete time intervals, while the individual length of each interval depends on the robot's actual velocity. Between two consecutive update-positions the global coordinates of the grid cells remain unchanged. Thus, the robot moves across the AGM. At the very moment of the AGM's update the

K.-W. JSrg / Robotics and Autonomous Systems 14 (1995) 159-170

166

--[--~

?......

F ~T'--

[-] . . . . . . ......v - " i - - -

!!!!!N

.............

-~--J--

'r - ~-- ~ - ---+-- + ~ - ~ - - ~ - - --f---~

........

i--F .......

il

b---L~ i i t._,-_,.,

i i i

:: .... ....~.....~.....~ .....7T . . . . .

j....~_j

....... A-i .......... !.+-i i

,

X Fig. update

i

, _ L _L . L . ' _ . ' . - , . _,_ . , _ . , . . , _ - , - v _ , . , . , _ J - - ' . . ' .-' _.' .-' - , . . ' .

6. Accumulative

Grid

Map

(principle)

at

two

expresses the degree of belief that C~j is actually occupied by an obstacle. The frequency measure counts the number of update cycles that have passed since the cell's weight has been incremented last: whenever a cell is not affected during an update cycle, H(C~j) is incremented. By contrast, if a cell is affected during an update cycle, then H(C~j) becomes equal zero. If H(C~j) exceeds a certain threshold Hmax, then G(Ci,j) becomes equal zero, i.e. the cell is considered to be empty (revision mechanism). Both G(Cij) and H(Cij) initially are equal zero.

....~-..-'--~

'-.,

consecutive

positions.

map follows the robot's movement. Please note that cells having a weight greater than zero are affected by the coordinate transformation only. This, in combination with the limited size of the AGM, makes the coordinate transformation very fast. The A G M can be seen as a window following the AMR's movements, while, at any time, only the visible part of an imaginary, global grid map is relevant. The fusion model which is used to update the A G M can be seen as a cyclic process consisting of two phases which are called accumulation and propagation, respectively. During the accumulation phase, new range information is integrated into the A G M using the sensor model described previously. Subsequently, the propagation phase causes an improvement of the AGM's contrast in the sense of a spatial stabilization of the sonarbased environmental knowledge. Thus, propagation helps to compensate the sensor model's directional uncertainty, too. Besides, the fusion model most reliably suppresses disturbances caused by cross-talk a n d / o r other sonar sources.

Propagation The directional uncertainty of the trivial sensor model leads to a blurred, and thus unsatisfactory approximation of the robot's environment in the grid map. To overcome this inaccuracy, i.e. to enhance the AGM's contrast, the individual weights of the actual cell's neighbors within a certain distance t are taken into account in order to compute the cell's propagated weight Gp(Cij). This step is based on the assumption that regions of ceils with high individual weights correspond to real obstacles, while regions with low weights normally result from erroneous range readings. The total weight G,(Ci,~) of the actual cell's neighbors is given by: p,q= +t Gu(Ci,j) =

E G(Ci+p,j+p), p,q=--t

with p =# 0 A q 4= 0. Subsequently, the propagated weight Gp(C~j) is computed taking into account a threshold G T. Formally, Gp(Ci,j) is given by: 'G(Cij ) -k'a(cij

if a(cij

Gp(Ci,j):=

O,

) + G u ( C i j ),

ifO
Accumulation

E a c h cell Ci, j has an associated weight G(Ci,j) and a frequency measure H(Ci,i). The weight

G.(Cij ) =

),

if with 0 _< k _< 1.

Gu(Cij ) >_ G T ,

K.-W. J6rg/ Robotics and Autonomous Systems 14 (1995) 159-170 Please note, that GT- defines the signal-to-noise ratio of the approach. Obviously, this mechanism causes very high propagated weights of those cells that have neighbors with high individual weights. By contrast, if the weights of a cell's neighbors in general are low, the propagated weight will be low, too.

3.3. Perceiving movbTg obstacles The assumption that cell regions with low individual weights result from erroneous range readings cannot generalIy be maintained in a dynamical environment. There, moving obstacles occupy a special position since the time-discrete sonar firing comes along with the effect that a moving obstacle gradually increments the weights of cells that correspond to the obstacle's location at the time of the particular measurement. Thus, moving obstacles leave traces in the grid map in the sense that all cells belonging to such a trace share the common characteristic of having low individual weights. Additionally, the sensor model's directional uncertainty leads to a certain haziness of the trace which depends on the distance between the obstacle and the robot: the larger the distance the greater the haziness. To sum up, it may be stated that the perception of a moving obstacle depends on the threshold G r : the lower t]~e threshold the higher is the probability that a moving obstacle will be perceived. Please note that a low threshold leads to an insufficient signal-to-noise ratio, i.e. an increased susceptibility to disturbances caused by specular reflections, crosstalk, and other sonar sources. Besides, the robot's capability to perceive a moving obstacle is also determined by the dimensions of the quadratic cell cluster which is taken into account while computing G,(Cio). The probability to perceive a moving obstacle increases with the number of cells belonging to the cluster. However, this leads to an increased computational effort having disadvantages concerning the robot's reaction time which is defined at the operational abstraction level. Finally, the robot's capability to perceive a moving obstacle also depends on both the obstacle's physical dimensions and its velocity.

167

3.4. Fusing radar map and accumulative grid map Radar Map and Accumulative Grid Map are complementary to each other in the sense that the RM represents information which is not represented in the A G M and vice versa. Thus, both maps reflect the complementary physical properties of the laser-radar and the sonar sensors, respectively. What remains to be done at this point is the fusion of the information provided by both maps in order to achieve a consistent representation of the environment. In the actual implementation RM and A G M serve for both world modeling at tactical and operational abstraction level. In order to reliably perform collision avoidance, the perception of moving obstacles at the operational abstraction level is of vital concern for an AMR. As can be seen easily, this is not desired at the tactical abstraction level. Thus, the use of individual AGM's at both operational and tactical abstraction level is necessary in order to fulfill their different qualitative requirements.

Operational abstraction level The world model which is used at the operational abstraction level is based on a polar coordinate system and called S e c t o r Map (SCM). This SCM is defined locally to the A M R and consists of 48 sectors each representing the distance be-

Fig. 7. Test lab and experimentalsetup.

K.-W. J&g /Robotics and Autonomous Systems 14 (1995) 159-170

168

tween the AMR's boundary and an obstacle. Each sector's corresponding laser-based range value is being compared with the associated sonar-based range value and the smaller value is used to update the sector. Sectors representing a range value are called occupied sectors. As a result of the trivial sonar sensor model the maximum posi-

tion error of an obstacle is half the beam width. For the sake of security, the sonar range value of an occupied sector is copied into both adjacent sectors. An adjacent sector which is already occupied by sonar range data remains unchanged because of the very high probability that the range data of this sector is correct.

(b) o ooo~*oooo~

o o oo

oo o o o o o o o o o o o o

o o o

(a) o

°°

o oo

oooo~

ooo

o

o o o

o o

o

o o

o

8~

8~

(d) (c)

-'~ 0 "~

o

oooooo

o

) [

°°%

I:~

, +

Fig. 8. (a) Radar Map. (b) Hypothesis. (c) Real sonar scan. (d) Complementary range readings. (e) Accumulative Grid Map. (f) Current Sensor Map. (g) Example of a Sector Map (with corresponding RM and AGM).

K.-W. J6rg / Robotics and Autonomous Systems 14 (1995) 159-170

169

(e) O0 0 O 0 0 0 Q O O Q O O 0 0 ~ o o

0 o

i

(f) .#" oo o ooooooooooo

o o o

o



oo

°

o

\

o

}

i s

,g

Fig. 8 (continued).

Tactical abstraction level Within the MOBOT-project, the long-term storage of environmental information is performed at the tactical abstraction level using a line-based approach.. Thus, the environmental information provided by the AGM has to be converted into lines first. This conversion process happens after the propagation phase previously described. Both the lines computed here and the lines provided by the Radar Map are used to update a representation structure called Current Sensor Map (CSM) which serves as basis for the subsequent long-term storage of the geometrical environmental knowledge [7]. Potential candidates for the line-generation are all grid cells with a propagated weight exceeding a certain threshold. Among these candidates, only the cells with the shortest di,;tance to the robot are finally considered. This is achieved by computing first the polar coordinates of all relevant cells with respect to the robot's local Cartesian coordinate system. Depending on their individual angles the cells are then arranged in ascending order. Subsequently, the cells with the shortest distance to the robot are determined. The last step applies

an iterative end-point-fit to these cells in order to compute the lines.

4. Experimental results Fig. 8 shows screen dumps of a testrun which was performed using the real robot. Our test lab has metal walls and was additionally equipped with a metal fence, a thin plastic pipe and a black cardboard box as shown in Fig. 7. MOBOT-IV was moving along the dotted track. Fig. 8a shows the Radar Map corresponding to the vehicle's actual position. Please note that the metal fence is not detected by the laser-radar. Fig. 8b shows the hypothetical sonar scan which was computed based on this Radar Map. In the actual implementation, the maximum range of a hypothetical sonar range reading is limited to 2.20 m. Fig. 8c shows the corresponding real sonar scan. As can be seen, several specular reflections have occurred. Fig. 8d shows the complementary sonar range readings which are extracted by superimposing hypothesis and real

170

K.-IV.. Jgrg /Robotics and Autonomous Systems 14 (1995) 159-170

sonar scan. The actual Accumulative Grid Map is shown in Fig. 8e and the corresponding Current Sensor Map in Fig. 8f. Fig. 8g gives an example of a Sector Map.

References [1] M. Beckerman and E.M. Oblow, Treatment of systematic errors in the processing of wide-angle sonar sensor data for robotic navigation, IEEE Transactions on Robotics and Automation 6 (2) (1990). [2] A. Elfes, Using occupancy grids for mobile robot perception and navigation, IEEE Computer 6-8 (1989). [3] A. Elfes, Dynamic control of robot perception using stochastic spatial models, Int. Workshop on Information Processing in Autonomous Mobile Robots, Munich, Germany (1991). [4] J. Borenstein and Y. Koren, The vector field histogram Fast obstacle avoidance for mobile robots, IEEE Transactions on Robotics and Automation 7 (3) (1991). [5] J. Borenstein and Y. Koren, Histogramic in-motion mapping for mobile robot obstacle avoidance, IEEE Transactions on Robotics and Automation 7 (4) (1991). [6] J. Borenstein and Y. Koren, Noise rejection for ultrasonic

sensors in mobile robot applications, Proc. 1992 IEEE Int. Conf. on Robotics and Automation, Nice, France (1992). [7] P. Hoppen, T. Knieriemen and E. von Puttkamer, Laserradar based mapping and navigation for an autonomous mobile robot, Proc. IEEE Int. Conf. on Robotics and Automation, Cincinnati, OH (1990). [8] T. Knieriemen, E. von Puttkamer and J. Roth, Extracting lines, circular segments and clusters from radar pictures in real time for an autonomous mobile robot, Euromicro "91, Workshop on Real-Time Systems, Paris-Orsay (1991). I91 M. Buchberger, K.W. J6rg and E. von Puttkamer, Laserradar and sonar based world modeling and motion control for fast obstacle avoidance of the autonomous mobile robot MOBOT-IV, Proc. o f the IEEE Int. Conf. on Robotics and Automation, Atlanta, GA (1993). J6rg received the Dipl.-Inform. and the Dr.-Ing. degrees from the Department of Computer Science of the University of Kaiserslautern, Germany, in 1988 and 1993, respectively. He is currently a lecturer in computer science and robotics at the University of Kaiserslautern. His research interests include autonomous mobile robots, robot sensing and sensor data fusion. Dr. J6rg is a member of IEEE. Klaus-Werner