Design A Generic Simulation Platform for DRS Experimentation and Development

Design A Generic Simulation Platform for DRS Experimentation and Development

Copyright 0 IFAC Information Control in Manufacturing, Nancy - Me1z, France, 1998 DESIGN A GENERIC SIMULATION PLATFORM FOR DRS EXPERIMENTATION AND DE...

2MB Sizes 1 Downloads 57 Views

Copyright 0 IFAC Information Control in Manufacturing, Nancy - Me1z, France, 1998

DESIGN A GENERIC SIMULATION PLATFORM FOR DRS EXPERIMENTATION AND DEVELOPMENT Jing Wang

Bourns College ofEngineering, University ofCalifornia at Riverside Riverside, CA 92521, USA - e-mail: [email protected]

Abstract: The high cost of acquiring and operating a collection of advanced autonomous mobile robots for distributed robotic system (DRS) experiments stimulates the interest of developing a generic, real-time, virtual reality based simulation platform. The specification and design principles for such a system is presented. New concepts introduced in the paper include levels of users for the simulation platform, abstract sensors/actuators; the bandwidth of sensing/actuating devices, the bandwidth of a robot and that for the simulation platform; integrating hard systems with computer simulation; and simulated concurrency. A multiprocessor based DRS simulation platform, currently being developed at the Robotics Laboratory, VC Riverside is used as an example. Copyright © 1998 IFAC Keywords: Robotics, Distributed Simulation, Distributed Control, Distributed Artificial Intelligence, Virtual Reality

Many DRS algorithms make assumptions on sensors and actuators with a functionality and/or performance well beyond today's technology. These algorithms, though not realizable with today's robots, are still considered valuable to understand the behavior of a distributed robotic system. It is therefore necessary to verify these algorithms with the technical difficulties shield.

1. INTRODUCTION

1,1 Distributed Robotic Svstems CDRS)

A mobile robot based distributed robotic system (DRS) consists of multiple autonomous mobile robots under decentralized control. No centralized mechanism, such as a master robot/CPU, a centralized and shared memory, or a synchronized clock is present in the system. Each robot has to operate autonomously reacting to its local environment, whereas robots must cooperate/coordinate to accomplish any system-wide task through inter-robot communication.

1.3

Motivation

Computer simulation has the advantage of conducting sophisticated robotic experiments without actually build and operate real robots. It may also be used to shield technical difficulties limited by the technology.

The research on DRS has in recent years become very active (Asama, et al.), expecting to play an important role in application areas such as space exploration, hazardous material manipulation, military and law enforcement, nuclear power station management, and flexible manufacturing where high reliability, redundancy and parallelism are required.

Historically, computer simulations written for DRS algorithm have not be considered as valid and convincing real robotic experiment because: Most simulations were written for a specific algorithm, not generic. Many simulations are not even valid in a sense that they inadvertently introduce either synchronized clock, or serialization of all instantaneous/non-instantaneous events, which a realistic multi-agent system does not enjoy. A DRS algorithm taking advantage of these mechanisms fail once ported to real robots. Some simulators make unrealistic assumptions in exchange for the simplicity of a theoretical algorithm. For instance, robots are operating on a discrete grid using error-free sensors and actuators. These assumptions are considered too ideal for most robotists.

1.2 DRS Development and Experiments

Experiment is certainly an important part of DRS research. A fully distributed multi-agent algorithm must be verified through experiments, during which its functionality and performance is evaluated, and more than likely and often, flaws found. On the other hand, developing and/or operating such a system is prohibitively time consuming and expensive. Many DRS algorithms designed for a large collection of robots are forced to be exercised on a very small number (2-3) of robotic units, resulting often meaningless results.

595

It is therefore urgent for the DRS community to specify, design and implement a valid, generic simulation/ development platform supporting both DRS experiments and development.

research community. To fulfill this goal, the simulation platfonn must relieve from its users many technical difficulties such as real-time 3-D graphics, concurrency, and development tools, so that they may concentrate on robot design, task programming and integrated experiments.

1.3 Current Technology and Practices

2.2 Validity

Rapid technology development on VR and graphical animation can make today's robot simulation look very real. So much so that even computer vision researchers are now accepting synthesized images as their system inputs, which were obtained in the past only through analog video sources. On the other hand, existing animation systems (such as ones used in film making), and simulation platforms (such as pilot training systems), are not suitable for simulating a DRS because:

One major design goal for the DRS simulation platform is to create a valid operating environment fully consistent with the principle of the DRS model, ensuring all algorithms developed on it can be ported to experiments and operations employing real robots. In particular, the platform must not introduce synchronized clock or serialization of inherently concurrent events. 2.3 Real-time 3-D Graphics

Many of these systems (such as in a pilot training system) are meant to provide a view to a single subjective agent (the operator). If more than one active agents are involved, (such as in a video game), the concern of most systems are limited to satisfying human's naked eye. Many asynchronous, inherently concurrent events are in fact serialized.

The simulation must support 3-D color graphics animation on all movable objects with kinematics considerations. Physical laws on timing, gravity, friction and robot dynamics, if relevant to the experiment, must be implemented. The graphics display must run in real time, or scaled real time. A recording and playback facility must be furnished to allow reviewing of robot experiments.

The goal is to port robot task programs running on computer simulation to real robot experiments, which when operate, do not enjoy system characteristics such as event serialization and synchronization. Neither theory nor technical help is available at this time for implementing a real-time computer graphical simulation involve multiple robots in which events are highly concurrent, and objects are highly dynamic in a shared operating space.

2.4 Task

Pro~am

Development

The simulation platform must serve DRS algorithm development supporting robot task program debugging, testing and evaluation in an integrated, environment. Mechanisms must be provided to allow time stamping, event logging, breaking and tracing in a distributed environment for debugging and performance evaluation. The execution of a DRS algorithm on a particular robot may be monitored through a "peep-hole" window. This mechanism should be generalized allowing its user to interactively operate a robot, while observing the overall system behavior at the same time.

1.4 Paper Organization

Section 2 addresses the requirements of a DRS simulation platform. Several new concepts are introduced in Section 3. In Section 4, the software archi tecture of the DRS simulation platform developed at VCR is presented.

2.5 Robot Development A generic DRS simulation platform must support technical specification, design, verification and testing of mobile robots to be used in a DRS experiment.

2. DESIGN PRINCIPLES 2. I Generality An ideal DRS simulation platform must allow its users to model their robots and the operating environment. It must support virtually arbitrary task experiments. Pre-defined or dynamically generated "scenes" may be invoked, which includes placing objects (movable and/or fixed obstacles and robots) in the field of operation as part of the initialization process. It must allow, at run time, dynamic and/or interactive introduction of external "events" and objects (obstacles and robots).

3. SOME BASIC CONCEPTS 3.1 Levels of Users

It is realized that a DRS simulation platform will ultimately support four (4) levels of users: Robot task programmers: These users give technical specification for robots, and then design and implement robot task programs based on it. The implementation of robots satisfying this specification is not their concern. Robot system developers: These are the people who ultimately must design and build robots

As the design and implementation of a good 3-D real-time graphics based simulation platform is a non-trivial software project, its development cost can only recovered through wide acceptance by the DRS

596

according to their technical specification given by the task programmers. While preparing for simulation based robot experiment, it is their responsibility to establish mathematical models for sensors and actuators to be employed by robots. They will work according to a manual provided by the simulation, which describes the basic but often powerful sensing and actuating mechanisms supported by the simulation kernel. A simple interface allows robot developers to integrate their sensors and actuators into the simulation. Simulation system administrator: The software model of sensors and actuators generated by the robot developers need to be approved, compiled and bundled into the simulation kernel by a simulation system administrator, who manages a library of robot models and operating scenes. Simulation system developers: These are in fact not real users, but system developers. They will eventually be phased out once the simulation platform is well established. 3.2 Bandwidth It has been realized that many robot cooperation and coordination algorithms are highly dependent on the response time and spatial accuracy of sensing/actuating devices, and that of robots operating these devices (Wang, 1997). We characterize it as the bandwidth intrinsic to the physical property of devices andlor robots.

Bandwidth ofSensing/Actuating Devices The bandwidth of a sensing device is inversely proportional to the response time from the moment a sensor is called for, to that when information is returned. For instance, a computer vision based perception system, or a ultra-sound based proximity sensor may have a bandwidth of 10-50 Hz, whereas a hardware photo sensor could operate at 1 MHz. Since most of today's robots are sampled systems, the hardware bandwidth is limited to the sampling rate of the robot operating that device.

have usually a balanced bandwidth among its sensors and actuators. For instance, human perception (in the range of 0.1 second) is well balanced to his decision making, and actuating activities. The current robot design practice unfortunately does not guarantee a balanced bandwidth specification. For instance, many fast actuating mechanisms are mismatched to slow vision systems.

Bandwidth of Simulation Platform The required bandwidth of the simulation platform is statistically proportional to the number of robots in the system, as well as on the bandwidth of individual robots. To simulate the effect such as a collision of two rigid objects, an infinitely small response time, or infinite wide bandwidth is required. In Fig. 1, two vehicles in fact would have collided in real world, but may not shown correctly in simulation if fixed, finite bandwidth is employed. We define the time bandwidth of a simulation platform to be inversely proportional to the smallest time interval in which the system could advances its (logical) simulation clock. Notice that if real-time simulation is not required, the logical simulation clock can be advanced in virtually infinitely small time intervals, achieving infinitely wide bandwidth. On the other hand, allowing virtually infinite bandwidth does not imply that the system is infinitely slow. High volume of events may not arrive at the system constantly, requiring indefinite system attention. Near real-time performance can still be achieved if high frequency events only occupy a small portion of simulation time.

(a) Snapshot n

(b) Snapshot n+l

Fig. 1 Collision not detected due to finite bandwidth of the simulation platform

The bandwidth of an actuator is inversely proportional to the time interval from which an actuation command is issued to that when the effect is completed. It is not only dependent the actuator's logic response to the command, but also on the dynamics of the actuating device. For instance, an LED can be turned on or off in a matter of nanoseconds with no residual effect, whereas a mobile vehicle cannot stop until significant amount of time after the break is applied. Here the LED has certainly a much higher bandwidth than that of the break on a mobile vehicle.

We define the spatial bandwidth of a simulation platform to be the reciprocal of the system's spatial resolution. A collision of two objects over a plate or volume larger than the spatial resolution is guaranteed to be detected. 3.3 Concurency It is important in a DRS simulation not to serialize concurrent events such as Event 1 and Event 2 shown in Fig. 2. Nevertheless, since a processor is inherently a serial computation device, and has only finite bandwidth, it can only handle one event at the time. To resolve this dilemma, a non-instantaneous event must be decomposed into a sequence of instantaneous ones, which can then be serialized. (In an asynchronous system, no two events happen at exactly the same time).

Bandwidth ofa Robot The bandwidth of a robot is a measure depending on its responsiveness to external events. It should be judged after a standard, well accepted evaluation process. It has been recognized that natural creatures

597

~:J!----Eiss s s sJ__~. . start Event 1

start Event 2

end Event 1

3.5

Integratin~ Hard

Systems with Simulation

Some abstract sensors and actuators will be built eventually after extensive testing of a DRS algorithm. During their development, a device, or part of a device can be integrated with a simulation platform for the evaluation of their functionality and/or performance.

end Event 2

Fig. 2 Concurrent, non-instantaneous events are decomposed into instantaneous ones The resolution (time interval between two consecutive events) is depending on the bandwidth of the simulation platfonn. If real-time is not required, this interval can be infinitely small.

In Fig. 4, a device is first mathematically modeled, and then "plugged" into the computer simulation. The real device however, presents a slightly different characteristic than the theoretical model. An empirical model, established based on the experiment involving hard-system, is then used again in the simulation for more actuate results.

3.4 Abstract Sensors and Actuators A wide variety of sensors and actuators have been assumed by many DRS algorithm developers. Some of them can be implemented with today's technology, others may not. An assumed sensor is sometimes so magical that it is intrinsically unimplementable. Moreover, the requirements (technical specification) for a device may change faster than anyone can ever work on it.

specification (math model) empirical model (based on hard-system)

1+1 \ ~ t!.J /

Simulation

This scenario simulates the concept of abstract sensors and abstract actuators used in a DRS simulation. An abstract sensor or actuator is the mathematical model of their respective physical counterpart, provided that it is ever built. The following are two examples:

~

measurement {off real

Experiment w:d-Systeml hard system

Fig. 4 Establishing an empirical model for a DUT The concept of integrating real hard-system with computer simulation is not new to control engineers, where the tenninology is "hardware in the loop". In Fig. 5, a real robot (KAMRO of University of Karlsruhe), operates side by side with many virtual robots in a virtual field. All sensors (in this case, 16 ultra-sound sensors), are replaced by virtual, abstract sensors, and interfaced with the main processor onboard the robot. The abstract sensors, driven by a real robot, operate in the virtual scene. All motion commands from the robot are intercepted, and delivered to the simulation platfonn to drive abstract actuators which "moves" graphical objects (robot and its arms).

a passive proximity sensor that measures the distance and direction of an object closest to the robot -- precision 1%, effective range 1-1 Om, response time 0.1 S. a "sign-board" device on which a robot can post a message. The state of the sign-board is considered changed instantaneously some random time after a post command. Just like in a real robotic system, an abstracted sensor can be hierarchical, as information obtained from various sensors are fused to derive higher level information. Some abstract sensors may be shared, such as S3 in Fig. 3. Others may even includes actuators (AI in Fig. 3).

~ ~~ real robot

~S~

DRS Simulation

~

virtual robot

~

real robot

~

virtual obstacles

Fig. 5 Integrating a real robot into simulation 4. SIMULATION PLATFORM DESIGN The DRS simulation platform developed by the author and his group is described in this section. The emphasis of the design, in addition to validity, is modularity, expandability and simplicity.

Fig. 3 Abstract sensors may be hierarchical Rather than extensively model the operating field and its response to various sensors and actuators, it is usually sufficient to include a pre-specified, statistical error to sensing and/or actuating effects, while observing the system's ability to overcome or compensate for these mechatronic deficiencies.

4.1 Over-all Architecture A multi-processor networked architecture has been employed to diversify the intensive computation required for the simulation platform. The main system modules are: 598

• • • •

whose duration is definable by the user, are inserted into the steam allowing synchronized graphics update in real time.

Unix operating system (OS) console and graphical monitor (CIM) simulation kernel (Kernel) robot processes

This stream may be saved into a file, and be played back, using a digital "VCR" in normal speed (display update upon every major time marker), slow motion (display upon every minor time maker), or in event mode where display is modified upon every updating command. The following is am example for a portion of this type of stream,

The system design and implementation allows further distribution of computation to more processors. To the other extreme, the entire system can run, without modification, on a single computer. Porting the simulation platform to other SGI based systems is immediate.

...TEEEEtEttETtttTtttTEEtEEtEEtEEEEET... where T denote major time markers (constant), t the minor time marker (constant), and E represents an event or a graphical command (variable). In this case, there are four sub time intervals, separated with minor time markers, within a main time interval.

Console! Monitor 4-processor SGI Onyx Reality Enginee2

robot processes

Fig. 6 Multi-processor based simulation platform

4.3 Simulation Kernel

4.2 Monitor

The Simulation Kernel and its relationship with other modules of the simulation platform is illustrated in Fig. 7.

Written in SGI's Open Inventor, the display monitor process runs on a SGI Onyx reality Engine 2 graphics super computer. Designed as a pure monitor to the simulation, this process samples the system in regulated time intervals, and displays the sampled operating scene on the graphics screen. It does not have to run along with the simulation.

Simulation Kernel ~ soft clock sensors/actuators

stream

Technical Specification Displaying in 3-D at all times, this module allows viewing of the operating scene from any distance and angle. The main functionality of the display monitor is: to serve as an interactive scene editor, which allows a user to set up a virtual DRS experiment by placing fixed and movable obstacles and robots from a "warehouse" into the operating environment. A operating scene can be saved into, and retrieved from files. to work as a simulation console of switching between scene editing mode and simulation mode. to monitor the dynamic simulation scene when the system is operating.

Simulation Console

task task

RI

R2

task

RN

Fig. 7 The Simulation Kernel and its relationship to other system components Scheduler

Interface to the Simulation Kernel

The Scheduler serializes all instantaneous events in a resolution comparable to the time and spatial bandwidth of the system. Responsible for scheduling all simulated events, it checks for possible collisions between robots and objects. Major and minor time markers are emitted to the display stream at regulated (logic) time intervals.

The main data structure used in the graphics monitor process is a direct acyclic graph (DAG) which is traversed periodically by the graphical engine of the Open Inventor. A node of the DAG represents an object and/or operation in the scene. The attributes to a node contains characteristics of the object, such as shape, dimension and color. The construction and manipulation of this DAG according to Open Inventor is out of the scope of this paper. Interested reader should consult (SGI).

The Scheduler is driven by a logical software clock, which implies that the kernel may not run in real time. However, real-time graphical display can still be achieved, as the graphics display is time synchronized with the markers in the stream. This method has the advantage of smoothing the unevenness of events incurred in the operating scene.

The DAG is modified in real time by a stream of operations commanded by another module called Simulation Kernel. Major and minor time markers, 599

Simulated Sensors/Actuators

display. These event log can be recorded and played back along with graphics information at a later time for algorithm debugging, performance evaluation, or demonstration/review purposes.

When a robot is loaded into the simulation, its simulated sensors and actuators are installed in the simulation kernel. Latter takes a sensing command from a robot (through the robot/kernel interface), consults the scene database for information, and returns the "sensed" information back to the robot within its pre-specified response time. Upon an actuation command from a robot, the kernel "actuates" proper devices in the scene database, which are reflected instantly on the graphics display.

5. ON-GOING DEVELOPMENT Several experiments are being developed at this time. • • • •

distributed traffic control based on (Wang) micro-robot soccer (KAIST) pattern fonning with mobile robots (Sugihara) mobile robot motion planning (Harinarayan)

RobotlKemellnterface 6. CONCLUSION The RobotlKemel Interface allows a simulated robot to pass a sensing or actuation command to the Simulation Kernel, and to obtain from it returned sensory information or acknowledgment on an actuation. The interface contains one routine:

This paper introduces several important concepts in designing and operating a computer simulation platform for DRS development and experiments. The DRS simulation platform developed by author's research group is reponed.

ret =kemel(, ,
7. ACKNOWLEDGMENTS

The robot builder will have to use this routine to operate its sensors and actuators installed in the simulation kernel. Specifically, he must

The author would like to thank Dr. Tim Lueth of Holtbaut Univ., Dr. Matthew Barth, Mr. Abdul Tabbara and Mr. Andrew Forslund of the Robotics Laboratory at UCR for valuable discussions.

write a collection of sub-programs, in the fonn of a class of OOP, callable from robot task programs. Running with the robot processes, these sub-programs all call routine kemelO for accessing infonnation in the simulation kernel. • write a class to serve sensor/actuator commands initiated by the robot processes through the robot/kernel interface. Linked with the kernel, the routines in this class access information in the scene database to achieve desired sensing/actuating effects. Access to the scene database is provided by the simulation platfonn. This class of programs must be approved by the simulation system manager before it can be integrated with the simulation kernel.



This work is partially supported by US Army Research Office through grant DAAH04-94-G-0011, and by NATO through grant CRG-950340. REFERENCES Asama, H., T. Fukuda, T. Arai and I. Endo, editors, "Distributed and Autonomous Robotic Systems", Springer-Verlag, Tokyo, 1994. Harinarayan, K. and V. Lumelsky, "Sensor-based Motion Planning for Multiple Mobile Robots in a Uncertain Environment", Proc. of IROS94, Munich, Germany, Sept. 12-16, 1994, pp. 1,485-1,492. KAIST, "Micro-robot Soccer Tournament", 1996. SGI, "Open Inventor Reference Manual", 1995. Sugihara, K. and I. Suzuki, "Distributed Motion Coordination of Multiple Mobile Robots". Proc. of 5th IEEE International Symposium on Intelligent Control, Sept., 1987, Philadelphia, PA, pp. 138-143. Wang, J. and S~ Premvuti, "Fully Distributed Traffic Regulation and Control of Multiple Mobile Robots in Discrete Space", "Distributed and Autonomous Robotic Systems", edited by H. Asama, T. Fukuda, T. Arai and Endo, SpringerVerlag, Tokyo, 1994. Wang, J., "Robot Bandwidth and Cooperation under DRS", in preparation, 1997.

Scene Database The Scene Database is the operating world of the robot simulation. Built-in, simple but powerful data structures and operating primitives are used to support all sensing and actuation activities. For instance, there is a data structure from which all spatial reasoning by a sensor could be based. The platform allows user to submit special purpose scene database structures and operating primitives -such as a mechanism supporting sign-board based inter-robot communication (Wang, et aI., 1994).

Stream Generation A stream that contains major and minor time markers, and graphical display commands is generated by the Stream Generator. At user's discretion, this stream may contain other event logs (messages) not necessarily related to graphics 600