Hardware-based, Trajectory-Controlled Visual Servoing of Mobile Microrobots

Hardware-based, Trajectory-Controlled Visual Servoing of Mobile Microrobots

5th IFAC Symposium on Mechatronic Systems Marriott Boston Cambridge Cambridge, MA, USA, Sept 13-15, 2010 Hardware-based, Trajectory-Controlled Visual...

887KB Sizes 0 Downloads 33 Views

5th IFAC Symposium on Mechatronic Systems Marriott Boston Cambridge Cambridge, MA, USA, Sept 13-15, 2010

Hardware-based, Trajectory-Controlled Visual Servoing of Mobile Microrobots Daniel Jasper ∗ Claas Diederichs ∗ Sergej Fatikow ∗ ∗ Division Microrobotics and Control Engineering, Oldenburg University, Oldenburg, Germany (e-mail: [email protected]).

Abstract: Visual servoing based on camera or microscopic feedback has become a widely acknowledged technique for the positioning of robots. With a hardware-based image processing and tracking architecture, classic timing downsides such as low update rates, latency and jitter can be bypassed making visual servoing efficient. Applying this tracking to mobile nanohandling robots, a compact high-performance micro- and nanopositioning system is realized. Trajectory control becomes feasible due to the deterministic timing behavior of the position tracking as well as the reliable open-loop control of the robots. Measurement results demonstrate the excellent characteristics of the CMOS camera-based tracking system, the mobile nanohandling robots and the trajectory-controlled positioning. Although initially aimed at coarse positioning, the system achieves accuracies below 1 µm. Keywords: Tracking, FPGA, trajectory control, signal generation 1. INTRODUCTION

2. HARDWARE-BASED LED-TRACKING

Many applications on the micro- or nanoscale rely on visual feedback because the required position information of tools and specimens cannot be provided by other sensors. Many robotic systems do not have internal sensors or their sensors might have insufficient accuracy. Furthermore, mounting inaccuracies and time-variant effects such as drift can render the sensors useless. If a robot is closedloop controlled using an imaging device as sensor, this is called visual servoing. For micro- and nanohandling, visual servoing can be accomplished using cameras for coarse positioning (Esta˜ na and W¨ orn, 2003) and optical or electron microscopes for fine positioning (Sievers and Fatikow, 2006). A major downside of current visual servoing implementations is the comparatively slow and undeterministic timing behavior of the image processing.

The crucial steps during visual servoing are image acquisition and image processing. Image acquisition requires a camera to read all pixel sensors and transfer the obtained data into the memory of a computer. The computer then performs image processing operations to calculate the pose information of the tracked robot. Both operations consume a considerable amount of time leading to latency in the control loop. Furthermore, if the involved computers and communication protocols are not real-time capable, there is an unpredictable jitter on the update rate. Lastly, due to the amount of transferred data and the required processing time, there is a limit on the achievable update rate of such a system. This is a severe limitation, especially for highspeed cameras that can deliver more than 50 images per second.

This paper describes a mechatronic system architecture that uses a camera to position a mobile nanohandling robot (see Fig. 1a). The robot’s position can be determined using two infrared LEDs mounted to its bottom, which are visible as bright regions in the camera’s image (see Fig. 1b). By calculating the weighted center of gravity of these regions (Eichhorn et al., 2008), the positiong can be extracted with sub-pixel accuracy. A new algorithm for the calculation of the weighted center of gravity is presented which can be implemented in hardware, thus minimizing latency and jitter.

To show, how these limitations can be overcome, the tracking of infrared LEDs mounted to a mobile nanohandling robot is used as an example (see Fig. 1a). The two infrared LEDs are observed by a CMOS camera leading to the two bright regions in Fig. 1b). The robot’s position can be

An initial open- and closed-loop control of the mobile nanohandling robots was first presented in Jasper and Edeler (2008). In this paper, a more accurate open-loop control combining rotational and translational movement is derived and supplemented by characterization measurements. Based on the precise open-loop control, trajectorycontrolled visual servoing is implemented capitalizing on the predictable sensor system. 978-3-902661-76-0/10/$20.00 © 2010 IFAC

565

a)

10 mm

b)

10 mm

Fig. 1. a) Bottom-up view of the mobile robot. b) Image taken by the tracking camera with infrared filter. 10.3182/20100913-3-US-2015.00081

Mechatronics'10 Cambridge, MA, USA, Sept 13-15, 2010

extracted precisely by calculating the weighted center of gravity of the two LED regions. In Eichhorn et al. (2008), the LED regions are detected using a flood-fill algorithm. This is a so called offline algorithm which requires the entire image to be available in memory. Instead, for the hardware implementation, as stream algorithm was developed that processes each pixel directly after reception from the camera. The weighted center of gravity (cx , cy ) of a region containing N pixels pi is calculated by: sx cx = = sv

PN

sy cy = = sv

PN

x(pi ) · v(pi ) PN i=0 v(pi )

i=0

y(pi ) · v(pi ) , PN i=0 v(pi )

i=0

(1) (2)

x(pi ) and y(pi ) are the x- and y-coordinates of pi , whereas v(pi ) is pi ’s gray value. The goal of the algorithm is to map each pixel px,y of the image with v(px,y ) exceeding a certain threshold to a region Ri and calculate the values cx and cy for each region. The concept of the algorithm is sketched in Fig. 2. It uses two memories. A line memory stores the region each pixel of the previous line was attributed to (gray shading in Fig. 2). A region memory stores the values sx , sy and sv for each of the regions. Thus, the following steps are executed with the reception of each pixel p: (1) If v(p) is below threshold, skip remaining steps. (2) Consider the pixel immediately above and left of p: (a) If both pixels do not belong to a region, p belongs to new region. (b) If both pixels belong to the same region or only one pixel belongs to a region, p belongs to the same region. (c) If both pixels belong to different regions, the regions need to be joined, i.e. the values sx , sy and sv in the region memory need to be summed. (3) Add the values of p to sx , sy and sv of p’s region. (4) Overwrite the line memory with p’s region. After reading the last pixel, the values cx and cy can be calculated for each region using two divisions. For small numbers of regions, this can be done with negligible delay. In a hardware-based implementation, all of these steps need to be executed with the pixel clock frequency. Especially, step (2c) is challenging, as the joining of regions requires multiple memory operations and several cases have to be distinguished. This is implemented in a pipeline architecture that is described in detail in Diederichs (2010).

-

1 1 1 1

1 1 1 1

1 1 - 1 1 - 1 1 1 p

-

3. CLOSED-LOOP CONTROL The closed-loop control based on the hardware-based tracking is done for the example of a mobile nanohandling robot. The nanohandling robots perform a stepwise motion based on the stick-slip actuation principle. Fig. 3 shows a bottom up view of the actuation unit based on laser-structured piezoceramic actuators (Edeler et al., 2008). Three active piezo segments can rotate a ruby hemisphere around its center point (Fig. 3 left). A single actuation unit (Fig. 3 middle) is comprised of three such ruby hemispheres. These hemispheres in turn can rotate a steel sphere using a friction contact. Three such steel sphere actuators are embedded into a printed circuit board (PCB) and carry the mobile robot (Fig. 3 right). Additionally, two infrared LEDs are mounted to the PCB to facilitate the tracking described in the previous section. Such a robot presents a particularly challenging control problem because multiple degrees of freedom need to be controlled using control channels that are not independent. Furthermore, environmental influences such as the cabling and contaminations on the working surface significantly influence the robot’s behavior. To simplify the control, the 27 active piezo ceramic segments of a robot are grouped into six channels. The mobile robots and their control channels have been extensively described in Jasper and Edeler (2008). 3.1 Open-loop control The motion behavior of such a robot is influenced by the signal amplitude and frequency. With the signal amplitude, the step length and movement direction in all degrees of freedom are controlled. The six channels of a mobile robot are shown in Fig. 4. For purely translational movement, the channels are further grouped into three channels as shown in Fig. 4b. For rotational movements, only two channels are required (see Fig. 4c). Based on geometric considerations, the following equations are a model describing the motion behavior: mx = (u1 − (u2 + u3 ) · sin 60◦ )/1200 Vpp my = (u2 − u3 ) · cos 60◦ /1200 Vpp mϕ = (ua − ub )/1800 Vpp,

with u1 = u1a + u1b , ua = u1a + u2a + u3a , etc. Dividing by 1200 Vpp and 1800 Vpp, respectively, normalizes the maximum step length m for a given maximum amplitude of 300 Vpp. Fig. 5 shows the maximum normalized step lengths in all degrees of freedom taking into account the 300 Vpp amplitude limit on each of the channels u1a ruby hemisphere

- - - 2 2 2 2 2 2 -

active segments

Fig. 2. Center of gravity algorithm. For pixel p, regions left and above (solid squares) need to be examined. 566

(3)

steel sphere

solder contacts

Fig. 3. Mobile robot actuation unit.

LEDs

PCB

Mechatronics'10 Cambridge, MA, USA, Sept 13-15, 2010

b)

1a

b a

3 2a

b

2

2

1a

The distances are in relation to the robot’s local coordinate system (see Section 3.2). This is accomplished in multiple steps. First, the ratio between ∆x and ∆y is translated into the movement angle α, movement distance ∆d and the maximum step length l. l is the possible step length if no rotation is used and the cos-term is caused by the hexagonal shape:

a

1

3a 2b

c)

2

1b

a a

1

1

a 3a

3b

3

3

b

a

Fig. 4. a) The six channels of a mobile robot: b) Three groups for translational movement and c) two groups for rotational movement. through u3b . Without rotation (Fig. 5 left), the maximum step lengths form a hexagon. The hexagonal shape is caused by the mechanical design using three piezo segments to rotate a ruby hemisphere. With increased rotational component (Fig. 5 right), the translational movement decreases down to purely rotational movement with mϕ = ±1. The shown hexagon for mϕ = 0.5 demonstrates that the step length can be equally divided between translational and rotational movement. To validate this model and characterize the robot, multiple actuation parameter sets on the edges of the graph in Fig. 5 were tested. The robot was set to perform 1000 steps at a constant step frequency and the net displacements ∆x, ∆y and ∆ϕ were recorded. As 1000 steps lead rotations smaller than 1.5◦ , the rotation virtually does not influence the lateral movement’s direction or distance. Using the information about the camera resolution and lens (see Section 4), these displacements have been translated into distance measurements and divided by 1000 to obtain the length of a single step. The result is shown in Fig. 6 and closely matches the model. The existing deviations are caused by manufacturing tolerances of the individual actuators and the force exerted by the robot’s cabling. Based on the characterization results, the constants KN and RN in Eq. 3 can be replaced by the constants K and R in order to calculate the correct step length: K = 1.85 Vpp/nm

(4)

R = 200 kVpp/◦ .

(5)

This characterization has been performed at multiple frequencies between 100 Hz and 40 kHz without a significant difference in behavior. Thus, the step size and direction is independent of the actuation frequency. The next step of the open-loop control is the application of the described model in order to move linearly over a certain distance (∆x, ∆y, ∆ϕ) within a specific time ∆t. 1

0

0

-0.5

-0.5

-1 -0.5

0 mx

0.5

1

-1

-0.5

0 mx

0.5

n = nT + nR =

∆d ∆ϕ . + l 0.0015◦

Fig. 5. Normalized maximum step lengths for the different degrees of freedom. 567

(9)

Using n, the normalized motion values mx , and my and mϕ as well as the actuation frequency f can be calculated: mx = sin(α) · nT /n

(10)

my = cos(α) · nT /n

(11)

mϕ = nR /n

(12)

f = ∆t/n.

(13)

With these values and an inversion of Eq. 3, the actuation amplitudes for all channels can be derived. 3.2 Closed-loop trajectory control The goal of the closed-loop trajectory control is to move the robot along a well-defined trajectory. For such a trajectory, the desired pose pd can be calculated for any point in time t. As the mobile robots used for this paper feature three degrees of freedom, pd is a vector of xd , yd and ϕd . The control approach presented here can be employed for arbitrary trajectories. To simplify the application of the robotic system, trajectories are not defined for the robot itself but for an arbitrary point relative to the robot, e.g. the tool center point (TCP). Fig. 7 shows the relation of the different mj = 0

my = 0

1.5

mj = 0 . 5

0

0

-1.5 -150

1

(8)

Next, the total number of steps n is calculated based on the number of pure translational steps nT and the number of pure rotational steps nR . As there is an inversely proportional relationship between the two step lengths, n is also the number of steps required, if the steps are partly translational and partly rotational:

-150

-1 -1

(7)

l = 160 nm · cos(α mod 30 ).

150

0.5

(6)



y - step [nm]

0.5

mj = 0 . 5

my = 0

mj

mj = 0

my

1

∆y α = arctan ∆x p ∆d = ∆x2 + ∆y 2

-3

2a

j - step [10 ° ]

a)

0 150 x - step [nm]

-150

0 150 x - step [nm]

Fig. 6. Recorded step length during the mobile robot’s characterization.

Mechatronics'10 Cambridge, MA, USA, Sept 13-15, 2010

received, which is outdated. The algorithm then calculates the robot’s estimated pose en by adding the movement that was calculated during the previous iteration. With en , two control deviations can be calculated. First, the distance between en and the following trajectory position tn+1 is the movement that the robot should execute in order to remain on the trajectory. Directly using the open-loop control, this represents a proportional controller part. Additionally, the distance between en and tn can be used as it is represents movement errors of the open-loop control. The integral of this distance is added to the control values forming an integral controller part. This leads to the PI-controller with the control parameters KP and KI : n X s = KP (tn+1 − en ) + KI (tn − en ) (15)

py CR

x

px a TCP

j a

CM

b

CC y Fig. 7. Coordinate systems and relations.

i=0

coordinate systems. The robot is positioned within the camera coordinate system CC . Due to the robot’s design, the position extracted by the LED tracking is identical to the rotational center point of the robot, i.e. the origin of the robot’s coordinate system CR . The TCP is visible in the microscope image (coordinate system CM ) and its position relative to the robot is fully defined by the angle α and the distance a. Thus, coordinates on the desired trajectories are first transformed into the robot’s coordinate system using a, α and ϕ. With this transformation, most operations required for micro- and nanohandling can be described by linear trajectories. Starting at t = 0 and pd (0) = (sx , sy , sϕ ), the chosen reference point moves with a constant velocities vx , vy and ωϕ along all three degrees of freedom: ! sx + vx · t pd (t) = sy + vy · t (14) sϕ + ωϕ · t Important examples of such trajectories are translational movement where ωϕ = 0 and rotations around the tool tip where vx = vy = 0. For each control iteration, i.e. sensor update, the pd is calculated and translated into the robot’s coordinate system. Then, the difference between the desired robot position and the measured position is calculated leading to the control deviation e. The hardware-based tracker delivers the pose information with negligible jitter and latency. However, the camera itselfs delays the sensor update by one frame, because the image is captured (pixels exposed) only for a few µs before the transfer of the pixel data takes most of the time of the update interval. Thus, a sensor update actually contains the robot’s pose information at the time of the previous update. Due to its low mass and comparatively low maximum speed, the robot can reach all velocities in less than 1 ms. Thus, its response time is negligible. The control algorithm compensates for this delay as shown in Fig. 8. At each iteration, the measured pose mn is mn-1 en-2 tn-2

mn en-1 tn-1

mn+1 en tn

s is then used as the motion (∆x, ∆y, ∆ϕ) that has to be completed within the time ∆t until the next sensor update. This is done using the approach described Section 3.1. As the open-loop control is accurate, KP = 1 already leads to good movement behavior. The accuracy is slightly increased by the integral part with KI = 0.05. 4. SYSTEM ARCHITECTURE Fig. 9 shows a diagram of the developed system architecture. It is comprised of a mobile nanohandling robot with a dedicated control unit and an imaging system with a camera and an FPGA-based LED tracker. In the described system, the nanohandling robot operates on a glass surface. Each robot has a pair of infrared LEDs mounted to its bottom (see Fig. 1). These LEDs are detected by a CMOS camera mounted beneath the glass surface. The working range observed by the camera is approx. 8×5 cm2 . With its 752×480 px resolution, a single pixel images an area of roughly 0.01 mm2 . The LED tracker and the control unit are connected using the controller area network (CAN). With the maximum baudrate of 1 MHz, the transfer of a single pose update take approx. 200 µs. 4.1 FPGA-based LED tracker The hardware-based tracking system is implemented on a XILINX Spartan3E FPGA (XC3S1200E). A softcore embedded processor (Microblaze) with a clock frequency of 50 Mhz is used. The camera transfers the pixel data via

Glass

LEDs

actuation signals

Robot LED Tracker

Camera

en+1

pixel data

tn+1

Fig. 8. Measured poses m and estimated poses e while moving along a trajectory t. 568

Control Unit

CAN movement commands

Fig. 9. Mechatronic system architecture for hardwarebased closed-loop visual servoing.

Mechatronics'10 Cambridge, MA, USA, Sept 13-15, 2010

a serial LVDS interface which is converted to parallel TTL signals by a deserializer. The recovered pixel data and the control signals are connected to inputs of the FPGA.

1.8 1.6 1.4 1.2 y [px]

The hardware part of the tracking algorithm calculates parameters for the weighted center of gravity method for each bright region in the image while the pixels are processed. When an image is fully transferred, the hardware part raises an interrupt at the embedded processor. The interrupt handler calculates the robot’s position and sends it using the CAN bus. Additionally, a high-speed USB interface can be used to transfer the captured images to a computer for other image processing tasks.

2

1 0.8 0.6

4.2 Robot control unit 0.4

In order to actuate the mobile nanohandling robot described above, high-voltage actuation signals for six channels have to be generated. For stick-slip-actuation, sawtooth-shaped signals are used. All six signals need be synchronous, because a simultaneous slip on all actuators leads to better movement behavior. Thus, all signals have the same frequency and phase, whereas amplitude and polarity need to be controlled individually for each channel. Furthermore, a change of signal parameters should be applied in less than 100 µs to minimize additional latency in the control loop. To meet these requirements, a custom hardware control unit was developed consisting of a digital and an analog part. The digital part is comprised of a microcontroller that implements the control algorithm and handles all communications including sensor data, control commands and status updates. Using the SPI protocol, it controls an FPGA that generates the data for a single sawtooth shaped signal. It employs a technique called direct digital synthesis (DSS) to allow for signals of arbitrary frequencies generated from a fixed clock source. The analog part transforms the generated digital signal using a high-speed DAC to create a single sawtooth-shaped signal with 3 Vpp amplitude. This signal is multiplied by six four-quadrant multiplying DACs. The output of these DACs are six sawtooth-shaped signals with a freely selectable polarity and amplitude in the range of 0 to 3 Vpp. As required, all six signals have the same frequency and a simultaneous slip-phase. The signals are amplified using commercial high-voltage power amplifiers with an inverting gain of 100, thus creating the required 300 Vpp output amplitude on each channel. The amplifiers can supply up to 100 mA into the piezo ceramics’ capacitive loads and create slew-rates as high as 400 Vpp/s. The measured settling time for a full range step is 1.5 µs. 5. RESULTS The performance of the described mechatronic system is analyzed in two steps. First, the noise, accuracy and linearity of the LED-tracking are tested. Second, the closed-loop positioning is verified in terms of repositioning accuracy and its ability to stay on a predefined trajectory. 5.1 Sensor system analysis The employed camera delivers pixels with a rate of 24 MHz. This leads to a full frame update rate of 58 Hz. 569

0.2 0 0

0.2 0.4 0.6 0.8

1

1.2 1.4 1.6 1.8

2

x [px]

Fig. 10. Positionings on a grid to evaluate the sensors nonlinearity (1 px ≈ 100 µm). For most of the experiments, a region of interest with the size of 752×150 px was selected leading to a 145 Hz update rate. In order to evaluate the system’s measurement noise, the robot was left stationary while recording sensor data. In a 20 s interval, the standard deviation was 0.002 pixels, corresponding to 200 nm, for the x and y position and 0.003◦ for ϕ. The maximum deviation, i.e. difference between minimal and maximal sensor values, was 0.014 px and 0.02◦ , respectively. The average update interval was 6.988 ms with a negligible standard deviation of only 1.4 µs. The difference between the maximum and minimum update interval was 10 µs, underlining the systems excellent jitter behavior. The resolution and repeat accuracy are only limited by the noise. To examine the sensor’s linearity, a measurement was conducted against a reference sensor. For this purpose, an optical microscope was mounted above the robot and a 50 µm glass sphere was tracked in the microscopes image. The maximum noise and thus accuracy of the microscopebased tracking system is approx. 250 nm, which is sufficient as a reference to the camera-based LED tracking. The mobile robot was then closed-loop positioned to a linear grid spanning 2 px of the LED tracking (see Fig. 10). Each position was approached ten times from different directions and measured with the reference sensor. There is virtually no non-linearity and the deviations between the two sensors are mostly caused by the noise. 5.2 Closed-loop control An important key property of closed-loop control is the repeat accuracy, also called repositioning accuracy. To measure the repeat accuracy, an optical microscope was used as reference similar to the linearity measurements in the previous section. For 1000 repetitions, the robot was moved randomly and brought back to its original

Mechatronics'10 Cambridge, MA, USA, Sept 13-15, 2010

0.06

10.2 measurement trajectory

9.8

0.04

9.6

8

0.03 0.02 0.01 0

Fig. 11. Error over time while driving along a linear trajectory at two different speeds (1 px ≈ 100 µm). position using closed-loop positioning. The final position after each repositioning was recorded with high precision using the optical microscope. The standard deviation within the measurements was 0.0025 pixels (250 nm) and the deviation between minimum and maximum result, i.e. the maximum error, was 0.018 pixels. The test was repeated at different positions with similar results. As the results are virtually identical to the sensor noise, the noise can be assumed to be the limiting factor on the repeat accuracy. To verify the mobile robot’s capability to stay on a predefined trajectory, several benchmark trajectories have been tested. Fig. 11 shows the deviation from the trajectory when driving along the edges of a 10 px, i.e. approx. 1 mm, square without rotating. For the fast movement, each edge was set to require 350 ms with 50 ms stops at each corner. Thus, the robot needs to move with a maximum velocity of 3 mm/s and the entire square requires 1.6 s. The slow movement is four times slower. The movement errors on the fast trajectory are up to 0.05 px, because with the still limited (145 Hz) update rate, the controller cannot react fast enough to compensate motion deviations. For the slower trajectory, the error remains at the noise level. Fig. 12 shows a single edge movement. For the first two sensor updates on the trajectory, an offset between the trajectory and the measurement value is visible. This is caused by the robot not exactly corresponding to the open-loop control model. From the third update on, this deviation is compensated for by the integral part of the controller and the measurements appear identical to the calculated trajectory. 6. CONCLUSIONS AND FUTURE WORKS This paper presents a fully integrated, hardware-based tracking and control architecture. The tracking algorithm has a noise and resolution below 0.01 px corresponding to approx. 1 µm. Update rates of 150 Hz can be achieved with negligible jitter and delay. The control approach facilitates precise trajectory control of mobile nanohandling robots. The control loop is executed on real-time capable microcontrollers and dedicated embedded logic without the assistance of a computer. A computer can be used to send high-level control commands to the robot controller requiring virtually no computation time. Thus, the control performance and accuracy is independent of the computer’s performance and load. The system will be improved with respect to several requirements. First, the tracking of multiple robots in the 570

measurement trajectory

10

10

x-position [px]

deviation [px]

0.05

12

fast slow

9.4 6

9.2 9

4

8.8 2

8.6 8.4

0

8.2 -2

8 0

0.1

0.2 time [s]

0.3

0.4

0.02

0.04 0.06 time [s]

0.08

Fig. 12. Measurements of a linear trajectory and magnification on the first part (1 px ≈ 100 µm). same camera image will be implemented, facilitating a cooperative operation. The key challenge is the mapping of the individual LEDs to the different robots and thus the transfer of pose information to the correct robot controller. Second, the controller will be extended to fuse the information of different sensors. Thus, an object in the microscope image can be moved along a trajectory while maintaining the corrected orientation via the LEDtracking’s ϕ information. Third, the open-loop control will be changed to be adaptive. The exact motion characteristics of a robot highly depend on its cabling conditions. Hence, they can differ significantly in different working areas. The adaptive control will constantly measure the real behavior and update its internal robot model. ACKNOWLEDGEMENTS Parts of this work were supported by the European Community: Project NanoHand (IP 034274). REFERENCES Diederichs, C. (2010). Hardware-software co-design tracking system for predictable high-speed mobile microrobot position control. In Proc. of 5th IFAC Symposium on Mechatronic Systems. Edeler, C., Jasper, D., and Fatikow, S. (2008). Development, control and evaluation of a mobile platform for microrobots. In Proc. of 17th IFAC World Congress. Eichhorn, V., Fatikow, S., Dahmen, C., Edeler, C., Jasper, D., and Stolle, C. (2008). Automated Microfactory inside a Scanning Electron Microscope. In Proc. of 6th Int. Workshop on Microfactories (IWMF). Esta˜ na, R. and W¨orn, H. (2003). Moir´e-based positioning system for microrobots. In Proc. of Int. Conference on Optical Measurement Systems for Industrial Inspection III, volume 5144, 431–442. SPIE. Jasper, D. and Edeler, C. (2008). Characterization, optimization and control of a mobile platform. In Proc. of 6th Int. Workshop on Microfactories (IWMF). Sievers, T. and Fatikow, S. (2006). Real-time object tracking for the robot-based nanohandling in a scanning electron microscope. Journal of Micromechatronics - Special Issue on Micro/Nanohandling, 3(3-4), 267– 284(18).