A Vision Based Wheel Slip Estimation Technique for Mining Vehicles

A Vision Based Wheel Slip Estimation Technique for Mining Vehicles

A Vision Based Wheel Slip Estimation Technique for Mining Vehicles Xiaojing Song*, Lakmal Seneviratne* and Kaspar Althoefer* * Division of Engineering...

1MB Sizes 0 Downloads 86 Views

A Vision Based Wheel Slip Estimation Technique for Mining Vehicles Xiaojing Song*, Lakmal Seneviratne* and Kaspar Althoefer* * Division of Engineering, King’s College London London, WC2R 2LS, United Kingdom. Abstract: Slip plays a vital role in traction control when a mining vehicle traverses over soft soil. This paper presents a vision-based wheel slip estimation technique for mining vehicles. A downward-looking camera is mounted with a special tilted angle to observe both the rotary motion of the wheel and the wheel translatory motion relative to the soil. An optical flow algorithm is developed to estimate the wheel angular velocity and the wheel translatory velocity, by tracking salient features selected from the soil surface and the wheel tire surface separately. The specially orientated camera enables the vision-based algorithm to estimate the wheel slip without wheel odometry information. The proposed technique has been validated on a linear test rig, showing good performance. Keywords: slip estimation, optical flow, camera model. 1. INTRODUCTION Mining is a cooperative activity between mining equipments and people who manipulate them. If mining vehicles have high-level autonomy, the mining industry could be more productive, accessible to unworkable areas, and efficient in unsafe extreme environments. Thus, slip plays a vital role in traction control when a mining vehicle traverses over extreme terrains (Le 1999), such as soft soil and mud. Thus, accurate measurement of wheel slips is necessary for this purpose. In the past decades, many techniques have been developed to estimate wheel slips (Iagnemma and Dubowsky, 2004; Bevly et al., 2000; Ojeda et al., 2006; Le et al., 1997; Cherouat et al., 2005). In general, these techniques can be divided into two groups. The first group is dynamic model-based slip estimation methods, while the second group is kinematic model-based slip estimation methods. To estimate slip parameters, the kinematic model-based methods require the motion states of the wheel as inputs, such as the wheel position and orientation, the wheel longitudinal and lateral velocities, and the wheel angular velocity. The wheel angular velocity is usually obtained from the onboard wheel encoder. For other motion states, many existing techniques, such as global positioning system (GPS) and inertial measurement unit (IMU), fifth wheel, can be applied. Nevertheless, GPS/DGPS is prone to errors due to signal loss and interferences in forests and areas crowded by tall buildings; IMUs drift with time without bound and causes cumulative errors in position and velocity. In recent years, visual-based approaches have become attractive alternatives for vehicle motion state estimation. Since the vision sensors have no contact with the soil, they are particularly capable of estimating motion states in the presence of slips. Compared to IMUs, vision-based approaches do not produce cumulative errors for velocity estimates, thus more suited for low speed motion. Visual sensors also have advantages of being small in size, low in power consumption, as well as passive in

nature. The methodology of vision-based approaches is to recover the motion states of the wheel by tracking extracted distinguishable visual features across multi-images. In (Helmick et al., 2005), a Kalman filter integrates data from visual odometry and an IMU to estimate slip parameters. Reference (Nister et al., 2004) employs both a single camera and a stereo camera for the trajectory estimation and navigation purposes. An omni-directional vision system is utilized to estimate translational and rotational motions of an indoor mobile robot (Bunschoten and Krose, 2003).



ω



Fig. 1. A downward-looking camera is mounted to capture images which contain both the soil surface and the wheel tire surface. To measure the wheel slip, a wheel encoder is essentially required to provide the wheel angular velocity; while a speed sensor is needed to give the wheel actual translatory speed. In the study, the use of a single camera with a special orientation allows the wheel slips to be measured without the wheel odometry information. In Fig. 1, a single camera is mounted on the vehicle to face downward with a tilted angle. The special orientation of the camera enables the camera to look at both the soil surface and the wheel tire surface. Hence, the rotary motion of the wheel and the wheel translatory motion relative to the soil are all observed by this onboard

IFACMMM 2009. Viña del Mar, Chile, 14 -16 October 2009.

camera. Each image captured by the camera contains two regions: a soil surface region and a wheel tire surface region, Fig. 2. The soil surface region is used to estimate the wheel

the raw images and selects salient features which are evaluated to be reliably tracked for tracking purposes. The tracking process estimates the feature locations on adjacent images, Fi and Qi . Given the feature locations, the image ˆ , T ˆ , is estimated. transformation between adjacent images, R A camera motion projection model is then used to give the estimates of the wheel translatory velocity, Vˆx , and the wheel angular velocity, ωˆ . The wheel slip i is therefore estimated.

Fi , Qi

Fig. 2. The captured image is divided into two regions: a soil surface region and a wheel tire surface region. translatory velocity, while the wheel tire surface region is used to estimate the wheel angular velocity. Two sets of salient natural features are separately selected from two regions and separately tracked by an optical flow algorithm. A camera projection model is specially built to map the movements of both sets of features on the image plane to the actual wheel motion. The proposed estimation method was validated on a linear test rig where conducted tests covered the slips ranging from 0% to 100%. Test results show good agreements between the slip estimates given by the proposed technique and the measurements given by sensors equipped on the test rig. The proposed slip estimation technique can be potentially used as a stand-alone system to achieve wheel slip estimation for mining vehicles which need traversing over slippery terrains, such as soft soils and mud. The paper is organized as follows. Section 2 gives the wheel slip definition and slip estimation system framework. Section 3 demonstrates the slip estimation algorithm and the camera projection model. Section 4 gives test results. Conclusions are drawn in Section 5. 2. SLIP DEFINITION AND SLIP ESTIMATION SYSTEM Figure 1 shows a single wheel traverses over rough terrain. It is noted that the slip angle is not considered in this paper. The slippage of a single wheel is quantified by a longitudinal slip. When a driving torque is applied, the wheel longitudinal slip is defined as (Wong, 1978)  V  i = 1 − x  × 100%  ωr 

(1)

where, Vx is the longitudinal velocity of the wheel center (also called translatory velocity), ω is the angular velocity of the wheel, r is the rolling radius of the free-rolling wheel. When a driving torque is applied, the wheel rotates without the equivalent translatory progression; thus, ω r > Vx . Figure 3 shows the framework of the slip estimation system. It composes of key components: image pre-processing, feature detection, feature tracking, image transformation parameter estimation, camera motion projection, slip estimation. The slip estimation system receives image sequences as inputs. The following step is to filter noise from

IFACMMM 2009. Viña del Mar, Chile, 14 -16 October 2009.

ˆ,T ˆ R Vˆx , ωˆ

Fig. 3. Slip estimation system framework. 3. SLIP ESTIMATION ALGORITHM

3.1 Feature Selection After being filtered by a Gaussian filter, the image sequences are ready to be further processed. The following step is to select two sets of natural and distinctive features from the soil surface region and the wheel tire surface region for separate tracking processes. Small windows of pixels are evaluated in terms of their trackability, and only those with good trackability are selected. Owing to the good reliability and the high computational efficiency, the feature detection method presented in (Shi and Tomasi, 1994) is adopted, where a Hessian matrix, M , reflects the texture pattern of a small window of pixels and determines the trackability of the window.  ΣI I ΣI x I y   Ix  T (2) M = x x  = Σ  I y   Ix I y  = Σ∇I (∇I )  ΣI y I x ΣI y I y  To guarantee a feature window can be reliably tracked, M should therefore be: full ranked, over noise and wellconditioned. Full rank indicates that intensity gradients within a small window are invariant in horizontal and vertical directions. To be over image noise, both eigenvalues of M are required to be large. The requirement of being wellconditioned indicates eigenvalues of M do not differ by several orders of magnitudes. Two small eigenvalues of M represent a roughly constant intensity profile within the window; a large and a small eigenvalue corresponds to a unidirectional texture pattern; two large eigenvalues indicate good texture patterns. Thus, the eigenvalues of the Hessian matrix M are used to evaluate the trackability of small windows in this paper. If the minimum eigenvalue of M is greater than a predefined threshold λc , the feature is accepted

Fig. 4. Two sets of features are selected from both regions: the wheel tire surface region (upper line) and the soil surface region (below line), respectively. They are tracked separately within their own regions across sequential images. as a valid candidate for feature tracking (Shi and Tomasi, 1994): min(λ1 , λ2 ) ≥ λc (3)

The selection criterion is applied in both regions for selecting salient features.

E (d) =  {I 2 (L) − I1 (L) + A

(7) Let g (L ) = [

3.2 Optical Flow for Feature Tracking Optical flow, also called image velocity, is the apparent motion of the image intensity/brightness patterns. The rotary motion of the wheel and the wheel translatory motion relative to the soil can be estimated from the planar optical flow. In the last three decades, many optical flow techniques have been proposed (Campbell et al., 2004). Among these optical flow techniques, the most attractive methods are differential methods, region-based matching methods, energy-based methods and phase-based methods. Synthetic image sequences and real image sequence experiments were carried out to show that the first-order differential methods work more reliable and robust than others (Barron et al., 1992). In this paper, we use the first-order differential method combined with the Newton-Raphson iterative method to track distinguished features. Features selected from the soil surface region and features selected from the wheel tire surface region are tracked separately. Let I1 and I 2 represent the intensity/brightness of a feature window in sequential images. We assume that the intensity value of a feature window remains the same in subsequent images. Hence, our aim is to estimate the feature locations by minimizing the residual error: d d (4) E (d) =  [ I 2 (L + ) − I1 (L − )]2 dL A 2 2 where L = [ x, y ] is the location of a feature window in a local image frame, d = [dx, dy ] is the movement of a feature window with respect to the local image frame, and A represents the small region of the feature window. Omitting high-order terms, the Taylor series expansion of I 2 and I1 gives: d dx ∂I 2 (L) dy ∂I 2 (L) + (5) I 2 (L + ) ≈ I 2 ( L ) + 2 2 ∂x 2 ∂y and d dx ∂I1 (L) dy ∂I1 (L) − I1 (L − ) ≈ I1 (L) − 2 2 ∂x 2 ∂y Substituting (5) and (6) into (4) gives:

(6)

dx [∂I1 (L) + ∂I 2 (L)] dy [∂I1 (L) + ∂I 2 (L)] 2 + } dL ∂x ∂y 2 2

∂I1 (L) + ∂I 2 (L) ∂I1 (L) + ∂I 2 (L) T , ] ∂x ∂y

(8)

Substituting (8) into (7) gives: d E (d) =  [ I 2 (L) − I1 (L) + g(L)]2 dL A 2

(9)

For minimizing residual error E (d) , the derivative of E (d) with respect to d is forced to zero, i.e. E (d) = 0 . Rearranging E (d) gives:

 [ I (L) − I A

1

2

(L)]g(L)dL = [

1 g(L)T g(L)dL ] ⋅ d (10) 2 A

Let D =  [ I1 (L ) − I 2 (L )]g (L ) dL

(11)

1 G = [  g(L)T g(L)dL ] 2 A

(12)

A

and

Then (10) can be rewritten as: d = G −1 ⋅ D

(13)

Equation (13) can be solved by using the Newton-Raphson iterative method. 3.3 Image Transformation Estimation and Outlier Rejection The motion of the camera can be decoupled into a translational motion in a plane and a rotational motion about the optical axis. Define R as the rotation matrix and T as the translation vector characterizing the image transformation between adjacent images. R and T are linked by the feature locations in adjacent images: F = RQ + T (14) where F and Q denote the feature locations in current and next images respectively. The image transformation estimation is thus the problem of inferring R and T from feature locations in adjacent images. Since quantization noise is present, more than three corresponding points are required to reduce the effect of noise and determine the transformation parameters. Thus, tens or hundreds of features are used to

IFACMMM 2009. Viña del Mar, Chile, 14 -16 October 2009.

achieve high accuracy of estimation. Taking the estimation error into account gives: Fi = RQi + T + Ei (i = 1, 2, n) (15)

Here we focus on the model used for the estimation of the wheel rotary velocity. As shown in Fig. 5, we can derive: L − L0 = r[cos(θ − β ) − cos(β − θ 0 )] (17)

where n represents the total numbers of features. Minimizing the weighted sum of squares gives:

and

n

 w i Ei i =1

n

2

=  w i Fi − RQ i − T

2

(16)

f f X0 − X x0 x f f X 0 − [ X 0 − r sin(β − θ 0 ) − r sin(θ − β )] (18) = x0 x

H0 − H =

i =1

1 . Equation n (16) can be solved either using a closed-form solution or an iterative minimization process. The closed-form solution offers the advantage of computational efficiency but suffers from numerical instability when outliers (incorrectly tracked features) are present. Hence, it is required to remove these outliers which may otherwise introduce considerable deviations to the estimation results. In the study, the random sample consensus (RANSAC) statistical method (Fischler and Bolles, 1981) is employed to reject outliers and find probable transformation parameters. This algorithm is iteratively repeated and tested on a sufficient number of feature sets to identify all outliers. After rejecting outliers, a closed-form solution is applied to re-estimate the image transformation parameters based on all inliers (correctly tracked features). The method in (Schonemann and Carroll, 1970) which applies Lagrange multipliers is used to finally estimate R and T . where the weight w i are evenly assigned, w i =

3.4 Geometrical Projection Model The wheel translatory motion relative to the soil and the wheel rotary motion are computed from image transformations, using a geometrical projection model. We assume that the distance from the camera to the soil surface remains the same and the optical axis of the camera is perpendicular to the soil surface during the camera motion. This hypothesis is useful to build a simplified pinhole camera model that maps image transformations to the camera motion. The model used for the estimation of the wheel translatory velocity is similar to the one we used in previous work (Song et al., 2008). Thus it is not described in this paper.

Fig. 5. Camera projection model.

IFACMMM 2009. Viña del Mar, Chile, 14 -16 October 2009.

Since | L − L0 | − | H 0 − H |= 0 , the following relation is induced: fX 0 (

1 1 fr − ) + [sin( β − θ 0 ) + sin(θ − β )] x0 x x − r[cos(θ − β ) − cos( β − θ 0 )] = 0

(19)

Given the camera focal length f , the wheel radius r , the camera tilted angle β , the camera resolution 2x0 and the distance from point A to the optical axis X 0 , then the rotary angle θ can be obtained by solving (19). In addition, the distance X 0 can be obtained using (20) : X 0 = [ D0 − r sin θ 0 ]sin α

(20)

where, the distance D0 is easily measured; α denotes the angle of view of the camera. The wheel rotary angle θ is changed with the wheel rotation. By tracking the features selected in the wheel tire surface, the rotary angle of the wheel between adjacent images is estimated. The focal length of the camera is determined using the Matlab camera calibration toolbox (Bouguet 2007). 4. EXPERIMENTS 4.1 Experiment Setup To validate the proposed wheel slip estimation technique, experimental study was carried out on a linear test rig where wheel slips can be accurately measured and controlled. The translational-only test rig comprises of a wheel assembly, a CMOS camera, a carriage assembly, two DC motors and two built-in encoders, a soil-filled container, shown in Figs. 6 and 7. The specifications of the camera and the wheel assembly are listed in Table. 1. The chain and the wheel were driven by the chain motor and the wheel motor respectively. The velocities of the chain and the wheel were controlled by changing motor voltages. Thus, the wheel slips were generated by causing speed difference between the chain and the wheel. The voltages loaded on both motors can be varied from 0 V to 24 V, causing the velocity to change from 0 m/s to 0.05 m/s. The wheel slips were varied from 0% to 100% in an increase step of 10%. The velocities of the chain and the wheel were acquired from built-in encoders, which were connected with a data acquisition card (NI DAQ 6601) transferring data to a host PC. The encoders worked in a synchronized mode at a sampling rate of 100 Hz. The built-in encoders initially produced 48 pulses/revolution. Fitted with gear boxes with gear ratios of 500:1, they produced 24000

pulses/revolution. When the wheel was driven to move, the CMOS camera was triggered to capture image sequences in a constant frame rate of 30Hz. For each test, the measured slips were obtained from the chain encoder and the wheel encoder; while the estimated slips were obtained from the proposed slip estimation method. Two categories of tests were conducted on the test rig. One category is slip-constant tests while the other one is slipvarying tests. In slip-constant tests, the chain and the wheel were commanded to move at a constant speed during each run. Slips were ranging from 0% to 100% in an increase step of 10%. In slip-varying tests, the wheel and the chain were commanded to move at varying speeds by adjusting the power applied on motors.

ims (%) =

Vchain − Vwheel ×100 Vchain

(21)

where, Vchain and Vwheel denote the chain velocity and the wheel velocity measured by encoders. The raw data from encoders contain high-frequency noise, thus, a lowpass filter circuit was used to reduce the noise. The estimated slip was computed as: Vˆ − Vˆwheel iet (%) = chain × 100 (22) Vˆchain where, Vˆchain and Vˆwheel denote the chain velocity and the wheel velocity estimated by the proposed algorithm. In the study, we define a normalized root-mean-square (RMS) error to evaluate the performance of the proposed slip estimation method. The normalized RMS error is expressed by: 2

ei _ rms (%) =

n − ietn  1 N  ims   n  ×100 N n =1  ims 

(23)

n where, N represents the total number of the data points, ims

and ietn denote the measured and estimated slips in the nth data point respectively.

Fig. 6. Schematic drawing of test rig.

Fig. 7. A tilted downward-looking camera is mounted to observe both the soil and the wheel. Table 1. Specifications of onboard CMOS camera and wheel assembly. CMOS camera

Wheel assembly

resolution:1280×1024 frame rate: 30 fps view angle: 40.4° × 30.8° height over terrain: 0.28m tilted angle: 40°

weight: 7.850 kg diameter: 0.198 m width: 0.098 m wheel surface material: rubber wheel surface: stripped teeth

4.2. Experimental Results The measured slip is obtained as:

Fig. 8. Test results covering slips from 0% to 100%. Solid lines () represent estimated slips, while dashdot lines (-·-·) represent measured slips. Figure 8 shows results of slip-constant tests obtained from the test rig. The results covered slips from 0% to 100% with a step of 10% increase. From Fig. 8, it is clearly seen that the estimated slips (solid lines) match the measured slips (dashdot lines) reasonably well. Table 2 gives the normalized root-mean-square (RMS) errors (%) which can quantify the accuracy of the proposed algorithm. As listed in Table. 2, it is seen that the estimates have good agreements with the measurements. The maximum normalized RMS error is 8.39%, when the wheel slip is between 0% and 10%. The RMS errors in other slip ranges are much lower. In particular, the RMS errors in higher slip ranges 70% - 100% are rather small (less than 1%). Adjusting the power applied on the chain motor and the wheel motor, varying wheel slips were generated to test the

IFACMMM 2009. Viña del Mar, Chile, 14 -16 October 2009.

capability of the proposed method in time-variant cases. Fig. 9 shows test results obtained from a slip-varying test conducted on the test rig. In Fig. 9, it is clearly seen that the estimates obtained from the proposed algorithm agree well with the measurements. The normalized RMS error of this slip-varying test is 0.31 %. Table 2. Normalized RMS errors of slip ratio, e i _ rms (%) , obtained from tests conducted on the test rig. Slip range 0% - 10% 10% - 20% 20% - 30% 30% - 40% 40% - 50%

ei_rms(%) 8.39 3.81 4.25 1.51 0.37

Slip range 50% - 60% 60% - 70% 70% - 80% 80% - 90% 90% - 100%

ei_rms(%) 1.41 1.46 0.51 0.38 0.13

Fig. 9. Results obtained from a slip-varying test. 5. CONCLUSIONS Mining is a cooperative activity between mining equipments and people who manipulate them. To enhance the autonomy of the mining vehicles which face task of traversing extreme terrain, wheel slip is a key parameter to be accurately estimated. In the paper, we propose a vision-based method to achieve this purpose. A downward-looking camera is mounted with a special tilted angle to observe both the rotary motion of the wheel and the wheel translatory motion relative to the soil. An optical flow algorithm is developed to track features selected from the soil surface and the wheel tire surface separately and detect the wheel slips. The special orientation of the camera enables the wheel slip to be measured without wheel odometry information. The proposed method has been validated on a linear test rig where wheel slips can be accurately measured and controlled. Test results show that the estimated wheel slips have good agreements with the measurements. However, some problems need to be addressed in future work. First, the change of the distance between the camera and the soil surface significantly affects the estimation accuracy. A height measurement and compensation method is therefore required. Second, the proposed method will be further extended to two-dimensional cases where the side slip angle is necessarily included.

IFACMMM 2009. Viña del Mar, Chile, 14 -16 October 2009.

REFERENCES Barron, J. L., Fleet, D. J., Beauchemin, S. S. and Burkitt, T. (1992). Performance of Optical Flow Techniques. IEEE Conf. CVPR, pp. 236-242,. Bevly, D. M., Gerdes, J. C., Wilson, C. and Zhang, G. (2000). The Use of GPS Based Velocity Measurements for Improved Vehicle State Estimation. Proceeding of the American Control Conference, Chicago, IL, pp. 2538–2542. Bouguet, J-Y. (2007). Camera calibration toolbox for Matlab. [Online].Available:http://www.vision.caltech.edu/bougu etj/calib_doc. Bunschoten, R. and Krose, B. (2003). Visual Odometry from an Omnidirectional Vision System. IEEE Int. Conf. Robotics and Automation. vol. 1, pp. 577-583. Campbell, J., Sukthankar, R., Nourbakhsh, I. (2004). Techniques for Evaluating Optical Flow for Visual Odometry in Extreme Terrain. IEEE Int. Conf. Intelligent Robots and Systems. vol. 4, pp. 3704-3711. Cherouat, H., Braci, M. and Diop, S. (2005). Vehicle Velocity, Side Slip Angles and Yaw Rate Estimation. IEEE Int. Sym. Industrial Electronics. vol. 1, pp. 349– 354. Fischler, M. and Bolles, R. (1981). Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartograph. Communication of the ACM. vol. 24, no. 6, pp. 381-395. Helmick, D. M., Cheng, Y., Clouse, D. S., Bajracharya, M., Matthies, L. H., Roumeliotis, S. I. (2005) Slip Compensation for a Mars Rover. IEEE/RSJ Int. Conf. Intelligent Robots and Systems., pp. 2806-2813. Iagnemma, K. and Dubowsky, S. (2004). Mobile Robots in Rough Terrain: Estimation, Motion Planning and Control with Application to Planetary Rovers, vol. 12 (SpringerVerlag, Berlin, Heidenlberg. Le, A. T. Rye, D.C., Durrant-Whyte, H. F. (1997) Estimation of Track-Soil Interactions for Autonomous Tracked Vehicles. IEEE Int. Conf. Robotics and Automation. vol. 2, pp. 1388–1393. Le, A. T. (1999). Modelling and Control of Tracked Vehicles. PhD Thesis, University of Sydney. Nister, D., Naroditsky, O. and Bergen, J. (2004). Visual Odometry. IEEE Computer Society Conf. Computer Vision and Pattern Recognition., vol. 1, pp. 652-659. Ojeda, L., Cruz, D., Reina, G. and Borenstein, J. (2006). Current Based Slippage Detection and Odometry Correction for Mobile Robots and Planetary Rovers. IEEE Trans. Robotics. vol. 22, pp. 366–378. Schonemann, P. H. and Carroll, R. M. (1970). Fitting One Matrix to Another under Choice of a Central Dilation and a Rigid Motion. Psychometrika. vol. 35, pp. 245-255. Shi J., Tomasi, C. (1994). Good Features to Track. IEEE Int. Conf. Computer Vision and Pattern Recognition. pp. 593-600. Song, X., Song, Z., Seneviratne, L. D. and Althoefer, K. (2008). Optical Flow-Based Slip and Velocity Estimation Technique for Unmanned Skid-Steered Vehicles. IEEE/RSJ Int. Conf. Intelligent Robots and Systems. pp. 101 – 106. Wong, J. Y. (1978). Theory of Ground Vehicles. New York : Wiley.