Computers and Electronics in Agriculture 25 (2000) 11 – 28 www.elsevier.com/locate/compag
Ground based sensing systems for autonomous agricultural vehicles T. Hague *, J.A. Marchant, N.D. Tillett Silsoe Research Institute, Wrest Park, Silsoe NK45 4HS, UK
Abstract This paper examines ground based (as opposed to satellite based) sensing methods for vehicle position fixing. Sensors are considered in various categories, motion measurement (odometry, inertial), artificial landmarks (laser positioning, millimetre wave radar), and local feature detection (sonar, machine vision). Particular emphasis is paid to technologies which have proven successful beyond the field of agriculture, and to machine vision because of its topicality. The importance of sensor fusion, using a sound theoretical framework, is emphasised. The most common technique, the Kalman filter, is outlined and practical points are discussed. As an example system, the autonomous vehicle developed at Silsoe Research Institute is described. This vehicle does not use an absolute positioning system, rather it navigates using local features, in this case the crop plants. This vehicle uses a sensor package that includes machine vision, odometers, accelerometers, and a compass, where sensor fusion is accomplished using an extended Kalman filter. © 2000 Elsevier Science B.V. All rights reserved. Keywords: Sensors; Navigation; Localisation; Autonomous vehicles; Data fusion; Kalman filter
1. Introduction
1.1. Background The concept of fully autonomous agricultural vehicles is far from new; examples, dating back to the fifties and sixties, of early ‘driverless tractor’ prototypes using leader cable guidance systems attest to this (Morgan, 1958). Whilst interest in automated agricultural vehicles has endured, it has done so in relative isolation from the field of mobile robotics. * Corresponding author. 0168-1699/00/$ - see front matter © 2000 Elsevier Science B.V. All rights reserved. PII: S 0 1 6 8 - 1 6 9 9 ( 9 9 ) 0 0 0 5 3 - 8
12
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
Research in mobile robotics also dates back to the sixties, when the robot ‘Shakey’ (Nilsson, 1969) at Stanford Research Institute navigated, using machine vision, through a set of carefully prepared rooms. Research has for the large part concerned itself with small indoor laboratory robots, operating in relatively well controlled environments. In the thirty years which have elapsed since then, the localisation or position fixing problem has been an important research topic, and the Kalrnan filter has emerged as a method for its solution. Despite the different environment and conditions, this approach is equally applicable to the navigation of autonomous agricultural vehicles, as will be illustrated.
1.2. Characteristics of the agricultural localisation problem The agricultural environment offers a very different set of circumstances to those encountered by a laboratory mobile robot. The lack of clutter typically present in the indoor environment simplifies the problem in one respect; however a number of additional complications are raised: The operating areas are large. Ground surfaces may be uneven. Depending on the operation, wheel slippage may be far from negligible. Cultivation operations may interfere with underground cables, etc. Environmental conditions (rain, fog, dust, etc.) may affect sensor observations. Low cost systems are required.
1.3. Sensor systems Sensors for automated systems can be separated into two categories. In one group are those sensors which measure the internal state of the system, e.g. encoders reporting the joint angles of a robot arm. For a vehicle which is free ranging within its environment, such sensors are largely restricted to motion and acceleration measurement. The second group are those sensors which make external observations; these can further be divided by usage into those which give observations of artificial navigational landmarks (beacons), and those which offer observations of local environment features of importance for specific tasks, e.g. crop plant locations for a vehicle which follows plant rows.
1.4. Aims and scope of this paper We begin with a limited review of sensing methods applicable to agricultural vehicle position fixing. It is not our intention to give a full review of previous research on automatic vehicle guidance, for which there are several papers available, e.g. (Warner, 1965; Tillett, 1991a). We give particular attention to technologies which have proven successful beyond the field of agriculture. Some sensing methods are omitted, namely those which are largely of historical interest, and satellite based positioning systems, which are described elsewhere in this issue.
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
13
In sections Sections 2 – 4, various sensing methods are discussed, organised according to the categories set out in Section 1.3. In Section 5, we address the fusion of disparate types of data from multiple sensors to maintain a reliable estimate of the vehicle position or state. Section 6 briefly describes an example system which demonstrates the application of these methods. Our concluding comments comprise Section 7.
2. Motion measurement
2.1. Odometry Odometers are a much criticised, although widely used, sensor. The rotation of one or more ground wheels is used to measure vehicle motion. The appeal of odometry is that is simple and low cost. The disadvantage is susceptibility to errors from a variety of sources, including wheel slippage and variation in the effective wheel rolling radius and the track width of the wheels. These errors are cumulative, giving a rapid loss in positional accuracy. Conventional wisdom suggests that odometric measurement should not be taken from driven wheels but from separate odometry wheels to reduce the effects of slip. For hard floor surfaces, thin odometry wheels are ideal to give a close approximation to a point contact. In the agricultural environment, this may be impractical: separate odometry wheels must have a contact point on the ground which is collinear with that of the driven wheels, or they must be castored, resulting in an awkward or vulnerable mechanical arrangement. Odometric wheels carrying little vertical load will be adversely affected by the rough soil surface, and may accumulate dirt, enlarging the effective radius. Various measures have been devised to reduce, or compensate for, systematic odometric errors in laboratory robot vehicles (Borenstein, 1996). It is important to realise that the physical processes involved in wheel slippage are different for vehicles on a soil surface, where the slip is largely movement in the soil beneath the wheels; thus models of odometric measurement for agricultural vehicles need further attention.
2.2. Inertial sensors Inertial sensors, i.e. accelerometers and gyroscopes, have been used in a number of vehicle applications (Scho¨nberg et al., 1996; Nebot et al., 1997b) as an alternative to odometers for dead reckoning. In addition to avoiding the difficulty of modelling odometric errors, inertial systems have the advantage that they can be produced in an encapsulated and robust package. Unfortunately, both the zero point and the output scale of these types of sensor are prone to thermal drift (Barshan and Durrant-Whyte, 1993) to a varying degree depending upon the type of the sensor (and inevitably linked to cost). Moreover, accelerometers are sensitive not only to accelerations of the vehicle, but also to the
14
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
gravitational acceleration; therefore estimation of attitude for the full three rotational degrees of freedom is necessitated, even when heading in the horizontal plane alone is of interest. Since the accelerations experienced by an autonomous vehicle in normal operation are typically quite small, the effect of bias point drift and inclination errors can be significant. When inertial sensor data are integrated to give position and orientation (once for gyros, and twice in the case of accelerometers) these sources of error can lead to rapid positional drift.
2.3. Other sensors for dead reckoning Grouped together here are a variety of miscellaneous methods for measuring heading and speed of a vehicle, which may be used to provide a short-term stable navigation capacity. Rotational motion can be determined by observing distant features (classically, celestial bodies) -a principle exploited in the sun sensor used on vehicles for unmanned planetary observation (Cozman and Krotkov, 1995). Alternatively, a correlation approach can be used to match topographic features of the horizon (Stein and Medioni, 1995). The use of a geomagnetic compass is also possible; a three axis fluxgate magnetometer allows an implementation with no moving parts. The difficulties in using such a compass are 2-fold; firstly, since the Earth’s magnetic field acts in a horizontal plane only at the equator, at other locations the vehicle roll and pitch must be known to obtain an accurate heading angle. Secondly, local magnetic fields and magnetic materials (e.g. nearby electric motors and vehicle steel work) adversely affect the readings. Vehicle ground speed is commonly measured using Doppler effect sensors. Signals in the microwave range penetrate crop and stubble to give a true ground speed. Because the frequency shift is proportional to speed, Doppler effect sensors have the limitation of low update rate, particularly at low vehicle speeds.
2.4. General properties of dead reckoning systems In summary, dead reckoning sensors suffer from a common limitation. This is not so much a failing of the sensors themselves, but in the method; since motion information is integrated to give position, any small bias in the sensor output will accumulate and result in positional drift. However, odometers and inertial systems have a high bandwidth, and are fairly reliable in the short term.
3. Artificial landmarks
3.1. Laser positioning systems A variety of laser based positioning systems are possible. A simple configuration uses three or more detectors positioned around the workspace. A vehicle mounted
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
15
laser is swept in a horizontal plane. The time at which the beam was detected is communicated to the positioning system, which uses triangulation to find the location of the vehicle. This system has the disadvantage of requiring a communication link between the vehicle and the (fixed) detectors. The GEC/Caterpillar free-ranging industrial AGV uses a horizontally scanned laser to determine the bearing to a number of reflective beacons, each of which carry a bar code for identification. The bearing measurements are used to estimate vehicle location. Systems which rely on a vehicle mounted laser have a significant drawback when used on rough terrain; tilt of the vehicle from the horizontal may cause the laser beam to miss the targets, unless the beam is diverged vertically. This divergence seriously reduces the practical operating range, output power being limited by the need for the system to be eye-safe. An alternative is to fix the laser or lasers in the field, and mount the detector or retroreflective beacon on the vehicle. Such systems have been used by Shmulevich et al. (1989), Matsuo et al. (1997) have automated a small tractor using a commercially available total station surveying tool in this way.
3.2. Millimetre wa6e radar Millimetre wave radar has been used on a large AGV for cargo handling (Durrant-Whyte, 1996); the radar is scanned horizontally, and measures range and bearing to a set of trihedral aluminium reflectors. The reflectors may be covered by a polarising grating to enable discrimination from other objects. Conceptually similar to the laser based systems, this radar system has the advantage that a range of a few hundred metres can be achieved, despite the beam being diverged vertically to accommodate a small degree of vehicle tilt. Millimetre wave radar is also less susceptible to climatic disruption than optical sensors. The major disadvantage is that suitable radar systems are not readily available, and require considerable expertise to construct.
3.3. General properties of beacon based methods Beacon based methods rely upon observations of a set of targets, the location of which are known a priori. This requires some care in beacon installation. The sensor observations are a direct function of position, and so are not prone to the drift that affects dead reckoning systems. Complications arise since a set of observations may be ambiguous (if the beacons are not uniquely identifiable), and unreliable because of both false detections and failure to detect obscured beacons. The rate at which observations are available is generally low, one scan per second being typical for laser and radar systems.
16
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
4. Local feature detection
4.1. Sonar The use of sonar for navigation, mapping and obstacle avoidance in indoor environments has received much study. Interest has been stimulated by the availability of the inexpensive Polaroid 6500 ultrasonic ranging unit, originally devised as a camera auto-focus device. The challenging problem of interpreting sonar data has attracted academic interest, further fuelled by the tantalising observation that in nature, bats appear to find little difficulty in navigating by sonar1. Some significant results on sonar interpretation for the indoor environment are the certainty grid approach (Elfes, 1987) and the method of identifying and tracking geometric features (Leonard and Durrant-Whyte, 1992; Kleeman and Kuc, 1995). The large corpus of literature on sonar data interpretation deals almost exclusively with the indoor environment, where features behave as specular reflectors at the wavelengths in question (around 10 mm). In an outdoor environment most surfaces-particularly those of natural objects do not behave in this way. The signal returned from natural, diffusely reflecting, surfaces is of much smaller amplitude than that from a smooth reflecting surface such as a laboratory wall. Further difficulties in the outdoor environment include air movement and ambient ultrasonic noise, a great deal of which may be generated by an agricultural vehicle. Recently there has been interest in an alternative type of sonar sensor, originally designed as an aid for the blind. Rather than directly measuring the time of flight of a burst of ultrasonic sound (as the Polaroid unit does), a constant transmission frequency modulated (CTFM) system is used; sound is constantly emitted, at a swept frequency. The frequency difference between detected echoes and the currently emitted sound gives a measure of range-in the blind aid, this is given as an audio frequency signal to the user. One of the useful properties of this system is that it gives not only the range to the nearest object, but multiple reflecting surfaces contribute to the frequency spectrum of the output. Using FFT analysis of the returned signal, and subsequent classification by a neural network, Harper and McKerrow (1997) claim that four different species of plant may be discriminated under laboratory conditions.
4.2. Machine 6ision Machine vision is now a well established part of engineering research and is finding its way into practical applications. The principles can be found in a number of texts, e.g. (Davies, 1990), here we focus on the application of machine vision to agricultural engineering with vehicles particularly in mind. 1 It is worth noting that the bat is one of few creatures which must use sonar because of its nocturnal habit; the majority of other animals are vision based-there is perhaps an important message here.
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
17
There are a number of mass market applications for imaging technology (for example personal computing, multimedia, home video, surveillance, digital photography) all of which drive costs down, making the technology attractive for agriculture. Useful characteristics include: Sensing is non contact. A large amount of information is collected quickly. The potential exists to be both cheap and powerful. Whereas some difficulties are: Storing and processing the data. Extracting usable information from images. Dealing with natural objects. Operating under natural lighting conditions. The advantages are significant where delicate moving objects are involved, e.g. inspection of produce (Yang and Marchant, 1996), monitoring of animals (Schofield and Marchant, 1996) or treatment of crops (Brivot and Marchant, 1996). For guidance of vehicles there is an important advantage over the sensors of the previous sections:- as well as measuring vehicle position, machine vision can measure the positions of objects in the scene. It can also potentially say something about those objects, e.g. whether they are plants or weeds (Brivot and Marchant, 1996; Benlloch et al., 1997) The first problem is diminishing constantly as memory density and processor power increase and costs reduce. However, expectations in terms of image resolution, frame rate, and modality (monochrome, colour, non-visible bands, etc.) are likely to increase to absorb the extra capability. The last three problems can be extremely severe (Tillett, 1991b) but may be tractable, e.g. (Pla et al., 1993). The difficulty is compounded by the fact that humans and other animals seem to have little problem in interpreting scenes, even natural outdoor ones. Thus the layman, and many scientists and engineers outside the field, do not appreciate that there is a problem. This can lead to over expectation for the technology, with potential for disillusionment and inappropriate dismissal. In short, the integration of machine vision into practical agriculture may take some time to arrive but it will be worth the wait. Particular problems with machine vision from vehicles include camera (i.e. vehicle) motion. There can be two sources of blurring. Firstly, with a normal analogue output camera, there may be a gross misregistration between the two fields that make up an interlaced frame. Using typical figures for field rate and vehicle speed this can represent about 30 mm on the ground, a figure that is significant compared with say the size of a weed. The simple solution is to use only one type of field (even or odd) but this reduces the resolution of the system. However, the theoretical resolution may not be usable in practice anyway because of camera vibration. With more modern digital output cameras the scan can be non-interlaced, as they are not tied to an existing TV standard, but they are expensive and may remain so until there is a mass market for them. Secondly, blurring is caused by the motion during the integration time of the sensor. This is much smaller and can be significantly reduced at agricultural vehicle speeds by
18
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
using an electronic shutter. Natural lighting provides at least one advantage in that there is normally plenty of it. Shutter speeds of one millisecond or less are achievable without special cameras. A further problem is the large range of light intensities that can be found in an outdoor scene. Temporal changes can occur over a relatively long time scale (cloudy vs. sunny day), or a short one (small clouds covering the sun on a windy day). Automatic control of the lens aperture or sensor integration time may be possible but it is not obvious which scene property to measure for control. Average or spot intensity measurements, as used in automatic cameras, may not preserve the features in the scene that are important for automatic image analysis. Spatial intensity changes can be very large (bright sunshine to deep shadow in the same scene) and cannot be accommodated by exposure control. They can also be well outside the range of a CCD image sensor. New sensor technologies having a wider dynamic range may provide a solution to the contrast problem. The difficulties of automatic interpretion of natural scenes have been dealt with elsewhere (Tillett, 1991b). Briefly, in addition to natural lighting, problems arise from the non-uniform and unpredictable nature of the objects, the difficulty of controlling their presentation to the camera, overlapping objects, and so on. It is extremely difficult to develop a vision algorithm to interpret natural scenes reliably. This means that, despite machine vision being a useful and powerful sensing method, it must be combined with other sensors in a proper framework if it is to be used successfully to guide vehicles.
4.3. General properties of local feature based systems Navigation on the basis of local features differs from beacon based navigation in that the feature locations are not known precisely a priori. A possible-though complex-solution is to estimate vehicle position and incrementally construct a map of the features at the same time. Later in this paper, we describe an alternative approach which relies upon the principle that the absolute position of a vehicle is relatively unimportant; it is position relative to the objects with which the machine interacts that is of consequence.
5. Sensor data fusion
5.1. Multi sensor systems In the absence of an ideal sensor for localisation, it is necessary to use sensors in combination in order to achieve acceptable results. Dead reckoning systems (odometry or inertial) and landmark based sensing systems (whether the landmarks be natural features, beacons or satellites) have complementary strengths, if the high frequency response of dead reckoning can be combined with the low frequency, but drift free, response of landmark based systems.
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
19
5.2. The Kalman filter The Kalman filter (KF) (Kalman, 1960; Bar-Shalom and Fortmann, 1988) provides a sound theoretical framework for multi-sensor data fusion. Kalman filters have been widely used in the navigation of missiles (Hostetler and Andreas, 1983), space probes (Campbell et al., 1983) and robot vehicles. The Kalman filter is a fundamental part of the positioning systems of both the GEC/Caterpillar industrial AGV, and the cargo handling AGV (Durrant-Whyte, 1996) mentioned earlier; moreover, it is central to GPS receiver operation (Glazer, 1980). The approach relies upon tracking the position of the vehicle (or more generally, the state of the system) over time. Once the filter has been initialised, it is never again necessary to locate the vehicle from scratch-rather, the vehicle is located with the benefit of an estimate of the last known position. It is clearest to begin with the case of a linear time invariant system, for which the Kalman filter is an optimal state estimator. In reality, the kinematics of most practical vehicles are non-linear, and so a modified form, the extended Kalman filter or EKF, is used. Consider a linear system where the dynamics of state x can be modelled by a discrete time state transition equation of the form: x(k +1) = Ax(k) + Bu(k) + v(k)
(1)
where A and B are constant matrices, and u(k) is a vector of inputs to the system at time step k. A sensing process which provides observations of some navigational features can be modelled: y(k +l) =Hx(k +l) + w(k + l)
(2)
here H is another constant matrix. The vector quantities v(k) and w(k) represent disturbance inputs, which are assumed to be Gaussian and white. In the absence of perfect sensors, the true state x(k) cannot be known exactly; the best that can be achieved is to maintain an estimate. Let that estimate be denoted by xˆ. Further, let the notation xˆ(k j ) represent the estimate of state x at time step k, based upon sensor data up to and including that at time step j. The filter functions in a predict / correct fashion. At time step k+ 1, a prediction xˆ(k +1 k) is first made by propagating the old estimate xˆ(k k) through the process model Eq. (1), i.e. xˆ(k +l k) =Axˆ(k k) + Bu(k)
(3)
In order to represent the reliability of state estimate xˆ, its covariance P is also maintained: P(k + 1 k) =AP(k k)AT +Q
(4)
where Q is the covariance of disturbance input v. As a practical example, Eq. (3) ··· Eq. (4) might be used to predict the new position of a vehicle based upon the last known position and its speed (perhaps from odometric measurements). The important point here is that the old state
20
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
estimate is still a valuable source of information a small instant later in time, provided that something is known of the dynamics of the state over the intervening period. This prediction xˆ(k +1 k) can now be improved by incorporating new sensor observations of the landmark objects, as described in Eq. (2). These observations may be incorporated to give corrected estimate xˆ(k+ 1 k + 1) using the following equations: P − l(k + l k +l) xˆ(k + 1 k +l)= P − l(k+1 k) xˆ(k k+ l)+ HTR − ly(k +1) (5) P − l(k + 1 k +l) = P − l(k + 1 k)+HTR − 1H
(6)
Here R is the covariance of w, or the observation noise. This correction process can be expressed in various (algebraically equivalent) ways; the equations given above are the less common inverse covariance form, i.e. expressed in terms of the inverse of the covariance matrix. This is known as the information matrix, and P − l(k + 1 k + 1) is a measure of the information content, or accuracy, of estimate xˆ(k+ 1 k + 1). Thus Eq. (5) combines the state prediction with the new landmark observations by summation, weighted according to the information content of each source. As might be expected, the information matrix of the new estimate is the sum of the information provided by the two sources. It should be noted that Eq. (5) ··· Eq. (6) are nothing more than the recursive form of the least squares estimator.
5.3. Extensions and practicalities 5.3.1. Validation of obser6ations: the correspondence problem Since the KF tracks the state estimate over time, it is possible to use the prior estimate of state at time k + 1 obtained from Eq. (3) in equation Eq. (2) to determine the expected sensor observations. This can be used both to resolve ambiguities over, say, which of a set of beacons has given rise to a given observation, and to test the likelihood of an observation being valid. A x 2 test can be applied to reject outlying observations. 5.3.2. Asynchronous updates For many sensing systems, e.g. a rotating laser scanner, the observations arrive asynchronously, rather than at regular time intervals. There is nothing in the formulation of the KF given above which requires that the predict/correct process be performed at a fixed time step. For observations which arrive asynchronously, it is possible to make the prediction when an observation has arrived, on the basis of the time at which the observation was taken. 5.3.3. Time 6arying and non-linear systems Although the KF has been described for the simple case of a time invariant linear system, some generalisations are possible. The matrices A, B, H, R and Q may be time varying. Moreover, it is possible to use the KF in an extended form (the EKF)
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
21
to estimate the state of a non-linear system. Consider replacing Eq. (1) with a general non-linear model: x(k + 1) =F(x(k), u(k), k) + v(k)
(7)
It is simply a matter of applying a Taylor series expansion to Eq. (7), truncated to first or second order terms, to give a form which can be used in the same way as a linear model. The filter is no longer optimal, but is sufficient for a very wide range of applications. The same principle applies also to a non-linear observation model. In autonomous vehicle navigation problems, this extended form is almost always necessary.
5.3.4. Other forms of noise In Section 5.2, the disturbance inputs v(k) and w(k) are assumed to be Gaussian and white. This is a fundamental assumption in the derivation and proof of the Kalman filter and its optimality for linear systems. The noise assumption is often cited as a limitation of the Kalman filter approach. In reality, of course, Gaussian white noise exists only in mathematical texts. Fortunately there are means of dealing with sensor data corrupted with coloured (temporally correlated) noise. The output of an accelerometer, for example, is typically corrupted by an offset error which is slowly varying with time (because of thermal drift). For accurate results, it is common to estimate the current value of this offset by incorporating it as an extra element to the state vector. In more complex cases, frequency domain techniques can be used to separate data from temporally correlated noise, for example in the processing of GPS data (Nebot et al., 1997a; van Bergeijk et al., 1998).
6. Example system
6.1. Background Some of the principles outlined in previous sections have been embodied in an autonomous vehicle developed at Silsoe Research Institute (Fig. 1). A brief account will be given here in order to illustrate some of the points already made, full details can be found in previous publications (Hague and Tillett, 1996; Brivot and Marchant, 1996; Hague et al., 1997). The vehicle has been developed in the context of precise treatment of plants, in particular for crop protection. Chemicals may be significantly reduced in quantity if they can be targeted in the most effective way. It may also be possible to abandon chemicals in some operations if mechanical treatments can be performed accurately. Accurate treatment on a highly localised scale may open up the possibility of (or even require) a careful application method that could be slower than existing treatments. In this case, an autonomous vehicle would be advantageous as it would remove the need to have a driver on board for long periods. Our vehicle is not
22
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
meant to be in any way a prototype for a commercially viable machine. Rather, it was developed as a test bed to investigate the control and sensing problems involved in autonomous operation of agricultural field machinery. We have chosen to start by attempting to treat a transplanted cauliflower crop. Like many other crops, it is grown in well defined rows. To navigate successfully, determination of vehicle position in an absolute reference frame is not necessary. In fact navigation with respect to landmarks in the vehicle’s environment will suffice. In our case the landmarks are the cauliflower rows themselves. Machine vision has been adopted both to locate the crop rows for navigation purposes, and to provide data for treatment targeting.
6.2. Machine 6ision In order to differentiate between vegetation (plants and weeds) and soil we exploit the fact that vegetation is much more reflective in the near infra red (around 800 nm) than soil. CCD cameras are sensitive beyond 1000 nm, and often contain filters to block radiation outside the visible range. If this filter is replaced with a filter that blocks visible light but passes near infra red, images can be obtained with an enhanced contrast between vegetation and soil. We use a Kodak Wratten filter 89B.
Fig. 1. The Silsoe experimental autonomous vehicle.
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
23
Fig. 2. Example of row finding: (a) original image; (b) located row structure.
Fig. 2a shows a typical scene, in the near infra red, containing cauliflower. The field of view provides only limited information for identifying the row structure which will be used for vehicle guidance. This means that any technique must be especially robust to uncertainty in the data. Uncertainties arise from variability in planting pattern, different plant sizes, missing plants, presence of weeds, image noise because of natural lighting variations and many other causes. To cope with the low information content we make use of both the data and the prior knowledge that is available, and use a robust method to find the row structure in the image. Our method (Marchant and Brivot, 1995) makes use of a specially designed Hough transform (Davies, 1990). The method integrates information over each row and over a number of rows in the image. It also makes use of prior knowledge of row spacing and camera calibration. Fig. 2b shows the located row structure superimposed on the original image. The differentiation of plants from weeds is an area of continuing research. One approach takes advantage of various observed differences in appearance between the two. For example, the weed patches are of a lower intensity than the plants and the grey levels within the patches are more rapidly varying. A high-pass filter passed over the image gives a higher response in the weed patches compared with the plants, and subtracting the high-pass filtered image from the original thus enhances the existing contrast between plants and weeds. In previous work Brivot and Marchant (1996) have combined this basic idea with some morphological processing (e.g. erosion, dilation, hole filling) to improve the accuracy of plant/ weed/soil boundary location. Fig. 3a shows a typical image with the segmentation in Fig. 3b. An alternative approach (Southall et al., 1998) uses the location of detected plant material for classification; features matching the known planting pattern of the crop are assumed to be crop, other features are identified as weeds. For greatest robustness, it is expected that a combined algorithm using both feature location and appearance for classification will be needed.
24
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
6.3. Vehicle na6igation For fully autonomous operation, the vehicle must be capable of following plant beds using data from the machine vision system, with sufficient robustness to tolerate patches of missing crop or heavy weed infestation. When the end of the row is reached, the vehicle needs sufficient navigation capability to reposition itself at the head of the next bed without manual intervention. As a consequence of the transient nature of the crop rows, a system which requires detailed prior maps of the plant locations would be impractical. To avoid the problem of mapping the field on-line, a system has been implemented which does not rely upon absolute positioning. Instead, an alternative coordinate system and means of path description is adopted. The path is composed of a list of segments, each defined by their length and a curvature function. The curvature functions used in practice are linear functions of distance, i.e. the path is composed of clothoid curves (which include circular arcs and straight lines). This allows paths to be constructed where curvature is a piecewise linear function of distance; these smooth paths are advantageous in reducing wheel slip. The segments aligned with crop rows are taken to be nominally straight, and are flagged in the path description as requiring the use of the machine vision system. For vehicle control purposes, all segments are augmented with a required forward speed. Rather than measuring position in a Cartesian coordinate system, the navigation system uses an extended Kalman filter to estimate directly the lateral offset and heading errors from the path, as well as forward distance along the path. To do this requires a fairly complex process model described elsewhere (Hague and Tillett, 1996; Hague et al., 1997). In addition to the machine vision system already summarised, a number of sensors are used to provide a dead reckoning capability. This provides the level of performance needed to bridge any small areas where the vision system cannot successfully locate the crop rows, and to perform a turn at the end of the row, during which the vision system is not available. The sensor package used includes left and right wheel odometers, forward and lateral accelerometers and a three axis
Fig. 3. Example segmentation: (a) original image; (b) segmented image-crop shown in white, weeds grey, soil black.
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
25
Fig. 4. Vehicle dead reckoning performance.
fluxgate compass. This package is carefully selected; for the compass data to be accurately interpreted, the roll and pitch angles of the vehicle must be known. The accelerometers are sensitive both to vehicle acceleration proper and gravitational acceleration as a result of tilt; the odometers register vehicle speed. The process model includes the derivative relationship between speed and acceleration, thus allowing the required roll and pitch angles to be estimated. The accuracy achieved by this system can be seen in Fig. 4; this shows the path taken by the vehicle during a headland turn. The accumulated position error over this turn is only 60 mm. While travelling down the crop row, the machine vision system provides additional input to the EKF navigator. For greater robustness, the readings provided by the vision system are validated using a x 2 test against predictions made on the basis of dead reckoning; data of less than 95% confidence are discarded. This eliminates erroneous data, typically where the wrong set of plant rows have been located by the vision system. The benefit of combined use of dead reckoning and machine vision is illustrated in Fig. 5. The three traces show results from a run of the vehicle over a bed of cauliflower; the effect of integrating odometric data is shown alongside the raw machine vision derived offset data. As can be seen, the position estimate derived from odometry alone is subject to unbounded drift, whilst the raw vision data is corrupted with significant levels of noise. Also shown is the output of a Kalman filter combining the two sources of information; the result retains some of the low noise level of the odometric data whilst eliminating the drift.
7. Conclusions In conclusion, we have discussed a variety of sensing methods for autonomous vehicle navigation. Advances in autonomous systems in a wide variety of applica-
26
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
Fig. 5. Example of sensor data fusion.
tion areas can be drawn upon to provide valuable input to agricultural automation. Certain aspects of the navigation problem which are unique to, or particularly acute in, the agricultural environment require further research, for example terrain-related issues. In the absence of a navigation sensor which is perfect or even adequate on its own, a combination of imperfect sensors must suffice. Sensor data fusion techniques allow state estimates to be extracted from sensor data which is corrupted by noise. Careful selection of sensor packages exploiting complementary strengths of individual sensors is necessary to allow all the data required to be extracted from an economical sensor suite. Finally, the need for cost effective systems favours sensor types which are (or will be) produced in large volume, but it is unlikely that the market for automated agricultural equipment will be large enough to create sufficient demand alone. Thus devices which have a mass market in other areas (e.g. multimedia and automotive) are most likely candidates for incorporation into practical systems.
References Bar-Shalom, Y., Fortmann, T., 1988. Tracking and Data Association. Academic Press, New York. Barshan, B., Durrant-Whyte, H.F., 1993. An inertial navigation system for a mobile robot. In: IEEE/RSJ International Conference Intelligent Robot Systems, pp. 2243 – 2248. Benlloch, J.V., Heisel, T., Christensen, S., Rodas, A., 1997. Image processing techniques for determination of weeds in cereals. In: Bio-Robotics 97, EurAgEng/IFAC. Gandia, Spain, pp. 195 – 200. Borenstein, J., 1996. Measurement and correction of systematic odometry errors in mobile robots. IEEE Trans. Robot. Autom. 12 (6), 869–880. Brivot, R., Marchant, J.A., 1996. Segmentation of plants and weeds for a precision crop protection robot using infrared images. Proc. IEEE-Vis. Image Signal Process. 143 (2), 118 – 124.
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
27
Campbell, J.K., Synnott, S.P., Bierman, G.J., 1983. Voyager orbit determination at Jupiter. IEEE Trans. Autom. Control 28 (3), 256–268. Cozman, F., Krotkov, E., 1995. Localisation using a computer vision sextant. In:IEEE International Conference on Robotics and Automation, Nagoya, Japan, pp. 106 – 111. Davies, E.R., 1990. Machine vision: Theory, Algorithms, Practicalities. Academic Press, London. Durrant-Whyte, H.F., 1996. An autonomous guided vehicle for cargo handling applications. Int. J. Robot. Res. 15 (5), 407–440. Elfes, A., 1987. Sonar-based real-world mapping and navigation. IEEE Trans. Robot. Autom. 3 (3), 249–265. Glazer, B.G., 1980. GPS receiver operation. In: Global Positioning System, vol. 1. The Institute of Navigation, pp. 81–86 (red books). Hague, T., Tillett, N.D., 1996. Navigation and control of an autonomous horticultural robot. Mechatronics 6 (2), 165–180. Hague, T., Marchant, J.A., Tillett, N.D., 1997. Autonomous robot navigation for precision horticulture. In:IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 1880 – 1885. Harper, N.L., McKerrow, P.J., 1997. Recognition of plants with CTFM ultrasonic range data using a neural network. In: IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 3244–3249. Hostetler, L.D., Andreas, R.D., 1983. Nonlinear Kalman filtering techniques for terrain-aided navigation. IEEE Trans. Autom. Control 28 (3), 315 – 322. Kalman, R.E., 1960. A new approach to linear filtering and prediction problems. J. Basic Eng. Trans. ASME, D 82, 35–45. Kleeman, L., Kuc, R., 1995. Mobile robot sonar for target localisation and classification. Int. J. of Robot. Res. 14 (4), 295–318. Leonard, J., Durrant-Whyte, H.F., 1992. Directed sonar sensing for mobile robot navigation. Kluwer Academic, Kingston–upon-Thames. Marchant, J.A., Brivot, R., 1995. Real time tracking of plant rows using a Hough transform. Real Time Imaging 5 (1), 363–371. Matsuo, Y., Yukumoto, O., Aburata, K., Noguchi, N., 1997. Research on tilling robot (part 4)-unmanned operation with the total station of automatic tracking type. In:Japanese Society of Agriculture Machinery 56th Annual Meeting, pp. 57 – 58. Morgan, K.E., 1958. A step towards an automatic tractor. Farm mech. 10 (13), 440 – 441. Nebot, E., Durrant-Whyte, H.F., Scheding, S., 1997. Frequency domain modeling of aided GPS with application to high-speed vehicle navigation systems. In IEEE International Conference on Robotics and Automation, Albuquerque, NM, pp. 1892 – 1897. Nebot, E., Sukkarieh, S., Durrant-Whyte, H.F., 1997. Inertial navigation aided with GPS information. In Proceedings of the Fourth Annual Conference Mechatronics and Machine Vision in Practice, Toowoomba, Queensland, Australia, pp. 169 – 174. Nilsson, N.J., 1969. A mobile automaton: an application of artificial intelligence techniques. In first International Joint Conference on Artificial Intelligence, Washington DC, pp. 509 – 520. Pla, F., Juste, F., Ferri, F., 1993. Feature extraction of spherical objects in image analysis: an application to robotic citrus harvesting. Comput. Electron. Agric. 8 (1), 5772. Schofield, C.P., Marchant, J.A., 1996. Measuring the size and shape of pigs using image analysis. In:AgEng 96, European Society of Agricultural Engineers, Madrid, Spain. Scho¨nberg, T., Ojala, M., Suomela, J., Torpo, A., Halme, A., 1996. Positioning an autonomous off-road vehicle by using fused DGPS and inertial navigation. Int. J. Syst. Sci. 27 (8), 745 – 752. Shmulevich, I., Zeltzer, G., Brunfeld, A., 1989. Laser scanning method for guidance of field machinery. Trans. Am. Soc. Agric. Eng. 32 (2), 425 – 430. Southall, B., Marchant, J.A., Hague, T., Buxton, B.F., 1998. Model based tracking for navigation and segmentation. In:Burkhardt, H., Neumann, B. (Eds.), Proceedings of the Fifth European Conference on Computer Vision, Freiburg, pp. 797 – 811. Stein, F., Medioni, G., 1995. Map based localisation using the panoramic horizon. IEEE Trans. Robot. and Autom. 11 (6), 892–896.
28
T. Hague et al. / Computers and Electronics in Agriculture 25 (2000) 11–28
Tillett, N.D., 1991a. Automatic guidance sensors for agricultural field machines: a review. J. Agric. Eng. Res. 50, 167–187. Tillett, R.D., 1991b. Image analysis for agricultural processes: a review of potential opportunities. J. Agric. Eng. Res. 50, 247–258. van Bergeijk, J., Goense, D., Keesman, K.J., Speelman, L., 1998. Digital filters to integrate global positioning system and dead reckoning. J. Agric. Eng. Res. 70, 135 – 143. Warner, M.G.R., 1965. The automation of agricultural field work. Proc. IMechE 197, 78 – 84 Convention on advances in automatic control. Yang, Q., Marchant, J.A., 1996. Accurate blemish detection with active contour models. Comput. Electron. Agric. 14 (1), 77–90.
.