Copyright © IFAC Intelligent Components and Instruments for Control Applications, Malaga, Spain, 1992
THE USE OF IMAGE PROCESSING IN SATELLITE ATTITUDE CONTROL D, Croft EMBL, Meyerhnfstrasse J, 6900 Heidelberg, GermaflY
Abstract, Accurate orientation control is essential for any communication satellite. One way of achieving this is to point a video camera down at the Earth, and compare the observed position of the Earth with its ideal position. If there are any errors, the satellite's thrusters are activated to correct them. Although this idea is simple in theory, there are many problems in practice. This paper looks at one of those problems - finding the position of the Earth in the image obtained from the camera. This has been broken down into two subproblems: i) locating the Earth/space boundary and ii) using data gathered from i) in estimating an Earth centre position. Finding the Earth/space boundary is non-trivial, because the Earth is in general only partially illuminated, and there are various forms of noise present in the image. Also, the image processing must be c'arried out in real time, since it is being used as part of a control system. Given a (generally incomplete) set of points on the Earth/space boundary, finding a centre posi tion is fairly straightforward. Two approaches are described here, and accuracy comparisons made . Keywords. Boundary detection, circle , domain constraints, image processing, nonlinear leastsquares, satellite attitude control, subpixel accuracy, thresholding, tracking.
INTRODUCTION
gested 2 that if there were constraints inherent in the domain, it might be possible to simplify the algorithms used . Two significant constraints are explored in this paper: using an annulus to limit the range over which image processing is performed, and using localised thresholding.
Global communications systems are coming to rely more and more on satellite links for conveying data. Generally, these satellites are put into geostationary orbits I . This makes it easy to set up ground stations, since an antenna can be pointed skywards in a fixed direction. However, in order to maintain a good signal-to-noise ratio, the satellite's own antenna must maintain its pointing direction accurately. This means that the orientation (or attitu.de) of the satellite, relative to the Earth, must be kept as constant as possible.
The specifications for the attitude control system give maximum pointing errors. Thus, if the system is working properly, the actual region of the image within which the Earth/space boundary will be found is quite small (only a few pixels) . Any image processing can, therefore, be confined to an annulus, defined by the maximum pointing errors.
Attitude control is generally done by observing the Earth and comparing its position with the expected I'osition w hen the satelli te is ideally oriented. If there is a discrepancy, one or more of the satellite thrusters can be activated to reduce it to zero.
Boundary detection could be done by applying an edge detector within the annulus. However, a much faster method is to use a thresholding tracker to follow around the boundary. Threshold values can be calculated locally, within segments of the annulus, so that gradations in illumination over the Earth's surface can be taken into account.
Traditionally, this has been done by scanning the Earth in the infra-red, and comparing the East and West extremes of its perimeter with the edges of the image. The problem with this technique is that it uses equipment with moving parts.
Centre estimates can be made from points collected from the boundary. Two methods have been tried: a geometrical method, and a nonlinear least squares method . It is assumed in both cases that the visible portion of the Earth's boundary is a circular arc.
An optical method for obtaining Earth position discrepancy data has been proposed (Beard et ai, 1990), using a camera on a chip with a wide-angle lens to observe the Earth . Typical Earth images are shown in Figs. 1 and 2. An edge detector would find discontinuities in the image; the discontinuity of most interest is the Earth/space boundary. Data from the edge detector would be fed into a circular matched filter . The strongest point in the filter's output corresponds to the Earth's centre position .
EARTH/SPACE DETECTION
BOUNDARY
Introduction The first step in locating the position of the Earth as seen by the satellite's on-board camera is to find the boundary between the edge of the Earth and deep space. To do this, advantage is taken of the fact that the Earth is lit by the sun, whereas space is nearly black .
Edge detection and matched filtering over entire images are computationally expensive. It has recently been sugI A geostationary orbit is in the equatorial plane, and has a period of exactly one sidereal day, so that the satellite remains above the same point on the Earth's surface at all times.
2by Or C.N. Duncan, of the Meteorology Department at the University of Edinburgh
383
Figure 1: METEOSAT full Earth image
Figure 2: !\lETEOSAT partially illuminated Earth image
384
However, there are a number of problems. Firstly, a perfect circular image of the Earth is almost never available; there is usually some degree of self-shadowing, where only part of the Earth is illuminated. In the worst case, the sun is behind the Earth, and all that can be seen is a thin band where sunlight has diffused through the atmosphere. Secondly, images will be subject to noise. In part, this will be thermal and digitisation noise. However, the camera will also be exposed to damaging radiation, which can result in some pixels being permanently on or permanently off.
Thresholds are selected "locally" for each spoke. A histogram is constructed from the intensities of all the pixels lying under a spoke. Then, the modal method is used for finding a threshold (see Rosenfeld & Kak, 1982). That is, the threshold intensity is selected as being the histogram minimum point lying between the two largest peaks on the histogram . As currently implemented, the system has 32 spokes. Starting with spoke 0, scanning is done by looking at the pixels along the spoke sequentially, until an over-threshold pixel is encountered. This pixel becomes the start point for the edge following phase of tracking; it will track as far as possible in both clockwise and anticlockwise directions from each spoke around the Earth/space boundary. The boundary points gathered in this manner are passed onto the centre locating algorithm.
Constraining the Problem As already mentioned, the Earth's centre will never, under normal conditions, drift by more than a few pixels from the centre of the image. Hence, we can construct an annulus within which we can expect to find the Earth/space boundary (see Fig. 3) . Any image processing can be confined within this boundary, rather than being applied to the entire image.
Some Embellishments One of the big advantages of selecting thresholds based on local conditions is that there are significant intensity variations over a normal Earth image, so that different threshold values apply along different segments of the boundary. However, because the number of pixels on which the histograms are based is small, noise can cause significant peaks , and erroneous threshold selection can sometimes occur. Another problem is that there can be substantial jumps in threshold value from one spoke to the next. This means that there will be discontinuities in the tracked boundary, possibly giving rise to systematic errors in the centre estimate.
This also means that when the visible Earth boundary deviates from the circular (as it will in any partially illuminated images of the Earth), very little of the non-circular portion of the boundary will be included within the annulus. Excluding this portion of the boundary will greatly improve the accuracy with which the centre position can be estimated. Tracking Tracking algorithms use local information to follow around discontinuities in an image - eg o an intensity discontinuity, such as the Earth/space boundary discussed in this paper (see Rosenfeld & Kak, 1982, for a more detailed discussion) . They can be a faster way of finding edges that conventional mask convolution methods, since they use information already gathered to extend an edge.
To reduce the effects of local errors, two techniques were investigated: smoothing and interpolation. Threshold smoothing looks at the threshold found for a given spoke, and compares it with the thresholds on the immediately adjacent spokes. If it is intermediate between them, it is left unchanged. If it is greater than both or less than both, then it is adjusted to be the mean of the three threshold values. This tends to reduce the influence of "outlier" threshold values.
Tracking has two phases: scanning and following. During the scanning phase, the system will look along a straight line through the image until it finds a discontinuity that will make a suitable start point. From this start point, the system will attempt to follow along an edge .
Threshold interpolation takes the thresholds on two adjacent spokes, and interpolates the thresholds linearly between them. Thus, instead of a sudden step in threshold between two spokes, there is a gradual change. This has the advantage of giving only a single track around the Earth/space boundary.
As a tracker follows an edge, it marks already tracked points, so that they will not be considered again. In order to grow the track, the pixels around the last point on the track are examined pairwise. If the intensity of one of the pair is above a given threshold (ie. it is part of the Earth) and the intensity of the other is below the threshold (ie. it is in space), then the above threshold point is marked as being on the Earth/space boundary, and becomes the new end of track 3 . The above procedure is then repeated . The track will thus grow until either i) there are no pairs that meet the "one above threshold and one below threshold" criterion, or ii} a previously tracked point is encountered.
Smoothing and thresholding can be combined to enhance performance.
A number of images , both synthetic and "real" (derived from METEOSAT pictures of the Earth) were used to test the boundary detecting algorithms (see Croft, 1991). Here, results from just one of these tests, on a METEOSAT image will be presented. This image is one of those "worst case" ones, where the sun is behind the Earth, and all that is visible is a thin strip of atmosphere (see Fig. 4).
Spokes and Thresholds Two questions arise from the previous subsection: how are the straight lines for the first phase of tracking laid down on the image, and how are thresholds for distinguishing Earth from space determined?
The results shown in table 1 give statistical comparisons of the distance of the points actually found on the boundary from the "ideal" circular boundary, centred on the centre of the image (distance values are in pixels) . Results from a Sobel edge detector are also included for reference.
The answer to both of these questions is: spokes. A set of evenly spaced spokes , radiating from the centre of the image, is notionally constructed. The portion of each spoke actually used by the system only extends a few pixels each side of the annulus (see Fig. 3).
In this experiment, threshold interpolation produced a significant improvement in accuracy when compared to the unaided thresholding tracker. Over a larger number of experiments, interpolation plus smoothing tends to give the most accurate results (see Croft, 1991, for more discussion of this) .
3 Another
approach, which has also been examined , is to look at the difference in intensity between the pixels in a pair, and compare this with a "difference threshold" . Only the absolute intensity criterion is discussed in this paper, for the sake of brevity.
385
----
----
Figure 3: Boundary detection using spokes and annulus
Figure 4: METEOSAT test image
Table 1: Effects of Tracking Algorithm on Edge Point Location Accuracy Boundary pt extraction method Sobel + modal thresholding Thresholding tracker Thresholding tracker + s Thresholding tracker + i Thresholding tracker + s + i
(where
point count
mean pt err
SD pt err
328 135 140 48 111
0.962 0.526 0.546 0.394 0.466
0.626 0.315 0.331 0.262 0.314
+ s indicates smoothing and + i indicates interpolation).
386
CENTRE LOCATION Introduc tion
Xc
To a very good approximation, the Earth is a sphere. Hence, the image of the Earth as seen from a satellite will be circular. Knowing the height of the satellite above the Earth's surface, and the angular field of view of the camera, it is easy to calculate the radius (in pixels) of the Earth as it should appear in the image. Having tracked the Earth/space boundary, we also have a list of points around the circumference of the Earth . In general, these will not span the entire circumference; there will be gaps , small ones due to noise, and large ones due to the Earth's partial illumination.
=
We simply pick pairs of points from the list of Earth/space boundary pixels, and slot them into these equations. By using a large number of pairs, and averaging the Xe and Ye values obtained, we can get an accurate centre estimate. For maximum accuracy, the angle subtended by the two boundary points and the centre of the image is selected to be as near to 90° as possible .
How can these points be used to locate the Earth's centre? Two techniques are considered in this paper, least squares and geometric .
The centre location algorithms were tested with a number of synthetic images, to observe the effect of pixel position and intensity noise . Both geometric and least-squares proved robust in the presence of noise - see Croft, 1991, for full details . The effects of using different tracking algorithms were also investigated. For the METEOSAT image mentioned in the boundary detection sections (see Fig. 4), the errors in centre estimate are shown in table 2.
Least Squares The Levenberg-Marquardt nonlinear least squares method, as described in Press et ai, 1988, can be used in centre location with a few minor modifications . The user must supply a fun ction that, given a value of the independent variable, x, computes:
In this example (and in several other experiments), the least squares method obtained the centre location exactly. The geometric method works best if thresholds have been both smoothed and interpolated at the tracking stage.
1. y, and 2 • .21L8 8 ,where
ai, a2, ' .. , aN are the N parameters to a, be found for the nonlinear equation Y = f(x: al,a2 ,·· · ,aN)'
CONCLUSIONS A system for identifying Earth/space boundary pixels in satellite based images of the Earth has been implemented, using computationally cheap image processing techniques. Two methods for using this data to locate the Earth's centre coordinates have been described . These coordinates can usually be determined to an accuracy of better than 0.1 pixels, even in the presence of significant noise . Threshold smoothing and interpolation at the tracking stage help to boost the accuracy with which the boundary is detected. The least squares centre finding algorithm usually gives a more accurate estimate than the geometrical algorithm.
We can express the equation for a circle as:
where R is the radius of the circle , (x e, ye) is it's centre coordinate, and (x , y) is a point on its circumference . We can rearrange this equation to get a solution for y, with x the independent variable: y = Ye - JR2 - (xe - x)2
Improvements could be made on the system as it currently exists, ego it could be speeded up significantly by using integer instead of floating point arithmetic. Other constraints on the tracked points might be implemented, eg o rejecting portions of arc that do not fit within certain curvature limits (see Pavlidis & Horowitz , 1974) . It would be instructive to estimate the comparative computational complexity of the geometric and the least-squares centre finding algorithms. Experimentation with other centre locating techniques, eg o Tamas, 1991, would be interesting.
The parameters we are interested in finding are the centre coordinates; we can write al Xe and a2 Ye . By differentiation, we get :
=
=
It may also be possible to generalise the method to cover a broader range of problems. Eg., a "pseudo annulus" could define a band around an arbitrary shape, limiting the amount of computation needed to find its boundary. If the object were moving, so that it was in a different position in each frame, then knowledge of its position in the previous frame and some approximate velocity vector would give the position in the current frame. Hence, the object could be tracked. Such ideas have potential applications in robotics and in cloud tracking.
Geometric Given two points , (XO,Yo) and (xI , yd , on the circumferen ce of a circle, plus the radius of the circle, R, it is possible to calculate the centre position of the circle .
If we write :
= XI - Xo = YI - Yo W = (x6 - xi) U
V
+ (Y6
-
yn
ACKNOWLEDGEMENTS
then we can show that Ye can be obtained by solving the quadratic:
I would like to thank Mitch Harri! for many u!eful ngge!tion! , and my tutor Char/e! Dun can for keeping me on the right track. I would al!o like to thank the Meteorology Department at the Univer6ity of Edinburgh for their npport in thi! work.
(V 2 /U 2
+ 1 )Y~ + (2xov/u + vw/u 2 - 2yo )ye+ {X6 + Y6 - R2 + xQw/u + w 2 /(4u 2 )) = 0 and
Xc
can be calculated from :
387
Table 2: Effects of Tracking Algorithm on Centre Location Accuracy Boundary pt extraction method Thresholding tracker Thresholding tracker + i Thresholding tracker + s Thresholding tracker + i
+s
Least squares posn err 0.000 0 .000 0.000 0.000
geometric posn err 0.879 1.072 0.971 0 .911
(where + s indicates smoothing, + i indicates interpolation and the "error" is the distance , in pixels, of the estimated centre from the centre of the image).
REFERENCES C .1. Beard, S .D. Hayward, M. Caola, C .N. Duncan and I. MacLaren (Sep. 1990). Interim report on mosaic earth unsor feasibility study, British Aerospace (Space Systems) internal document TP 8760. D . Croft (1991). An automated technique for finding earth centre in satellite images, MSc report, Dept . of Meteorology, Univ. of Edinburgh, Scotland. T . Pavlidis and S .L . Horowitz (Aug. 1974). Segmentation of plane curv es, IEEE Trans. on Computers, vol C-23, no . 8, pp 860-870. W.H . Press, B.P. Flannery, S.A. Teukolsky and W.T. Vetterling (1988) . Numerical recipes in C, pp. 540-547, Cambridge University Press . A . Rosenfeld and A.C . Kak (1982) . Digital picture processing, Academic Press. A. Tamas (1991) . On Circles Recognition, Vision Interface '91.
388