Target Geolocation for Agricultural Applications via an Octorotor

Target Geolocation for Agricultural Applications via an Octorotor

5th IFAC Conference on Sensing, Control and Automation for 5th IFAC Conference on Sensing, Control and Automation for Agriculture 5th IFAC Conference ...

844KB Sizes 25 Downloads 24 Views

5th IFAC Conference on Sensing, Control and Automation for 5th IFAC Conference on Sensing, Control and Automation for Agriculture 5th IFAC Conference on Sensing, Control and Automation for Agriculture 5th IFAC14-17, Conference on Sensing, Control and Automation for August 2016. Seattle, Washington, USA Available online at www.sciencedirect.com Agriculture August 14-17, 2016. Seattle, Washington, USA Agriculture August 14-17, 2016. Seattle, Washington, USA August 14-17, 2016. Seattle, Washington, USA

ScienceDirect

IFAC-PapersOnLine 49-16 (2016) 027–032 Target Geolocation for Agricultural Applications via an Octorotor Target Geolocation for Agricultural Applications via an Octorotor Target Geolocation for Agricultural Applications via an Octorotor Target Geolocation for Agricultural Applications via an Octorotor

Christian A. Garcia*. Yunjun Xu** Christian A. Garcia*. Yunjun Xu**  Christian Yunjun  Christian A. A. Garcia*. Garcia*. Yunjun Xu** Xu**  *University of Central Florida, Orlando, FL 32816 USA  *University of Central Florida, Orlando, FL 32816 USA (e-mail: [email protected]). *University of Florida, *University(e-mail: of Central Central Florida, Orlando, Orlando, FL FL 32816 32816 USA USA [email protected]). **University of Central Florida, Orlando, FL 32816 USA (e-mail: [email protected]). (e-mail: [email protected]). **University of Central Florida, Orlando, FL 32816 USA (Tel: 407-823-1745; email: [email protected]) **University of Orlando, **University of Central Central Florida, Florida, Orlando, FL FL 32816 32816 USA USA (Tel: 407-823-1745; email: [email protected]) (Tel: 407-823-1745; (Tel: 407-823-1745; email: email: [email protected]) [email protected])

Abstract: In this paper, an orthographic projection based Kalman filter is designed to estimate the Abstract: In this paper, an orthographic projection based Kalman filter is designed to estimate the location of In identified targets for use in disease detectionbased applications strawberry trees or other shallow Abstract: this an orthographic projection Kalman filter is to estimate the Abstract: this paper, paper, anfor orthographic projection Kalmanfor is designed designed to other estimate the location of In identified targets use in disease detectionbased applications forfilter strawberry trees or shallow crops. The process model and measurement model are formulated, relating the global target position to its location of identified targets for use in disease detection applications for strawberry trees or other shallow location of identified targets for use in disease detection applications for strawberry trees or other shallow crops. The process model and measurement model are formulated, relating the global target position to its corresponding image pixel,and taking into account assumptions pertinent to a typical strawberry orchard. An crops. The process model measurement model are formulated, relating the global target position to its crops. The process model measurement model are formulated, relating the global target position toAn its corresponding image pixel,and taking into account assumptions pertinent to a typical strawberry orchard. octorotor is custom-designed and into built, and expected to carry a spectral camera in hover orchard. flight. The corresponding image pixel, taking account assumptions pertinent to a typical strawberry An corresponding image pixel, taking into account assumptions pertinent to a typical strawberry orchard. An octorotor is custom-designed and built, and expected to carry a spectral camera in hover flight. The camera producing the images of thebuilt, targetand is stabilized by a gimbal. An orthographic projection model is octorotor is and expected to aa spectral camera hover The octorotor is custom-designed custom-designed and expected by to acarry carry spectral camera in inprojection hover flight. flight. The camera producing the images of thebuilt, targetand is stabilized gimbal. An orthographic model is adopted here because of the of shallow depth of the crop when compared to the camera altitude. This camera producing the images the target is stabilized by a gimbal. An orthographic projection model is camera producing the images the target is stabilized by awhen gimbal. An orthographic projection model is adopted here because of the of shallow depth of the crop compared to the camera altitude. This projection approach leads to ashallow Kalmandepth filter of design. Firstly, the compared approach is tested through simulation. adopted here because of the the crop when to the camera altitude. This adopted here becauseleads of the the crop when to tested the camera altitude. This projection approach to ashallow Kalmandepth filter of design. Firstly, the compared approach is through simulation. Secondly, it is validated in to an aad-hoc experiment usingFirstly, a recorded video taken by the octorotor. projection leads Kalman filter the is tested through projection itapproach approach leads Kalman filter design. design. the approach approach is by tested through simulation. simulation. Secondly, is validated in to an aad-hoc experiment usingFirstly, a recorded video taken the octorotor. Secondly, it is validated in an ad-hoc experiment using a recorded video taken by the octorotor. Secondly, itUAV, is (International validated in an ad-hoc experiment using a Tracking recorded video taken by theAll octorotor. © 2016, IFAC Federation ofKalman Automatic Control) Hosting by Elsevier Ltd. rights reserved. Keywords: Precision Agriculture, Filters, Applications. Keywords: UAV, Precision Agriculture, Kalman Filters, Tracking Applications. Keywords: Tracking Keywords: UAV, UAV, Precision Precision Agriculture, Agriculture, Kalman Kalman Filters, Filters, Tracking Applications. Applications.   localization of the disease for further investigation,  1. INTRODUCTION  localization of the disease for further investigation, determined directly on-board and for without humaninvestigation, input. localization of the the disease for further investigation, 1. INTRODUCTION localization of disease further determined directly on-board and without human input. 1. INTRODUCTION 1. INTRODUCTION determined directly directly on-board on-board and and without without human human input. input. Growers traditionally rely on labor intensive methods for determined Growers traditionally rely on labor intensive methods for Different sensors have been used on UAVs to identify targets agricultural operations, e.g. manually scouting for diseases, Different sensors have been used on UAVs to identify targets Growers traditionally rely on labor intensive methods for Growers traditionally labor intensive for in methods more general like on laser range-finders Different sensorsapplications, have been been used used on UAVs to identify identify(Savage targets agricultural operations,rely e.g. on manually scouting for diseases, Different sensors have UAVs to targets which can be imprecise, slow, and prone to human error in more general applications, like laser range-finders (Savage agricultural operations, e.g. manually scouting for diseases, agricultural operations, e.g.slow, manually scouting for diseases, & more La Scala 2007) or electro-optical sensors (Wu in more general applications, like laser laser infrared range-finders (Savage which can be imprecise, and prone to human error in general applications, like range-finders (Savage (Defterlican et al Recently, robotic technologies La Scala 2007) or electro-optical infrared sensors (Wu which can be2016). imprecise, slow,emerging and prone prone to human human error & which be imprecise, slow, and to error 1995), but the more common sensor is the camera (Campbell & La Scala 2007) or electro-optical infrared sensors (Wu (Defterli et al 2016). Recently, emerging robotic technologies La Scala or electro-optical infrared sensors (Wu have seen in agriculture and have the technologies potential to & 1995), but the2007) more common sensor is the camera (Campbell (Defterli et al aluse 2016). Recently, emerging emerging robotic technologies (Defterli et 2016). Recently, robotic & Wheeler 2006). Gimballed camera systems are commonly 1995), but the more common sensor is the camera (Campbell have seen use in agriculture and have the potential to 1995), but the moreGimballed common sensor the camera (Campbell revolutionize farmingand provide 2006). cameraissystems are commonly have seen use usetraditional in agriculture agriculture andapproaches have the the and potential to & Wheeler have seen in have potential to used to achieve stabilized camera vision platform (Campbell & & Wheeler 2006).a Gimballed Gimballed camera systems are are commonly revolutionize traditional farming approaches and provide & Wheeler 2006). systems commonly quick, cost-effective crop monitoring, pesticide application, used to achieve a stabilized vision platform (Campbell & revolutionize traditional traditional farming approaches approaches and provide provide revolutionize Wheeler 2006). Once the pixel location of a suspected plant used to achieve a stabilized vision platform (Campbell & quick, cost-effective crop farming monitoring, pesticideand application, used to achieve a stabilized vision platform (Campbell & and accurate crop yield predictions (Defterli et al 2016, Li et Wheeler 2006). Once the pixel location of a suspected plant quick, cost-effective crop monitoring, pesticide application, quick, cost-effective crop monitoring, pesticide is detected, estimation methods can be used to track its Wheeler 2006). Once the pixel location of a suspected plant and accurate crop yield predictions (Defterli et al application, 2016, Li et Wheeler 2006). Once themethods pixel location a suspected al 2015). Unmanned vehicles (UAVs) are2016, one Li such detected, estimation can beof used to trackplant its and accurate crop yield yieldaerial predictions (Defterli et al al 2016, et is and accurate crop predictions (Defterli et et corresponding coordinates. For the is detected, estimation estimation methods can perspective be used used to to projection track its its al 2015). Unmanned aerial vehicles (UAVs) are one Li such is detected, methods can be track type of robotic technology that has been incorporated into corresponding coordinates. For the perspective projection al 2015). Unmanned aerial vehicles (UAVs) are one such al 2015). Unmanned aerial that vehicles (UAVs) are one such model, which iscoordinates. nonlinear, extended Kalman filters (EKFs) corresponding coordinates. For the perspective projection type of robotic technology has been incorporated into corresponding For the Kalman perspective projection agricultural applications because of the relatively filters (EKFs) type of robotic robotic technology that has has beenvehicles’ incorporated into model, which is nonlinear, extended type of technology that been incorporated into (Savagewhich & Lais Scala 2007) and sigma-point unscented model, which is nonlinear, nonlinear, extended Kalman filters filters (EKFs) agricultural applications because of the vehicles’ relatively model, extended Kalman (EKFs) low cost when compared to manned flight vehicles, easy (Savage & La Scala 2007) and sigma-point unscented agricultural applications because of the vehicles’ relatively agricultural applications of theflight vehicles’ relatively Kalman filters (Campbell & Wheeler 2006) have been (Savage & La Scala 2007) and sigma-point unscented low cost when comparedbecause to manned vehicles, easy (Savage & La (Campbell Scala 2007) sigma-point unscented deployment, and compared ability to hover (Berniflight et al 2009; Gomez& and Wheeler 2006) have been low cost when when compared to manned manned flight vehicles, easy Kalman filters low cost to vehicles, easy investigated. Kalman filters (Campbell & Wheeler 2006) have been deployment, and ability to hover (Berni et al 2009; GomezKalman filters (Campbell & Wheeler 2006) have been Candon et al 2013). investigated. deployment, and ability to hover (Berni et al 2009; Gomezdeployment, and ability to hover (Berni et al 2009; Gomez- investigated. Candon et al 2013). investigated. Candon et et al al 2013). 2013). In this paper, a disease detection scenario tailored for Candon While a wide range of agricultural tasks are using UAVs, the In this paper, a disease detection scenario tailored for strawberry orchards with andetection octorotor scenario UAV hovering above In this paper, paper, disease detection tailored for While a wide range of agricultural tasks are using UAVs, the In this aa disease tailored for application of range interest this research is detecting in strawberry orchards with an octorotor scenario UAV hovering above While wide range of to agricultural tasks are using using diseases UAVs, the the While aa wide of agricultural tasks are UAVs, the target orchards is studied. It should be observed that the strawberry orchards with an octorotor UAV hovering above application of interest to this research is detecting diseases in strawberry with Itan should octorotorbeUAV hovering strawberry crops (Xu to al 2014). detection using observed thatabove the application of interest interest toetthis researchDisease is detecting detecting diseases in the target is studied. application of research is diseases in geolocation method presented uses pixel coordinates its the target is is studied. It should should be observed observed thatas the the strawberry crops (Xu etthis al 2014). Disease detection using the target studied. It be that UAVs is generally performed with a multispectral, geolocation method presented uses pixel coordinates as its strawberry crops (Xu et al 2014). Disease detection using strawberry (Xu etperformed al 2014). Disease using measurement and hence requires a spectral camera that geolocation method presented uses pixel coordinates as its UAVs is crops generally with a detection multispectral, geolocation method presented uses apixel coordinates asthat its hyperspectral, or thermal imaging camera carried on-board, measurement and hence requires spectral camera UAVs is generally performed with a multispectral, UAVs is generally with carried a multispectral, outputs pixelated on-boardaa the UAV.camera It is also measurement and images hence requires requires spectral camera that hyperspectral, or thermalperformed imaging camera on-board, measurement and hence spectral that which detects or radiationcamera levels carried in the on-board, infrared outputs pixelated images on-board the UAV. It is also hyperspectral, orabnormal thermal imaging imaging camera on-board, hyperspectral, thermal assumed pixelated that the UAV is capable of finding diseased outputs images on-board the UAV. UAV. It regions is also also which detects abnormal radiation levels carried in the infrared outputs images on-board the It is spectrum given off by the plants (Berni et al 2009). However, that the UAV is capable of finding diseased regions which detects abnormal radiation levels in the infrared assumed pixelated which detects abnormal radiation levels in theHowever, infrared on-board without use of a ground station. The UAV used in assumed that the UAV is capable of finding diseased regions spectrum given off by the plants (Berni et al 2009). the UAV findingThe diseased the imagesgiven and off spectral gathered analyzed assumed on-board that without use ofisacapable groundof station. UAV regions used in spectrum given by the thedata plants (Berni are et al altypically 2009). However, However, spectrum by plants (Berni et 2009). this research has a camera gimbal to provide a rapidly on-board without use of a ground station. The UAV used in the images and off spectral data gathered are typically analyzed on-board without of a ground station. The UAV in at ground station and not on-board. has use a camera gimbal to provide a used rapidly thea images images and spectral spectral data gathered are are typically typically analyzed analyzed this research the and data gathered stabilizing platform for the vision system. This was taken into this research has a camera gimbal to provide a rapidly at a ground station and not on-board. this research has aforcamera gimbal to This provide a rapidly stabilizing platform the vision system. was taken into at aa ground ground station and and not on-board. on-board. at account in the geolocation stabilizing platform for the thederivation. vision system. system. This This was was taken taken into into To find thestation suspected not location of the diseased plants, the stabilizing platform for vision account in the geolocation derivation. To find the suspected location of the diseased plants, the account in the geolocation derivation. images are tagged with the GPS coordinates of where they To find the suspected location of the diseased plants, the in themodel geolocation To findare thetagged suspected the diseased plants,they the account To better this derivation. scenario and achieve a low images with location the GPSofcoordinates of where were taken the with regionthe photographed is identified through images are and tagged with the GPS coordinates coordinates of where where they To better model this scenario and achieve a low images are tagged GPS of they computational cost in the geolocation process,aa three To better better model model this this scenario and achieve achieve low were taken and the region photographed is identified through To scenario and low Geo-referencing (Gomez-Candon et al is As through can be computational cost in the geolocation process, three were taken and and the the region photographed photographed is2013). identified through were taken region identified assumptions were taken into account. First, the target, computational cost in the geolocation process, threea Geo-referencing (Gomez-Candon et al 2013). As can be computational cost in the geolocation process, three observed, that process still requireset human to assumptions were taken into account. First, the target, a Geo-referencing (Gomez-Candon etconstant al 2013). 2013). As input can be be Geo-referencing (Gomez-Candon al As can diseased strawberry tree, isinto approximately stationary and anyaa assumptions were taken taken into account. First, First, the target, target, observed, that process still requires constant human input to assumptions were account. the manually analyze the images and identify the suspected strawberry tree, is approximately stationary and any observed, that process still requires constant human input to diseased observed, process requires input to motion of the target is due to small external disturbances, diseased strawberry tree, is approximately stationary and any any manually that analyze the still images and constant identify human the suspected diseased strawberry tree, is approximately stationary and location. This research looks to evolve the disease detection motion of the target is due to small external disturbances, manually analyze the images and identify the suspected manually analyze the looks images identify the suspected such asof and is modelled as such. The second motion of wind, the target is due to small external disturbances, location. This research to and evolve the disease detection motion the target to smallasexternal process further by looks providing a precise, quantitative and isis due modelled such. disturbances, The second location. This This research looks to evolve evolve the disease disease detection such as wind, location. research to the detection assumption is that, the geolocation will such as wind, wind, and since is modelled modelled as such. such.estimation The second second process further by providing a precise, quantitative such as and is as The assumption is that, since the geolocation estimation process further by providing a precise, quantitative process further by providing a precise, quantitative assumption is that, since the geolocation estimation will will assumption is that, since the geolocation estimation will Copyright © 2016, 2016 IFAC 27 Hosting by Elsevier Ltd. All rights reserved. 2405-8963 © IFAC (International Federation of Automatic Control) Copyright © 2016 IFAC 27 Peer review©under of International Federation of Automatic Copyright 2016 responsibility IFAC 27 Control. Copyright © 2016 IFAC 27 10.1016/j.ifacol.2016.10.006

IFAC AGRICONTROL 2016 28 August 14-17, 2016. Seattle, Washington,Christian USA A. Garcia et al. / IFAC-PapersOnLine 49-16 (2016) 027–032

xk  xk 1  xk 1t

occur when the octorotor is hovering, with the gimbal focused towards the ground below it, the camera direction and the ground are approximately perpendicular. Lastly, because the tree height is small compared to the flight altitude, the target tree can be considered a shallow structure, and the tree can be treated as two dimensional in the image, making an orthographic projection model valid (Szeliski 2010). The orthographic projection model is linear with regard to the target position. Therefore a Kalman filter can be designed for this case, which is optimal and will estimate the target in the least time (Simon 2006).

(2)

2.3 Measurement Model Initially, the octorotor position measurements, given by the GPS unit in terms of latitude (lat), longitude (lon), and altitude (alt), are converted to the Earth-Centered-Inertial frame (ECI) as (Campbell & Wheeler 2006)

xB

In Section 2, the definition of the geolocation problem in this study is presented, including the process model and measurement model derivations. The estimator algorithm is also presented. In Section 3, a brief description of the octorotor platform, custom-designed for agriculture disease detection applications, is provided. In Section 4, a simulation of the geolocation algorithm is presented. Section 5 provides a test of the algorithm using images obtained from the octorotor in hover flight and the concluding remarks are given in Section 6.

( ECI )

 alt  RE  C  lat  C  lon       alt  RE  C  lat  S  lon     alt  RE  S lat  

(3)

where RE is the radius of the Earth at the current latitude, and C(.) and S(.) are the cosine and sine functions. The resulting vector is in the ECI frame. The relative position vector in the local Earth frame is shown as E ( ECI ) x B ( E )  RECI  xB ( ECI )  xhome 

2. ORTHOGRAPHIC PROJECTION BASED KALMAN FILTER DESIGN FOR GEOLOCATION

E ECI

R

2.1 Geolocation Problem Definition

(4)

C  lon  0    S  lon       S  lat  C  lon   S  lat  S  lon  C  lat   (5)  C  lat  C  lon  C  lat  S  lon  S  lat  

where the home vector is the position vector of the origin of the E frame expressed in the ECI frame. The octorotor position vector in the E frame will be used to determine the camera position vector. Since the E frame is used from this point on as the standard frame, its notation will be dropped.

The geolocation problem, as analysed in this paper, is to determine the target location based on images taken from an UAV using the UAV and its gimbal’s measured GPS position and attitude information respectively. It is assumed that the diseased tree pixels are successfully identified through hyperspectral analysis criteria (Sankaran et al 2010). For this research’s purposes, an estimation error of less than half the distance between rows is desired (e.g. less than 0.5 m for a 1 m row distance) so to avoid mistakenly estimating the disease over to an adjacent row of trees. It is also desired that during the disease detection operation, the octorotor will be hovering in place at a fixed position and orientation.

A visual representation of the camera model is displayed in Fig. 1. As shown, the image plane is approximately parallel to the x-y plane of the E frame. Based on the assumption that the target is a shallow object with respect to the camera, an orthographic projection can be applied (Szeliski 2010), varying from the common perspective projection method used in (Campbell & Wheeler 2006). The x and z components of the Earth and camera frames are also shown in Fig. 1. In a pin-hole camera model, the target point xt, the camera point xc, and the target projection in the image plane (i.e. the pixel vector xp, measured in length) all lie in line (Campbell & Wheeler 2006).

2.2 Process Model The state variables are the x and y components of the target position (with a subscript “t”) in the local East-North-Up (ENU) frame, denoted as the superscript “E”. The tree’s height, denoted as zt( E ) is considered constant with a common height assumed. The noise associated with the x and y tree position coordinates is denoted by w ~ N  0, Q  , which represents an uncorrelated Gaussian random variable with zero-mean and covariance Q (Simon 2006). The state vector is x   xt( E )

yt( E )  and the process model is given by T

the expression

Fig. 1. Camera model. x  dx / dt  0  w

(1)

The relative position vector, in the camera frame, can then be expressed as

An Euler integration scheme is used to propagate the states, and the propagation of the states at each time step k is

xc(C)t  REC  x  xc  28

(6)

IFAC AGRICONTROL 2016 August 14-17, 2016. Seattle, Washington,Christian USA A. Garcia et al. / IFAC-PapersOnLine 49-16 (2016) 027–032

The rotation matrix from the local Earth frame to the camera frame is composed of the rotation from the Earth frame to the octorotor body frame and from the body frame to the camera frame as REC  RBC REB .

29

where x B is found from Eq. (4). The orthographic projection can be represented mathematically with a projection matrix given as

 1y H   0

The rotation matrix from the local ENU frame to the octorotor body frame, in North-East-Down (NED) orientation, is

0  0 1 0  0   1  0 0 1  0 z 

1

y

0

0  1  z 

(11)

The first matrix in the product is the scale factor matrix. These scale factors   y ,  z  are dependent on the altitude

0  C 0  S   C S 0 0 1 0  1 0 REB  0 C S   0 1 0   S C 0 1 0 0  (7) 0 S C   S 0 C   0 0 1  0 0 1

between the camera, which is a function of the octorotor altitude and have units of meter per pixel. They can be determined experimentally by taking images with the camera aligned perpendicular to a flat object of known dimension, centered in the image, and at a known distance. The object’s physical dimensions and its dimensions in pixels spanned across the image can be divided to form the reference ratio

in which  , ,  are the Euler angles. The perspective projection method performed in (Campbell & Wheeler 2006) treats the UAV and gimballed camera as a point mass where the gimbal kinematics are not modeled. In this case however, the gimbal kinematics will be modeled due to the specific size and nature of the gimbal being used. The camera frame axes are originally aligned with the body frame axes, where its respective gimbal Euler angles represent deviations from these axes. The gimbal rotation sequence was defined based on the geometric configuration of the gimbal, as shown in Fig. 2. The gimbal pitch angle, θg is defined positive clockwise, to make a downward rotation of the camera positive, while the roll and yaw angles, g, and ψg, are kept positive counter-clockwise.

y ,ref  yactual / y pixel

(12)

The reference ratio can then be scaled for any distance, based on the property of similar triangles, as

 y   y , ref  x / xref



(13)

This process was repeated multiple times for validation and was also used for the scale factor in the z image direction. The depth component of the camera frame is the x component and it is hence omitted in the projection, denoted by the second matrix in the product of Eq. (11) (Szeliski 2010). Combining Eqs. (6)-(11), the measurement model can be written as

y  HREC  x  xc 

(14)

The measurement model can also be written as the sum of a term for the state vector, a known term D, and sensor noise as

y  Cx  D  v Fig. 2. Gimbal and gimbal model.

where the terms C and D are expressed as

The rotation matrix from the body frame to the camera frame is  C g 0 S g  1 0 0   C g S g 0    RBC  R g Rg R g   0 1 0  0 Cg Sg  S g C g 0 S g 0 C g  0 Sg Cg   0 0 1    

 REC  /  y 2,1 C  RC  /   E 3,1 z

(8)

 



B T E

x

( B) B c

x

( B) B b

x

( B) b 1

x

(B) B c

 R g x T

T

C E

C E

/ y   / z  3,2 

2,2

 

(16)

(17)

and are composed of components from the second and third row of the rotation matrix REC . The measurement noise is

(9)

v ~ N  0, R  , where R is the noise covariance matrix (Simon

(1) 1 2

 RT g RTg x 2(2)3   RBC  x3(C)c

R  R 

   R C  x  0 0 z T /   E 2 c t y  D    T C  R x   0 0 zt  / z    E 3 c 

The base of the gimbal, denoted by b, is aligned with the octorotor body frame and the camera frame is aligned with intermediate frame 3. The camera position vector is given as

xc  x B   R

(15)

2006). Note that the measurement equation is linear with respect to the states and the state vector contains only the unknown components x and y since the z component of the tree is considered constant and incorporated into D.

(10)

29

IFAC AGRICONTROL 2016 30 August 14-17, 2016. Seattle, Washington,Christian USA A. Garcia et al. / IFAC-PapersOnLine 49-16 (2016) 027–032

4. SIMULATION

2.4 Kalman Filter based Geolocation Estimation

A simulation was performed to validate the filter, using video of a red target taken with the Powershot camera but not onboard the octorotor, for simplicity. However the octorotor GPS and attitude values were simulated as constant signals. In the simulation, the camera scale factors are set both at 9.15 mm/pixel respectively at a 6.1 m reference distance. Eq. (13) is used to project these scale factors for different altitudes. A 15 sec stationary video of a red target is taken with a disturbance, added by jerking the camera, to demonstrate the estimator robustness. A red-color pixel intensity threshold was performed in MATLAB to find the red box in each video frame and the center pixels of the contour were then selected, based on the contour with the highest area. The filter estimate was initialized at the simulated octorotor position, which corresponds to initial estimated pixels at the center of the image. The estimation algorithm is run at 30 Hz, which is the frame rate of the video.

Based on the process and measurement models developed in the previous section, a Kalman filter is designed to estimate the location of the target tree. The covariance matrices are

 E  wx2  0   Q 2   0 E w   y  

(18)

 E  vx2  0   R 2   0 E v   y  

(19)

and

Then the Kalman gain and the estimated target are respectively calculated using

K  PC T R 1

xˆ  0  K  y  yˆ   K  y  Cxˆ  D 

(20)

The octorotor’s nominal position in the simulation is set to 28.3382714° latitude, -81.3392998° longitude, and 6.35 m altitude, respectively. The home position is set to 28.3382480° latitude, -81.3393222° longitude, and 0 m altitude. The desired angle values are all 0° except for the gimbal pitch which is 90°. The process noise terms are each 0.01 m/s. The measured pixels of the target center were (223, 248) before the disturbance and (226, 244) afterwards. The standard deviations used in Eq. (19) are 3.83 pixels and 2.94 pixels, respectively. The inverse operations of Eqs. (3)-(5) are used to convert position back to GPS coordinates.

(21)

Based on (Simon 2006), the covariance matrix is propagated using Eqs. (16), (18), and (19) according to

P  Q  PC T R 1CP

(22)

The update equations used in the algorithm are Euler integration schemes for the state and its covariance based on Eqs. (21) and (22) respectively at each time step. Note in Eq. (21) that the known term D is determined each time step by measured information and can be subtracted from the measured pixels directly.

In Table 1, the actual target position is presented along with the final estimated values and their errors of 0.08 m and 0.04 m in the x and y components respectively. The error present can be attributed to not having the camera placed perfectly perpendicular to the target, as this video was not stabilized using the gimbal. The simulated octorotor position is shown. The latitude and longitude coordinates are also provided, but these are displayed as relative to the given home position.

3. OCTOROTOR DESIGN An octorotor platform is designed as part of a cooperative, autonomous network for disease detection in strawberry orchards, shown in Fig. 3 (Xu et al 2014). The geolocation estimation model is tailored for use with this platform. The ublox LEA-6 GPS module used on the octorotor has a 2.5 m – 4 m circular error probability. The octorotor was maneuvered manually for the experiment conducted in Section 5. It is equipped with the RCTimer Legacy aerial gimbal as well as a fiberglass protection casing, designed and fabricated inhouse, for the sensors on-board. A Canon Powershot A90 TM, seen in Fig 2, was used in place of a spectral camera. Its image size is of 640 by 480 pixels.

The estimation position and pixel results are presented across time in Fig. 4 and 5 below and compared to the known and measured values respectively. Note the disturbance shown in Fig. 5 manifests itself in the respective position estimation of Fig. 4. The estimator recovers within 5 seconds from the disturbance of about 17 pixels in magnitude.

Fig. 4. Target position estimation results. Fig. 3. Octorotor platform.

30

IFAC AGRICONTROL 2016 August 14-17, 2016. Seattle, Washington,Christian USA A. Garcia et al. / IFAC-PapersOnLine 49-16 (2016) 027–032

31

Table 1. Geolocation Estimation Simulated Results

x (m) y (m) z (m) lat (°) lon (°)

UAV Pos. 2.19 2.58 6.35 2.34E-5 2.24E-5

Actual Pos. 2.5 2.5 0.10 2.34E-5 1.63E-5

Est. Pos. 1.5 1.5 2.30E-5 1.72E-5

Error 0.08 0.04 Fig. 6. Octorotor trajectory in the x (left), y (middle), and z (right) directions.

Fig. 5. Measured and estimated pixels in x and y. 5. EXPERIMENT

Fig. 7. Target position estimation in the x and y directions.

An experiment is performed to demonstrate the estimation algorithm using real flight data. A video is taken from the octorotor in flight of a red box in a grassy field and an image algorithm to detect the box in each frame, similar to that of Section 4, is implemented. This centroid pixel measurement is treated as the diseased tree pixels. When using a spectral camera, the color intensity criterion used here should be replaced with a spectral or thermal intensity criterion, depending on the desired identification scheme. Additionally, new scale factor values for the spectral camera will need to be determined, based on Eqs. (12) - (13) or other appropriate methods. Note that the contour selection method used will not work for spectral images if contour region definition is not available for these images, or in general for multiple targets since this method is based on identifying the greatest contour area available. The filter covariance values would also need to be tuned for better performance with a different camera, however the overall results still hold for use with an appropriate spectral camera.

Fig. 8. Measured and estimated pixels in the x and y directions.

The video is recorded at 20 frames per second for 25 seconds. The target box is 25 cm long, 15.5 cm wide, and 5 cm tall. The log data for the octorotor GPS and attitude rotations corresponding to the video is used post flight. The yaw angle of the gimbal was observed to be about 120° from the front arm, defined as its origin. The home position is taken as the GPS coordinates of the octorotor at the start of the log. The algorithm was run at the sampling rate of the video. The variance of the measured signals, 134 pixels and 45 pixels, calculated post flight, are used as the values of the R matrix in Eq. (19). The results are provided in Table 2 and Figs 6 – 10, where the vehicle motion is displayed first for reference.

Fig. 9. Trace of the position estimate covariance matrix.

Fig. 10. Still-shot of the target tracking video from the octorotor.

31

IFAC AGRICONTROL 2016 32 August 14-17, 2016. Seattle, Washington,Christian USA A. Garcia et al. / IFAC-PapersOnLine 49-16 (2016) 027–032

the Photogrammetry, Remote Sensing, and Spatial Information Sciences, vol. 38, pp. 1-6. International Society for Photogrammetry and Remote Sensing. Campbell, M. and Wheeler, M. (2006). A Vision Based Geolocation Tracking System for UAVs. AIAA Guidance, Navigation, and Control Conference and Exhibit, pp. 1942-1959. AIAA, Washington, D.C. Defterli, S.G. Shi, Y., Xu, Y., and Ehsani, R. (2016). Review of Robotic Technology for Strawberry Production. Accepted to Applied Engineering in Agriculture, American Society of Agricultural and Biological Engineers, to be published. Gomez-Candon, D., Lopez-Granados, F., Caballero-Novella, J.J., Pena-Barragan, J.M., Gomez-Casero, M.T., JuradoExposito, M, and Garcia-Torres, L. (2013). Semiautomatic Detection of Artificial Terrestrial Targets for Remotely Sensed Image Georeferencing. IEEE Geoscience and Remote Sensing Letters, vol. 10 (1), pp. 184-188. Li, N., Remeikas, C., Xu, Y., Jayasuriya, S., and Ehsani, R. (2015). Task Assignment and Trajectory Planning Algorithm for a Class of Cooperative Agricultural Robots. ASME Journal of Dynamic Systems, vol. 137 (5), pp. 051004-1 – 051004-9. Sankaran, S., Mishra, A., Ehsani, R., and Davis, C. (2010). A Review of Advanced Techniques for Detecting Plant Disease. Computers and Electronic in Agriculture, vol. 72 (1), pp. 1-13. Savage, C.O. and La Scala, B.F. (2007). Accurate Target Geolocation using Cooperative Observers. Conference Proceedings of 2007 Information Decision, and Control, pp. 248-253. IEEE, New York. Simon, D. (2006). Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches, Chapters 4, 8, and 13. John Wiley and Sons, Hoboken, New Jersey. Szeliski, R. (2010). Computer Vision: Algorithms and Applications, Chapters 1-2. Springer Science & Business Media, Springer, New York. Wu, Y.A. (1995). E/O Target Geolocation Determination. Proceedings of the 34th IEEE Decision and Control Conference, vol. 3, pp. 2766-2771. IEEE, New York. Xu, Y., Ehsani, R., Kaplan, J., Ahmed, I., Kuzma, W., Orlandi, J., Nehila, K., Waller, K., and Defterli, S.G. (2014). An Octo-Rotor Ground Network for Autonomous Strawberry Disease Detection – Year 1 Status Update. 2nd International Conference on Robotics and Associated High-Techonologies and Equipment for Agriculture and Forestry, Madrid, Spain, pp. 457-466.

Table 2. Geolocation Estimation Experiment Results

x (m) y (m) latitude (°) longitude (°)

Actual Position 7.930 -5.517 28.5998280 -81.1964760

Estimated Position 7.952 -5.527 28.5998279 -81.1964758

Position Error 0.022 0.010 1E-7 2E-7

As seen in Fig. 6 and Fig. 8, disturbances in the hover are present and are reflected in the pixel measurements. It was noted that the shifts present in Fig. 8, in both the x and y directions, follow the movement of the octorotor of Fig. 6, which corroborates their correlation. At higher, more realistic octorotor flight altitudes, the same horizontal vehicle motion will have less effect on the filter convergence. The pixel tracking in Fig. 8 and the position estimate of Fig. 7 demonstrate convergence. The trace of the covariance matrix of the estimate, seen in Fig. 9, converges to approximately zero, minimizing the estimation error. The actual target position value, as shown in Fig. 7, was determined by evaluating the UAV GPS position corresponding to the time instant when the centroid crossed the center of the image. The image frame corresponding to this time instant is provided in Fig. 10 for reference. The estimator is able to track the signal effectively, even with the disturbance present. The target position estimate comes within 0.03 m in both directions, which very well meets the desired error of 0.5 m, established in the problem definition. 6. CONCLUSIONS A geolocation algorithm is developed for direct on-board tracking of identified diseased strawberry trees. It is designed to be used with hyperspectral cameras that produce pixelated measurements of the tree. The process and measurement models are developed specifically for the assumptions applicable to strawberry orchards and gimballed UAVs. An orthographic projection method is used to relate the pixel measurements to a local position vector in a linear fashion and a Kalman filter is applied to estimate the position optimally. Simulation and an ad-hoc experiment validate the effectiveness of this approach. ACKNOWLEDGMENT The authors would like to thank the United States Department of Agriculture – National Institute of Food and Agriculture for financial support (#2013-67021-20934). They would also like to thank Davis Drake for providing his piloting expertise, as well as the two Senior Design groups of 2013-2014 and 2014-2015 who contributed immensely to the design and construction of the octorotor platform. REFERENCES Berni, J., Zarco-Tejada, P., Suarez, L., Gonzalez-Dugo, V., and Fereres, E. (2009). Remote Sensing of Vegetation from UAV Platforms using Lightweight Multispectral and Thermal Imaging Sensors. International Archives of 32