Proceedings, 6th IFAC Conference on Bio-Robotics Proceedings, IFAC Conference on Bio-Robotics Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Conference on Bio-Robotics online at www.sciencedirect.com Proceedings, 6th IFAC Conference onAvailable Bio-Robotics Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Conference on Bio-Robotics Beijing, China, China,6th July 13-15, 2018 Beijing, July 13-15, 2018 Beijing, China, July 13-15, 2018
ScienceDirect
IFAC PapersOnLine 51-17 (2018) 1–4
Development of a Navigation System for a Smart Farm Development of a Navigation System for a Smart Farm Development of a Navigation System for a Smart Farm H. Gan*. W.S. Lee** for a Smart Farm Development of a Navigation System
H. Gan*. W.S. Lee** H. H. Gan*. Gan*. W.S. W.S. Lee** Lee** H. Gan*. * Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611 W.S. Lee** * Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611 e-mail:
[email protected]). USA (Tel: 352-392-1684; * and Engineering, University * Department Department of of Agricultural Agricultural and Biological Biological Engineering, University of of Florida, Florida, Gainesville, Gainesville, FL FL 32611 32611 USA (Tel: 352-392-1684; e-mail:
[email protected]). of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611 USA (e-mail: ** Department * Department of Agricultural and Biological Engineering, University of Florida, Gainesville, FL 32611 USA (Tel: 352-392-1684; e-mail:
[email protected]). USABiological (Tel: 352-392-1684; e-mail:
[email protected]). Engineering, University of Florida, Gainesville, FL 32611 USA (e-mail: ** Department of Agricultural and
[email protected]) USA (Tel: 352-392-1684; e-mail:
[email protected]). University ** Department of of Agricultural Agricultural and and Biological Biological Engineering, Engineering, University of of Florida, Florida, Gainesville, Gainesville, FL FL 32611 32611 USA USA (e-mail: (e-mail: ** Department
[email protected]) University of Florida, Gainesville, FL 32611 USA (e-mail: ** Department of Agricultural and Biological Engineering,
[email protected])
[email protected])
[email protected]) Abstract: Autonomous agricultural robots will be important for future smart farms. Autonomous Abstract: Autonomous agricultural robots important for future smart farms. Autonomous navigation as one of the robots’ abilities, couldwill savebe labors for driving vehicles and provide accurate and Abstract: Autonomous agricultural robots will belabors important for future future smart farms. Autonomous Autonomous Abstract: Autonomous agricultural robots will be important for smart farms. navigation as one of the robots’ abilities, could save for driving vehicles and provide accurate and consistent localizations to perform farm operations. The goal of this study was to develop a navigation Abstract: Autonomous agricultural robots will be important for future smart farms. Autonomous navigation as one one of of the the to robots’ abilities, could save save labors labors for of driving vehicles and provide accurate accurate and navigation as robots’ abilities, could for driving vehicles and provide and consistent localizations perform farm operations. The goal this study was to develop a navigation system thatlocalizations guide a perform field abilities, robot to operations. travel from labors a farm station tostudy a citrus grove and visit each tree navigation ascould one of the to robots’ could save for driving vehicles and provide accurate and consistent to farm The goal of this was to develop a navigation consistent localizations perform farm operations. The goal of this study was to develop a navigation system thatlocalizations could guide a perform field robot to operations. travel from asystem farm station tostudy a citrus grove and visit each tree autonomously with obstacle avoidance ability. The was developed with consideration of the consistent to farm The goal of this was to develop a navigation system that could could guide field robot to to ability. travel from from farm station station to aa citrus citruswith groveconsideration and visit visit each each tree system that guide aa field robot travel aasystem farm to grove and tree the autonomously with obstacle avoidance The was developed of concept of future smart farms, in which internet of things and big data analysis would be implemented. system that could guide a field robot to ability. travel from asystem farm station to a citruswith groveconsideration and visit each autonomously with obstacle avoidance The was developed of the autonomously with obstacle avoidance ability. The system wasdata developed with consideration of tree the concept of future smart farms, in which internet of things and big analysis would be implemented. autonomously with obstacle avoidance ability. The system was developed with consideration of the concept of future smart farms, in which internet of things and big data analysis would be implemented. concept of future smart farms, in which of things andHosting bigoperating data would implemented. © 2018, IFAC (International Federation ofinternet Automatic Control) byanalysis Elsevier Ltd.Smart Allberights Keywords: Field robotics; Internet of things; Navigation; Robot system; farm.reserved. concept of future smart farms, in which internet of things Robot and bigoperating data analysis would be implemented. Keywords: Field robotics; Internet of things; Navigation; system; Smart farm. Keywords: Robot Keywords: Field Field robotics; robotics; Internet Internet of of things; things; Navigation; Navigation; Robot operating operating system; system; Smart Smart farm. farm. Keywords: Field robotics; Internet of things; Navigation; Robot operating system; Smart farm. Multi-sensor fusion is another technique used for robot 1. INTRODUCTION Multi-sensor fusion is Kise another technique used for robot navigation in orchards. et al. (2002) used an RTK-GPS 1. INTRODUCTION Multi-sensor fusion is another technique used for Multi-sensor fusion is another technique used for robot robot 1. INTRODUCTION navigation in orchards. Kise et al. (2002) used an RTK-GPS 1. INTRODUCTION and an IMUintoorchards. developed a steering controlling algorithm for The most important abilities of autonomous agricultural navigation Multi-sensor fusion is another technique used for robot Kise et al. (2002) used an RTK-GPS 1. INTRODUCTION navigation intoorchards. Kise et al. (2002) used an RTK-GPS and an IMU developed a steering controlling algorithm fora The most important abilities of autonomous agricultural an autonomous tractor. Iida and Burks (2002) combined robots can be grouped into four categories, being navigation, navigation in orchards. Kise et al. (2002) used an RTK-GPS and an IMU IMU to to developed developed steering controlling algorithm for The most important abilities of autonomous agricultural and an aa steering controlling algorithm for The most important abilities of autonomous agricultural an autonomous tractor. Iida and Burks (2002) combined robots can be grouped into four categories, being navigation, and ultrasonic sensors toBurks provide navigation offoraaa detection, action and abilities mapping (Auat Cheein and Carelli, DGPS and an IMU to developed a steering controlling algorithm The most important of autonomous agricultural an autonomous tractor. Iida and (2002) combined robots can be grouped into four categories, being navigation, an autonomous tractor. Iida and Burks (2002) combined robots can be grouped into four categories, being navigation, and ultrasonic sensors provide navigation ofandaaa detection, action and mapping (Auat Cheein Carelli, tractor in orchards. Hansen al.to (2011) fused odometry 2013). Navigation is the first step, which could also rely on DGPS an autonomous tractor. Iidaetand Burks (2002) combined robots can be grouped into four categories, beingand navigation, DGPS and ultrasonic sensors to provide navigation ofandaa detection, action and mapping (Auat Cheein and Carelli, DGPS and ultrasonic sensors to provide navigation of detection, action and mapping (Auat Cheein and Carelli, tractor in orchards. Hansen et al. fused odometry 2013). Navigation is themapping step, which could also Carelli, rely on gyro measurements with line features created by 2D laser detection and mapping infirst some cases such as navigation in DGPS and ultrasonic sensors to(2011) provide navigation ofand detection, action and (Auat Cheein and tractor in orchards. Hansen et al. (2011) fused odometry 2013). Navigation is the first step, which could also rely on tractor in orchards. Hansen et al. (2011) fused odometry anda 2013). Navigation is the first step, which could also rely on gyro measurements with line features created byfilters 2D laser detection and infirst some cases suchcould navigation in tractor scanner data usingHansen derivative free Kalman and orchards. Due mapping to the bigger canopy sizes ofas most specialty in orchards. et al. (2011) fused odometry 2013). Navigation is the step, which also rely on gyro measurements with line features created by 2D laser detection and mapping in some cases such as navigation in gyro measurements with line features created byfilters 2D laser detection and mapping in some casessizes suchofasmost navigation in navigated scanner data using derivative free Kalman and orchards. Due to the bigger canopy specialty a tractor in orchards. In orchards navigation, crops, GPS signals could frequently be blocked. Thus, measurements line features created byfilters 2D laser detection and mapping in some casessizes suchof navigation in gyro scanner data using derivative free Kalman and orchards. Due to canopy most specialty scanner data using with derivative free Kalman filters and orchards. Duesignals to the the bigger bigger canopy sizes ofas most specialty navigated a tractor in orchards. In orchards navigation, crops, GPS could frequently be blocked. Thus, using multi-sensor fusion, especially, GPS-based navigation in orchards is considered more challenging than in methods scanner data using derivative free Kalman filters and orchards. Due to the bigger canopy sizes of most specialty navigated a tractor in orchards. In orchards navigation, crops, GPS signals could frequently be blocked. Thus, navigated a tractor in orchards. Inespecially, orchards navigation, crops, GPSin orchards signals could frequently be blocked. than Thus, methods using multi-sensor fusion, GPS-based navigation is considered more challenging in sensor fusion methods, were not studied as much as machine the openGPS field of most agronomic crops. navigated a tractor in orchards. In orchards navigation, crops, signals could frequently be blocked. Thus, methods using multi-sensor fusion, especially, GPS-based navigation in orchards is considered more challenging than in usingmethods, multi-sensor fusion, especially, GPS-based navigation in orchards is considered more challenging than in methods sensor fusion weremethods. not studied as much as machine the open field of most agronomic crops. vision and LiDAR-based However, using multi-sensor fusion, especially, GPS-based navigation in orchards is considered more challenging than in methods sensor fusion methods, weremethods. not studied as much much as as machine the open open field ofmany most agronomic agronomic crops. fusion methods, were not studied as machine the most crops. Over the field years,of methods have been created to navigate sensor vision and LiDAR-based However, GPS-based navigation solutions arewere stillmethods. usedstudied frequently in practice due sensor fusion methods, not as much as machine the open field ofmany most agronomic crops. vision and LiDAR-based However, GPS-based Over the years, methods have been created to navigate vision and LiDAR-based methods. However, GPS-based ground robots inmany orchards. Machine visioncreated and laser scanner navigation solutions are robustness stillmethods. used frequently in practice due to the simplicity and the to environmental noises. Over the years, methods have been to navigate vision and LiDAR-based However, GPS-based Over the years, many methods have been created to navigate navigation solutions are robustness still used used frequently frequently in practice practice due ground robots in orchards. Machine visioncreated and laser scanner navigation solutions are still in due based methods were used most frequently. Subramanian et al. to the simplicity and the to environmental noises. Over the years, many methods have been to navigate ground robots in in orchards. Machine visionSubramanian and laser laser scanner scanner navigation solutions are robustness still used frequently in practice due ground robots orchards. Machine vision and to the simplicity and the to environmental noises. based methods were used most frequently. et al. to the simplicity and the robustness to environmental noises. So far, most studies of orchard navigation have focused on (2006) developed a machine vision and laser ground robots in orchards. Machine vision andradar laser(LiDAR) scanner based methods were used most frequently. Subramanian et al. to the simplicity and the robustness to environmental noises. based methods were used most frequently. Subramanian et al. So far,row mostguidance studies ofororchard navigation have focused on (2006)methods developed a machine vision and laser radar (LiDAR) single single orchard navigation (rowbased guidance system for citrus grove navigation and were used mostvision frequently. Subramanian et al. So most studies navigation have on (2006) developed aa machine and laser radar So far, far,row mostguidance studies of ofororchard orchard navigation have focused focused on (2006) guidance developed machine vision and laser radar (LiDAR) (LiDAR) single single orchard navigation (rowbased system for citrus grove navigation and following plus turning). There are no systems that attempt to achieved average positioning errors of 28 (mm) using So far, most studies of orchard navigation have focused on (2006) developed a machine vision and laser radar (LiDAR) single row guidance or single orchard navigation (rowbased guidance system for citrus grove navigation and rowplus guidance orThere single orchard navigation (rowbased guidance system for citrus grove navigation and single following turning). are no systems that attempt to achieved average positioning errors of 28 (mm) using solve the navigation problem for a farm system that is to machine vision and 25 (mm) using the laser radar, in a single rowplus guidance orThere single orchard navigation (rowbased guidance system for citrus navigation and following turning). are no systems that to achieved average positioning errors of 28 (mm) using following plus turning). There are no systems that attempt attempt to achieved averageand positioning errorsgrove of laser 28 (mm) using solve the navigation problem for a farm system that is to machine vision 25 (mm) using the radar, in a navigate vehicles autonomously in an entire farm. Thanks to straight citrus grove alleyway. Barawid et al. (2007) following plus turning). There are no systems that attempt achieved vision averageand errors the of laser 28 (mm) using the navigation problem aa farm system that to machine 25 (mm) radar, in solve thevehicles navigation problem for for farm system that is is to to machine vision andpositioning 25alleyway. (mm) using using the laser radar, in aa solve navigate autonomously in an entire farm. Thanks straight citrus grove Barawid et al. (2007) the recent advance of problem technologies, as system internet of things developed a navigation system an the orchard using a(2007) twothevehicles navigation for asuch farm that is to to machine citrus vision and 25alleyway. (mm) for using laser radar, in a solve navigate vehicles autonomously in an entire farm. Thanks to straight citrus grove alleyway. Barawid et al. navigate autonomously in an entire farm. Thanks straight grove Barawid et al. (2007) the recent advance of technologies, such as internet of things developed a navigation system for an orchard using a two(IoT), deep learning, and big data, creating smart farms has dimensional laser scanner. They applied Hough transform to navigate vehicles autonomously in an entire farm. Thanks to straight citrus grove alleyway. Barawid et al. (2007) the recent advance of and technologies, such as internet internet of things things developed navigation system for an orchard orchard using tworecent advance of technologies, such as of developed aa laser navigation system for an using aa two(IoT), deep learning, big data, creating smart has dimensional scanner. They applied Hough transform to the become a popular research topic. From a as smart farmfarms of fit lines along detected tree canopies and provide the recent advance of and technologies, such internet ofpoint things developed a laser navigation system for an orchard using alateral two(IoT), deep learning, and big data, creating smart farms has dimensional laser scanner. They applied Hough transform to (IoT), deep learning, big data, creating smart farms has dimensional scanner. They applied Hough transform to become a popular research topic. From a smart farm point of fit lines along detected tree canopies and provide lateral view, an orchard navigation system should have the ability to offset and heading measurements. This method achieved 0.11 (IoT), deep learning, and big data, creating smart farms has dimensional laser scanner. They applied Hough transform to become popularnavigation research topic. topic. From smart farm point of of fit lines along detected tree canopies canopies and provide provide lateral become aaorchard popular research From aa smart farm point fit lines along detected tree and lateral view, an system should have the ability to offset and heading measurements. This method achieved 0.11 guide a ground vehicle from a farm station to the orchard and m and 1.5° in the lateral and the heading mean error, aorchard popularnavigation research topic. From a smart farm point of fit lines along detected tree canopies and provide lateral view, system should have the ability to offset and heading measurements. This method achieved 0.11 view, an an orchard navigation system should have the ability to offset and heading measurements. This method achieved 0.11 become guide a ground vehicle from a farm station to the orchard and m and 1.5° in the lateral and the heading mean error, travel in each row automatically. Besides, the navigation respectively. Similarly, Bayar et al. (2015) developed a view, an orchard navigation system should have the ability to offset and heading measurements. This method achieved 0.11 guide a ground vehicle from a farm station to the orchard and m and 1.5° in the lateral and the heading mean error, ainground vehicle from a farm station to the orchard and m and 1.5° Similarly, in the lateral andet the heading mean error,a guide travel each row automatically. Besides, the navigation respectively. Bayar al. (2015) developed should be monitored remotely in real-time and be part model-based control method in et which a laser developed scanner wasa system guide a ground vehicle from a farm station to the orchard and m and 1.5° in the lateral and the heading mean error, travel in each row automatically. Besides, the navigation respectively. Similarly, Bayar al. (2015) in eachberow automatically. Besides, the and navigation respectively. Similarly, Bayarin et al. (2015) developed system should monitored remotely in real-time be parta model-based method which a laser scanner wasaa travel of the in farm’s system. Thus, weBesides, aimed tothe create such used to detectcontrol the relative positions of fruit trees for central eachIoT automatically. navigation respectively. Similarly, Bayar al. (2015) developed system should be monitored remotely in and be part model-based control method in which aa laser scanner was system should berow monitored remotely in real-time real-time and be parta model-based control method in et which laser scanner was travel of the farm’s IoT system. Thus, we aimed to create such used to detect the relative positions of fruit trees for central navigation system for a citrus grove. The objectives include line calculation. They alsopositions used wheel encoders for central ‘deadsystem should be monitored remotely in real-time and be partaa model-based control method in which a laser scanner was of the farm’s IoT system. Thus, we aimed to create such used to detect the relative of fruit trees for of the farm’s IoT system. Thus, we aimed to create such used to detect theThey relative positions of fruit trees for central navigation system for a citrus grove. The objectives include line calculation. also used wheel encoders for ‘deadcreating a system that can 1. navigate between any two points reckoning’ to locate the position of the vehicle in a row and of the farm’s IoT system. Thus, we aimed to create such a used to detect the relative positions of fruit trees for central navigation systemthat for can citrus grove. between The objectives objectives include line calculation. They also used of wheel encoders for ‘deadsystem for aa citrus grove. The line calculation. They also used wheel encoders ‘deadcreating a system 1. navigate any twoinclude points reckoning’ to locate the position the vehicle in for afrom row and navigation in a farm with obstacle avoidance ability; 2. autonomously to estimate the paths of the vehicle when turning one navigation system for a citrus grove. The objectives include line calculation. They also used wheel encoders for ‘deadcreating system that can canavoidance 1. navigate navigateability; between any two points points reckoning’ to locate the position of the the vehicle in aafrom row one and creating aa system that 1. between two reckoning’ to locate the of vehicle in row and in a farm with obstacle 2. any autonomously to estimate the paths of position the when turning ina asystem citrus grove with obstacle avoidance ability; 3. row to the to next. Sharifi andvehicle Chen (2015) classified creating that canavoidance 1. navigate between any two points reckoning’ locate the position of the vehicle in afrom rowRGB and travel in a farm with obstacle avoidance ability; 2. autonomously to estimate the paths of the vehicle when turning from one in a farm with obstacle ability; 2. autonomously to estimate the paths of the vehicle when turning one travel in a citrus grove with obstacle avoidance ability; 3. row to the next. Sharifi and Chen (2015) classified RGB enable real-time monitoring and control of the robot; 4. images taken from a mobile robot in an orchard row into a farm obstacle ability; 2. autonomously to estimate paths of the when turning fromRGB one in travel in aawith citrus grove with obstacle avoidance ability; 3. row to the next. Sharifi and Chen (2015) classified travel inreal-time citrus grove avoidance with and obstacle avoidance ability; 3. row to taken the the next. Sharifi andvehicle Chen in (2015) classified RGB enable monitoring control of the robot; 4. images from a mobile robot an orchard row into collect and send real-time sensor data to a remote station. classes based on graph partitioning theory and then applied travel a citrus monitoring grove with and obstacle avoidance ability; row to taken the next. Sharifi and robot Chen in (2015) classified RGB enable real-time control of robot; 4. images from aa mobile an row into enable in real-time monitoring and control of the thestation. robot; 3. 4. images taken from mobile robot in an orchard orchard row into collect and send real-time sensor data to a remote classes based on graph partitioning theory and then applied Hough transform to determine the central path in a row for enable real-time monitoring and control of the robot; images taken from a mobile robot in an orchard row into collect and send real-time sensor data to a remote station. classes based on graph partitioning theory and then applied collect and send real-time sensor data to a remote station. 4. classes based on graph partitioning theory path and in then applied Hough transform to determine the central a row for navigation. Allonthese works presented satisfactory in collect and send 2. MATERIALS AND real-time sensor dataMETHODS to a remote station. classes based graph partitioning theory and in then applied Hough transform to the path aaresults row for Hough transform to determine determine the central central path in row for navigation. All these works presented satisfactory results in 2. MATERIALS AND METHODS cases of the single row following using machine vision or Hough transform to determine the central path in a row for navigation. these works presented satisfactory in 2. navigation. All theserow works presented satisfactory results in 2. MATERIALS MATERIALS AND AND METHODS METHODS cases of theAll single following using machine results vision or laser scanning techniques. navigation. All theserow works presented satisfactory results in 2.1 The Robot 2. Platform MATERIALS AND METHODS cases of the single row following using machine vision or cases of the single following using machine vision or laser scanning techniques. cases of the single row following using machine vision or 2.1 The Robot Platform laser scanning scanning techniques. laser techniques. 2.1 The The Robot Robot Platform Platform 2.1 laser scanning techniques. 2.1 The Robot Platform
2405-8963 © 2018, IFAC (International Federation of Automatic Control) Copyright 2018 IFAC 1 Hosting by Elsevier Ltd. All rights reserved. Copyright 2018 responsibility IFAC 1 Control. Peer review©under of International Federation of Automatic Copyright © 2018 2018 IFAC IFAC 1 Copyright © 1 10.1016/j.ifacol.2018.08.051 Copyright © 2018 IFAC 1
IFAC BIOROBOTICS 2018 2 Beijing, China, July 13-15, 2018
H. Gan et al. / IFAC PapersOnLine 51-17 (2018) 1–4
The robot platform was an unmanned ground vehicle (Husky, ClearPath Robotics, Kitchener, Canada), as shown in Fig. 1. It was a rugged robot designed for running on all terrains, with external dimensions of 900×670×390 (mm) and a weight of 50 kg. It could carry up to 75 (kg) payload and run a maximum speed of 1 (m/s).
2.4 The Navigation Method The navigation algorithm was developed utilizing all the sensors, GPS, IMU, Wheel encoders, Lidar and the RGB sensor from the Kinect. There were two navigation modes that enabled the robot to visit any accessible locations on a farm and travel in a citrus grove automatically. In the first mode, the GPS, IMU and the wheel encoders were integrated by an extended Kalman filter to provide accurate odometry information. Then the Simultaneous Localization and Mapping (SLAM) algorithm combined the odometry data with the LiDAR data which enabled the robot to do path planning and to navigate based on laser scans (Fig. 2). To visualize the position of the robot and send waypoints to it, a map server was built on the remote computer and the Google map data was cashed in the server so that a user can utilize the map offline.
Fig. 1. The robot platform – ClearPath Husky 2.2 The CPU, Sensors and External Hardware The robot was equipped with a mini computer (Intel chipbased Mini-ITX, VIA Technologies, Taipei, Taiwan) for data processing. Sensors equipped in the robot platform included a GPS receiver, an inertial measurement unit (IMU), wheel encoders, a LiDAR (UTM-30LX, Hokuyo Automatic CO., Ltd, Japan) and a Kinect camera (Kinect v2, Microsoft CO., WA, USA). External hardware included a WiFi router for wireless communication and a radio module for remote control. Fig. 2. The sensor fusion diagram for navigation mode 1. GPS, IMU and wheel encoders were fused with an extended Kalman filter and generated filtered odometry data, which were then combined with laser scans for SLAM algorithm
2.3 Software Platform Robot Operating System (ROS) was the main software platform used for developing the autonomous navigation system. It is an operating system-like platform that collects and connects software frameworks. The core elements of the ROS are nodes and topics. A node is nothing more than a running process that performs computations. Different nodes communicate with each other through topics. For instance, a node is created for each sensor, and the node would receive sensor data and send the data to a publicly shared space that every node and user in the same network can access.
In the second mode, vision data from the RGB sensor of the Kinect was added. The vision data was processed to determine the robot’s relative positions and orientations in a row and to assign real-time waypoints for the SLAM algorithm as navigation goals. 2.5 Machine Vision Algorithms In the second navigation mode, machine vision was used to determine the robot’s relative positions and orientations in a row using the RGB sensor data from the Kinect.
Based on the ROS, a network was built between the robot’s computer and a remote computer using a WiFi router so that both computers could communicate and share data with each other. On the robot, the computer received and processed sensors’ data, and applied navigation algorithms to control the robot. On the remote station, the computer stored and visualized the data and sent commands to the robot when necessary.
The algorithm detected the upper edges of citrus canopies and fit two lines along the edges using Hough transform. Then the central line of a row was calculated from the two lines of the upper edges. Based on the position and direction of the central line, the robot’s relative position and orientation could
2
IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018
be determined mathematically. described below:
H. Gan et al. / IFAC PapersOnLine 51-17 (2018) 1–4
performances. To test the localization accuracy, seven waypoints were set on Google map using MAPVIZ. When the robot stopped at each waypoint, it collected 50 GPS points and calculated the mean as true location. An average localization error of 0.43 m was achieved by using the extended Kalman filter. Table 1 shows the results of the seven runs.
The detailed steps are
1.
Convert the RGB image (each video frame) into a Green_ness image using equation (1) and grey-scale image.
2.
Apply Green_ness threshold and threshold to detect citrus canopies
3
brightness
Table 1. Localization accuracy of the robot using extended Kalman filter.
3.
Convert the filtered image into Canny edge map
4.
Apply probabilistic Hough line transform to find lines along the upper edge of the citrus canopies
# of Run
1
2
3
4
5
6
7
Mean error
5.
Fit two lines to the left and right upper edges
Error (m)
0.27
0.41
0.38
0.49
0.56
0.29
0.61
0.43
6.
Calculate the intersection of the two lines and the central line
7.
Calculate the angle between the robot’s heading and the center line of the road
8.
Calculate the distance between the center of the robot and the center line of the road
The obstacle avoidance and row following performances were firstly evaluated in a simulation software named Gazebo. In Gazebo, the robot’s model and a simplified model of a row of citrus trees were built, and the navigation algorithms were evaluated in the simulations. Fig. 4 shows the Gazebo model that the robot traveled in a curved row of citrus trees.
(1)
2.6 Data visualization and storage Sensors’ data were visualized and stored in real-time through WiFi network. The computer on the robot and the computer at the remote station were synchronized in ROS, and so were the sensor data. The open source data visualization library MAPVIZ was used to show the Google map as well as all other sensors data. Raw data were recorded in ROS bagfiles, which can be used to play the data back. Fig. 3 shows the data that were collected and visualized in the MAPVIZ.
Fig. 4. Gazebo simulation of the Husky robot traveling in a curved row of citrus trees using the navigation algorithms Field evaluation was also conducted in a citrus grove in Citra, Florida, USA. The robot was able to successfully travel in a row. The actual localization accuracy in a row will be evaluated in further studies. Fig. 5 shows an example video frame from the Kinect that the upper edges of trees canopies were detected, and the center line of the row was calculated and displayed. Fig. 3. Data visualization using the MAPVIZ library 3. RESULTS AND DISCUSSIONS 3.1 Evaluation of the navigation method The two navigation modes were evaluated on localization accuracy, obstacle avoidance ability and row following 3
IFAC BIOROBOTICS 2018 4 Beijing, China, July 13-15, 2018
H. Gan et al. / IFAC PapersOnLine 51-17 (2018) 1–4
4. CONCLUSIONS A navigation system was developed to guide a ground robot to travel from a farm station to a citrus grove and guide it to automatically follow each row of the citrus grove while avoiding all obstacles encountered. The sensor data and the robot’s status were monitored and recorded from the farm station in real-time. The system also enabled the computer at the farm station to interrupt and control the robot manually in emergency situations. The communication between the robot and the farm station was established under a WiFi network, in which more sensing systems could be added easily. The system represented a simple IoT and can be expanded to an IoT of any size.
Fig. 5. Real-time central line (orange solid line) detection by machine vision and robot’s relative positions and orientations determination
REFERENCES
3.2 Data visualization
Auat Cheein, F.A. & Carelli, R., 2013. Agricultural robotics: Unmanned robotic service units in agricultural tasks. IEEE Industrial Electronics Magazine, 7(3), pp.48–58.
Real-time data visualization was achieved wirelessly in the remote computer. Fig. 6 shows an example of the MAPVIZ displaying the Google map, the GPS locations (the yellow line), and the live video stream from the Kinect.
Barawid, O.C. et al., 2007. Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosystems Engineering, 96(2), pp.139–149. Bayar, G., Bergerman, M., Koku, A.B. and Ilhan Konukseven, E., 2015. Localization and control of an autonomous orchard vehicle. Computers and Electronics in Agriculture, 115, pp.118-128. Hansen, S., Bayramoglu, E., Andersen, J.C., Ravn, O., Andersen, N. and Poulsen, N.K., 2011, June. Orchard navigation using derivative free Kalman filtering. In American Control Conference (ACC), 2011 (pp. 4679-4684). IEEE. Iida,
Fig. 6. The MAPVIZ software that visualizes the Google map, the GPS locations (the yellow line) and the live video stream (top-left corner of the map) from the Kinect 3.3 Discussions The navigation system was designed to be applied in a smart farm. When combined, the two navigation modes had the ability to guide the robot to visit each citrus tree in a citrus grove without human interference. For tasks such as yield mapping and canopy phenotyping, it had the capability to carry sensors and fulfill all the tasks by itself. The system was connected to the central computer at the remote station through WiFi network. It shared sensor data and processed data with the remote computer in real-time. Such a configuration can be expanded by adding more sensing systems to form a larger IoT. In this study, the WiFi network was built using a consumer-level router, which had a small coverage. For real smart farm applications, an industrial-scale WiFi network should be used to cover the entire farm. Nevertheless, the concept of the system is suitable for a future smart farm, where sensing systems would be integrated into an IoT and sensor data would be monitored, visualized and analyzed using big data tools.
M. & Burks, T.F., 2002. Ultrasonic Sensor Development for Automatic Steering Control of Orchard Tractor. Pp. 221-229 in Automation Technology for Off-Road Equipment, Proceedings of the July 26-27, 2002 Conference (Chicago, Illinois, USA) 701P0502.(doi:10.13031/2013.10010).
Kise, M., Noguchi, N., Ishii, K. and Terao, H., 2002. The development of the autonomous tractor with steering controller applied by optimal control. Pp. 367-373 in Automation Technology for Off-Road Equipment, Proceedings of the July 26-27, 2002 Conference (Chicago, Illinois, USA) 701P0502. (doi:10.13031/2013.10026). Sharifi, M. and Chen, X., 2015, February. A novel vision based row guidance approach for navigation of agricultural mobile robots in orchards. In Automation, Robotics and Applications (ICARA), 2015 6th International Conference on (pp. 251-255). IEEE. Subramanian, V., Burks, T.F. & Arroyo, A.A., 2006. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Computers and Electronics in Agriculture, 53(2), pp.130–143. 4