Application of an image and environmental sensor network for automated greenhouse insect pest monitoring

Application of an image and environmental sensor network for automated greenhouse insect pest monitoring

Journal Pre-proofs Full length article Application of an Image and Environmental Sensor Network for Automated Greenhouse Insect Pest Monitoring Dan Je...

538KB Sizes 0 Downloads 35 Views

Journal Pre-proofs Full length article Application of an Image and Environmental Sensor Network for Automated Greenhouse Insect Pest Monitoring Dan Jeric Arcega Rustia, Chien Erh Lin, Jui-Yung Chung, Yi-Ji Zhuang, JuChun Hsu, Ta-Te Lin PII: DOI: Reference:

S1226-8615(19)30471-6 https://doi.org/10.1016/j.aspen.2019.11.006 ASPEN 1473

To appear in:

Journal of Asia-Pacific Entomology

Received Date: Revised Date: Accepted Date:

28 July 2019 18 October 2019 12 November 2019

Please cite this article as: D.J.A. Rustia, C. Erh Lin, J-Y. Chung, Y-J. Zhuang, J-C. Hsu, T-T. Lin, Application of an Image and Environmental Sensor Network for Automated Greenhouse Insect Pest Monitoring, Journal of AsiaPacific Entomology (2019), doi: https://doi.org/10.1016/j.aspen.2019.11.006

This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

© 2019 Published by Elsevier B.V. on behalf of Korean Society of Applied Entomology.

Application of an Image and Environmental Sensor Network for Automated Greenhouse Insect Pest Monitoring

Dan Jeric Arcega Rustiaa, Chien Erh Lina, Jui-Yung Chungb, Yi-Ji Zhuangc, Ju-Chun Hsuc, Ta-Te Lina,*

a.

Department of Bio-Industrial Mechatronics Engineering, National Taiwan University

b.

Tainan District Agricultural Research and Extension Station, Council of Agriculture

c.

Department of Entomology, National Taiwan University

*

Corresponding author: Ta-Te Lin, Professor, Ph.D. No. 1, Roosevelt Rd., Sec. 4, Taipei, Taiwan, ROC Department of Bio-Industrial Mechatronics Engineering National Taiwan University TEL: 886-2-33665331 FAX: 886-2-23929416 E-mail: [email protected]

ABSTRACT

This work presents an automated insect pest counting and environmental condition monitoring system using integrated camera modules and an embedded system as the sensor node in a wireless sensor network. The sensor node can be used to simultaneously acquire images of sticky paper traps and measure temperature, humidity, and light intensity levels in a greenhouse. An image processing algorithm was applied to automatically detect and count insect pests on an insect sticky trap with 93% average temporal detection accuracy compared with manual counting. The integrated monitoring system was implemented with multiple sensor nodes in a greenhouse and experiments were performed to test the system’s performance. Experimental results show that the automatic counting of the monitoring system is comparable with manual counting, and the insect pest count information can be continuously and effectively recorded. Information on insect pest concentrations were further analyzed temporally and spatially with environmental factors. Analyses of experimental data reveal that the normalized hourly increase in the insect pest count appears to be associated with the change in light intensity, temperature, and relative humidity. With the proposed system, laborious manual counting can be circumvented and timely assessment of insect pest and environmental information can be achieved. The system also offers an efficient tool for

long-term insect pest behavior observations, as well as for practical applications in integrated pest management (IPM).

Keywords: greenhouse management; integrated pest management; image processing; support vector machines; wireless sensor network.

-2-

Introduction Integrated pest management (IPM) is one of the core components of a holistic and efficient greenhouse management program. In order to take action on pest control effectively, it is necessary to monitor the activity and density of insect pests. One of the simplest methods to monitor insect pests in greenhouses is the use of sticky paper traps. A typical sticky paper trap consists of a sticky glue layer mounted on a piece of colored cardboard in which the insect pests get attracted to. Sticky paper traps can yield quantitative information concerning the approximate population density and varieties of insects in greenhouses or in open fields. As a result, better and precise pest control or management strategies can be applied, and thus, a reduction in pesticide use and less crop damage can be achieved. Currently, identification and counting of insect pests in sticky paper traps are mainly done by manual inspection (Bashir et al., 2014; Devi and Roy, 2017). There is also no standardized way to quantitatively automate the counting process up to now (Liu et al., 2017). Counting and inspecting of sticky paper traps manually can be very tedious especially when the production area is relatively large. Additionally, insect pests are not easily identified by human inspection unless done by experts. This is why automating the process can be very helpful and the population of the insect pests can be monitored more efficiently compared to manual inspection (Miranda et al., 2015; Rustia and Lin, 2017; Liu et al., 2017).

-3-

Many works show the potential of using image processing algorithms to automatically count insect pests in sticky paper traps (Barbedo 2014; Cho et al., 2007; Qiao et al., 2008). However, most algorithms were developed for sticky paper trap images obtained with controlled lighting environments (Cho et al., 2007; Liu et al., 2017; Espinoza et al., 2016; Moerkens et al., 2019). This partially solves the problem as it still needs manual collection of sticky paper traps from the field. It also loses the temporal information that can be obtained by instantaneous monitoring. It is advised to keep track of the insect pest population using automatic insect pest counting algorithms to prevent insect pest outbreaks early and more effectively (Solis-Sánchez et al., 2009; Weeks et al., 1999; Liu et al., 2017). With the advent of low-cost embedded systems, the development of new technologies for agriculture has become faster and more feasible. One of the most useful yet challenging use of such devices is wireless imaging. Some wireless imaging systems were developed for insect monitoring by automatically collecting sticky paper trap images using RGB cameras installed on different locations (Zhong et al., 2018; Miranda et al., 2015; López Granado et al., 2012). In those systems, the number of insects on the sticky paper trap images are counted through image processing algorithms. Likewise, Martin et al. (2008) developed a video camera network that detects the number of insect pests from sticky paper traps by real-time video analysis for early insect pest detection. However, there has not been much work on

-4-

integrating wireless imaging and environmental sensing specifically for insect pest monitoring which could have a huge potential in many IPM applications. An automated insect pest identification and environmental monitoring system can be used for continuous and precise monitoring of insect population and activity. With simultaneous measurement of the environmental conditions, it is possible to further investigate insect behavior that can be affected by environmental factors such as temperature, relative humidity, and light intensity. It was shown from controlled experiments conducted by Liang et al. (2010) that there are certain peak times for thrips flight activity such as in the morning and in the evening, usually at temperature levels around 28°C. On the other hand, Sengonca and Liu (1999) analyzed the effects of humidity and temperature on the life cycle of whiteflies showing that both environmental parameters can cause a faster growth rate but higher death rate during hot days. More recent studies show that whiteflies can be trapped using artificial light sources, as shown in the works of Shimoda and Honda (2013) and Stukenberg et al. (2015) due to their positive attraction to light. However, the experiments were done by monitoring the population of the insect pests and the environmental condition by manual inspection which required a lot of attention and effort to conduct the observations. Through the use of a continuous insect pest population and environmental monitoring system, the flight activity and reproductive behavior of the insect pests can be analyzed seamlessly and the occurrence of insect pest outbreaks can be predicted and prevented.

-5-

This study aims to design and test an integrated system using imaging and environmental sensors to simultaneously and continuously acquire images of sticky paper traps, and measure the temperature, humidity, and light intensity levels in a greenhouse. The system presented can be used for insect pest monitoring both for controlled and uncontrolled environments in order to conduct further analysis about the dynamics of insect pest behavior in relation to environmental conditions. An image processing algorithm was developed to automatically detect and count insect pests on sticky paper traps. The sensor modules are linked in a wireless sensor network to provide instantaneous information of insect pest count and environmental conditions. The performance of the automated monitoring system was tested in a nursery greenhouse and efficacy of the system was demonstrated with the analyses of spatial and temporal insect pest concentration information in association with environmental conditions.

Materials and methods Wireless sensor node design Each wireless sensor node is made up of three core components: a Raspberry Pi 3 embedded system (Raspberry Pi Foundation, Inc., Cambridge, United Kingdom) as the main processing module, a Raspberry Pi Camera v2 module, and add-on environmental sensors, as illustrated in the block diagram in Fig 1a. Raspberry Pi 3 is an ARM Cortex-based embedded

-6-

single-board computer that is capable of image/video acquisition, sensor handling, and many other tasks. The wireless sensor node is designed in a modularized manner in which environmental sensors can be replaced or added depending on the need of the applications. It can be used to detect sudden changes in the environmental conditions inside the farm that may possibly be related to the insect pest appearance and activity. The specifications of the environmental sensors are shown in Table 1. The Raspberry Pi Camera v2 is a fixed-focus high-resolution 8-megapixel RGB camera connected to the Raspberry Pi 3 via camera serial interface (CSI). As shown in Fig. 1c, the wireless sensor node is hung 8-10 cm above the plants as the camera is positioned approximately 8 cm away from the sticky paper trap. Based on actual measurements, the device consumes around 400 mA while running the sensor data collection program in the background, and around 600 mA as the data are sent to the server via WiFi. The wireless sensor node is connected to an AC power socket available for the greenhouse in our experiments. The wireless sensors’ modules are built to withstand vertically dripping water with its waterproof top design. However, the back and bottom face of the device is not completely protected from water sprays. The device can be rated up to IPX3 based on the Ingress Protection Rating in terms of protection from dust and water.

Wireless sensor network architecture

-7-

The wireless sensor network system is made up of a star topology with multiple integrated wireless sensor nodes, as shown in Fig. 2. Through the star topology, each node can communicate seamlessly with each other and directly transmit the data through the internet via a router. Each wireless sensor node sends the environmental data, every 5 minutes, using UDP protocol and sends the images through HTTP POST protocol. To prevent data loss, all data are also stored directly to the back-up SD memory card of each node. All the data gathered are stored in a central server. The central server runs in Windows 7 OS with an Intel Core-i5 processor and NVIDIA GTX630 GPU support. The server makes use of Apache as its web server software and MySQL as its database. Image processing and data analysis are done by the server, and the processed data are made available online through a web site that can be accessed by a computer or a smart phone. The website can display the approximate spatial location of each node with the processed images of the sticky paper traps made available for monitoring the on-site conditions. The temporal data are also provided to observe the changes in insect pest count and environmental condition. Using both the temporal and spatial information collected, data analyses relating the insect pest count and environmental condition are displayed in the website. Meanwhile, the base station act as an on-site real-time display for the farm owner that shows similar data to the website.

-8-

Experimental setup and data collection The functionality of the system was initially tested in the laboratory after hardware and software implementation. To further evaluate its performance, the system was installed and tested in a 528.8 m2 seedling nursery greenhouse in Chiayi County, Taiwan, with tomato seedlings as the main crop. The experimental setup is shown in Fig. 3. The node locations were suggested by the greenhouse operators in the crop areas or by the entrances. At these locations, seven wireless sensor nodes and one base station were installed. The base station acts as an on-site real-time display for the farm owner that shows similar data to the website. From the operators’ past observations using sticky paper traps, the most common insects found inside a greenhouse are whiteflies, thrips, flies, and aphids. According to past results, there are several factors observed that possibly affect the number of insect pests in the greenhouse. The first factor is the frequent opening of the entrance door which causes a sudden burst of air flow which disturbs the insects and causes them to fly. Secondly, the rotating in-house fans (R fan in Fig. 3) and wall-mounted fans (W fan in Fig. 3) that are constantly turned on all day, regardless of the temperature condition inside the farm, can cause similar phenomena. This is because the air flow can disturb the activity of the insect pests and cause them to find a more stable location to stay (Isaacs et al., 1999). Lastly, temperature has an effect in terms of the growth and development rate of insect pests (Jaworski and Hilszczański, 2013). From these assumptions, it can be inferred that insect pest

-9-

activities and the population density are mostly dependent on their ideal environmental conditions for reproduction and flight which is in agreement with related literature (Sengonca and Liu, 1999; Jaworski and Hilszczański, 2013; Isaacs et al., 1999). The experiments were conducted during the summer season in Taiwan for approximately 45 days with each observation period segmented into three parts of 15 days each. The observation periods were separated to allow replacement of the A5-sized sticky paper traps whenever it was almost full of captured insects. Sticky paper trap images were transmitted to the server every 10 minutes from 7AM to 7PM. Acquisition of images during the night was avoided since this could affect the behavior and activity of the insect pests. In fact, there is apparently very little increase in the insect pest count during night time as observed from previous experiments since most of the insects inside the farm are more active during daytime (Isaacs et al., 1999; Jha et al., 2006; Shimoda and Honda, 2013).

Automatic insect pest counting method The developed image processing algorithm aims to segment objects from the background and filter non-insect objects from a sticky paper trap image to get the correct number of insects. All image processing procedures are performed in the remote server using OpenCV 3.1 C++ image processing library (Bradski, 2000) and Qt 5.7.0 cross-platform

-10-

software development tool (The Qt Company, Espoo, Finland). The sticky paper trap images, with a resolution of 3280 x 2464, undergo a series of processing steps as shown in Fig. 4. To begin with, RGB-to-LUV color model conversion is performed on the acquired image to extract the V-channel color component. Following color model conversion, the objects in the sticky paper trap image are segmented by removing the yellow background using static binary thresholding (Moerkens et al., 2019; Cho et al., 2007; Boissard et al., 2008). After analyzing the images, arbitrary V-channel values of 85 to 120 are found as segmentation threshold. The range in these threshold values, to remove the yellow background, is selected based on the average histogram of many images collected in a preliminary experiment under various lighting conditions during day time. Specifically, the range of the V-channel values at low illumination is found to be around 85 to 100 and normal illumination is from 100 to 120. Therefore, the variability in lighting condition is included in the threshold range. For hole removal and touching object separation, morphological closing is applied followed by median blur for denoising. Selective blob detection is then performed on the pre-processed image to locate the centroids of the blobs (objects) in the image. Blob analysis is performed by selecting the objects according to blob area with blobs smaller than 16 x 16 and larger than 128 x 128 filtered out. This is because it was found that the blobs smaller than 16 x 16 mostly resemble dirt and dust and the blobs larger than 128 x 128 are usually glares. Each blob is cropped into individual 128 x 128 RGB images as input later on

-11-

to the classifier. The cropping size was determined from different sizes of insects in which small insects have sizes less than 64 x 64 while medium and large sized insects were below 128 x 128. The pre-defined size assures that the insect body parts such as legs and wings are not cut off after cropping. The cropped 128 x 128 RGB images obtained from the blob centroids are classified by support vector machine (SVM). Finally, the images are labeled after classification by enclosing insects with red boxes and non-insects with blue boxes. The number of insects are determined from the number of blobs (or objects) classified as an insect. SVM is a classification method based on machine learning in which two or more classes can be linearly or non-linearly separated according to their features. Compared to other classification methods, SVM is fast but requires trial-and-error to find optimal parameters (Chang and Lin 2011). To prepare the SVM model training samples, the selective blob detection method is used to automatically crop out 128 x 128 RGB images from the 3280 x 2464 sticky paper trap images, with a total of 2500 training samples and 1500 validation samples. The insect objects are classified by entomologists through manual inspection of the collected images while verifying from the sticky paper traps using microscopes. Some of the insect objects found were aphids, gnats, flies, thrips, and whitefly. Non-insect objects are composed of background, water droplets, glares, shadow, and other tiny foreign objects such as dust and

-12-

dirt. The feature extraction method used for the insect pest algorithm forms feature vectors by getting the raw RGB 3-channel pixel information from the 128 x 128 RGB image. The feature vectors are used as inputs for the trained support vector machine model for classification into insect or non-insect classes. The main advantage of using RGB raw pixel values for SVM classification is because it is more effective for generalized classification cases, most especially if there is insufficient prior knowledge of the classes for classification. Like in the case of facial recognition (Azzopardi et al., 2016; Kim et al., 2002), the insects and non-insects have too many variants. Due to this, using traditional color or geometric features makes the classifier too constrained to the specific set of features and weak against newly found variants of the targets for classification. This was proven as the performance of the classifier model in this work surpassed the performance of our previous model that used color, shape, and morphological features that had a lower average F1-score of 0.93 (Rustia et al., 2017). The SVM classification model was trained using the CvSVM class in the OpenCV 3.1 library. Training was done by feeding the raw RGB pixel feature vectors from the 128 x 128 sorted insect and non-insect training images to the SVM classifier model. In this work, an SVM classifier model with RBF (Radial basis function) kernel was used. Training was automatically done with 2500 training samples using the OpenCV built-in model training function.

-13-

Insect pest counting algorithm evaluation The insect pest counting algorithm was evaluated based on two levels: object level and image level. Object level evaluation was performed to measure the performance of the classifier model without including the other components of the algorithm. It involves testing the trained SVM RBF classifier model on individual 128 x 128 testing images by 5-fold cross-validation. The evaluation was done using 1500 validation samples, different from the training samples. The following derived indices were calculated for evaluation (Ding and Taylor, 2016): 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 =

𝑅𝑒𝑐𝑎𝑙𝑙 =

𝐹1 = 2 ∗

𝑇𝐼 𝑇𝐼 + 𝐹𝐼

(1)

𝑇𝐼 𝑇𝐼 + 𝐹𝑁𝐼

(2)

𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 ∗ 𝑅𝑒𝑐𝑎𝑙𝑙 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 + 𝑅𝑒𝑐𝑎𝑙𝑙

(3)

where TI = true insect, TNI = true non-insect, FI = false insect, and FNI = false non-insect. For image level evaluation, the algorithm was tested on 3280 x 2464 testing images from three experiments performed to test the system. It was done by comparing the automatic count of the algorithm and the manual count by entomologists. Throughout this text, the accuracy of the algorithm is based on the relative absolute difference of the manual count and automatic count as follows: 𝐷𝑒𝑡𝑒𝑐𝑡𝑖𝑜𝑛 𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 (%) =

(

1-

|A - M| M

)

× 100%

(4)

-14-

where A = the true (insect) automatic count by the algorithm and M = the manual count by entomologists. The algorithm was tested both spatially and temporally based on Equation (4). Spatial accuracy is defined as the absolute difference in the final insect pest count between the automatic and manual count. Meanwhile, temporal accuracy is the absolute difference between the automatic and manual count during a specified time. The spatial insect pest counts were also statistically analyzed using a t-test with 0.05 significance level using R version 3.6.0 statistical software (R Core Team, 2013). These evaluation methods verify the capability of the algorithm in successfully counting the insect pests on the sticky paper traps based on different locations and time. Discussion Sample detection results The mechanism of the algorithm is demonstrated in the sample processed images in Fig. 5. The foreground of the original RGB image (Fig. 5a) was separated from the background (Fig. 5b) then open blobs were closed using morphological closing operation (Fig. 5c). Median filter (Fig. 5d) was applied to remove noises in the image and then the blobs were detected using blob analysis (Fig. 5e). Fig. 5e shows that glares, whether big or small, may be detected as a blob as seen with the clustered glares located at the lower left of the image. But noticing the results in Fig. 5f, the glares smaller than a blob size of 16 x 16 were excluded

-15-

from the SVM classification stage to reduce the error in counting. The sequence of image processing steps demonstrates the effectiveness of the algorithm in filtering non-insect objects that are present in the image.

Insect pest counting algorithm evaluation After performing object level evaluation, the trained SVM RBF kernel model, to classify insect and non-insect objects, was found to have an average F1-score of 0.97 (Insect: F1-score = 0.98; Non-insect: F1-score = 0.95). Upon inspecting the validation results, the most commonly found source of misclassification was from object appearance similarities, such as when some insects look like dirt or stains, which comprised 3.9% out of 5.5% of the total error from non-insect misclassification. While the remaining 1.6% are from rare cases where multiple insects are in a single 128 x 128 image and touching/overlapping objects. This means that the trained model was able to distinguish between insect and non-insect objects successfully with minimal error. The algorithm was tested both spatially and temporally on an image level based on Equation (6). After 15 days for each experiment, the total counts per node and the average of the final insect pest counts were obtained, as shown in Fig. 6. The average spatial accuracy of the algorithm ranged from 90% to 96% for each individual node. The t-statistic results after conducting a paired t-test, excluding the mean automatic and manual counts, show values

-16-

close to zero which means the automatic counting results were not significantly different from the manual counts. However, there were still instances in which the algorithm failed to have a high accuracy in detection, as seen by the counting comparisons for node 3 in Fig. 6c. The discrepancies are caused by the presence of dirt left from water droplets that are sometimes identified as insects. This is a challenging problem since there are unexpected instances like this that occur whenever the plants are watered in the greenhouse. Therefore, it is also recommended to maintain the cleanliness of the greenhouse to avoid errors in counting. Nevertheless, these results show that the system meets its goal of accurately counting the number of insect pests individually per node. The temporal accuracy is also tested to prove that the system can be used to reflect temporal information on insect pest counts which is useful for early insect pest detection. Temporal accuracy is obtained from the absolute difference in count per image. To illustrate, the final daily counts of nodes with the maximum and minimum number of counts for each experiment were selected, as shown in Fig. 7. The values of the automatic and manual counts are consistent with each other. Meanwhile, including all the other nodes, the average temporal accuracy of the algorithm per individual node is about 93% based on the absolute difference of counts per image taken at different times. One main cause of error in temporal counting is the presence of droplets close to the insects that will disappear after a certain period of time. Another source of error is whenever glares are captured together with the

-17-

insect which causes missed detection. These errors are difficult to avoid due to variations in lighting and operational conditions in the greenhouse when acquiring the sticky paper trap images.

Temporal information from the insect pest counts Fig. 8a shows the temporal insect pest counts from three experiments conducted consecutively while Fig. 8b shows the total insect pest count on each experiment. Compared to the other nodes, node 4 was found to have the highest number of temporal insect pest count after three experiments. It was also found that first experiment had a sudden increase in count during day 7 (Fig. 8b). This implicates that there were possible external factors such as environmental or weather condition that disturbed the behavior of the insect pests. Comparing the three 15-day experiments, the first experiment has higher pest count. Meanwhile, Fig. 9 shows the average hourly normalized frequency of change (Δ) in insect count, temperature, relative humidity, and light intensity for the three experiments. The temporal change in each environmental parameter was obtained by taking the derivative of the temporal measured values from the sensor nodes. It was observed from the results that there is a possible relationship between the changes in environmental condition and the insect pest count as suspected from the results shown in Fig. 8b. This shows an advantage of the proposed automated system compared to manual inspection in which the system can record

-18-

the counts and environmental conditions simultaneously and continuously. Therefore, environmental effect on insect behaviors can be further studied.

Spatial information from the insect pest counts One of the purposes of using a wireless camera and environmental sensor network, aside from making sticky paper trap inspections more convenient, is to determine the possible target locations for more precise pesticide application or management. Furthermore, the environmental data obtained can be used to identify the environmental parameter that has the greatest effect on insect activity. Fig. 10 shows the average values of insect pest counts and the environmental sensor data collected from each node for the three experiments conducted. These figures are plotted in a spatial dimension of the greenhouse corresponding to that shown in Fig. 3. The average measured data of the three experiments from each sensor node are shown in Figs. 10a, 10b, 10c, and 10d for insect pest count, temperature, relative humidity, and light intensity, respectively. The spatial distributions of the corresponding data are also displayed in Figs. 10e, 10f, 10g, and 10h. For the insect pest count, the data from sensor node 4 have the highest average values (Fig. 10a). This is most likely caused by the sudden gusts of wind when opening the door where sensor node 4 was installed. This kind of information can be utilized to find the hotspot of insect activity in a greenhouse and to further improve its management. It can also be observed that there were fewer variations in insect

-19-

counts in nodes 1 and 2. This is possibly due to the W fans (as shown in Fig. 3) that are continuously in operation. This causes a constant subtle airflow running through the greenhouse. Since nodes 1 and 2 are closest to the W fans, there are fewer insects detected there, as expected. To further analyze the data obtained by the system, the percentage of insect pests detected on the sticky paper trap images was also inspected manually by entomologists as shown in Fig. 11. Among the insects detected, the percentage of the most harmful insect pests whitefly, thrips, and aphid, were given more attention. Fig. 11 shows that a high percentage of the target insect counts were thrips. This means that the environmental conditions of the observation periods were possibly more favorable for the reproduction and activity of thrips (Jha et al., 2009; Yadav and Chang, 2014). On the other hand, whiteflies still existed but there were very few aphids. Some percentage of other insects were also found which include gnats, flies, moths, etc. This shows that the automatic insect pest count results consist mostly of insects that should be monitored more carefully.

Discussion System and algorithm evaluation

-20-

It was demonstrated that the automatic and manual counts are consistent with each other. Despite some small difference in the counts of each node, it is very minimal and so the automatic counts reflect the count information needed. It was proven that the application does not require too complex algorithms to perform successful detection and identification. However, this does not include very special cases such as extremely crowded insect pests that cannot be easily solved by the algorithm presented. Instead, the potential of implementing an insect pest counting algorithm even on uncontrolled environments was highlighted. After manual inspection, some cases of misclassification are from dirt classified into insects. The dirt sometimes come from dried water splashes and dust from the air. It can be prevented if the cleanliness of the greenhouse is maintained such as to prevent soil or any other foreign object to be stuck on the sticky traps. Another source of error is whenever glares are captured together with the insects which causes missed detection. These errors are difficult to avoid due to variations in lighting and operational conditions in the greenhouse when acquiring the sticky paper trap images. This implies a limitation for any automatic insect counting algorithm in which it is nearly impossible to perfectly detect the objects without considering the non-insect objects. Nevertheless, these results show that the system meets its goal of accurately counting the number of insect pests individually per node despite the uncontrolled environment. This is different from some past developed methods which do

-21-

not consider the presence of non-insect objects in the image and perform the detection without excluding the false detections.

Possible effects of environmental conditions to the insect pest behavior It can be observed that there are differences in the increasing rate of the insect pest count on each node installed at different locations in the greenhouse. It can also be noted that the increasing rates of each node are consistent for each 15-day experiment period for the whole experiment since no specific action was done to reduce the insect population (Fig. 8). However, the three 15-day experiments differ in terms of weather conditions. For experiment 1, specifically, a typhoon hit the area around day 8-12 (Fig. 8b), while the weather was consistently dry for experiments 2 and 3. Despite having roofing and protection, the greenhouse was still affected by the harsh weather conditions, especially the strong gusts of wind. The strong gusts of wind caused disturbances to the activity of the insect pests. It also causes huge sudden variations with the temperature and humidity conditions throughout the farm that causes the insect pests to find a more ideal spot for living. Due to this, the insect pests take flight which causes some of the insect pests get captured by the sticky paper traps (Isaacs et al., 1999; Liang et al., 2010). Based on other literature, the temporal information from the insect pest count and the instantaneous environmental data can possibly be used together to further understand insect

-22-

pest population dynamics (Jha et al., 2009; Yadav and Chang, 2014). From our previous work, similarities between the temporal changes in the insect pest count and the environmental parameters such as temperature, relative humidity, and light intensity were found (Rustia and Lin, 2017). For the three 15-day experiments, it was apparent that the flight of the insect pests is not uniform daily (Fig. 9). Instead, the activity of the insects is more frequent when there is a drastic change in environmental conditions especially during the morning or late afternoon. This means that the peak insect pest activity is affected by changes in the environmental conditions monitored. This is in coherence with Stukenberg et al. (2015) in which insect activity was reported to be highly affected by light. This observation is also similar to experimental results from Jha et al. (2009) which shows that thrips follow a specific flight pattern. Therefore, the information obtained from the proposed system is valuable in the study of insect pest behaviors in association with environmental conditions. It can possibly be used for developing an alarm system and prediction of insect pest flight. Among the environmental parameters observed, it can be seen that the humidity levels are high most of the time, as shown in Fig. 10c. The general ideal humidity level for reproduction of the target insects is around 70-80% RH (Yadav and Chang, 2014), and the ideal temperature for reproduction of insect pests is around 25-34°C (Wang and Tsai, 1996; Antignus et al., 2001). Therefore, this shows that the relative humidity and temperature measured in the experiments were in the range suitable for reproduction of insects. This

-23-

means that the system was able to obtain same observations that were collected by controlled experiments. By combining the insect pest count and environmental information, researchers can also utilize the system and reduce the burden in inspecting the sticky paper traps and reading environmental sensor data loggers. This can be even more helpful when studying insect pest behavior in larger area which require more people to conduct the observations. Since it was found that the flight rate of insects are associated with environment conditions, the information collected by the automated system can be further used by the manager of the greenhouse to devise environmental control plans to reduce the flight rate of insect pests, and thus pesticide application can be avoided or optimized during crop production.

Conclusion The hardware and software design and implementation of an integrated imaging and environmental sensor network system for automated insect pest monitoring was presented. This work is an improvement from using wireless cameras alone by integrating both wireless cameras and environmental sensors to collect more potentially valuable information for many possible insect pest monitoring applications. The key contribution of this work is that an automated and integrated system is developed that can provide quantitative and precise acquisition of insect pest and environmental information simultaneously that can facilitate

-24-

IPM and greenhouse management. With the advent of this system, laborious manual counting can be circumvented and timely assessment of insect pest and environmental information can be achieved. It is possible to quantitatively and effectively investigate the effects of the environmental factors on the appearance and activity of insect pests in an uncontrolled agricultural environment. Furthermore, the insect pest counting algorithm is able to overcome the constraints of instantaneous observation which may involve the presence of glare, dirt, and droplets, which is a primary problem in insect pest counting. Currently, the algorithm cannot identify the insects according to species but it is considered as one of the possible improvements to the system. The system should also be installed in more experimental sites for comparison. Several potential applications, such as early warning of insect pest infestation, pesticide control, insect behavior studies, can be further applied based on the proposed system.

Acknowledgements This work was supported by the Taiwan Council of Agriculture, ROC (Grant No. 106AS-18.2.1-ST-a1). The authors also wish to thank Mr. Yu-Chia Lu for his kind permission and assistance to test the system in a seedling nursery greenhouse.

-25-

References Antignus, Y., Nestel, D., Cohen, S., Lapidot, M., 2001. Ultraviolet-deficient greenhouse environment affects whitefly attraction and flight-behavior. Environ. Entomol., 30(2), 394-399. https://doi.org/10.1603/0046-225X-30.2.394. Azzopardi, G., Greco, A., Vento, M., 2016. Gender recognition from face images using a fusion of SVM classifiers. In: International Conference on Image Analysis and Recognition (ICIAR), Póvoa de Varzim, Portugal, pp. 533-538. Barbedo, J. G. A., 2014. Using digital image processing for counting whiteflies on soybean leaves. J. Asia-Pac. Entomol. 17, 685-694. https://doi.org/10.1016/j.aspen.2014.06.014. Bashir, M. A., Alvi, A. M., Naz, H., 2014. Effectiveness of sticky traps in monitoring of insects. J. Agric., Food, Environ. Sci., 1. Boissard, P., Martin, V., Moisan, S., 2008. A cognitive vision approach to early pest detection in greenhouse crops. Comput. Electron. Agric., 62(2), 81-93. Bradski, G., 2000. The OpenCV library. Dr. Dobb’s Journal of Software Tools. Chang, C.-C., Lin, C.-J., 2011. LIBSVM: A library for support vector machines. ACM Trans. Intell. Syst. Technol., 2(27), 1-27. https://doi.org/ 10.1145/1961189.1961199.

-26-

Cho, J., Choi, J., Qiao, M., Ji, C-W., Kim, H.Y., Uhm, K.B., Chon, T-S., 2007. Automatic identification of whiteflies, aphids and thrips in greenhouse based on image analysis. Int. J. Math. Comput. Simul., 1, 46-53. Devi, M. S., Roy, K., 2017. Comparable study on different coloured sticky traps for catching of onion thrips, Thrips tabaci Lindeman. J. Entomol. Zool. Stud., 5(2), 669-671. Ding, W., Taylor, G., 2016. Automatic moth detection from trap images for pest management. Comput.

Electron.

Agric.,

123(Supplement

C),

17-28.

https://doi.org/10.1016/j.compag.2016.02.003 Espinoza, K., Valera, D. L., Torres, J. A., López, A., Molina-Aiz, F. D., 2016. Combination of image processing and artificial neural networks as a novel approach for the identification of Bemisia tabaci and Frankliniella occidentalis on sticky traps in greenhouse agriculture. Comput. Electron. Agric., 127(Supplement C), 495-505. https://doi.org/10.1016/j.compag.2016.07.008. Isaacs, R., Willis, M. A., Byrne, D. N., 1999. Modulation of whitefly take ‐ off and flight orientation by wind speed and visual cues. Physiol. Entomol., 24(4), 311-318. https://doi.org10.1046/j.1365-3032.1999.00144.x.

-27-

Jaworski, T., Hilszczański, J., 2013. The effect of temperature and humidity changes on insects development their impact on forest ecosystems in the expected climate change. For. Res. Pap., 74(4), 345-355. https://doi.org/10.2478/frp-2013-0033. Jha, V. K., Seal, D. R., Schuster, D. J., Kakkar, G., 2009. Diel flight pattern and periodicity of chilli thrips (Thysanoptera: Thripidae) on selected hosts in South Florida. In: Annual Meeting of the Florida State Horticultural Society, pp. 267–271. Kim, I. K., Kim, J. H., Jung, K., 2002. Face recognition using support vector machines with local correlation kernels. Int. J. Pattern Recognit. Artif. Intell., 16, 97-111. Liang, X.-H., Lei, Z. R., Wen, J.-Z., Zhu, M.-L., 2010. The diurnal flight activity and influential factors of Frankliniella occidentalis in the greenhouse. Insect Sci., 17(6), 535-541. Liu, H., Lee, S.-H., Chahl, J. S., 2017. A review of recent sensing technologies to detect invertebrates

on

crops.

Precis.

Agric.,

18(4),

635-666.

https://doi.org/10.1007/s11119-016-9473-6. López Granado, O., Martínez-Rach, M., Migallón, H., Malumbres, M., Bonastre Pina, A., Serrano Martín, J., 2012. Monitoring pest insect traps by means of low-power image sensor

technologies.

Sens.

(Basel,

https://doi.org/10.3390/s121115801

-28-

Switzerland),

12,

15801-15819.

Martin, V., Moisan, S., Paris, B., Nicolas, O., 2008. Towards a video camera network for early pest detection in greenhouses. In: ENDURE International Conference, La Grande-Motte, France. Miranda, J. L., Gerardo, B. D., Tanguilig III, B. T., 2015. Pest identification using image processing techniques in detecting image pattern through neural network. Int. J. Adv. Image Process. Tech., 1(4), 4-9. Moerkens, R., Brenard, N., Bosmans, L., Reybroeck, E., Janssen, D., Hemming, J., Sluydts, V. 2019. Protocol for semi-automatic identification of whiteflies Bemisia tabaci and Trialeurodes vaporariorum on yellow sticky traps. J. Appl. Entomol., 0(0), 1-7. https://doi.org/10.1111/jen.12630 Qiao, M., Lim, J., Ji, C. W., Chung, B.-K., Kim, H.-Y., Uhm, K.-B., Myung, C. S., Cho, J., Chon, T-S., 2008. Density estimation of Bemisia tabaci (Hemiptera: Aleyrodidae) in a greenhouse using sticky traps in conjunction with an image processing system. J. Asia-Pac. Entomol., 11(1), 25-29. https://doi.org/10.1016 /j.aspen.2008.03.002 R Core Team, 2018. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Rustia, D. J. A., Lin, C. E., Chung, J-Y., Lin, T.-T., 2017. An object classifier using support vector machines for real-time insect pest counting. In: 2017 Conference on

-29-

Bio-Mechatronics and Agricultural Machinery Engineering, Taipei, Taiwan, pp. 275-278. Rustia, D. J. A., Lin, T.-T., 2017. An IoT-based wireless imaging and sensor node system for remote

greenhouse

pest

monitoring.

Chem.

Eng.

Trans.,

58,

601-606.

https://doi.org/10.3303/CET1758101. Sengonca, C., Bo, L., 1999. Laboratory studies on the effect of temperature and humidity on the life table of the whitefly, Aleurotuberculatus takahashi David & Subramaniam (Hom.,

Aleyrodidae)

from

southeastern

China.

J.

Pest

Sci.,

72,

45-48.

https://doi.org/10.1007/BF02771095. Shimoda, M., Honda, K.-I., 2013. Insect reactions to light and its applications to pest management.

Appl.

Entomol.

Zool.,

48(4),

413-421.

https://doi.org/10.1007/s13355-013-0219-x Solis-Sánchez, L. O., García-Escalante, J. J., Castañeda-Miranda, R., Torres-Pacheco, I., Guevara-González, R., 2009. Machine vision algorithm for whiteflies (Bemisia tabaci Genn.) scouting under greenhouse environment. J. Appl. Entomol., 133(7), 546-552. https://doi.org/10.1111/j.1439-0418.2009.01400.x

-30-

Stukenberg, N., Gebauer, K., Poehling, H. M., 2015. Light emitting diode (LED)-based trapping of the greenhouse whitefly (Trialeurodes vaporariorum). J. Appl. Entomol., 139(4), 268-279. https://doi.org/10.1111/jen.12172 Wang, K., Tsai, J. H., 1996. Temperature effect on development and reproduction of silverleaf whitefly (Homoptera: Aleyrodidae). Ann. Entomol. Soc. Am., 89(3), 375-384. https://doi.org/10.1093/aesa/89.3.375 Weeks, P. J. D., O’Neill, M. A., Gaston, K. J., Gauld, I. D., 1999. Automating insect identification: exploring the limitations of a prototype system. J. Appl. Entomol., 123(1), 1-8. https://doi.org/10.1046/j.1439-0418.1999.00307.x Xia, C., Chon, T.-S., Ren, Z., Lee, J.-M., 2015. Automatic identification and counting of small size pests in greenhouse conditions with low computational cost. Ecol. Inf., 29(Part 2), 139-146. https://doi.org/10.1016/j.ecoinf.2014.09.006. Yadav, R., Chang, N.-T., 2014. Effects of temperature on the development and population growth of the melon thrips, Thrips palmi, on eggplant, Solanum melongena. J. Insect Sci., 14, 78. https://doi.org/10.1673/031.014.78 Zhong, Y., Gao, J., Lei, Q., Zhou, Y., 2018. A vision-based counting and recognition system for flying insects in intelligent agriculture. Sens. (Basel, Switzerland), 18(5), 1489. https://doi.org/10.3390/s18051489

-31-

Table 1. Specifications of environmental sensors used in the sensing module Sensor model name

Environmental data

AM2301 (Guangzhou Aosong

Relative humidity

Electronics Co., Guangzhou, China) BH1750 (ROHM Semiconductor, Kyoto, Japan)

Interface

Resolution

Accuracy

Range

Response time

0.1%RH

±3%RH

0-100%

>2 seconds

0.1C

±0.5C

-40-80C

One-wire Ambient temperature Ambient light

I2C

1 lux

1.2 (Sensor out/Actual lux)

1-65535 lux

>2 seconds

List of Figures Fig. 1. Integrated wireless imaging and environmental sensor module: (a) device hardware block diagram, (b) actual device set-up in a greenhouse with its corresponding components, (c) schematic drawing with dimensions, (d) full 3D model. Fig. 2. Schematic diagram of the integrated wireless imaging and environmental sensor network. Fig. 3. Greenhouse layout and the approximate installation locations of the wireless sensor nodes. The representations below show the actual appearance of each object shown in the layout. Fig. 4. Image processing algorithm flow chart for insect pest counting. Fig. 5. Sample image outputs of the insect pest counting algorithm. (a) Original RGB image; (b) extracted V component after applying static thresholding; (c) shows (b) after undergoing morphological closing operation; (d) shows (c) after applying median blur; (e) the original image with detected blobs marked using red circles; (f) final output image including objects selected using blob analysis and identified using SVM with classified insects marked using red boxes and non-insects using blue boxes. Fig. 6. Manual vs. automatic insect pest spatial counting comparison showing the final count values of both methods: (a), (b), and (c) show count comparisons for experiments 1, 2, and 3, respectively. The columns on the furthest right show the comparison of mean

values for all nodes. Paired t-tests (p<0.05) were performed with the t-statistic values shown at the upper right of each plot. Fig. 7. Temporal manual vs. automatic insect pest counting comparisons showing the final counts per day from manual and automatic counting: (a), (b), and (c) show count comparisons for experiments 1, 2, and 3, respectively. Only two sets of data are displayed representing maximum and minimum counts from 7 sensor nodes. Fig. 8. Temporal insect pest count information from three 15-day experiments: (a) shows the connected real-time insect pest counts from three 15-day experiments with their corresponding node numbers. Dashed lines are used as dividers for each 15-day experiment and (b) shows the average temporal insect count of the three 15-day experiments. Fig. 9. Normalized hourly frequency of changes in insect pest count and its association with average environmental parameters of temperature, relative humidity, and light intensity. (a), (b), and (c) show plots for experiments 1, 2, and 3, respectively. Fig. 10. Average insect pest counts and average environmental parameters measured in three experiments. (a), (b), (c), and (d) show the mean bar chart for insect pest counts, temperature, relative humidity, and light intensity with the corresponding standard deviations for each node. (e), (f), (g), and (h) show spatial contour plots indicating the relative location of each node and their intensity levels.

-1-

Fig. 11. Percentages for different kinds of insects: whitefly, thrips, aphid, and other insects, captured on the sticky paper traps in the three experiments.

Highlights



The system can monitor insect count and environmental parameters simultaneously.



The average temporal accuracy of the insect pest counting algorithm is 93%.



Spatial and temporal information of insect activity can be effectively obtained.



Insect activity affected by environment can be investigated with the system.

Conflicts of interest All authors declare that there is no conflict of interest.

-2-