An Automatic Plant Growth Measurement System for Plant Factory

An Automatic Plant Growth Measurement System for Plant Factory

An Automatic Plant Growth Measurement System for Plant Factory Wei-Tai Chen, Yu-Hui Flora Yeh, Ting-Yu Liu, Ta-Te Lin * Department of Bio-Industrial M...

634KB Sizes 45 Downloads 166 Views

An Automatic Plant Growth Measurement System for Plant Factory Wei-Tai Chen, Yu-Hui Flora Yeh, Ting-Yu Liu, Ta-Te Lin * Department of Bio-Industrial Mechatronics Engineering, National Taiwan University, Taiwan, R.O.C. (Tel: 886-2-3366-5331; e-mail: [email protected]).

Abstract: Plant factories are closed cultivation spaces where environmental conditions are controllable. In order to find the most suitable environmental conditions for plant growth, plant features can provide a good indication. Traditional direct measurement methods are simple but destructive and laborious. In this research, an automatic plant growth measurement system, which is composed of a stereo vision system and a weight measurement system, was developed. The growth of Boston lettuces were monitored and measured with our automatic system. The camera mounted on a sliding rail was moved by a linear actuator to extend the field of view of the planting bed. Using image processing methods and stereo vision techniques, panoramic images of the plating bed were constructed and plant geometric features such as projected leaf area, maximum and minimum radii and plant volume were calculated. At the same time, plant weights were continuously recorded by the weight measurement instrument. Both the visionbased system and weight measurement system are based on non-destructive methods, so the plants features can be monitored throughout the whole growth cycle without affecting its growth. The plant growth measurement system developed in this research can obtain a large amount of plant features easily comparing to traditional destructive measurement methods. Besides, the relationship between the projected leaf area and the fresh weight of Boston lettuces was also discussed in this research. The results have proved the feasibility and functionality of this integrated system. This system would provide valuable information for plant factory managements and agricultural research applications. Keywords: Plant Factory, Plant Growth Measurement, Hydroponics, Image Processing, Load Cell 1. INTRODUCTION Plant factory is a closed cultivation space, in which the environmental conditions, such as lighting period, humidity and temperature are controllable. For the purpose of optimizing plant production with limited resources, determining the most suitable growing environment for plants is one of the primary tasks in plant factory. By examining the correlation of environmental conditions and plant productions in a plant factory application, Sase et al. (1988) measured the weight and the leaf area of lettuces under different sources of light. The results showed there were no significant difference in fresh weight, dry weight, and plant area between lettuces grown under 400W highpressure sodium lamps and 400W metal halide lamps. Therefore, their research indicated that spectral power distribution of radiation from lamps has no influence on fresh or dry production on specific range of photosynthetic photon flux. As plant feature measurements provide a good indication of the growth, fast and diverse plant features are desirable. Traditional direct measurement methods are simple and reliable, but they are time consuming and laborious. In contrast, vision-based methods are non-destructive and efficient methods to acquire a large amount of measurements. He et al. (2003) successfully constructed 3D model from 2D images and extracted plant growth information, including average height, projected leaf area, and volume by the

stereological algorithm. This has made the stereo vision based technique a representative non-destructive plant feature measurement method. Chien and Lin (2005) reconstructed three-dimensional structure of selected vegetable seedlings by incorporating two side-view images with a top-view image. Also, this research applied orthogonal images to improve vegetable seedlings leaf number and leaf area estimations. Later, Lin et al. (2012) incorporated image processing algorithms with an automatic measurement platform to monitor plant growths of Boston lettuces. With a camera mounted on a sliding rail and based on the stereo vision technology, features of Boston lettuces such as projected leaf area, plant height, plant volume, and equivalent diameter were extracted. Plant weight is another important plant feature for growth measurement. Traditional weight measurements often use electronic balances to measure the plant fresh weights. The processes of plant weight measurements are not only laborious but also destructive to plants. In order to overcome this problem, Bengough and MacKenzie (1994) developed a non-destructive root weight measurement method to continuously measure the force exerted by roots of seedling pea. The roots force and elongation were measured by connecting a load cell to tips of roots and attaching linear variable differential transformer with the tube covered the pea. The results showed that the root growth rate responds within minutes to the change in force, but followed by a long period of growth rate adjustment for a period of hours. Van

Henten and Bontsema (1995) showed that there is a linear relation between the leaf area and the dry weight of lettuces through image processing methods and destructive plant weight measurements. This research indicated the possibility of a non-destructive plant growth measurement method using the plant leaf area to estimate the plant dry weight. Later, Takaichi et al. (1996) constructed a non-destructive plant weight measurement system to measure transpiration and water uptake rate of tomato plants. Their system continuously measured the plant weight by two electronic balances and two water communicating pots.

camera moving, stopping at specified location and image acquisition, was carried out until the end of the planting bed before returning to the starting position. The planting bed panoramic images were constructed through imaging processing steps like image stitching, contour extracting, and foreground segmentation. Advanced image processing algorithms such as Speeded-Up Robust Features (SURF) (Bay et al., 2008), and Random Sample Consensus (RANSAC) (Fischler and Bolles, 1981) were applied. The 2D and 3D features, such as projected leaf area, plant height, and plant volume, were calculated for later analyses.

In this research, an automatic plant growth measurement system was developed to monitor Boston lettuce growth statuses in a plant factory. Plant features such as projected leaf area and volume were calculated through image processing algorithms and stereo vision techniques. The weights of plants were obtained from the load cell mounted on weight measurement instruments simultaneously. In addition, the plant features and fresh weights of lettuces were analysed to demonstrate the functionality and efficiency of the system. 2. MATERIALS AND METHODS 2.1 System design and operations The automatic plant growth measurement system was developed and put into practice in a closed-type plant factory. The Boston lettuce was the selected leafy plant to be monitored. The plant growth measurement system monitored 8 Boston lettuces simultaneously on a planting bed. The day and night temperature was set to be 23C and 19C respectively with 16 hours of light period. In order to monitor the plant growth continuously, the images of the planting bed were acquired every 30 minutes, and plant weights were measured and recorded every minute.

Fig. 1. Data flow and control signal diagram.

The dataflow and control signals of the system are illustrated in Fig. 1. There are two parts of the plant measurement system: stereo vision system and weight monitoring system. The camera, Logitech® HD Pro Webcam C910, of the stereo vision system was driven by a motor to take images above the plant bed in a plant factory. Plant features were calculated by applying image processing algorithms on images acquired by the camera. In the weight monitoring system, the weights of plants were automatically and continuously measured by the weighting instrument utilizing the LDB-2kg load cells. The hardware design of the automatic plant feature measurement system is integrated with vision based measurement system developed in Lin et al. (2012) and its components are illustrated in Fig. 2. The stereo vision system was setup on a planting shelf in a plant factory. An upsidedown F-shaped arm was fixed on the slider of the belt driven by a linear actuator. The arm was extended into the planting bed with a camera mounted. In order to capture images of plants on the planting bed, a camera was moved to fixed locations using a belt with a brushless DC electric motor and a brushless motor servo driver. In each image acquisition and recording process, a series of repetitive movements including

Fig. 2. Hardware design of the automatic plant growth measurement system.

Beside image information, another plant feature that was monitored automatically and continuously as plants grew is the plant weight. The plant weight measurement instrument was built with a load cell (Model LDB-2kg and Manufacturer here). A load cell is a transducer which converts a force (weight) signal into an analogue electrical signal. Fig. 3 illustrates the hardware design of our self-developed plant weight measurement instrument. The instrument was built with acrylic material and has three main components: the load cell, top disk and bottom disk. The load cell is mounted on the top disk and connected to the bottom disk to measure the weight of the upper disk. Each Boston lettuce was planted in the plant holder with its roots immersing in a nutrient solution of the hydroponics system, and the plant holder r was attached to the top disk. Hence, the weight of the lettuce would create a downward force on the top disk. This weight measurement instrument loading limit is 2 kg maximum with a precision of 0.2 g.

Fig. 3. Weight measurement instrument design. 2.2 Image processing methodology As this research is an extension project of Lin et al. (2012), we incorporated its plant feature calculation method with the automatic plant growth measurement system. Fig. 4 shows the image processing procedures to extract plant features from the images acquired. In particular, some OpenCV library functions were utilized in the plant feature calculation steps.

Fig. 4. Flowchart of the plant feature extraction procedure.

In order to apply stereo vision techniques on calculating plant features, the relative position between adjacent images are required. Since the camera was mounted on an arm and moved horizontally by the motor, positions between adjacent images can be controlled. The distance between adjacent images was set to be close, so there were enough cross areas between adjacent images to calculate the relative position between them. The first step was finding and matching corresponding points between adjacent images by the SURF algorithm. However, there were still false points matched by the SURF function. In order to filter out these errors, another algorithm, RANSAC was applied. The RANSAC function was used to compare random pairs of corresponding points and find relative relationships between them. The falsely matched points were filtered out by only keeping inliers, which are corresponding points sharing a similar relative relationship. In order to eliminate the errors caused by light angle changed, the minimum Cut algorithm (Davis, 1998) was applied to match up borders with minimal errors found. These processes were repeated until all adjacent images were stitched into one elongated image, the panoramic image. In order to calculate the plant features from the panoramic image, the plant images have to be segmented out from the background. Since the color of the plants and background were green and white, the plants area could be easily located in square areas using color histogram threshold. Backgrounds were removed using the GrabCut algorithm (Rother et al., 2004) from the square areas located. The disparities of plants were calculated by semi-global block matching (SGBM) function from the OpenCV library, a modified algorithm from the semi-global matching (SGM) method (Hirschmüller, 2005). The SGBM algorithm matched blocks of pixel between adjacent images and used sub-pixel metric to calculate disparity images. Finally, the plant features such as projected leaf area and plant volume can be calculated from the disparity image. The projected leaf area was calculated from the contour area extracted by GrabCut algorithm, while the plant volume was the sum of all height in this area, which was calculated by the SGBM function. 3. RESULTS AND DISCUSSION The automatic plant growth measurement system developed in this research non-destructively and continuously measured features and weight of Boston lettuces planted in the plant factory. In the 14 days of plant growth observation period, weights of 8 Boston lettuces were acquired every minute and images of 4 Boston lettuces were pictured every 30 minutes. Fig. 5a. shows the panoramic image of four lettuces in the same row of the planting bed. Since the camera travelled across the whole planting bed by the linear actuator, the system was able to monitor and measure numbers of plants effortlessly. Moreover, background of panoramic image was segmented from plants area by applying color threshold and GrabCut algorithm as illustrated in Fig. 5b. And then, 2D plant features such as projected leaf area, width, length, maximum radii, and minimum radii were calculated as illustrated in Fig. 5c. These features are important indicators of the plant growth. As a result, this system can be further applied to find the most suitable environmental conditions.

Applying stereo vision algorithm to the panoramic image, the disparity map was constructed as shown in Fig. 5d. The disparity map represents the depth information. The darker pixel in the disparity map suggests that the corresponding position is closer to the camera and vice versa. The plant volume was then calculated from the disparity map. It is believed that the disparity map can be used to rebuild 3D virtual model of plants in the future. As there were not enough features to be found by SURF algorithm when the leaf areas were small, couple of green squares were placed between lettuces to increase corresponding points in the image stitching step.

intuitively that plant with larger volume was heavier. However as the lettuces grew taller than the stereo vision effective region after 12 days of observation, only 12 days of plant volume results are presented in Fig. 6c.

(a)

(a) (b)

(c)

(d) Fig. 5. (a) Panoramic image of the planting bed; (b) background segmentation; (c) illustration plant width, plant length, maximum radii, and minimum radii; (d) disparity map of the panoramic image in grey scale.

Fig. 6a. depicts growth curves of the examined Boston lettuces of a growing period of 14 days. Results show that although growing speeds were quite different between individuals, most of them shared a similar pattern of growth. The variability in this case, although not desirable in a plant factory, indicates the variation of the environmental conditions such as lighting during the growth. In the plant factory, plants are cultivated in hydroponics nutrient solution. As the buoyance from the nutrient solution to roots could affect the weight, the weight measurements were oscillating at the first few days. However, the effect of the buoyance was ignorable as the plants became bigger. The projected leaf area and volume of Boston lettuces were illustrated in Fig. 6b and 6c. Comparing these two features together, it is obvious that the projected leaf area and the volume of these lettuces were growing in a similar pattern. The results also showed

(b)

(c) Fig. 6. Plant growth curves of Boston lettuces in: (a) weight, (b) projected leaf area, (c) plant volume.

As plant features and weights of Boston lettuces were measured simultaneously by our system, the relationship between plant projected leaf area and fresh weight can be discussed in greater details. A logarithmic regression was applied to fit the relationship of the leaf area and the fresh weight for each Boston lettuce. The R-square of the logarithmic regression to the 4 lettuces are all higher than 0.94. The R-square value is still higher than 0.86 when considering all 4 lettuces together as one. The results show that there is a positive correlation between the leaf area and weight of Boston lettuces as expected. The logarithmic function appears to be appropriate to describe the relationship between the projected area and the plant fresh weight. This result shows the possibility to use vision-based measurement methods to acquire plant features and then estimate its weight from the features. In other words, the plant weight can be estimated via images solely. Furthermore, with the automatic fresh weight measurement system, the calibration of the plant projected area with the fresh weight can be easily established. Usually the calibration procedure is a destructive and laborious procedure. The results show that the plant growth measurement system developed in this research not only can measure plant features and weight, but also able to help the plant factory to find suitable environmental conditions.

Fig. 7. Logarithmic regression of plant projected leaf area with respect to weight of Boston lettuces. 4. CONCLUSION A vision-based measurement system integrated with a weight measurement instrument to build an automatic plant growth measurement system. Successful results of plant growth monitoring and feature measurement have proved the feasibility and functionality of this system. Both plant features and weight can be obtained simultaneously and nondestructively throughout the whole plant cycle. Also, the weight and projected leaf area of Boston lettuces were found to have a positive correlation through the logarithmic regression analysis. The R-square of logarithmic regression to 4 individual plants are all higher than 0.94. It is believed

that this integrated system can provide useful information in plant factory management and various agricultural research areas.

REFERENCES Bay, H., Ess, A., Tuytelaars, T., and Gool, L.V. (2008). SURF: Speeded-Up Robust Features. Computer Vision and Image Understanding, 110(3), 346-359. Bengough, A.G., and MacKenzie C.J. (1994). Simultaneous measurement of root force and elongation for seedling pea roots. Journal of experimental botany, 45(270), 95102. Chien, C.F., and Lin T.T. (2005). Non-destructive growth measurement of selected vegetable seedlings using orthogonal images. Trans. of the ASAE. 48(5):1953-1961. Davis, J. (1998). Mosaics of scenes with moving objects. IEEE Comp. Soc. Conf. on Computer Vision and Pattern Recognition, 354-360. Fischler, M.A., and Bolles, R.C. (1981). Random sample consensus: a paradigm for model fitting with application to image analysis and automated cartography. Communication of Association Machine, 24(6), 381-395. He, D., Matsuura, Y., Kozai, T., and Ting, K. (2003). A binocular stereovision system for transplant growth variables analysis. Applied Engineering in Agriculture, 19(5), 611-617. Hirschmüller, H., 2005. Accurate and efficient stereo processing by semi-global matching and mutual information. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2, 807-814. Lin, T.T., Chien, C.F., Liao, W.C., Chung, K.C., and Chang, J.M. (2006). Machine vision systems for plant growth measurement and modelling. Environ Control Biol, 44(3), 181-187. Lin, T.T., Lai T.C., Liu T.Y., Yeh Y.H., Liu C.C., and Chung W.C. (2012). An Automatic Vision-Based Plant Growth Measurement System for Leafy Vegetables. International workshop on Computer Image Analysis in agriculture. Sase, S., Ikeda, H., and Takezono, T. (1988). Plant production in the artificial environment. Symposium on High Technology in Protected Cultivation, 230, 323-328. Takaichi, M., Shimaji, H., and Higashide, T. (1996). Monitoring of the change in fresh weigth of plants grown in water culture. International symposium on plant production in closed ecosystems, 413-418. Van Henten, E.J. and Bontsema J. (1995). Non-destructive crop measurements by image processing for crop growth control. Journal of agricultural engineering research, 61(2), 97-105. Rother, C., Kolmogorov, V., and Blake A. (2004). “GrabCut” - Interactive foreground extraction using iterated graph cuts. ACM Transactions on Graphics, 23(3), 309-314.