Length Measurement of Potato Leaf using Depth Camera

Length Measurement of Potato Leaf using Depth Camera

Proceedings, 6th IFAC Conference on Bio-Robotics Proceedings, IFAC Conference on Bio-Robotics Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Co...

833KB Sizes 0 Downloads 40 Views

Proceedings, 6th IFAC Conference on Bio-Robotics Proceedings, IFAC Conference on Bio-Robotics Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Conference on Bio-Robotics online at www.sciencedirect.com Proceedings, 6th IFAC Conference onAvailable Bio-Robotics Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Conference on Bio-Robotics Beijing, China, China,6th July 13-15, 2018 Beijing, July 13-15, 2018 Beijing, China, July 13-15, 2018

ScienceDirect

IFAC PapersOnLine 51-17 (2018) 314–320

Length Measurement of Potato Leaf using Depth Camera Length Length Measurement Measurement of of Potato Potato Leaf Leaf using using Depth Depth Camera Camera Li Wu*, Yaowei Long**, Hongof Sun**, Ning Leaf Liu**, Wenyun Wang**.Camera Qiaoxue Dong* Length Measurement Potato using Depth Li Wu*, Yaowei Long**, Hong Sun**, Ning Liu**, Wenyun Wang**. Qiaoxue Dong*

Li Wu*, Yaowei Long**, Hong Sun**, Ning Liu**, Wenyun Wang**. Qiaoxue Dong*  Li Wu*, Yaowei Long**, Hong Sun**, Ning Liu**, Wenyun Wang**. Qiaoxue Dong*  Li*China Wu*, Yaowei Long**, Hong Sun**, Liu**, Wenyun Wang**. Qiaoxue Dong* Agriculture University, Beijing Ning 100083, China, (e-mail: [email protected]);  Agriculture University, Beijing 100083, China, (e-mail: Ministry [email protected]);  Integration **Key Laboratory*China of Modern Precision Agriculture System Research, of Education, China Agriculture Agriculture University, Beijing 100083, China, (e-mail: Ministry [email protected]); **Key Laboratory*China of Modern Precision Agriculture System Integration Research, of China Agriculture University, Beijing 100083, China, (Tel: 86-010-62737838 ) ; Education, Agriculture Beijing 100083, China, (e-mail: Ministry [email protected]); **Key Laboratory*China of Modern PrecisionUniversity, Agriculture System Integration Research, of Education, China Agriculture University, Beijing 100083, China, (Tel: 86-010-62737838 ) ; **Key Laboratory of Modern Precision Agriculture SystemChina, Integration Ministry)of University, Beijing 100083, (Tel: Research, 86-010-62737838 ; Education, China Agriculture University, Beijing 100083, China, (Tel: 86-010-62737838 ) ; Abstract: Leaf length is one of important parameters for crop growth estimation. In order to accurately Abstract: is one this of important parameters forlinear crop hypothesis growth estimation. order accurately obtain theLeaf plantlength leaf length, paper proposes a slope in whichIn the leaf to length could Abstract: Leaf length is one this of important parameters forlinear crop hypothesis growth estimation. Inthe order to accurately obtain the plant leaf length, paper proposes a slope in which leaf length could be measured inlength 3-D space with a right-angle model. Aforbinocular stereoestimation. vision system was to applied with Abstract: Leaf is one of important parameters crop growth In order accurately obtain the plant leaf space length,with this apaper proposes a slope linear hypothesis in which the was leaf applied length could be measured in 3-D right-angle model. Acould binocular stereo vision system with RGB and depth image output. The projection ofa leaf behypothesis obtained inin RGB and depth image.could The obtain the plant leaf length, this paper proposes slope linear which the leaf length be measured in 3-D space withThe a right-angle Acould binocular stereo vision system was applied with RGB and depth image projectionmodel. of leaf be obtained in RGB and image. Four The depth camera is 3-D used to output. obtain one right-angle side to correct the measurement result in depth RGB image. be measured in space with a right-angle model. A binocular stereo vision system was applied with RGB and depth image output. Theright-angle projection side of leaf could the be obtained in RGB and depth image. Four The depth camera is used toto obtain one to correct measurement result in RGB without image. methods were compare the extract of accuracy in the First, RGBand images leaf RGB and depthpresent image output. Theright-angle projection leaf could be images. obtained in RGB depth image. Four The depth camera used toto obtain one to correct the measurement result in RGB without image. methods wereispresent compare the extractside accuracy in the images. First, RGB images leaf ⃗⃗⃗⃗ depth camera used toto obtain one leaf right-angle side to correct the measurement result in RGB image. Four ) on the horizontal projection plane. Second, leaf length segmentation were used tocompare extract methods wereis present the length extract(𝑳𝑳 accuracy in the images. First, RGB images without leaf ⃗⃗⃗⃗𝟏𝟏𝟏𝟏 ) on the horizontal projection plane. Second, leaf length segmentation were usedtotocompare extract leaf (𝑳𝑳 methods present the length extract accuracy in theby images. leaf ⃗⃗⃗⃗𝟐𝟐 ) werewere ⃗⃗⃗⃗ (𝑳𝑳 calculated with the right-angle model correction depth First, imageRGB basedimages on ⃗⃗⃗⃗ 𝑳𝑳𝟏𝟏without . leaf Third, the length segmentation were used to extract leaf length (𝑳𝑳 𝟏𝟏 ) on the horizontal projection plane. Second, ⃗⃗⃗⃗𝟐𝟐 ) were calculated (𝑳𝑳 with the right-angle model correction by depth imageplane. basedand on ⃗⃗⃗⃗ 𝑳𝑳morphological Third, the ⃗⃗⃗⃗𝟏𝟏𝟏𝟏 ) on 𝟏𝟏 . leaf the horizontal projection Second, length segmentation were used to extract leaf length (𝑳𝑳 preprocessing of RGB images was conducted with color image segmentation (𝑳𝑳 with the right-angle model with correction by depthsegmentation image basedand on ⃗⃗⃗⃗ 𝑳𝑳morphological ⃗⃗⃗⃗𝟐𝟐𝟐𝟐 ) were calculated 𝟏𝟏 . Third, the 𝟏𝟏 preprocessing of RGB images was conducted color image ⃗⃗⃗⃗𝟐𝟐 ) were then ⃗⃗⃗⃗ ⃗⃗⃗⃗ ⃗⃗⃗⃗ (𝑳𝑳 calculated the(𝑳𝑳 right-angle model with correction depthsegmentation image basedand on the operation, theRGB leafwith length ) wasconducted extracted. Fourth, the by correction leaf length (𝑳𝑳 were obtained 𝟏𝟏 . Third, preprocessing of images color image morphological 𝟑𝟑was 𝟒𝟒 ) 𝑳𝑳 ⃗⃗⃗⃗ ⃗⃗⃗⃗ operation, then of theRGB leaf length (𝑳𝑳 ) was extracted. Fourth, the correction leaf length (𝑳𝑳 ) were obtained preprocessing images was conducted with color image segmentation and morphological 𝟑𝟑 𝟒𝟒 ⃗⃗⃗⃗𝟑𝟑(𝑳𝑳 ⃗⃗⃗⃗ ⃗⃗⃗⃗ by depth image correction to 𝑳𝑳 . 𝟑𝟑Four establishedleaf to analyze 𝑳𝑳⃗⃗⃗⃗𝟏𝟏𝟒𝟒,) ⃗⃗⃗⃗ 𝑳𝑳were 𝑳𝑳𝟑𝟑obtained , and ⃗⃗⃗⃗ 𝑳𝑳𝟒𝟒 operation, then the leaf length ) wasprediction extracted.models Fourth,were the correction length (𝑳𝑳 𝟐𝟐 , ⃗⃗⃗⃗ ⃗⃗⃗⃗𝟑𝟑 .⃗⃗⃗⃗ ⃗⃗⃗⃗ by depth image correction to 𝑳𝑳 models were establishedleaf to analyze 𝑳𝑳⃗⃗⃗⃗𝟏𝟏𝟒𝟒𝟒𝟒,) ⃗⃗⃗⃗ 𝑳𝑳were , and ⃗⃗⃗⃗ 𝑳𝑳𝟒𝟒 ⃗⃗⃗⃗𝟑𝟑𝟑𝟑Four 𝟐𝟐 , 𝑳𝑳𝟑𝟑obtained operation, then the leafthat length (𝑳𝑳 ) wasprediction extracted. Fourth, the⃗⃗⃗⃗ correction length (𝑳𝑳 ⃗⃗⃗⃗ ⃗⃗⃗⃗ ⃗⃗⃗⃗ ⃗⃗⃗⃗ ⃗⃗⃗⃗ results. It is indicated the prediction model based on the 𝑳𝑳 measurement has better performance, 𝟐𝟐established to analyze 𝑳𝑳𝟏𝟏 by depthItimage correction to 𝑳𝑳prediction . Four prediction models were ,performance, 𝑳𝑳𝟐𝟐𝟐𝟐 , 𝑳𝑳𝟑𝟑𝟑𝟑 , and 𝑳𝑳in 𝟑𝟑 𝟒𝟒 𝟑𝟑 𝟏𝟏 𝟒𝟒 ⃗⃗⃗⃗ results. is indicated that the model based on the 𝑳𝑳 measurement has better 𝟐𝟐 𝟐𝟐𝑳𝑳 ⃗⃗⃗⃗𝟑𝟑 .0.833. ⃗⃗⃗⃗ ⃗⃗⃗⃗ by depth correction to Four prediction models were to analyze 𝑳𝑳𝟏𝟏 ,performance, 𝑳𝑳𝟐𝟐 , ⃗⃗⃗⃗ 𝑳𝑳𝟑𝟑 , and ⃗⃗⃗⃗ 𝑳𝑳in which 𝑹𝑹It𝒄𝒄 image 0.7178 and 𝒗𝒗 𝟐𝟐 is results. that𝑹𝑹the prediction model based on the 𝑳𝑳𝟐𝟐𝟐𝟐𝟐𝟐established measurement has better in𝟒𝟒 ⃗⃗⃗⃗ 𝟐𝟐isisindicated which 𝑹𝑹 is 0.7178 and 𝑹𝑹 is 0.833. 𝒗𝒗 𝟐𝟐 results.𝑹𝑹It𝒄𝒄 𝟐𝟐𝟐𝟐isisindicated that𝑹𝑹the prediction model based on the ⃗⃗⃗⃗ 𝑳𝑳𝟐𝟐 measurement has better performance, in 𝟐𝟐 is which 0.7178 and 0.833.of Automatic 𝒄𝒄 𝟐𝟐 depth 𝒗𝒗Federation © 2018, IFAC (International Control) Hosting by image Elsevierprocessing Ltd. All rights reserved.leaf 𝒄𝒄 𝒗𝒗 Keywords: image, color-segmentation, morphological operation, technology, 𝟐𝟐 which 𝑹𝑹𝒄𝒄 is 0.7178 and color-segmentation, 𝑹𝑹𝒗𝒗 is 0.833. Keywords: depth image, morphological operation, image processing technology, leaf length Keywords: depth image, color-segmentation, morphological operation, image processing technology, leaf length Keywords: depth image, color-segmentation, morphological operation, image processing technology, leaf length  length  significant improvement in the estimation of LAI compared 1. INTRODUCTION  significant improvement in theSun estimation of LAIproposed compareda to ground truth data. Hong et al. (2016)  1. INTRODUCTION significant improvement in theSun estimation of LAIproposed compareda to ground truth data. Hong et al. (2016) 1. INTRODUCTION for truth chlorophyll index corn in field.a significant improvement in theof estimation of leaves LAIproposed compared Potato crop, is an important food crop. It is significant to method to ground data. Hong Sun et canopy al. (2016) 1. INTRODUCTION method for chlorophyll index of corn canopy leaves in field. Potato the crop,appearance is an important food tocrop. It is significant to A field for corn chlorophyll content index - SVM prediction to ground truth data. Hong Sun et canopy al.LS(2016) proposed detect parameter estimate the growing method chlorophyll index of corn leaves in field.a A field corn chlorophyll content index LS SVM prediction Potato crop, is an important food crop. It is significant to detect the appearance parameter tocrop. estimate the growing model was established after theofimage segmentation and grey method for chlorophyll index corn canopy leaves in field. situation. Leaf parameter, as the main photosynthetic organs Potato crop, is an important food It is significant to A field corn chlorophyll content index LS SVM prediction detect theLeaf appearance parameter to estimate the growing 2 2 model was established after thewhose image segmentation and grey situation. parameter, as the main photosynthetic organs is 0.56 and 𝑅𝑅 is A field corn chlorophyll content index LS SVM prediction value parameters calculation, 𝑅𝑅 of the plant, has an important significance to study plant model was established after the image segmentation 𝑐𝑐 2 𝑣𝑣 grey detect theLeaf appearance parameter to estimate the the growing and 2 is situation. parameter, as the main photosynthetic organs is 0.56 and 𝑅𝑅 value parameters calculation, whose 𝑅𝑅 of the plant, has an important significance to study the plant 𝑐𝑐 𝑣𝑣 2 2 0.41. B Chen et al. (2017) indicated a method for single leaf model was established after the image segmentation and grey phenotype. Theparameter, leaf length is main one photosynthetic of characteristics to value parameters calculation, whose 𝑅𝑅𝑐𝑐 2 is 0.56 and 𝑅𝑅𝑣𝑣 2 is situation. Leaf as the of the plant, has an important significance to study theorgans plant 𝑐𝑐 2 𝑣𝑣 0.41.measurement B Chen et al.calculation, (2017) a𝑅𝑅method for single phenotype. The leaf length is one characteristics to value onindicated thewhose counting leaf pixels in is the 0.56 and 𝑅𝑅𝑣𝑣 2leaf is indicate the leaf and biomass. It is of necessary to the measure of the plant, has area an important significance to study plant 𝑐𝑐 of 0.41. Bparameters Chen et al.based (2017) indicated a method for single leaf phenotype. The leaf length is one of characteristics to area area measurement based on the counting of the leaf pixels in indicate the leaf area and biomass. It is necessary to measure digital leaf image. IS Nasution (2017) used combination of 0.41. B Chen et al. (2017) indicated a method for single leaf in the crop growth estimation. In traditional way, manual phenotype. The area leafand length is one characteristics to area measurement based on the counting of the leaf pixels in indicate the leaf biomass. Ittraditional is of necessary to measure digital leaf image. IS Nasution (2017) used combination in the crop growth in estimation. way, manual color space, Otsu's morphological area measurement based on the thresholding, counting of the leaf pixels of in method is applied leaf In measurement with L*a*b* indicate the leaf area estimation. andthe biomass. Ittraditional is necessary to measure in the crop growth Inlength way, manual digital leaf image. IS Nasution (2017) used combination of L*a*b* color space, Otsu's thresholding, morphological method is applied in the leaf length measurement with operations and connected component analysis to estimate leaf digital leaf image. IS Nasution (2017) used combination of destructive, cumbersome and labor-consuming, and is not in the crop growth estimation. In traditional way, manual method is applied in theandleaflabor-consuming, length measurement color space, Otsu's thresholding, operations andperennis. connected component analysis tomorphological estimate leaf destructive, cumbersome and iswith not L*a*b* area of Bellis L*a*b* color space, Otsu's thresholding, beneficial the growth plant. method istoapplied in of theand leaflabor-consuming, length measurement with operations and connected component analysis tomorphological estimate leaf destructive, cumbersome and is not area of Bellis perennis. beneficial to the growth of plant. andperennis. connected component analysis to estimate leaf destructive, cumbersome and labor-consuming, and is not operations area of Bellis beneficial to the growth of plant. the perennis. above studies removed the shortcomings of In recent years, conducted using Although area of Bellis beneficial to the many growthresearches of plant. have been Although thedestructive above studies removed measurement the shortcomings of In many researches been conducted using and in traditional on leaf therecent imageyears, processing techniques have to extract leaf area (J. M. tedious Although the above studies removed measurement the shortcomings of In recent years, many researches have been conducted using tedious and destructive in traditional on leaf the image processing techniques to extract leaf area (J. M. area, none ofdestructive the characteristics of the measurement uneven slope that the Although the above studies removed the shortcomings of Chen et al. 1992; Yongjiao Wang etbeen al. conducted 2006; Zhongzhi In recent years, many researches have using tedious and in traditional on leaf the image techniques to extract leaf area (J. M. leaf none the characteristics of uneven slope that the Chen et al.processing 1992; Rouphael Yongjiao et al.Chaohui 2006; Zhongzhi under the influence of the gravity were taken tedious andof destructive in traditional measurement on into leaf Wang et al. 2010; etWang al.to2010). Lü(J. et M. al. area, the image processing techniques extract leaf area area, poses none of the characteristics of the uneven slope that the Chen et al. 1992; Yongjiao Wang et al. 2006; Zhongzhi under the of might gravity taken Wang et al. 2010;leaf Rouphael et al. number 2010). Chaohui Lü et al. leaf the influence information bewere lost because of area, poses noneSome of theofcharacteristics of the uneven slope thatinto the (2010)et measured area by pixel statistic. DH Jung Chen al. 2010; 1992; Rouphael Yongjiao et al.Chaohui 2006; Zhongzhi Wang et al. etWang al. number 2010). Lü et al. account. leaf poses under the influence of might gravity were taken into account. Some of the information be lost because of (2010) measured leaf area by pixel statistic. DH Jung vertical projection in the image and reduce the measurement leaf poses under the influence of gravity were taken into et al. (2015) integrated the method of morphological and Wang al. 2010;leaf Rouphael et al. number 2010). Chaohui Lü et al. account. Some of the information might be lost because of (2010) et measured area by pixel statistic. DH Jung vertical projection in the image and reduce the measurement et al. (2015) integrated the method of morphological and accurate. Some Thus the depth camera could bebeused and help of to of the information might lostmeasurement because pixel-value analysis measure the fresh weight ofDH lettuce (2010) measured leaf to area by pixel number statistic. Jung et al. (2015) integrated the method of morphological and account. vertical projection the image and reduce the accurate. Thusimage. the in depth camera could be and help to pixel-value to measure the fresh weight of lettuce get the depth It can compensate the used lossmeasurement information projection in the image and reduce the grown in a analysis closed hydroponic system. The coefficients of vertical et al. (2015) integrated the method of morphological and accurate. Thusimage. the depth camera could be used and help to pixel-value to measure system. the freshThe weight of lettuce the depth can compensate the loss information grown in a analysis closed hydroponic coefficients of get of RGB image to depth a It extent. In be thisused paper, ahelp depth accurate. Thusimage. the camera could and to determination (> 0.93) standard ofweight prediction (SEP) pixel-value analysis toand measure theerror freshThe of lettuce get the depth Itcertain can compensate the loss information grown in a closed hydroponic system. coefficients of of RGB image to a certain extent. In this paper, a depth determination (>C0.93) and errorfocus of prediction (SEP) camera was applied to shoot the leaves of loss the branch and get the depth image. Itcertain can compensate the information values (< 5a g). Liu et al.standard (2015) put on researching grown in closed hydroponic system. The coefficients of of RGB image to a extent. In this paper, a depth determination 0.93) and errorfocus of prediction (SEP) camera was applied toand shoot the leaves ofatthe and values (< 5 g).(> Cmethod,which Liu et al.standard (2015) on researching the RGB image theextent. depth image the branch same time. of RGB image to a to certain In this paper, a depth the measuring wasput based on the image determination 0.93) and error of prediction (SEP) obtain camera was applied shoot the leaves of the branch and values (< 5 g).(>Cmethod,which Liu et al.standard (2015) put focus on researching obtain the RGB image and the depth image at the same time. the measuring was based on the image The were processed and leaves the length were was applied toand shoot ofatthe branch and processing rice height andon projected area camera values (< 5techniques g). Cmethod,which Liu to et detect al. (2015) putbased focus on researching obtainimages the RGB image the the depth image thevalues same time. the measuring was the image images were processed and the length values were processing techniques to detect rice height and projected area The extracted in RGB image and RGB-Depth image respectively. obtain the RGB image and the depth image at the same time. of crop images. the measuring method,which washeight basedandon the image processing techniques to detect rice projected area The images wereimage processed and the length values were extracted in RGB and improved RGB-Depth image respectively. of crop images. It provided a method coefficient of images were processed and the the length values were processing techniques to detect rice height and projected area The of crop images. extracted in RGB image and RGB-Depth image respectively. It provided ain image method improved the coefficient of In orderimages. to improve the measurement accuracy, M Mora et al. determination lengthandmeasurement with the image extracted in RGB RGB-Depth image respectively. of crop It provided ain method improved thewith coefficient of In orderproposed to improve the measurement accuracy, M Morabased et al. processing determination length measurement the image (2016) a modified cover photography method technology. It provided ain method improved thewith coefficient of In orderproposed to improve the measurement accuracy, M Morabased et al. determination length measurement the image (2016) a modified cover photography method processing technology. on specific imagethesegmentation algorithms toMora exclude In order to improve measurement accuracy, M et al. processing determination in length measurement with the image (2016) proposed a modified cover photography method based technology. on specific image segmentation algorithms toshowed exclude contributions from non-leaf materials. Results (2016) proposed a modified cover photography method baseda processing technology. on specific image segmentation algorithms toshowed exclude contributions from non-leaf materials. Results a on specific image segmentation algorithms to exclude contributions from non-leaf materials. Results showed a 2405-8963 © IFACnon-leaf (International Federation of Automatic Control) contributions from materials. Results showed a 314Hosting by Elsevier Ltd. All rights reserved. Copyright © 2018, 2018 IFAC Peer review©under of International Federation of Automatic Copyright 2018 responsibility IFAC 314Control. Copyright © 2018 2018 IFAC IFAC 314 10.1016/j.ifacol.2018.08.197 Copyright © 314 Copyright © 2018 IFAC 314

IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018

Li Wu et al. / IFAC PapersOnLine 51-17 (2018) 314–320

2. MATERIALS AND METHODS

315

The leaf length was measured by image processing. Four methods were discussed in this research. The first one was the length extraction in RGB image without image preprocessing and marked as ⃗⃗⃗ 𝐿𝐿1 in the data analysis. The second one was length extraction with combination of depth image and RGB image in which images were not ⃗⃗⃗⃗2 . The third one was the preprocessed also and marked as 𝐿𝐿 length extraction in RGB image with image preprocessing and marked as ⃗⃗⃗⃗ 𝐿𝐿3 . The forth one was the length extraction with combination of depth image and RGB image in which the length were measured after the image preprocessing by morphological method and marked as ⃗⃗⃗⃗ 𝐿𝐿4 .

2.1 Experimental Materials and Method The imaging system used in this experiment is a depth camera with model FM810-HD from Tuyang Technology. It is an integrated binocular stereo vision system involving two cameras synchronously. The outputs are RGB and depth image with BMP format in which the depth image is calculated by binocular disparity between two cameras. The depth image has been calibrated and tested before commercial sale. As a result, the depth image could be applied directly. According to the system application manual, the working distance is from 600 mm to 800 mm. Regarding the height and vertical angle of view (H/V) in an RGB image is 60°/ 46°, and in an depth image is 56 °/ 46 °. The resolution of the RGB image is 1280 × 960, and the resolution of the depth image is 237 × 235. In the experiment, a circular flowerpot with standard soil height was used to remove the effect of different soil height. Firstly, 30 potato branches were picked from 12 potato plants. Secondly, sampled branches were inserted into standard flowerpots to simulate their natural growth on their plants. The leaves on each branch were measured manually one by one to show the length real value. After the leaf number marking, the manual procedures of leaf length measurement was conducted as follows: (1) each potato branch was divided into left and right sides, and each side was numbered from left to right. According to the order of the leaf point to the petiole, it was recorded as Left (1,2,...n) or Right (1,2,...n). (2) The real length value of each leaf was measured and marked as 𝐿𝐿⃗ in the data analysis. Then the images were collected. The imaging system was used to collect RGB images and depth images of 30 potato branches. The standard flowerpots were inserted into 30 branches of potato plants in turn and placed vertically below the depth camera. As shown in Fig.1. 30 sets of images were obtained respectively. Each set of images includes one RGB image and one depth image. 155 leaves were finally selected and analysed in 30 potato branches.

In order to indicate the accuracy and applicable ability of the measurement results, the linear regression analysis was conducted between the measurement data by image extracting methods and the real leaf length. 117 samples were used as establish modeling sets, while 38 samples were used as establish prediction sets. 2.2 Image Processing Flow Regarding four methods of length measurement mentioned above, the image processing steps follows the flow chart shown in Fig.2. The measurement was considered from horizontal and vertical direction which could be obtained in RGB and depth image separately. Start Collection of RGB images and depth images

RGB image or not?

N

Depth image rectification

Y RGB image rectification

Morphological operation

Y

Coloursegmentation or not? N Leaf length extraction in image without preprocessing

Leaf length extraction in image with preprocessing

Height difference extraction

Leaf length correction

Leaf length correction

Fig. 2 Flow chart of image processing On the horizontal projection plane, the leaf length was directly measured using the RGB image in the first method. In terms of vertical space, the height difference could be obtained from the depth image which represented the leaf inclination. In the second method, the height value was used to correct the leaf length result in RGB image. The second method mentioned above was conducted by the depth correction.

Fig. 1 Imaging illustration of branch of potato plant 315

IFAC BIOROBOTICS 2018 316 Beijing, China, July 13-15, 2018

Li Wu et al. / IFAC PapersOnLine 51-17 (2018) 314–320

In order to discuss the prepossessing influence in image measurement, the third method was conducted also with depth correction. After the leaves segmentation based on the colour feature in the RGB image, the morphological operation was performed, so that the background and leaf stems in the image were removed. The forth method was when the third method was completed, the height difference of the depth image was used to correct the leaf length of the leaf.

In the third method, it proposed with the preprocessing image. The preprocessing procedure included the color-segmentation and morphology operation. Through the above steps, the interference of stems was removed from the branches in the image. Usually, the background of a plant image is the soil with larger red (r) and blue (b) values, but its green (g) value is always smaller than the g value of the plant itself, where r, g, and b are the normalized colour components (where r = R/(R + G + B), g = G/(R + G + B), b = B/ (R + G + B)). It is very effective to distinguish between plant and non-plant backgrounds by using indicators such as r -b, g -b, (g -b)/(r -g) and (2g -r -b), among which (2g - r -b) is most effective (Jianqin Zhang et al. 2004). This paper used the (2g-r-b) component to segment the potato branches from the background.

2.2 Image Calibration The calibration was finished in RGB and depth image measurement. Considering the horizontal and vertical direction, it used the RGB image’s pixel number and depth image’s grayscale calibration. An image can be defined as a two-dimensional function f(x,y) where x and y are spatial coordinates and f is the brightness at any coordinate (x,y). When x, y, and f are finite discrete values, it is a digital image. Pixels are the smallest unit in a digital image with coordinates and brightness properties. The brightness of a grayscale image is often referred to as a grayscale level, and a grayscale image generally has 0 to 255 grayscale levels (Rafael C. Gonzalez et al. 2014). The image calibration often includes spatial coordinate and greyscales. 1. Calibration of RGB image: This paper used the diameter of a standard circular flowerpot for calibration. The diameter d (mm) of a flowerpot measured by hand was used as a parameter to normalize the corresponding number of pixels p (pixels). Therefore the normalized unit d/p (mm/pixels) can convert any two points in the RGB image or grayscale image processed by morphological operation into the distance l (mm) between these two points. If given the number of pixels between these two points s (pixels), the conversion formula is shown in formula (1): 𝑙𝑙 = 𝑠𝑠 × 𝑑𝑑⁄𝑝𝑝 2.

After the image segmentation, the morphology, a branch of biology, was used to manipulate the shape and structure of plants. The description language of mathematical morphology is set theory, which provides a unified and powerful tool to deal with the problems encountered in image processing. Corrosion, expansion, and opening operation are common morphological operations(Qiuqi Ruan et al. 2007). Corrosion: If given that A, B is a set of 𝑍𝑍 2 , A corroded by B, and it’s donated as A ⊖ B, which is defined as A⊖B={x|(𝐵𝐵)𝑥𝑥 ⊆ 𝐴𝐴}.That is say the result of A is corroded by B is a set of all points x that are included in A after B is translated by X. By the Structure Element (SE), corrosion shrinks the image.

(1)

Depth image calibration: Based on the maximum height of 800 mm and the minimum height of 600 mm in the working area setting by the depth camera system, the depth image was obtained. Since the depth image is essentially a greyscale image, the grayscale is 256 levels which is ranging from 0 to 255.A specific greyscale represents a specific height. In this depth camera system, the highest height of the imaged object is 800 mm, on the other hand the minimum height is 600 mm. The greyscale of the depth image in this height range is 256 levels, so that the calibration unit of the depth image can be obtained. For any pixel on the depth image, if its greyscale k is known, the height value h (mm) indicating distance from the camera can be obtained from formula(2): h = 0.78125 × k

(2)

2.3 Morphological Operation

316

Dilation: If given that A, B is a set of 𝑍𝑍 2 , A dilated by B, and it’s donated as A⊕B, which is defined as A⊕B={x|(𝐵𝐵̂ )𝑥𝑥 ⊓ 𝐴𝐴 ≠ Ф}. This formula shows that the dilation process is to do a mapping on the origin firstly, and then translate x. A is dilated by B is that 𝐵𝐵̂ is translated by all x and A has at least one non-zero common element. By the Structure Element (SE), dilation expands the image. Opening operation: Let A be the original image, and  be the structural element image, then the set A is opened by the structural element B, which denoted as A︒ B, which is defined as A︒ B=(A⊖B)⊕B, That is to say that A is opened by SE B is the result of A being corroded by B and then expanded by B. 2.4 Leaf Length Measurement The measurement of leaf length between leaf point and leaf petiole on both unprocessed and preprocessing RGB image was done by ImageJ software. Beyond that, the statistical operation of the maximum and minimum grayscale for each leaf on the depth image was also done by ImageJ. ImageJ is java-based public image processing software developed by the National Institutes of Health.

IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018

Li Wu et al. / IFAC PapersOnLine 51-17 (2018) 314–320

In order to correct the result of horizontal projection measurement in RGB image, this paper proposes a slope linear hypothesis of the leaf posture with the help from depth image. It is assumed in this hypothesis that the leaves on the potato branches in standard flowerpot are presented in the posture like Fig. 3, and the main veins of the leaf are absolutely flat. The real length of the leaf, which is denoted by B to A (BA), is actually combined by one right-angled edge denoted by B to C (BC) in the RGB image. According to the triangle model, the leaf length could be corrected by the other right-angled edge from A to C (AC) with height difference in the depth image.

Similarly, the leaf length was calculated by using ImageJ and marked by ⃗⃗⃗⃗ 𝐿𝐿3 . Secondly, in the depth image the other right-angled edge (AC) of the slope linear model was extracted, which represents how much the leaf inclined. 30 calibrated depth images were opened by ImageJ. Use ImageJ's regional greyscale statistics function for each leaf on the depth image. In the depth image, the area where each leaf appears was selected in the order of the 155 leaves consecutively numbered to extract the maximum and minimum greyscale. The gray level difference which is denoted by Δh are thus obtained. Using the calibration unit of the depth image, the conversion formula (4) of the height difference (H) and the gray level difference Δh can be obtained. The height difference of the leaves which ⃗. represents the inclination of the leaf is marked as 𝐻𝐻

Fig. 3 The slope linear hypothesis of single leaf It shows that the measurement in RGB is BC with information loses. The depth image keeps the AC information. The depth image is simultaneously imaged with the RGB image. Therefore the depth image can be used to compensate the lost information in RGB image. Then formula (3) is used to approximate the length of the leaf, so as to correct and improve the accuracy of measurement by the image processing technology. BA = √𝐵𝐵𝐵𝐵 2 + 𝐴𝐴𝐴𝐴 2

317

(3)

The measurement method was conducted by combination of RGB and depth image. Firstly, based on the slope linear hypothesis of leaf posture, one of the right-angled edges of the slope linear model was extract from RGB images with and without preprocessing: 1) Leaf length without preprocessing: No preprocessing was performed on 155 leaves on 30 RGB images. ImageJ was used to open the calibrated RGB image, and the leaf length was calculated by using the two-point distance measurement ⃗⃗⃗1 . function. It was marked as 𝐿𝐿 2) Leaf length with preprocessing: These 30 images were firstly colour-segmented to obtain their (2g-r-b) components, then followed by the opening operation in morphological processing. The same size of the square SE and the disk SE are selected to perform the opening operation in morphological processing on the RGB image. The result shows that the use of the square structure element will result in a more optimal segmentation of the leaf. Because of the different sizes of stems in the RGB image, the square structure elements in the range of 5 to 15 were selected for each image according to the size of the stem. The only purpose was to obtain the best segmentation in each image.

𝐻𝐻 = 0.78125 × 𝛥𝛥ℎ (4) Base on the slope linear hypothesis, the leaf length could be corrected by the formula (3). Because there are two methods for measuring RGB image, it leads to BC having two situations for discussion. The first situation of BC is “ B’C’ ”, in which B’C’ is the shape of leaf shown in the RGB image. The second situation of BC is “ B’’C’’ ”, in which coloursegmentation and morphological operation changed the shape of leaf. All the four methods used to obtain the leaf length corrected by formula (3) take the different value of BC according to the specific method applied. 1. Method 1: formula (3) should be “B′A′ = B′C′ ”. B’A’ ⃗⃗⃗1 ) is equal to B’C’ where B’C’ is the length value (𝐿𝐿 measured in RGB image without morphological operation; 2. Method 2: formula (3) should be “ B′′A′′ = ⃗⃗⃗⃗2 ) is calculated by B’A’ √(𝐵𝐵′𝐶𝐶′)2 + 𝐴𝐴𝐴𝐴 2 ”. B’’A’’ (𝐿𝐿 and AC where B’C’ is the length value measured in RGB image without morphological operation and AC is the height difference extracted from the depth image; 3. Method 3: formula (3) should be “ B′′′A′′′ = B′′C′′ ”. ⃗⃗⃗⃗3 ) is equal to B’C’ where B’’C’’ is the length B’’’A’’’ (𝐿𝐿 value measured in RGB image with morphological operation; 4. Method 4: formula (3) should be “ B′′′′A′′′′ = ⃗⃗⃗⃗4 ) is calculated by √(𝐵𝐵′′𝐶𝐶′′)2 + 𝐴𝐴𝐴𝐴 2 ”. B’’’’A’’’’ ( 𝐿𝐿 B’’A’’ and AC where B’’C’’ is the length value measured in RGB image with morphological operation and AC is the height difference extracted from the depth image; 3. RESULTS AND ANALYSIS 3.1 Image calibration Results 30 RGB images and 30 depth images were acquired by the depth camera system as shown in Fig. 4. The calibrations in RGB and depth image were performed. ImageJ was used to measure the leaf length and calculate the height difference. According to the total pixels (236.111 pixels)calculated by the ImageJ in the range of standard flowerpot’s diameter (235 mm), the conversion unit between pixel and length in RGB

317

IFAC BIOROBOTICS 2018 318 Beijing, China, July 13-15, 2018

Li Wu et al. / IFAC PapersOnLine 51-17 (2018) 314–320

⃗ , ⃗⃗⃗⃗ 𝑳𝑳𝟏𝟏 , ⃗⃗⃗⃗ 𝑳𝑳𝟐𝟐 Table 1 Statistics of 𝑳𝑳

image is obtained, which is 1.3026 pixels/mm, as shown in Fig. 4(a). Similarly, according to the maximum (800 mm)and minimum (600 mm) working distance of depth camera, as well as the maximum (255) and minimum (0) grayscale of depth image , the conversion unit between grayscale and depth is obtained, which is 0.78125 mm/grayscale. As shown in Fig. 4(b), pixel difference between two pixels of one single leaf in the depth image represents the height difference between them. Using formular (4), the statistical grayscale difference between the maximum grayscale and minimum grayscale of one single ⃗ the leaf was converted into H. Thus the height difference 𝐻𝐻 leaves were obtained.

Leaf lengt h 𝐿𝐿⃗ ⃗⃗⃗ 𝐿𝐿1 ⃗⃗⃗⃗ 𝐿𝐿𝟐𝟐

Max (mm)

Min (mm)

Mean (mm)

86 72.271 75.509

14 9.269 10.058

33.270 28.437 30.222

⃗⃗⃗⃗𝟏𝟏 (a)Linear regression of 𝑳𝑳

Standard Deviation (mm) 12.596 12.275 12.828

⃗⃗⃗⃗𝟐𝟐 (b)Linear regression of 𝑳𝑳

⃗⃗⃗⃗𝟏𝟏 , ⃗⃗⃗⃗ Fig. 5 Linear regression analysis of 𝑳𝑳 𝑳𝑳𝟐𝟐 3.4 (a) RGB image

(b) Depth image

The morphological preprocessing removed the interference of the stem attached to the leaves when the leaf length was measured on the image, as shown in Fig. 6 (a) and Fig. 6 (b). Comparing with leaves in Fig. 6 (b), the leaves in Fig. 6 (a) have more clear leaf point and leaf petiole. Different types of SE bring different segmentation effect. The black background is confusing with leaves in the image. Fig. 6 (c) and Fig. 6 (d) are Fig. 6 (a) and Fig. 6 (b) performed with grayscale brightness and complement, and are beneficial for measurement. After the morphological operation, the leaf length were measured in RGB images. The statistics of length value ⃗⃗⃗⃗ 𝐿𝐿3 is shown in Table 3.The maximum, minimum, and mean value ⃗⃗⃗⃗3 is 74.437 mm, 7.381 mm, and 27.348 mm respectively. of 𝐿𝐿 ⃗⃗⃗⃗3 and ⃗L. The linear A prediction model was established for 𝐿𝐿 ⃗⃗⃗⃗ regression of 𝐿𝐿3 is shown in Fig. 7 (a). Its coefficient of determination in modeling sets, which is marked as 𝑅𝑅𝑐𝑐3 2 , is 0.6744. Its coefficient of determination in validation sets, which is marked as 𝑅𝑅𝑣𝑣3 2 , is 0.6863.

Fig. 4 Collection of RGB image and depth image 3.2 Length measurement Results by RGB image The leaf length of 155 leaves on RGB image without preprocessing ⃗⃗⃗ 𝐿𝐿1 is obtained by ImageJ. The statistics of length value is shown in Table 2. The maximum, minimum, and mean value of ⃗⃗⃗ 𝐿𝐿1 is 72.271 mm, 9.269 mm, and 28.437 mm respectively. In order to show the measurement accuracy and applicable ability, a prediction model was established for ⃗⃗⃗1 and L ⃗ . The linear regression of 𝐿𝐿 ⃗⃗⃗1 is shown in Fig. 5 (a). 𝐿𝐿 Its coefficient of determination in modeling sets (𝑅𝑅𝑐𝑐1 2 ) is 0.6752. Its coefficient of determination in validation sets (𝑅𝑅𝑣𝑣1 2 ) is 0.731. 3.3

Length measurement Results by RGB image with Morphological preprocessing

Length measurement Results by Depth Correction without preprocessing

The leaf length of 155 leaves on RGB image without preprocessing ⃗⃗⃗⃗ 𝐿𝐿2 is corrected by depth image. The statistics of length value is shown in Table 1. The maximum, minimum, and mean value of ⃗⃗⃗⃗ 𝐿𝐿2 is 75.509 mm, 10.058 mm, and 30.222 mm respectively. These values of ⃗⃗⃗⃗ 𝐿𝐿2 are larger ⃗⃗⃗1 ’s values. A prediction model was also established for than 𝐿𝐿 ⃗⃗⃗⃗ ⃗ . The linear regression of 𝐿𝐿 ⃗⃗⃗⃗2 is shown in Fig. 5 (b). 𝐿𝐿2 and L Its coefficient of determination in modeling sets (𝑅𝑅𝑐𝑐2 2 ) is 0.7178. Its coefficient of determination in validation sets (𝑅𝑅𝑣𝑣2 2 ) is 0.833. 𝑅𝑅𝑐𝑐2 2 and 𝑅𝑅𝑣𝑣2 2 are larger than 𝑅𝑅𝑐𝑐1 2 and 𝑅𝑅𝑣𝑣1 2 respectively.

(a) Morphological operation 318

(b) Morphological operation

IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018

with 'square' SE

Li Wu et al. / IFAC PapersOnLine 51-17 (2018) 314–320

with 'disk' SE

(c) Morphological operation with 'square' SE after brightness and complement

An imaging system based on integrated binocular stereo vision is used to obtain depth images as well as RGB images. This paper proposes a slope linear hypothesis to improve the measurement accuracy by depth correction. Compared the measurement results between RGB and correction methods, four method were discussed in this paper. First, RGB images ⃗⃗⃗1 ) without leaf segmentation were used to extract leaf length (𝐿𝐿 ⃗⃗⃗⃗2 ) on the horizontal projection plane. Second, leaf length (𝐿𝐿 were calculated with the right-angle model correction by ⃗⃗⃗1 . Third, the preprocessing of RGB depth image based on 𝐿𝐿 images was conducted with color image segmentation and morphological operation, then the leaf length ( ⃗⃗⃗⃗ 𝐿𝐿3 ) was extracted. Fourth, the correction leaf length ( ⃗⃗⃗⃗ 𝐿𝐿4 ) were obtained by depth image correction to ⃗⃗⃗⃗ 𝐿𝐿3 .

(d) Morphological operation with 'disk' SE after brightness and complement

Main results for this paper are as follows:

Fig. 6 Morphological operation with different SE 3.5 Length measurement Results by Depth Correction with Morphological preprocessing

1.

RGB images were performed colour-segmentation and morphological operation to remove the interference of the background and leaf stem.

2.

Comparing the maximum, minimum, and mean value of ⃗ , 𝐿𝐿 ⃗⃗⃗1 , ⃗⃗⃗⃗ 𝐿𝐿2 , ⃗⃗⃗⃗ 𝐿𝐿3 ,and ⃗⃗⃗⃗ 𝐿𝐿4 , it is obvious that the statistic of ⃗⃗⃗⃗ 𝐿𝐿2 L ⃗ is more closer to L than others. However, the reason why the statistics of ⃗⃗⃗⃗ 𝐿𝐿4 is not as closer as ⃗⃗⃗⃗ 𝐿𝐿2 is as follows. On the one hand the reason is that excessive light changes the shape of leaf when coloursegmentation is performed. On the other hand the end point at the leaf petiole is changed when the opening operation in morphological processing is performed. So ⃗⃗⃗⃗3 which were measured on RGB that the values of 𝐿𝐿 images with preprocessing, is slightly smaller than ⃗⃗⃗ 𝐿𝐿1 . ⃗⃗⃗⃗4 was combined of H ⃗⃗ and L ⃗⃗⃗⃗3 , as well as ⃗⃗⃗⃗ Even 𝐿𝐿 𝐿𝐿2 was ⃗⃗ and ⃗⃗⃗⃗ ⃗⃗⃗⃗4 is slightly smaller than 𝐿𝐿 ⃗⃗⃗⃗2 . combined of H L1 , 𝐿𝐿

3.

⃗⃗⃗⃗2 , ⃗⃗⃗⃗ 𝐿𝐿3 , Four prediction model were established for ⃗⃗⃗ 𝐿𝐿1 , 𝐿𝐿 2 2 ⃗⃗⃗⃗ ⃗ 𝐿𝐿4 , and 𝐿𝐿 respectively. The 𝑅𝑅𝑐𝑐2 is 0.7178 and 𝑅𝑅𝑣𝑣2 is 𝐿𝐿2 0.833. The 𝑅𝑅𝑐𝑐 2 and 𝑅𝑅𝑣𝑣 2 of prediction model for ⃗⃗⃗⃗ ⃗ and L are both larger than the others.

After the morphological operation, the leaf length is corrected by depth image. The statistics of length value is shown in Table 2. The maximum, minimum, and mean value of ⃗⃗⃗⃗ 𝐿𝐿4 is 75.667 mm, 9.130 mm, and 29.179 mm respectively. The maximum, minimum, and mean value of ⃗⃗⃗⃗ 𝐿𝐿4 is larger than ⃗⃗⃗1 ’s value. A prediction model was also established for ⃗⃗⃗⃗ 𝐿𝐿 𝐿𝐿4 ⃗ ⃗⃗⃗⃗ and L. The linear regression of 𝐿𝐿4 is shown in Fig. 7 (b). The coefficients of determination in modeling sets ( 𝑅𝑅𝑐𝑐4 2 ) is 0.7097. The coefficients of determination in validation sets (𝑅𝑅𝑣𝑣4 2 ) is 0.8103. 𝑅𝑅𝑐𝑐4 2 and 𝑅𝑅𝑣𝑣4 2 are larger than 𝑅𝑅𝑐𝑐3 2 and 𝑅𝑅𝑣𝑣3 2 respectively. 𝑳𝑳𝟑𝟑 , ⃗⃗⃗⃗ 𝑳𝑳𝟒𝟒 Table 2 Statistics of ⃗𝑳𝑳, ⃗⃗⃗⃗ Leaf lengt h 𝐿𝐿⃗ ⃗⃗⃗⃗ 𝐿𝐿3 ⃗⃗⃗⃗ 𝐿𝐿4

Max (mm)

Min (mm)

Mean (mm)

86 72.437 75.667

14 7.381 9.130

32.270 27.348 29.179

319

Standard Deviation (mm) 12.596 12.281 12.880

In this paper, the ideal condition that the single branch of potato was inserted in the flowerpot was assumed. However branches are interlaced with each other in the fields, and further study is needed for this actual situation. ACKNOWLEDGMENTS

(a)Linear regression of ⃗⃗⃗⃗ 𝐿𝐿3

This work was supported by the National Natural Science Foundation funded project (31501219, 31701319) and the Chinese High Technology Research and Development Research Fund (2016YFD0300600- 2016YFD0300606, 2016YFD0300600-2016YFD0300610) and the Key Laboratory Project (BKBD2017KF03, KF2018W003,2018TC020).

⃗⃗⃗⃗4 (b)Linear regression of 𝐿𝐿

REFERENCES

Fig. 7 Linear regression analysis of ⃗⃗⃗⃗ 𝑳𝑳𝟑𝟑 , ⃗⃗⃗⃗ 𝑳𝑳𝟒𝟒

B Chen, Z Fu, Y Pan, J Wang, Z Zeng (2017). Single leaf area measurement using digital camera image. Computer

4.CONCLUSIONS 319

IFAC BIOROBOTICS 2018 320 Beijing, China, July 13-15, 2018

Li Wu et al. / IFAC PapersOnLine 51-17 (2018) 314–320

& Computing Technologies in Agriculture IV, 345, 525530. C Liu, L Chen, C Lv, W Ren, SA University (2015). The Plant Type Parameters Measurement Algorithm of Rice Based on Image Processing Technology, Iournal of Agricultural Mechanization Research. Chaohui Lü, Hui Ren, Yibin Zhang, Yinhua Shen (2010). Leaf Area Measurement Based on Image Processing. International Conference on Measuring Technology and Mechatronics Automation. DH Jung, SH Park, XZ Han, HJ Kim (2015). Image processing methods for measurement of lettuce fresh weight. , 40(1), 89-93. Hong Sun, Xiang Chen, Meng Cheng, Minzan Li, Wei Yang, Qin Zhang (2016). Image Analysis of Field Mobile Acquisition and Diagnosis of Maize Chlorophyll Content. ICSaid. IS Nasution (2017). Non-Destructive Measurement for Estimating Leaf Area of Bellis perennis. J. M. Chen, T. A. Black (1992). Defining Leaf Area Index for Non-flat Leaves. Plant, Cell and Environment, 15,421429. Jianqin Zhang, Xiu Wang, Jianhua Gong, Xuzhang Xue (2004). Realization of Leaf Area Measurement System Based on Machine Vision Technology. Progress in Natural Science, 14. M Mora, F Avila, M Carrasco-Benavides, G Maldonado, S Fuentes (2016). Automated computation of leaf area index from fruit trees using improved image processing algorithms applied to canopy cover digital photograpies. Computers & Electronics in Agriculture, 123(C), 195-202. Qiuqi Ruan (2007). Digital Image Processing, Second Edition, chapter 9. Publishing House of Electronics Industry, Beijing. Rafael C. Gonzalez, Richard E. Woods, Steven L. Eddins (2014). Digital Image Processing Using MATLAB, Second Edition, chapter 1. Publishing House of Electronics Industry, Beijing. Yongjiao Wang, Yin Zhang, Sanyuan Zhang (2006). Approach to Measure Plant Leaf Area Based on Image Process. Computer Engineering, 32(8), 210-212. Y. Rouphael, A.H. Mouneimne, A. Ismail, E. Mendoza-de, C.M. Rivera, G. Colla (2010). Modeling Individual Leaf Area of Rose (Rosa hybrid L.) Based on Leaf Length and Width Measurement. Photosynthetica, 48(1), 9-15. Zhongzhi Wang, Jinrui Zhang (2010). A measurement Approach of Leaf Area Based on Digital Image Processing. Microcomputer Applications.

320