Proceedings, 6th IFAC Conference on Bio-Robotics Proceedings, IFAC Conference on Bio-Robotics Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Conference onAvailable Bio-Robotics online at www.sciencedirect.com Beijing, China,6th July 13-15, 2018 Proceedings, IFAC Conference on Bio-Robotics Beijing, China,6th July 13-15, 2018 Beijing, China, July 13-15, 2018
ScienceDirect
IFAC PapersOnLine 51-17 (2018) 700–705
Chlorophyll Content Detection of Field Maize Using RGB-NIR Camera Chlorophyll Content Detection of Field Maize Using RGB-NIR Camera Chlorophyll Content Detection of Field Maize Using RGB-NIR Camera Chlorophyll of Field Maize Using RGB-NIR Camera JunyiContent Zhang* **,Detection Minzan Li*, Zichun Sun*, Haojie Liu*, Hong Sun*, Wei Yang ,
, Junyi Zhang*, ,**, Minzan Li*, Zichun Sun*, Haojie Liu*, Hong Sun*, Wei Yang Junyi Zhang*,**, Minzan Li*, Zichun Sun*, Haojie Liu*, Hong Sun*, Wei Yang * Key Laboratory of Modern PrecisionLi*, Agriculture SystemHaojie Integration Ministry of Education, Junyi Zhang* **, Minzan Zichun Sun*, Liu*, Research, Hong Sun*, Wei Yang * Key Laboratory of Modern Precision Agriculture System Integration Research, Ministry of Education, ); China Agricultural University, Beijing 100083, China ( Tel:86-010-62737838; e-mail:
[email protected] * Key Laboratory of Modern Precision Agriculture System Integration Research, Ministry of Education, ); ( Tel:86-010-62737838; e-mail:
[email protected] China Agricultural University, Beijing 100083, China * Key Laboratory ofUniversity Modern Precision Agriculture Integration Research, Ministry of Education, ); ** Henan of Animal Husbandry and Economy, Zhengzhou 450046, China; China Agricultural University, Beijing 100083, China (System Tel:86-010-62737838; e-mail:
[email protected] ** Henan University of Animal Husbandry and Economy, Zhengzhou 450046, China; China Agricultural University, Beijing 100083, China (Tel:86-010-62737838; e-mail:
[email protected] ); ** Henan University of Animal Husbandry and Economy, Zhengzhou 450046, China; ** Henan University of Animal Husbandry and Economy, Zhengzhou 450046, China; Abstract: In order to rapidly detect the chlorophyll content in the field, the RGB-NIR camera was used Abstract: In order to rapidlyimage detectofthe chlorophyll in the field, themeasured RGB-NIR used to collect the multi-spectral maize canopy. content The SPAD value was at camera the samewas time to Abstract: In order to rapidlyimage detectofthe chlorophyll in the field, themeasured RGB-NIR was used to collectthe the multi-spectral maize canopy. content The SPAD value was at camera thethe same time to indicate chlorophyll content. The multi-spectral images were processed. Considering features of Abstract: In order to rapidly detect the chlorophyll content in the field, the RGB-NIR camera was used to collectthe thechlorophyll multi-spectral imageThe of maize canopy. images The SPAD value was measured at thethe same time of to indicate content. multi-spectral were processed. Considering features multi-spectral images, after image median filtering segmentation towas remove the background effect to collectthe thechlorophyll multi-spectral imageThe of maize canopy.and The SPAD value measured at thethe same time by to indicate content. multi-spectral images were processed. Considering features of multi-spectral images, after image median filtering and segmentation to remove theaverage background effect by field soil, dry straw and weeds, 15 image parameters were extracted including the grey value indicate the chlorophyll content. The multi-spectral images were processed. Considering the features of multi-spectral images, image filtering andwere segmentation to removethe theaverage background effect field soil, dry (A straw andafter weeds, 15),median image parameters extracted including greycoverage value by of ANIR the vegetation (A ARVI, Athe vegetation each channel multi-spectral images, after image median filtering indices andwere segmentation to ,remove the effect R, A G , A B and NDVI, ANDGI DVI),background field soil, dry straw and weeds, 15 image parameters extracted including average grey value by of each channel (A , A , A and A ), the vegetation indices (A , A , A , A ), vegetation coverage R G B NIR NDVI NDGI RVI DVI texture parameters (energy (AASM ), moment of inertia (ACONof ), index (VCI), average(A field soil, dry hue straw andBweeds, 15 image parameters were extracted including the average grey value H) and R, A G G, A B NIR), the vegetation indices (ANDVI NDVI, ANDGI NDGI, ARVI RVI, ADVI DVI), vegetation coverage each channel (A and A R NIR index (VCI),(Ahue average(A )AEN and image texture parameters (energy (A ),, A moment of inertiacoverage (Awere H ASM CON), (A ) and inverse difference moment (A T he sensitive parameters correlation each channel (A Aentropy ), the vegetation indices (A , A , A ), vegetation COR L)). R,),average(A G, AB and NIR NDVI NDGI RVI DVI index (VCI), hue ) and image texture parameters (energy (A ), moment of inertia (A H ASM CON), H ASM CON correlation entropy (A ) and inverse difference moment (AL)). T he sensitive parameters were COR),average(A EN selected by (A correlation coefficient analysis and RF method. There were 4 and 8 parameters, respectively. index (VCI), hue ) and image texture parameters (energy (A ), moment of inertia (A H ASM CON), correlation (A ), entropy (A ) and inverse difference moment (A )). T he sensitive parameters were COR EN L COR EN L selected by (A correlation coefficient analysis and RFselected method.moment There were 4T and parameters, respectively. After modeling, LS-SVM model built by RF parameters showed better detection accuracy correlation ), entropy (AEN ) and inverse difference (AL)). he 8 sensitive parameters were CORthe selected by correlation coefficient analysis and RF method. There were 4 and 8 parameters, respectively. 2 2 After theRvLS-SVM built by for RF selected detection accuracy and =0.74. Itmodel could be used andparameters non-destructive of therespectively. chlorophyll with Rmodeling, selected by correlation coefficient analysis and RFrapid method. There wereshowed 4 anddetection 8 better parameters, c =0.87 After modeling, theR LS-SVM model built by for RF selected parameters showed better detection accuracy 2 2 with R =0.87 and =0.74. It could be used rapid and non-destructive detection of the chlorophyll c22for field maize. 2 content After modeling, the vLS-SVM model built by RF selected parameters showed better detection accuracy 2 with R cc2=0.87 and Rvv2=0.74. It could be used for rapid and non-destructive detection of the chlorophyll content fieldand maize. with Rc for =0.87 Rv =0.74. It could be used for rapid and non-destructive detection of the chlorophyll content field maize. © 2018, for IFAC (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved. Keywords: Multi-spectral content for field maize. image, chlorophyll content, image parameter, vegetation index, RF method. Keywords: Multi-spectral image, chlorophyll content, image parameter, vegetation index, RF method. parameter, vegetation RF method. Keywords: Multi-spectral image, chlorophyll content, image rapidly. The images wereindex, processed follow Keywords: Multi-spectral image, chlorophyll content, image parameter, vegetation index, RF method. 1. INTRODUCTION rapidly. The images were processed follow segmentation and parameters extraction. The 1. INTRODUCTION rapidly. The and images were processed follow segmentation parameters extraction. Thea 1. INTRODUCTION sensitive image parameters were analysed and rapidly. The images were processed follow The chlorophyll contents and leaf area index of segmentation and parameters extraction. Thea 1. INTRODUCTION sensitive image parameters were analysed and contents and leaf area index of The chlorophyll model wasimage builtand for chlorophyll field segmentation parameters extraction. Thea the crops have the remarkable influence on the sensitive parameters weredetection analysedofand The chlorophyll contents and leaf area index of model wasimage built parameters for chlorophyll detection ofand fielda maize. the crops have the remarkable influence on and the sensitive weredetection analysedof light utilization, dry matter accumulation The chlorophyll contents and leaf area index of field model was built for chlorophyll the crops have thedryremarkable influence on and the model light utilization, matter and accumulation harvest. The development application of maize. the crops have thedryremarkable influence on and the maize. was built for chlorophyll detection of field light utilization, matter accumulation The harvest. development and application of maize. 2. MATERIALS AND METHODS modern spectrum analysis and and the light utilization, dry mattertechnology accumulation harvest. The development and application of 2. MATERIALS AND METHODS modern spectrum analysis technology and the digital image provides harvest. The processing development and application ofa 2. MATERIALS AND METHODS modern spectrum analysistechnology technology and the digital image processing provides quick non-destructive detection method foraa 2. modern spectrum analysistechnology technology and the 2.1MATERIALS Review Stage AND Field METHODS Image Acquisition digital and image processing technology provides detection method for quick and non-destructive crop detection (Houborg et al. 2015; Liang et al. 2.1 Review Stage Field Image Acquisition digital and image processing technology providesfora quick non-destructive detection method crop detection (Houborg et detection al. 2015; Liang al. 2.1 Review Stage Field Image Acquisition 2012; Li al. 2015;). quick andet non-destructive methodet Review Stage Field Image Acquisition A 2-CCD multi-spectral imaging system was crop detection (Houborg et al. 2015; Liang et for al. 2.1 2012; Li et al. 2015;). crop detection (Houborg et al. 2015; Liang et al. A 2-CCD multi-spectral imaging system used to obtain the maize canopy images (Fig.was 1). 2012; Li et al. 2015;). 2-CCD multi-spectral imaging system In cropLimonitoring, the spectral image could help A 2012; et al. 2015;). used to obtain the maize canopy images (Fig.was 1). The system included 2-CCD image sensor and A 2-CCD multi-spectral imaging system was crop monitoring, the spectral image could help In used to obtainincluded the maize canopy images (Fig.and 1). to acquire vegetation spectrum and appearance The system 2-CCD image sensor In crop monitoring, the spectral image could help controller (Qian al. 2-CCD 2014;). The multi-spectral used to obtain theetmaize canopy images (Fig.and 1). to acquire vegetation spectrum and appearance The system included image sensor features of the crop leaves or canopy. It indicates In crop monitoring, the spectral image could help controller (Qian et al. 2014;). The multi-spectral to acquire vegetation spectrum and appearance image sensor could synchronously obtain image The system included 2-CCD image sensor and of the crop leaves or canopy. It indicates features controller (Qian et al. 2014;). The multi-spectral the vitality of crop vegetation by spectral parameters to acquire vegetation spectrum and appearance sensor could synchronously obtain image features of the leaves or canopy. It indicates image signals including visible light (Blue(B), Green(G), controller (Qian et al. 2014;). The multi-spectral the vitality of 2016; vegetation spectral parameters sensor could synchronously obtain image (Kanke et Pan etby 2013; It Feng et al. image features of al. the leaves oral.canopy. indicates the vitality of crop vegetation by spectral parameters signals including visible light (Blue(B), Green(G), Red(R); 400~700nm) and near infrared light image sensor could synchronously obtain image (Kanke et al. Pan etbyal. 2013; et al. signals including visible light (Blue(B), Green(G), 2010; Gong al. 2014;). The leaf Feng area index the vitality ofet2016; vegetation spectral parameters (Kanke et al. 2016; Pan et al. 2013; Feng et al. Red(R); 400~700nm) and near infrared light (NIR, 760~1000nm). The centre wavelengths signals including visible light (Blue(B), Green(G), The2013; 2010; Gong et2016; al.be2014;). area 400~700nm) and near infrared light (LAI) could measured toleaf show theindex (Kanke et al.also Pan et al. Feng etcrop al. Red(R); 2010; could Gong et al. 2014;). The leaf area index (NIR, 760~1000nm). The centre wavelengths were 470nm, 550nm,The and 800nm Red(R); 400~700nm) and620nm near infrared light also measured show (LAI) the crop (NIR, 760~1000nm). centre wavelengths biomass which relates toleaf plant density, 2010; Gong et closely al.be Theto area index 470nm, 550nm, 620nm and 800nm (LAI) could also be2014;). measured to show the crop were respectively in B, G, R and NIR images. The (NIR, 760~1000nm). The centre wavelengths biomass which closely relates to plant density, 470nm, 550nm, 620nm and 800nm height, leaf area and on (Lu etto al.show 2011;the Meyer (LAI) could alsoclosely be so measured crop were respectively in B, G, installed R and NIR images. The biomass which relates toal. plant density, system software was in the controller were 470nm, 550nm, 620nm and 800nm on (Lu et 2011; Meyer height, leaf area and so respectively in B, G, R and NIR images. The et al. 1987; Leblanc et al. 2005; Casa et al.density, 2005;). biomass which closely relates toal.plant system software was installed in the controller height, leaf area and so on (Lu et 2011; Meyer side, with collector acquisition respectively in B,was G, installed Rconnection, and NIR images. The et al. 1987; et al. 2005;). software in the controller height, leaf Leblanc area and et soal. on2005; (Lu etCasa al. 2011; Meyer system side, with collector connection, acquisition et al. 1987; Leblanc et al. 2005; Casa et al. 2005;). parameter setting, image display, storage and system software was installed in the controller However, the crop canopy monitoring accuracy side, with collector connection, acquisition et al. 1987; Leblanc et al. 2005; Casa et al. 2005;). parameter setting, image display, storage and However, the crop canopy monitoring accuracy processing side, withfunctions. collector connection, acquisition may be reduced by the influences of straw, weeds parameter setting, image display, storage and However, the crop canopy monitoring accuracy processing functions. may be reduced by the influences of straw, weeds processing setting, image display, storage and and background with themonitoring dynamic collection However, the crop canopy accuracy may soil be reduced by the influences of straw, weeds parameter functions. and soil background with the dynamic collection processing functions. environment (Wu et al. 2015). In order to analyse may be reduced by thewith influences of straw, weeds and soil background the dynamic collection et al. 2015). In ordertoto analyse environment (Wu the chlorophyll content, it is segment and soil background with thenecessary dynamic collection environment (Wu et al. 2015). In order to analyse the chlorophyll content, it is necessary segment the crop canopy images, extract environment (Wu et al. 2015). In orderto analyse the chlorophyll content, it is necessary totodetecting segment the crop canopy images, extract detecting parameters and remove invalid data before the the chlorophyll content, it is necessary to segment the crop canopy images, detecting parameters and remove invalidextract data before the detection modelling. the crop canopy images, extract detecting parameters and remove invalid data before the modelling. detection parameters and remove invalid data before the detection modelling. In this study, a developed multi-spectral imaging detection modelling. In this study, a developed system was used to collectmulti-spectral maize canopyimaging images Fig.1. 2-CCD multispectral imaging system In this study, a developed multi-spectral imaging Fig.1. 2-CCD multispectral imaging system system was used to collect maize canopy images andthis detect chlorophyll content in the field Fig.1. 2-CCD multispectral imaging system In study, a developed imaging system was the used to collectmulti-spectral maize canopy images and detect content in theimages field Fig.1. 2-CCD multispectral imaging system system was the usedchlorophyll to collect maize canopy and detect the chlorophyll content in the field and detect the chlorophyll content in the field 701Hosting by Elsevier Ltd. All rights reserved. 2405-8963 © IFAC (International Federation of Automatic Control) Copyright © 2018, 2018 IFAC ,
Peer review©under of International Federation of Automatic Copyright 2018 responsibility IFAC 701Control. Copyright © 2018 IFAC 701 10.1016/j.ifacol.2018.08.114 Copyright © 2018 IFAC 701
IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018
Junyi Zhang et al. / IFAC PapersOnLine 51-17 (2018) 700–705
701
The position monitoring system was developed for the image location record. When the image was captured, the GPS data was recorded synchronously.
threshold was defined by the standard deviation and mean of the neighbourhood pixel matrix (Drever et al. 2006; Lima et al. 2016; AjaFernández et al. 2015;), which could be expressed as equation (1):
2.2 Field experiment
1, f (i, j ) 30 SIG and f (i, j ) 1.5 MEAN g (i, j ) = 0, other
The field experiment was conducted in Yangling, Shanxi province on July 5-8, 2014. The maize variety was Zheng Dan 958 and the phonological period was seedling stage. The sensor was set 50cm above maize canopy. The lens focal length was 6mm, the viewing angle was about 50°, the image size was 1024×768 pix, TIFF format. 327 couples of maize canopy multispectral images were obtained. Then the images were checked and invalid samples were removed. There were 49 invalid samples and 278 valid samples were left(Yao et al. 2015;). Synchronously, the chlorophyll content index value measured by SPAD device and GPS data were recorded.
(1)
In which, f(i,j) is the grey value of the input image, SIG is the pixel standard deviation inside the matrix window, MEAN is the overall average grey value of the image, a and b, respectively, using empirical values of 30 and 1.5, g(i,j) is the binary image after the segmentation. (2) Image parameter extraction After the image segmentation, the detection parameters were extracted based on the colour and texture properties. The average grey values of R, G, B and NIR channels (AR, AG, AB, ANIR) were measured and used to calculate the vegetation indices including NDVI (Normalized Difference Vegetation Index, ANDVI), NDGI (Normalized Difference Green Index, ANDGI), RVI (Ratio Vegetation Index, ARVI), and DVI (Difference Vegetation Index, ADVI). These parameters mentioned above are list in Table 1.
2.3 Multi-spectral image processing The multi-spectral images were processed following pre-processing, canopy image segmentation, detection parameter extraction and analysis, as shown in Fig. 2.The algorithm was implemented by using image processing toolbox in MATLAB 2014a.
Table 1. Image detecting parameters Name
Abbreviation
Average grey value of M band
AM
Normalized difference vegetation index Normalized difference greenness index Ratio of vegetation index Differential vegetation index
Fig.2. Multispectral image and data processing
Formula Ʃ Grey value of M band / Number of pixels
ANDVI
(ANIR-AR) / (ANIR+AR)
ANDGI
(AG-AR) / (AG+AR)
ARVI
ANIR / AR
ADVI
ANIR-AR Ʃ Tone value / Number of image pixels Ʃ Crop pixel / Total number of image pixels
Hue mean
AH
(1) Image filtering and segmentation
Vegetation coverage index
VCI
The median filtering was selected to reduce the random noise in the images (Wu et al. 2015;). According to the colour and shape characteristics of different objective images, the image segmentation of maize canopy was conducted following two steps. Firstly, the HSI (Hue, Saturation, and Intensity) colour model was used to remove the soil background from the RGB image by the hue (H) threshold [π/4, 6π/5]. Secondly, the maize canopy was segmented by a variable threshold algorithm based on local statistics and region labelling algorithm. The local threshold segmentation method used a pixel matrix of [3, 3] as the search window. The local
Note:M means NIR、R、G and B,respectively, for the near infrared, red, green and blue light band average grey value.
In addition, considering the adjacent grey spatial distribution features, the image texture parameters were calculated based on the grey level co-occurrence matrix. The image texture parameters were calculated by equation (2)~(6) including energy(ASM), moment of inertia(CON), correlation(COR), entropy(EN) and inverse difference moment(L). They were used for maize chlorophyll index analysis and presented as energy (AASM), moment of inertia (ACON), correlation (ACOR), entropy (AEN) and inverse difference moment (AL), respectively. 702
IFAC BIOROBOTICS 2018 702 Beijing, China, July 13-15, 2018
Junyi Zhang et al. / IFAC PapersOnLine 51-17 (2018) 700–705
L −1 L −1
ASM = − P 2 (i, j ) i =0 j =0
d xy ( p, q ) =
(2)
d y ( p, q ) d x ( p, q ) (7) + ; p, q [1, N ] max d x ( p, q) max d y ( p, q)
L −1 L −1
CON = − (i − j ) P(i, j ) 2
i =0 j =0
ijP(i, j) − x
COR =
i
In this research, the basis of SPXY algorithm was the symbiotic distance of multispectral image parameters and SPAD values.
(3)
y
j
2 x 2 y
According to the analysis mentioned above, the Least Squares Support Vector Machine algorithm (LS-SVM) was used to build the detection model of chlorophyll content at maize seedling stage.
(4)
L −1 L −1
EN = − P(i, j )log P(i, j ) i =0 j =0
(5)
L −1 L −1
P(i, j ) L = + 1 (i − j )2 i =0 j =0
3. RESULTS AND DISCUSSION
(6)
2.4 Data processing
3.1 Image pre-processing and segmentation
(1) Parameter selection for chlorophyll content detection
The multi-spectral images of maize canopy were collected in the field, the original images are shown in Fig.3, in which (a) is RGB image and (b) is NIR image. There are background interference of soil and dry straw in the images. Before the canopy segmentation, the images were preprocessed by median filtering which could improve the image contrast and preserve edge information as much as possible.The image segmentation was conducted. Firstly, according to HSI colour space, green vegetation was initially separated from soil and straw background, as shown in Fig.3 (c) and (d). Secondly, local threshold segmentation was used for further removing of the field straw interference. The results of RGB and NIR image are shown in Fig.3 (e) and (f).
In order to make sure which properties were more sensitive to the chlorophyll content, the correlation between the extracted parameters and chlorophyll content were conducted by correlation coefficient and Random Frog (RF) algorithm. The RF algorithm is a variable selection method for the analysis of gene expression data (Li et al. 2012). The main workflow is as follows: (1) Initialize the tuneable parameters, randomly generate an initial variable subset V0 contains Q variables; (2) Generates a random number according to the standard normal distribution N(Q,Q). The integer that is closest to this random number is defined as Q*.It is the number of variables contains in the candidate variable subset V*; (3) Establish a candidate variable subset V* containing Q* variables using the relation between Q* and Q; (4) Use the probability calculated by RMSECV to determine V*, make V1=V*, iterating for N steps; (5) Analyse the candidate variables subset generated per step, calculate the frequency of selection for each variable.
(a)Original RGB image
(b)Original NIR image
(2) Data set division and modelling In order to build a chlorophyll content diagnosis model of maize canopy, the valid image samples were divided into modelling set and validation set by SPXY (Sample set Partitioning based on joint X-Y distance) algorithm. SPXY algorithm is developed from Kennard-Stone (KS) algorithm based on statistics (Galvão et al. 2005). The classification function of SPXY algorithm was classified the sampling data as dxy(p, q). At the same time, The dx(p, q) and dy(p, q) were divided by their maximum values in the dataset to make sure that the samples had the same weight in both x and y spaces. The standardized x and y distance formula is:
(c)Segmented RGB image based on H
(e)RGB image segmentation
(b)Segmented NIR image based on H
(f)NIR image segmentation
Fig.3 Processing results of multispectral images
703
IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018
Junyi Zhang et al. / IFAC PapersOnLine 51-17 (2018) 700–705
703
Table3. Parameter selection by RF algorithm
3.2 Image parameters extraction and correlation ship analysis The detection parameters were extracted including the average grey value of each band (AR, AG, AB, ANIR), vegetation indices (ANDVI, ANDGI, ARVI, ADVI), hue mean (AH), vegetation coverage index (VCI), and image texture parameters (AASM, ACON, ACOR, AEN and AL). The relationship between the parameters and SPAD values was analyzed. The results showed that the color and the vegetation index characteristics of image had higher correlation coefficient than the texture parameters. The correlation coefficients between detection parameters and SPAD values are listed in Table 2. As a result, 4 parameters were selected by correlation coefficient method including AG, AB, ANDVI and ARVI. Table 2. Correlation coefficient between Image parameters and SPAD value (r) Parameters
r
Parameters
r
Parameters
r
AR
0.29
ANDVI
-0.33
AASM
-0.11
AG
0.32
ANDGI
-0.10
AEN
-0.06
AB
0.32
ARVI
0.33
ACON
0.29
ANIR
-0.04
ADVI
-0.24
ACOR
-0.26
AH
-0.07
VCI
0.11
AL
-0.12
3.3 Parameter selection by RF RF algorithm was used to calculate the selection frequency of each parameter for chlorophyll index (SPAD value) detection. It was operated for fifteen times and the mean value was taken as the final result. The final result is shown in Table 3. In average grey value of each channel, the selection frequency of AR and AG were higher than 0.8, and better than AB and ANIR. For vegetation indices, the selection frequency of ANDVI and ARVI were higher than 0.9, better than ANDGI and ADVI. In image texture parameters, the selection frequency of AASM, AEN, ACON and AL were all higher 0.8. It was suggested that there were significant changes in the canopy images with the chlorophyll content index increase. The 8 parameters with selection frequency value more than 0.8 mentioned above were applied as the sensitive variable for the chlorophyll content detection.
Image parameter
Selection frequency
Image parameter
Selection frequency
AR AG
0.8385 0.8026
ANDVI ANDGI
0.9638 0.3372
AB
0.0811
ARVI
0.9523
ANIR
0.7541
ADVI
0.4498
AH
0.0212
VCI
0.0857
AASM
0.9784
AEN
0.9294
ACON
0.8134
ACOR
0.4882
AL
0.9482
3.4 Detection model of chlorophyll content The valid samples were divided into calibration set and verification set by SPXY algorithm before modelling. 200 samples of 278 valid data were used to build the chlorophyll content index diagnosing model and 78 samples were used for verification. The estimation model of chlorophyll content index (SPAD value) of maize leaves was established base on the LS-SVM algorithm. According to the parameters selected by correlation coefficient and RF, two models were established. After the data normalized, the sensitive parameters were [0.1, 0.9]. Image parameters were input as independent variables and SPAD values were taken as dependent variable. The results of the modelling are shown in Fig.4. Firstly, the model was established by 4 parameters which were selected by correlation coefficient method including AG 、 AB 、 ANDVI and ARVI. As shown in Fig.4(a), the modelling accuracy was Rc2=0.56 , and the validation accuracy was Rv2=0.41. Secondly, the model was established by 8 parameters which were selected by RF method including AR, AG, ANDVI, ARVI, AASM, AEN, ACON and AL. As shown in Fig.4 (b), the accuracy of calibration Rc2 was 0.87 and the verification Rv2 was 0.74. Compared two models, it shows that the detection model established by RF selected parameters could be used for rapid and non-destructive detection of the chlorophyll content for field maize.
Prediction SPAD Value 预测叶绿素SPAD值
54
In these parameters, AR and AG were able to reflect the spectral characteristics of plants in red and green area, which was the sensitive absorption band for chlorophyll and green reflection band for vegetation respectively. ANDVI and ARVI were two vegetation indices which could reflect the growth state and vegetation coverage information in the image.
建模 Modelling 验证 Verification
50 46 42 38 34
Rc2 = 0.56 Rv2 = 0.41
30
30
34
38 42 46 50 Measurement SPAD Value 实测叶绿素SPAD值
54
(a) Detection results by correlation coefficient selected parameters 704
IFAC BIOROBOTICS 2018 704 Beijing, China, July 13-15, 2018
Junyi Zhang et al. / IFAC PapersOnLine 51-17 (2018) 700–705
canopy based on support vector regression algorithm. Transactions of the Chinese Society of Agricultural Engineering, 28(20), 162-171. Li, Z., Jin, X., Wang, J., Yang, G., Nie, C., & Xu, X., et al. (2015). Estimating winter wheat (triticum aestivum) lai and leaf chlorophyll content from canopy reflectance data by integrating agronomic prior knowledge with the prosail model. International Journal of Remote Sensing, 36(10), 2634-2653. Kanke, Y., Tubaña, B., Dalen, M., & Harrell, D. (2016). Evaluation of red and red-edge reflectance-based vegetation indices for rice biomass and grain yield prediction models in paddy fields. Precision Agriculture, 17(5), 507-530. Pan, B., Zhao, G. X., Zhu, X. C., Liu, H. T., Liang, S., & Tian, D. D. (2013). estimation of chlorophyll content in apple tree canopy based on hyperspectral parameters. Spectroscopy & Spectral Analysis, 33(8), 2203-2206. Feng, Y., Fan, Y. M., Li, J. L., Qian, Y. R., Yan, W., & Jie, Z. (2010). Estimating lai and ccd of rice and wheat using hyperspectral remote sensing data. Transactions of the Chinese Society of Agricultural Engineering, 26(2), 237-243. Gong, Z., Zhao, Y., Zhao, W., Lin, C., & Cui, T. (2014). Estimation model for plant leaf chlorophyll content based on the spectral index content. Acta Ecologica Sinica, 34(20), 9-15. Lu, X. M., Huang, Q., Sun, X. C., Zhang, T. M., Liu, H. Z., & Zhong, X. H., et al. (2011). Application of image processing technology in rice canopy leaf area index. Chinese Agricultural Science Bulletin, 27(3), 65-68. Meyer, G. E., & Davison, D. A. (1987). Electronic image plant growth measurement system. Transactions of the ASAE American Society of Agricultural Engineers (USA), 30(1), 165-208. Leblanc, S. G., Chen, J. M., Fernandes, R., Deering, D. W., & Conley, A. (2005). Methodology comparison for canopy structure parameters extraction from digital hemispherical photography in boreal forests. Agricultural & Forest Meteorology, 129(3– 4), 187-207. Casa, R., & Jones, H. G. (2005). Lai retrieval from multiangular image classification and inversion of a ray tracing model. Remote Sensing of Environment, 98(4), 414-428. Wu, Q., Sun, H., Li, M. Z., Song, Y. Y., & Zhang, Y. E. (2015). [research on maize multispectral image accurate segmentation and chlorophyll index estimation]. Spectroscopy & Spectral Analysis, 35(1), 178.
(b) Detection results by RF selected parameters Fig.5 Results of chlorophyll content detection models 4. CONCLUSION In this paper, the multi-spectral images were measured and processed to rapidly detect the chlorophyll content of field maize canopy. The image detection parameters were extracted to build detecting model. The results are summarized as follows: (1) 15 parameters were extracted and analyzed including the average grey value of each channel (AR,AG,AB and ANIR), the vegetation indices (ANDVI,ANDGI,ARVI,ADVI), vegetation coverage index (VCI), hue average(AH) and image texture parameters (energy (AASM), moment of inertia (ACON), correlation (ACOR), entropy (AEN) and inverse difference moment (AL)). (2) The sensitive parameters were selected by correlation coefficient analysis and RF method. There were 4 and 8 parameters, respectively. After modelling, the LS-SVM model built by RF selected parameters showed better detection accuracy with Rc2=0.87and Rv2=0.74. It could be used for rapid and non-destructive detection of the chlorophyll content for field maize. ACKNOWLEDGMENTS This research was supported by the National Key Research and Development Program (2016YFD0300600-2016YFD0300606, 2016YFD0300600-2016YFD0300610), National Natural Science Fund (Grant No. 31501219) and the project (BKBD-2017KF03, KF2018W003, 2018TC020). REFERENCES Houborg, R., Mccabe, M., Cescatti, A., Gao, F., Schull, M., & Gitelson, A. (2015). Joint leaf chlorophyll content and leaf area index retrieval from landsat data using a regularized model inversion system (regflec). Remote Sensing of Environment, 159, 203221. Liang, L., Yang, M., Zhang, L., Lin, H., & Zhou, X. (2012). Chlorophyll content inversion with hyperspectral technology for wheat 705
IFAC BIOROBOTICS 2018 Beijing, China, July 13-15, 2018
Junyi Zhang et al. / IFAC PapersOnLine 51-17 (2018) 700–705
Qian, W., Hong, S., Li, M. Z., & Wei, Y. (2014). Development and application of crop monitoring system for detecting chlorophyll content of tomato seedlings. International Journal of Agricultural & Biological Engineering, 7(2), 138-145. Yao, W., Li, M., Yi, Z., Meng, Z., Hong, S., & Song, Y. (2015). Performance analysis of vehicle-mounted multi-spectral imaging system at different vehicle speeds. Transactions of the Chinese Society for Agricultural Machinery. Drever, L., Robinson, D. M., Mcewan, A., & Roa, W. (2006). A local contrast-based approach to threshold segmentation for pet target volume delineation. Medical Physics, 33(6), 1583-1594. Lima, K. A. B., Aires, K. R. T., & Reis, F. W. P. D. (2016). Adaptive Method for Segmentation of Vehicles through Local Threshold in the Gaussian Mixture Model. Intelligent Systems (pp.204-209). IEEE. Aja-Fernández, S., Curiale, A. H., & VegasSánchez-Ferrero, G. (2015). A local fuzzy thresholding methodology for multiregion image segmentation. Knowledge-Based Systems, 83(1), 1-12. Li, H. D., Xu, Q. S., & Liang, Y. Z. (2012). Random frog: an efficient reversible jump markov chain monte carlo-like approach for variable selection with applications to gene selection and disease classification. Analytica Chimica Acta, 740:20-26. Galvão, R. K., Araujo, M. C., José, G. E., Pontes, M. J., Silva, E. C., & Saldanha, T. C. (2005). A method for calibration and validation subset partitioning. Talanta, 67(4), 736-740.
706
705