Bioenergy Crop Identification at Field Scale Using VHR Airborne CIR Imagery

Bioenergy Crop Identification at Field Scale Using VHR Airborne CIR Imagery

Bioenergy Crop Identification at Field Scale Using VHR Airborne CIR Imagery Muhammad Abdullah Sohl, Patric Schlager, Klaus Schmieder, and H.M. Rafique...

2MB Sizes 38 Downloads 62 Views

Bioenergy Crop Identification at Field Scale Using VHR Airborne CIR Imagery Muhammad Abdullah Sohl, Patric Schlager, Klaus Schmieder, and H.M. Rafique

Abstract

The present study is aimed at developing a methodology to extract maize, a predominant energy crop, and efficiently map its spatial distribution in a Natura 2000 region of northern Germany. Following a GEOBIA approach, segmentation was performed on two hierarchical levels. Level 1 consisted of field boundaries, and level 2 represented variations within level 1. Decision rules were developed for level 2 based on spectral information, vegetation indices, standard deviations and knowledge of crop phenology. For this purpose, first, level 2 image objects were classified. Subsequently classification was shifted to level 1. Maize covered 10.6 percent of total study area. The presented methodology gives the advanced user the flexibility to integrate expert knowledge in the classifier. In addition, the implementation time of decision rules was very fast and helped to produce results with high accuracy.

Introduction

Energy is a major contributor to economic and social development. However, the way it is produced, distributed, and used can have direct impact on several climatic issues and environmental resources (Kaltschmitt and Weber, 2006). Due to the adverse effects of conventional energy resources on the environment, and the finite supply of fossil fuels, many countries are converting towards renewable energy supplies. Hydropower, wind power, solar energy, geothermal, and bioenergy are key renewable energies. Among these, bioenergy plays an important role due to its flexibility of supplying heat, electricity, fuel, and capacity for storage. Over recent years, many communities and local municipalities in Germany have set themselves the target of achieving self-sufficiency with renewable energies. In addition to that, political support programs such as “Renewable Energies Act” have created a boom in the renewable energy market. Consequently, large maize covers (bioenergy crop) appeared in Lüchow-Dannenberg (northern Germany) and its Natura 2000 protected sites (Dziewiaty et al. (2007). Maize as habitat has disadvantage properties for agricultural breeding birds, which are used as Muhammad Abdullah Sohl was with the Department of Geomatics, Computer Science and Mathematics, Stuttgart University of Applied Sciences, 70174 Stuttgart, Germany; and currently with the Water Resource Division, National Engineering Services Pakistan (Pvt) Limited, Lahore, Pakistan ([email protected]). Patric Schlager was with the Institute of Landscape and Plant Ecology (320), University of Hohenheim, 70599 Stuttgart, Germany; and currently with the Armenian National Agrarian University, 74 Teryan, 0009 Yerevan, Armenia. Klaus Schmieder is with the Institute of Landscape and Plant Ecology (320), University of Hohenheim, 70599 Stuttgart, Germany. Hafiz Muhammad Rafique is with the School of Physical Sciences, Department of Physics, University of the Punjab, Quaid-i-Azam, Lahore 54590, Pakistan.

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

08-15 August Peer Reviewed.indd 669

indicator species for biodiversity on agricultural sites (ibid). Area and yield of energy crops grown within a certain region have been documented by agricultural statistics; the spatial distribution is difficult to access by traditional methods. However, it is of great interest since spatial land-use patterns affect habitat quality. Land-cover classification has been carried out at a pixel level since the launch of the Landsat satellites, and it is still widely implemented. Yet, in very high-resolution (VHR) data, single pixels do not represent objects of interest (geons) because the pixel size is far below the targeted object size. Thus, GEographic Object Based Image Analysis (GEOBIA) or OBIA evolved as a new paradigm (Blaschke et al., 2014). GEOBIA allows incorporating image information beyond the spectral information used in pixel-based image analysis, such as shape, texture, and relationship of objects (Hay and Blaschke, 2010). This resolved the “salt and pepper” problem found in traditional pixel-based classification of VHR data (Yu et al., 2006; Pu et al., 2011). GEOBIA is far superior to pixel based classification when working with VHR data (Castilla et al., 2008, Johansen et al., 2010; Benjamin et al., 2013; Tehrany et al., 2013). Advances in object-based classification are reviewed in Alpin and Smith (2008) and Arvor et al. (2013), whereas Blaschke (2010) did the most comprehensive literature review on GEOBIA research to date. Numerous methods have been adopted recently to classify agricultural land cover. GEOBIA proved to be more successful in land-cover mapping due to the fact that crops and habitats exist in patches rather than small elements (Hernando et al., 2012b). As a result, GEOBIA approaches are increasingly practiced for land-cover classification and habitat mapping (Hay and Castilla, 2008; Tiede et al., 2008; Addink et al., 2012; Hernando et al., 2012a). Within GEOBIA, Nearest Neighbor (NN) and rule sets have proved their effectiveness to classify maize (Xu et al., 2004; Brooks et al. 2006; Yu et al., 2006; Tehrany, et al., 2013). Integration of expert knowledge in the classification (pixel-based or object-based) have revealed increase in accuracy (Bach et al., 2003; Stolz et al., 2005; Lucas et al., 2007; Conrad et al., 2010; Rittl et al., 2013). Pena-Barragan et al. (2011) carried out object-based classification of different crops using multitemporal ASTER satellite data in Yolo County, California. Numerous vegetation indices, textural features and knowledge of crop phenology were integrated into the classification process and the overall accuracy of the classification reached 79 percent. De-Wit and Clevers (2004) documented per-field classification within GEOBIA framework to be threetimes faster and more efficient than pixel-based classification. Agriculture land over and habitat mapping are often performed on multi-temporal data in order to achieve high level of Photogrammetric Engineering & Remote Sensing Vol. 81, No. 8, August 2015, pp. 669–677. 0099-1112/15/669–677 © 2015 American Society for Photogrammetry and Remote Sensing doi: 10.14358/PERS.81.8.669

Au g u s t 2015

669

8/3/2015 12:57:41 PM

accuracy. However, uncertainty remains in traditional classification approaches when working with single date imagery, limiting their applicability to ecological research tasks and political demands. Therefore, the objective of current study was to achieve high accuracy by efficiently classifying the bioenergy crop maize using mono-temporal VHR color-infrared (CIR) imagery.

Study Area and Dataset Study Area

Lucie (DE2933401) covers an area of 8,229 ha and is a part of the district Luechow-Dannenberg in Lower Saxony, Germany (Figure 1). It is a special protected area (SPA) for birds and habitats that comes under the umbrella of Natura 2000 (Bird Directive 2009/147/EC; Habitat Directive 92/43/EEC), i.e., an ecologically protected area under EU conditions for conservation of endangered habitats, wild fauna, and flora (Europarl, 2011). Agricultural land cover included in the study area comprises maize, wheat, rye, barley, rape, grassland, and potato.

Dataset

Digital UltraCamXp, an airborne sensor, was used to capture VHR CIR imagery in July 2010. The Digital UltraCamXp is a product of Microsoft Corporation having an image format of 196 megapixels (17,310 pixels across track and 11,310 pixels along track). It has a pixel pitch of 6 μm and can have a ground sampling distance (GSD) of 2.9 cm at flying height of 500 m (Microsoft, 2010). The current imagery was captured at an average flight height of 5,930 meters covering 35 cm GSD and having 16-bit radiometric resolution. The aerial images

Figure 1. Location map of study area. Scale bar refers to expanded SPA: Lucie.

670

Aug us t 2 015

08-15 August Peer Reviewed.indd 670

had 60 percent forward overlap and 30 percent side overlap. The imagery was radiometrically corrected for dark current offset and vignetting effect. Camera calibration data (interior orientation parameters), ground control points (GCP), global positioning system (GPS), and inertial measurement unit (IMU) data was provided for orthorectification. Additionally, ALK (Automatisierte Liegenschaftskarte) vector data was available to aid information extraction. ALK is a German cadastre map containing boundary information of land parcels, which can serve to obtain better segmentation and improved classification results. A field survey of LüchowDannenberg was performed in July 2010 for collecting reference data of existed agriculture land cover. During the field survey measurements such as land cover location, necessary photographs, and supplementary crop parameters (e.g., height of crop, soil cover, etc.) were taken into consideration. The survey area was divided and labeled into ~1.5 square kilometer grids (Figure 2). Reference data collected for the vicinity of the study area partially covered grid numbers: 737, 738, 773, 774, 946, 947, 982, 983, 1018, and 1019.

Methodology Data Preprocessing

Aerial images, camera interior orientation parameters, GPS and IMU data were integrated in MATCH-AT INPHO software to perform Aerial Triangulation (AT). Sixty-one (61) images comprising four strips were reckoned to cover the whole area of Lucie with twenty-three (23) reasonably distributed ground control points (GCPs). GCPs at the border of the image block were used as control points while the rest of the points served as checkpoints. GCPs were measured stereoscopically for utmost precision. Subsequently, a digital terrain model (DTM) with a 15 m grid was produced in INPHO’s MATCH-T® software using DTM flat strategy. DTM was further edited for maximum precision. Orthophotos were produced using the nearest-neighbor resampling technique having 10 percent overlap and a 5 percent clip area. Final pixel size was chosen according to the original 35 cm GSD. Afterwards, the orthophotos were

Figure 2. Reference data available for the sites.

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

8/3/2015 12:57:44 PM

mosaicked applying “Global Tilting” radiometric adjustment and “Adaptive Feathering” mosaic adjustment. Global Tilting is an image group adjustment that independently compares each channel of an image to the corresponding adjacent images and applies the best radiometric parameters. It allowed compensating any intensity/color/contrast variation within neighboring images (Trimble, 2011). The Adaptive Feathering mosaic adjustment generated seamless mosaics by computing automatic “blending function” that controlled the individual images to be stitched into output mosaic (ibid). The computation of radiance and ground reflectance could only be possible with a fully calibrated camera and/or with color targets placed in the flight area with additional measurements of the targets during imagery acquisition. This data was not available; therefore, final output pixels were contented with radiometrically corrected (dark current, vignetting, color balanced) digital numbers (DN).

technique for the test area. Implementing multi-resolution segmentation to the whole study area resulted in workstation memory problems and software crashes. Thus, an alternative optimized segmentation strategy was devised, which comprised combination of quadtree segmentation, multiresolution region grow segmentation, and spectral difference segmentation. The segmentation was implemented on two hierarchical levels (Figure 4). Chessboard segmentation was used to rasterizing the field boundaries resulting in level 1 image objects. The optimized segmentation strategy produced level 2 image objects without compromising much on their quality and took about 60 hours to segment the whole study area. Level 2 image objects were used as the smallest classification unit.

Workflow

The workflow primarily consists of three main parts, i.e., segmentation, analysis (spectral signatures) and classification. Segmentation is dividing an image into different homogeneous regions or “image objects” (Baatz and Schäpe, 2000). Spectral signature or spectral response curve of an object is the unique reflected intensity value plotted over a range of wavelength. Objects could be identified or separated from one another other based on spectral signatures and their variations. Classification is the process of categorizing an image into meaningful land cover classes such as forest, sand, snow, vegetation, and water (Tso and Mather, 2009). Figure 3 shows the complete workflow of the methodology adopted in the present study.

Segmentation

In the object-based approach, image objects (segments) serve as a basis for classification. Different types of segmentations were tested using Trimble eCognition® 8.64. In the preliminary investigation, multi-resolution segmentation (Baatz and Schäpe, 2000) revealed the most suitable segmentation

Figure 4. Segmentation hierarchical levels; level 1 involved rasterizing parcel boundaries whereas any variations within a parcel were detected at level 2.

Figure 3.Workflow of classification. The scheme represents all the steps involved in bioenergy crop classification.

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

08-15 August Peer Reviewed.indd 671

Au g u s t 2015

671

8/3/2015 12:57:49 PM

Figure 5. Spectral signatures of maize in grid 982 to 983. X-axis NIR, Red and Green channels data is plotted whereas mean reflection response comprising 16-bit data are plotted on Y-axis.

Figure 6. Spectral signatures of grassland. X-axis NIR, Red and Green channels data is plotted whereas mean reflection response comprising 16-bit data are plotted on Y-axis.

Analysis: Spectral Signatures

Detailed spectral analysis was performed for development of classification decision rules. Grids 982 to 983 reference data was used as a training set to produce spectral signatures of all agriculture land covers. The spectral signatures of “maize” in grid 982 to 983 are plotted in Figure 5. Variation in maize was due to alternate times of sowing and growth of crop. Mean reflectance values of maize were observed to cluster from 23,000 to 28,000 in the NIR band. Grasslands have the most within-class variation besides maize. Some mowed grasslands having NIR mean values between 25,000 to 30,000 exhibited a spectral response similar to late maize (Figure 6). Variation in grassland signatures resulted from different grassland harvesting time. The spectral

672

Aug us t 2 015

08-15 August Peer Reviewed.indd 672

response of the remaining crops in grid 982 revealed that apart from rye, all the crops have a high spectral response in NIR compared to maize (compare Figures 5 and 7). In addition, the difference between the NIR and Red band is highly significant.

Classification Rules Development

Spectral analysis showed that some maize regions had spectral characteristics similar to those of mowed grassland, while few maize areas such as medium maize resembled spectral responses of rye. From the classification point of view, it was not possible to discriminate maize from rest of the crops with high level of accuracy. It was due to within-class variation of maize, rye, and grasslands. Therefore, to narrow down the problem, maize was divided into three subcategories primarily on the basis of their spectral response: early PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

8/3/2015 12:57:53 PM

Figure 7. Spectral signatures of remaining crops. X-axis NIR, Red and Green channels data is plotted whereas mean reflection response comprising 16-bit data are plotted on Y-axis. Table 1. Maize (Late, Medium, and Early) Class Description in Trimble eCognition® Late Maize

Medium Maize

Early Maize

Mean NIR >= 15000 Mean NIR <= 29500 Mean Red >= 28000 Mean Red <= 43000 Mean Green >= 27000

Mean NIR >= 15000 Mean NIR <= 28500 Mean Red >=19000

Mean NIR >= 15000 Mean NIR <= 30000 Mean Red >= 19000 Mean Red <= 27000 Mean Green >= 23000 Mean Green <= 31000 NDVI >= -0.2 NDVI <= 0.17 Green/Red >= 1 SD NIR>= 1200 SD Red>= 1500 SD Green >= 1750

NDVI >= -0.3 NDVI <= 0 Green/Red <= 1 NIR/Green <= 1 SD Green >= 1850

Mean Green >= 23000 Mean Green <= 30000 NDVI >= -0.3 NDVI <= 0.02 Green/Red <= 1 SD NIR >= 1200 SD Red >= 1500 SD Green >= 1700

maize, medium maize, and late maize. Early seasoned and early-matured maize, mid-seasoned, and medium-matured maize, late-seasoned, and late-matured maize were classified as early, medium, and late maize, respectively (Carsky et al., 2001). This subdivision was merely for internal classification to map the overall maize cover accurately and had nothing to do with actual definition of early, medium, and late maturity of the crop. Spectral values alone were not sufficient in order to map maize precisely; consequently additional parameters were devised to eliminate confusion in separating these crops. These parameters included standard deviations, NDVI, and simple ratios. The final decision rules developed are shown in Table 1.

Classification Rules Implementation

First, the late-maize rule set was executed at level 2 (Figure 8). Bare soil near houses, having spectral characteristics similar to that of late-maize, was included in late-maize. Subsequently, early-maize rule set was implemented on the remaining unclassified image objects of level 2. Some small image objects, i.e., trees, were also included in early-maize class. Finally, medium-maize rule set was applied to the remaining unclassified objects at level 2. In the next step, early-maize, medium-maize, and latemaize were combined into one class namely “combined

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

08-15 August Peer Reviewed.indd 673

Figure 8. Implementing early maize, medium maize, and late maize rule set.

Au g u s t 2015

673

8/3/2015 12:57:56 PM

(a)

(b)

Figure 9. Eliminating small misclassified image objects: (a) Final result of maize classification at level 2 (white color), and (b) Classifying image objects of level 1 based on level 2 (white color).

Figure 10. Spatial distribution of maize in the study area.

674

Aug us t 2 015

08-15 August Peer Reviewed.indd 674

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

8/3/2015 12:57:59 PM

maize.” Misclassification was found in the built-up area and tree area. Most of these misclassified image objects were not adjacent to each other. Therefore, the entire image objects in the class “combined maize” were merged together and small misclassified image objects were eliminated by a query: all those image objects from the combined maize class that had an area smaller than 10,000 pixels ought to be unclassified. Due to the query, most of the small, misclassified image objects were successfully eliminated from the combined maize class. However, there were still some areas within other crops that have been classified as maize (Figure 9a). This misclassification could not be removed at level 2 image objects. Misclassified regions were eliminated from the maize class by classifying the level 1 image objects (field boundaries) with the query: “classify image objects at level 1 as maize, which have 50 percent or more area covered by level 2 class ‘combined maize’, otherwise image object should remain unclassified” (Figure 9b).

Results and Discussion

In the study area, 874 ha were classified as maize from the total 8,229 ha, which is 10.6 percent of maize of the whole study area. According to government statistics 11 percent maize was found in the whole Luechow Dannenberg district. This study validates the statistical information provided by the German government. Besides the boom of renewable energy sources occurred due to political support, 10.6 percent maize shows that it has not been overly produced. Therefore, Lucie is effectively helping in protection of endangered birds species by conserving their habitats. The spatial distribution of maize has been shown in Figure 10. To measure the accuracy of classification, an error matrix, also known as confusion matrix, was created (Story and Congalton, 1986; Congalton, 1991). Table 2 shows the error matrix for grid 738-774. Table 2. Error Matrix of Grid 738 to 774 Maize

Others

Total

User’s Accuracy

Maize

841196.79

100606.69

941803.48

89.32%

Others

153584.80

3201166.80

3354751.60

95.42%

Total

994781.59

3301773.49

4296555.08

Producer’s Accuracy

84.56%

96.95%

Total = 94.08%

The overall accuracy is the simplest estimator that can be extracted from error matrix. Overall accuracy of this study was 94.08 percent, which is quite high. It shows kind of biasness due to the fact that only two classes were used. For detailed analysis of accuracy of individual classes, user’s and producer’s accuracies were calculated. User’s accuracy of maize, also known as error of commission, was 89.32 percent. About 11 percent of other agriculture land was included in the maize. Producer’s accuracy of maize, also called error of omission, was 84.56 percent indicating just over 15 percent of maize was omitted. Higher user’s accuracy than producer accuracy demonstrates that the rule set for maize was somewhat strict; that led to some maize image objects being classified as others class. The Kappa statistic was calculated to measure the possibility of correctly classified objects (agreement) by chance (Cohen, 1960). Its values range from 1 (perfect agreement/accurate) to 0 (agreement by chance/inaccurate). Kappa value was 0.83 that shows the robustness of the methodology adopted. In this study, GEOBIA rules implemented on single date VHR imagery (35 cm) clearly resulted in a high user’s and producer’s accuracy. This exhibits that remotely sensed VHR airborne imagery was extremely efficient in mapping the bioenergy

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

08-15 August Peer Reviewed.indd 675

crop in the Natura 2000 area of Lucie. Thirty-five centimeters (35 cm) of the GSD yielded high accuracy in mapping maize, although the imagery only had information in NIR, Red, and Green channels. In this research, maize showed diverse spectral responses as a result of different sowing times and phenology. Due to vast variation within the class of maize, it presented spectral signatures similar to some other agriculture crops, i.e., grassland and rye. The rules developed for splitting maize into three subclasses helped in mapping maize precisely. The object-based classification rules provided the opportunity to integrate expert knowledge by incorporating features based on spectral values, i.e., different vegetation indices from simple ratios to NDVI, standard deviations, and knowledge of crop phenology. NDVI enabled for compensating for variation caused by different illumination conditions and viewing aspects, and standard deviation allowed to limit or mask the range of spectral values (Jensen, 2005). Simple ratios and standard deviations were introduced into classification because NDVI alone could not extract maize accurately. This is due the fact that NDVI is negatively affected by soil surface reflection within a land cover class (Heute and Jackson, 1988). In the present study, maize, rye, and grasslands were the major crops influenced by exposure of soil underneath. That is why very specific thresholds were devised so that other crops influence on the maize class is minimalistic, i.e., keep the error of commission low. The execution time of developed rules was very fast and helped to improve the accuracy. Stolz et al. (2005) implemented ENPOC rule based classifier to classify maize but yielded very low maize accuracy due to similar spectral signatures of maize and grassland. The present study helped to overcome this problem by dividing maize into three subclasses. Hernando et al., (2012b) also mapped Natura 2000 habitat by applying rules (based on threshold values) on image objects with an overall accuracy of 83.6 percent. Probability rules applied on accurate field boundaries increases classification accuracy especially for agriculture land cover mapping. Lucas et al. (2007) mapped agricultural land covers with an overall accuracy of 84.9 percent using field boundaries. Conrad et al. (2010) classified different irrigated crops on per-field basis while attaining an overall accuracy of 80 percent.

Conclusion and Recommendations

Very high-resolution (VHR) CIR imagery showed great potential for agriculture land cover classification. User defined rules implemented on image objects have proved its effectiveness to extract the desired land cover with higher level of accuracy. The presented methodology helps to overcome the traditional “salt and pepper” problem for VHR data and gives the advanced user the flexibility to integrate expert knowledge in the classifier. Good cadastral data is very useful for agriculture land cover classification. However, accuracy of field boundaries is of critical importance. It is assumed that within one field, only one class exists. Field boundaries ought to match with the crop dynamic boundaries for most of the area. Otherwise, per-field classification be avoided or field boundaries should be created/updated according to De-Wit and Clevers (2004), Conrad et al. (2010), or Montaghi et al. (2013). Along with the advantages, developing rules based on threshold values also has a downside. Too lenient rules tend to include other crops in the desired class, and too strict rules tend to miss significant portions of the desired class. Development of rules takes considerable time and expert knowledge of desired classes. Besides, these rules are implemented on image objects created by segmentation. The segmentation process of such high-resolution data takes an ample time and requires substantial computer processing resources.

Au g u s t 2015

675

8/3/2015 12:57:59 PM

For future work, rules could be developed for remaining agricultural land covers (wheat, rye, grasslands, etc.) of the study area and the efficiency of rule-based classification could be tested while working on several classes. Since maize was one of the most difficult classes to identify in the study area, it is expected that the classification methodology for other crops will be successful with high accuracies. The rule set development in this particular case was time taking but it should be noted that maize had the most within-in class variation. For simple cases, NDVI could be enough to separate land covers as documented by Hernando et al., (2012a). In order for rule set to be transferable on multi-temporal and other sensors data, DNs of orthophotos should be converted to percentage ground reflectance before decision rules development and classification. Moreover, new accuracy assessment approaches for object-based image classification (Radoux et al., 2010) such as Object Fate Analysis (OFA) matrix (Schöpfer et al., 2008; Hernando et al., 2012a) could be carried out to further enhance the analysis.

Acknowledgments

The authors are grateful to German Federal Ministry for Education and Research (BMBF) and Institute of Landscape and Plant Ecology (320), University of Hohenheim for financing the project by providing the data, software, and agricultural assistance in making this research possible. We also thank Trimble eCognition team for freely presenting a latest copy of eCognition software.

References

Addink, A.E., F.M.B. Van-Coillie, and S.M. De-Jong, 2012. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis, International Journal of Applied Earth Observation and Geoinformation, 15:1–6. URL: http://dx.doi.org/10.1016/j. jag.2011.12.001 (last date accessed: 03 June 2015). Aplin, P., and G.M. Smith, 2008. Advances in object-based image classification, International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 37(B7):725–728. Arvor, D., L. Durieux, S. Andrés, and M.A. Laporte, 2013. Advances in Geographic Object-Based Image Analysis with ontologies: A review of main contributions and limitations from a remote sensing perspective, ISPRS Journal of Photogrammetry and Remote Sensing, 82(0):125–137: doi:10.1016/j.isprsjprs.2013.05.003 Bach, H., M., Braun, G. Lampart, and W. Mauser, 2003. The use of remote sensing for hydrological parameterization of Alpine catchments, Hydrology and Earth System Sciences, 7(6):862–876. Baatz, M., and M. Schäpe, 2000. Multiresolution segmentation An optimization approach for high quality multi-scale image segmentation, Angewandte Geographische InformationsVerarbeitung XII (J. Strobl, T. Blaschke, and G. Griesebner, editors) Wichmann Verlag, Karlsruhe, pp. 12–23. Benjamin A.B, A.W. Timothy, F.C. Jamison, and E.M. Brenden, 2013. Does spatial resolution matter? A multi-scale comparison of object-based and pixel-based methods for detecting change associated with gas well drilling operations, International Journal of Remote Sensing, 34(5):1633–1651. Blaschke, T., 2010. Object based image analysis for remote sensing, ISPRS International Journal of Photogrammetry and Remote Sensing, 65(1):2–16. Blaschke, T., G.J. Hay, M. Kelly, S. Lang, P. Hofmann, E. Addink, R. Queiroz Feitosa, F. van der Meer, H. van der Werff, F. van Coillie, and D. Tiede, 2014. Geographic Object-Based Image Analysis Towards a new paradigm, ISPRS Journal of Photogrammetry and Remote Sensing, 87(0):180-191: doi:10.1016/j.isprsjprs.2013.09.014. Brooks, C., D. Schaub, R. Powell, N. French, and R. Shuchman, 2006. Multi-temporal and multiplatform agricultural land cover classification in southeastern Michigan, Proceedings of the ASPRS 2006 Annual Conference, 01-05 May, Reno, Nevada.

676

Aug us t 2 015

08-15 August Peer Reviewed.indd 676

Carsky, R.J., B.B. Singh, and B. Oyewole, 2001. Contribution of early season cowpea to late season maize in the savanna zone of West Africa, Biological Agriculture & Horticulture, 18(4):303–315 Castilla, G., G.J. Hay, and J.R. Ruiz, 2008. Size-constrained region merging (SCRM): An automated delineation tool for assisted photointerpretation, Photogrammetric Engineering & Remote Sensing, 74(4):409–419. Cohen, J., 1960. A coefficient of agreement for nominal scales, Educational and Psychological Measurement, 20:37–46. Congalton, R.G., 1991. A review of assessing the accuracy of classifications of remotely sensed data, Remote Sensing of Environment, 37:35–46 Conrad, C., S. Fritsch, J. Zeidler, G. Rücker, and S. Dech, 2010. Perfield irrigated crop classification in arid central Asia using SPOT and ASTER data, Remote Sensing, 2(4):1035–1056. De Wit, A.J.W., and J. Clevers, 2004. Efficiency and accuracy of perfield classification for operational crop mapping, International Journal of Remote Sensing, 25(20):4091–4112. Dziewiaty, K., and P. Bernardy, 2007. Auswirkungen zunehmender Biomassenutzung (EEG) auf die Artenvielfalt - Erarbeitung von Handlungsempfehlungen für den Schutz der Vögel der Agrarlandschaft - Endbericht F+E-Vorhaben, Bundesministerium für Umwelt, Naturschutz und Reaktorsicherheit, 128 S, Berlin. Europarl, 2011. Parliamentary questions, European Parliament, URL: http://www.europarl.europa.eu/sides/getAllAnswers. do?reference=E-2010-011192&language=EN (last date accessed: 03 June 2015). Hay, G.J., and G. Castilla, 2008. Geographic Object-Based Image Analysis (GEOBIA): A new name for a new discipline? ObjectBased Image Analysis - Spatial Concepts for Knowledge-driven Remote Sensing Applications (T. Blaschke, S. Lang, and G.J. Hay, editors, Springer Verlag, Chapter 1.4, pp 81–92. Hay, G.J., and T. Blaschke, 2010. Forward: Special issue on geographic object-based image analysis (GEOBIA), Photogrammetric Engineering & Remote Sensing, 76(2):121–122. Hernando, A., D. Tiede, F. Albrecht, and S. Lang, 2012a. Spatial and thematic assessment of object-based forest stand delineation using an OFA-matrix, International Journal of Applied Earth Observation and Geoinformation, 19:214–225. Hernando, A., L.A. Arroyo, J. Velazquez, and R. Tejera, 2012b. Objects-based Image Analysis for mapping Natura 2000 habitats to improve forest management, Photogrammetric Engineering & Remote Sensing, 78(9):991–999. Huete, A.R., and R.D. Jackson, 1988. Soil and atmosphere influences on the spectra of partial canopies, Remote Sensing of the Environment, 25:89–105. Jensen, J.R., 2005, Introductory Digital Image Processing: A Remote Sensing Perspective, Third edition, Englewood Cliffs, New Jersey, Pearson Prentice Hall. Johansen, K., L.A. Arroyo, and S. Phinn, 2010. Comparison of geo-object based and pixel-based change detection of riparian environments using high spatial resolution multi-spectral imagery, Photogrammetric Engineering & Remote Sensing, 76(2):123–136. Kaltschmitt, M., and M. Weber, 2006. Markets for solid biofuels within the EU-15, Biomass and Bioenergy, 30:897. Lucas, R., A. Rowlands, A. Brown, S. Keyworth, and P. Bunting, 2007. A rule-based classification of multi-temporal satellite imagery for habitat and agricultural land-cover mapping, ISPRS Journal of Photogrammetry and Remote Sensing, 62:165–185. Microsoft, 2010. UltraCam-Xp Technical Specifications, URL: http:// download.microsoft.com/download/7/4/3/743EFD09-258B4BFA-8D56-3148C60DD137/UCAMTechnicalDocuments/ UltraCamXp-Specs.pdf (last date accessed: 03 June 2015). Montaghi, A., R. Larsen, and M.H. Greve, 2013. Accuracy assessment measures for image segmentation goodness of the Land Parcel Identification System (LPIS) in Denmark, Remote Sensing Letters, 4(10):946–955: doi:10.1080/2150704X.2013.817709 Pena-Barragan, M.J, M.K. Ngugi, R.E. Plant and J. Six, 2010. Objectbased crop identification using multiple vegetation indices, textural features and crop phenology, Remote Sensing of Environment, 115(6):1301–1316.

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

8/3/2015 12:58:00 PM

Pu, R., S. Landry, and Q. Yu, 2011. Object-based urban detailed land cover classification with high spatial resolution IKONOS imagery, International Journal of Remote Sensing, 32(12):3285–3308. Radoux, J., P. Bogaert, and P. Defourny, 2010. Overall accuracy estimation for geographic object-based image classification, Accuracy 2010 Symposium, Leicester, UK. Richard, A.J., and X. Jia, 2006. Remote Sensing Digital Image Analysis: An Introduction to 4th Edition, Springer, Heidelberg, 209 p. Rittl, T., M. Cooper, R.J. Heck, and M.V.R. Ballester, 2013. Objectbased method outperforms per-pixel method for land cover classification in a protected area of the Brazilian Atlantic rainforest region, Pedosphere, 23(3):290–297: doi:http://dx.doi. org/10.1016/S1002-0160(13)60018-1 Schöpfer, E., S. Lang, and F. Albrecht, 2008. Object-fate analysis: Spatial relationships for the assessment of object transition and correspondence, Object-Based Image Analysis (T. Blaschke, S. Lang, and G.J. Hay, editors), Springer, Berlin, Heidelberg, pp. 786–801. Stolz, R., M. Braun, M. Probeck, R. Weidinger, and W. Mauser, 2005. Land use classification in complex terrain: The role of ancillary knowledge, EARSeLeProceedings, 4(1):94–105. Story, M., and R.G. Congalton, 1986. Accuracy assessment: A user’s perspective, Photogrammetric Engineering & Remote Sensing, 52(3):397–399.

PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING

08-15 August Peer Reviewed.indd 677

Tehrany, M.S., B. Pradhan , and M.N. Jebu, 2013. A comparative assessment between object and pixel-based classification approaches for land use/land cover mapping using SPOT 5 imagery, Geocarto International, pp. 1–19: doi:10.1080/1010604 9.2013.768300 Tiede, D., S. Lang, and D. Hölbling, 2008. Class modelling of biotope complexes - Success and remaining challenges, Proceedings of GEOBIA, 2008 - Pixels, Objects, Intelligence: GEOgraphic Object Based Image Analysis for the 21st Century, 05-8 0August, Calgary, Alberta, Canada, pp. 6768–6774. Trimble, 2011, Reference manual InphoOrthoVista 4.6, pp. 56–65, URL: ftp://76.162.39.185/INPHO/ReferenceManual_OrthoVista_ (English)_46.pdf (last date accessed: 03 June 2015). Tso, B., and P.M. Mather, 2009. Classification Methods for Remotely Sensed Data, Taylor and Francis, 95 p. Xu, W., B. Wu, J. Huang, Y. Zhang, and Y. Tian, 2004. Segmentation and classification approach of land cover mapping using QuickBird image, IGARSS ‘04, Proceedings 2004 IEEE International (Volume 5), 2024, Anchorage, Alaska, pp. 3368–3370. Yu, Q., P. Gong, N. Clinton, G. Biging, M. Kelly, and D. Schirokauer, 2006. Object-based detailed vegetation classification with airborne high spatial resolution remote sensing imagery, Photogrammetric Engineering & Remote Sensing, 72(7):799–811.

(Received 12 December 2013; accepted 08 April 2014; final version 25 September 2014)

Au g u s t 2015

677

8/3/2015 12:58:00 PM