Automatic identification of butterfly species based on local binary patterns and artificial neural network

Automatic identification of butterfly species based on local binary patterns and artificial neural network

Applied Soft Computing 28 (2015) 132–137 Contents lists available at ScienceDirect Applied Soft Computing journal homepage: www.elsevier.com/locate/...

1MB Sizes 0 Downloads 87 Views

Applied Soft Computing 28 (2015) 132–137

Contents lists available at ScienceDirect

Applied Soft Computing journal homepage: www.elsevier.com/locate/asoc

Automatic identification of butterfly species based on local binary patterns and artificial neural network Yılmaz Kaya a,∗ , Lokman Kayci b , Murat Uyar c a b c

Department of Computer Engineering, Siirt University, 56100 Siirt, Turkey Department of Biology, Siirt University, 56100 Siirt, Turkey Department of Electrical and Electronics Engineering, Siirt University, 56100 Siirt, Turkey

a r t i c l e

i n f o

Article history: Received 10 September 2012 Received in revised form 13 October 2014 Accepted 30 November 2014 Available online 8 December 2014 Keywords: Butterfly identification Local binary patterns Texture features Artificial neural network

a b s t r a c t Butterflies are classified firstly according to their outer morphological qualities. It is required to analyze genital characters of them when classification according to outer morphological qualities is not possible. Genital characteristics of a butterfly can be determined by using various chemical substances and methods. Currently, these processes are carried out manually by preparing genital slides of the collected butterfly through some certain processes. For some groups of butterflies molecular techniques should be applied for identification which is expensive to use. In this study, a computer vision method is proposed for automatically identifying butterfly species as an alternative to conventional identification methods. The method is based on local binary pattern (LBP) and artificial neural network (ANN). A total of 50 butterfly images of five species were used for evaluating the effectiveness of the proposed method. Experimental results demonstrated that the proposed method has achieved well recognition in terms of accuracy rates for butterfly species identification. © 2014 Elsevier B.V. All rights reserved.

1. Introduction Insects, the most crowded family in the animal kingdom are represented with one and a half million species. The Lepidoptera order that involves butterflies and moths, who are distinguished from their closest relatives Trichoptera with flakes rather than trichome on their wings and with absorbent mouth parts in adolescents, is one of the richest teams among insects with its more than 170,000 species. The wing shape, textures and colors vary in butterflies with a great range. The figures on the wings of butterflies mostly have important roles in distinction of species at first glance. While these types of features are used as taxonomic characters as long as they are used within the species, examination of genitals organs’ outer structural features of especially the male individual is required when distinguishing species very similar to each other at first glance [1]. On the other hand, these techniques are difficult to apply and time-consuming. In recent years, molecular studies are added to these identification characteristics [2]. All of the studies carried out previously, although not completely decisive, have been on supporting characters for morphological characters can be used in butterfly identifications. The aim of this study is to

∗ Corresponding author. Tel.: +90 506 488 49 90. E-mail address: [email protected] (Y. Kaya). http://dx.doi.org/10.1016/j.asoc.2014.11.046 1568-4946/© 2014 Elsevier B.V. All rights reserved.

design an automatic machine viewing (computer vision) system that correctly identifies butterfly species based on texture features of butterfly images. To our knowledge, there are no enough studies with machine vision and machine learning in the literature to identify the butterfly species. In previous studies, we used the energy spatial Gabor filtered (GF) (different orientations and frequencies) method with various classification methods for identification of five butterfly species. The highest obtained accuracy is 97% by ELM for five butterfly species and the accuracy obtained by ANN is 92% [3]. In another study, we obtained 96.3% classification accuracy, while employing gray-level co-occurrence matrix (GLCM) with multinomial logistic regression (MLR) for classification of 10 butterfly species and the accuracy obtained by ANN is 93.2% [4]. Additionally, 92.85% accuracy was obtained by GLCM with ANN methods for 14 butterfly species methods [5]. Furthermore, we employed GLCM and local binary pattern (LBP), the highest accuracy of identification of 19 butterfly species was obtained for these feature extraction methods is observed while employing ELM with 98.25%, 96.45% accuracy, respectively, and the accuracy obtained for ANN are 93.16% and 89.47%, respectively [6]. For classification of butterflies, the texture features on surface of butterfly wings were used. Texture analysis plays an important role in many image analysis applications. Texture can be defined as the visual or tactile surface characteristics of objects [7]. It can

Y. Kaya et al. / Applied Soft Computing 28 (2015) 132–137

133

Fig. 1. The selected samples from five butterflies species.

also be formed by a single surface via variations in shape, illumination, shadows, absorption and reflectance [8,9]. Although in general there is no information on the cause of the variations, differences in image pixels provide a practical means of analyzing the textural properties of objects [10–14]. In this paper, we used the local binary pattern (LBP) operator to analyze patterns of the butterfly species and extract their texture properties. The LBP texture analysis operator was introduced as a robust descriptor of microstructures in images [15] and it has already been used in a large number of computer vision applications such as visual inspection, image retrieval, remote sensing, biomedical image analysis, face image analysis, motion analysis, environment modeling and outdoor scene analysis [16–21]. The main advantages of this operator are its tolerance to illumination changes, its computational simplicity, which makes it possible to analyze images in challenging real-time settings [17]. In present study, it is tried to prove that texture features of organisms are also decisive in outer morphological features used in identification. This study was formed with two stages; first, texture features, which were obtained from butterfly images and then classification was realized through ANN (artificial neural network). In the past years, ANNs have seen an increasingly interests in image processing. The advantage of using ANN is able to build the nonlinear and requires only input and output of data without knowing the processes in ANNs clearly. As a result of this study, identification and classification of butterfly species by using LBP texture features through ANN is showed a significant success. The rest of this paper is organized as follows. Section 2 gives an overview of butterflies species used in study; Section 3 describes the LBP operator; Section 4 describes the feature extraction based on LBP. In Section 5, we describe the application of artificial neural network (ANN) for pattern recognition. In Section 6 the proposed method is given. The experimental results are presented in Section 7, and section 8 provides concluding remarks. 2. Data set In this study, species belonging to family Papilinidae was collected from Mount Erek, Van between May 2002 and August 2003. Field studies were carried out between the attitudes of 1800–3200 m. In the field, butterflies were caught by using net trap. After being killed in jars containing ethyl acetate the butterflies were put into special envelopers prepared in advance together with labels including collection information. At the end of the field studies, samples in temporary storage boxes were put in softening containers. Samples of the butterflies softened in it for 2–3 days were pinned with standard insect pins of the appropriate numbers (0 and 1 pins). Then samples were stretched in stretching boards and dried for being prepared as standard museum materials. The drying process lasted one week in an incubator fixed to 50–55 ◦ C. Locality labels were added to the samples rejected from stretching boards and these samples were then classified in collection drawers. External and internal morphological genital features were considered in the identification of samples. Together with morphological features of extremities and organs on head and chest, textures and colors on upper and lower sides of wings were considered in the identification made regarding external morphological

features. Slides of male genital organs of some samples, which could not be identified with the external morphological features, were prepared. The identification was made by comparison of genital structures of related literature. Various handbooks, revision studies and comparison materials are used in identification, as follows in alphabetical order: Carbonell [22], Hesselbarth et al. [23], Skala [24] and Tolman [25]. Identification labels, on which the scientific names were written, were pinned on the samples at the end of the identification process. For an identification of texture features of the species used in the study 10 images for each of 5 species were used shot with a Nikon Professional camera. Butterfly species belonging to Papilinidae family that is spread in Van Lake basin were used in the study, as shown in Fig. 1. 3. Local binary pattern The local binary pattern (LBP) operator was proposed to measure the local contrast in texture analysis [15]. This operator is defined as a gray scale invariant texture measure, derived from a general definition of texture in a local neighborhood [17]. The LBP can be seen as a unifying texture model that describes the structure of a texture with micro-textons and their statistical distribution rules. The original LBP operator, introduced by Ojala et al. [26], is a powerful means of texture description and it is defined as a grayscale invariant texture measure, derived from a general definition of texture in a local neighborhood. For each pixel in an image, a binary code is produced by thresholding its value with the value of the center pixel. The basic version of the LBP operator considers only the eight neighbors of a pixel, but the definition has been extended to include all circular neighborhoods with any number of pixels [15,26]. Different LBP operators can be defined according to the neighbors (see Fig. 2). In general, LBPP,R , where R is defined by a set of different multiscale models, P is the number of neighbors; R indicates the radius of the model. At a given pixel position (xk , yk ), LBP operator labels the pixels of an image by using the value of the center pixel as a threshold value of the neighborhood of each pixel. If the neighboring pixel value is greater than or equal to the center pixel value this pixel takes the value 1 otherwise it takes 0. Then an LBP code for a neighborhood is formed (Fig. 3). Fig. 3 shows a basic LBP where P and R are 8 and 1, respectively. The decimal value of this binary code gives the local structural information around the given pixel. The mathematical formulation of LBP for a pixel is as follows: LBP(x) =

 S(t) =

P 

S(G(xi ) − G(x))2i−1

(1)

i=0

1, t ≥ 0 0, t < 0

(2)

where x is the location of the center pixel. xi is the location of the ith neighboring pixel and G(·) is the pixel intensity value. Note that each bit of the LBP code has the same significance level and that two successive bit values may have a totally different meaning. Actually, the LBP code may be interpreted as a kernel structure index. By definition, the LBP operator is unaffected by any monotonic

134

Y. Kaya et al. / Applied Soft Computing 28 (2015) 132–137

Fig. 2. Circularly symmetric neighbor sets for different LBP operators.

Fig. 3. Calculating the original LBP code.

Fig. 4. (a) Original image, (b) gray-scaled image, (c) processed by the LBP operator (R = 1, P = 8), (d) processed by the LBP operator (R = 2, P = 8), (e) processed by the LBP operator (R = 3, P = 8) and (f) histogram of 64 LBP values.

gray-scale transformation which preserves the pixel intensity order in a local neighborhood (Fig. 4). By applying this procedure an LBP image is formed, which has pixel values ranging between 0 and 255. Each LBP value corresponds to a different pattern. When the histogram of the LBP image was produced, it shows how often each of these 256 different patterns appears in a given texture [27]. However, it is possible to decrease the number of patterns in an LBP histogram by only using uniform patterns without losing much information. An LBP pattern is a uniform pattern if it contains at most two bitwise transitions from 0 to 1 or 1 to 0 at its binary representation when the binary string is considered circular. For example, 11100001 (with 2 transitions) is a uniform pattern, whereas 11110101 (with 4 transitions) is a non-uniform pattern.

4. Feature extraction Experiences in saccadic eye movements indicate that local appearances play an important role for classification [28]. People

can recognize objects because they seek particular regions where discriminating information is located. LBP features computed over the whole butterflies’ wings represent only the micro patterns without any information about their locations. All the LBP values can composite a texture spectrum, called texture pattern matrix or LBP matrix, which is a map of a butterflies’ image. The map can show the distribution of gray levels and the main texture information of the butterflies’ image. The approach presented in this paper utilizes this finding by dividing butterfly images into sub-blocks and comparing the similarities between these sub-blocks. The butterflies’ images are divided into 64 sub-blocks for feature extraction. This representation captures local textures of images. Five texture features are extracted from the LBP matrix. These are average (F1), deviation (F2), energy (F3), entropy (F4) and correlation (F5). Let f(x, y) be the texture pattern matrix of size M × N. The five features are listed as follows: 1  f (x, y) MN M−1 N−1

F1 =

x=0 y=0

(3)

Y. Kaya et al. / Applied Soft Computing 28 (2015) 132–137

135

Fig. 5. Artificial neuron model.

  M−1 N−1   1  2 F2 =  (f (x, y) − AVE) MN

(4)

x=0 y=0

  N−1 M−1  1  F3 =  f 2 (x, y) MN

(5)

x=0 y=0

N−1  M−1  f (x, y)

F4 = −

ENE

x=0 y=0

ln

 f (x, y) 

(6)

ENE

M−1 N−1   xyf (x, y) − x y

F5 = −

(7)

x y

x=0 y=0

where  is the mean and  is the standard deviation. 5. Artificial neural network The artificial neural network (ANN) method is based on the findings of the biological nervous system [29]. In this artificial neural system, there are nerve cells which are joined together in a variety of ways to form networks. The ANN consists of layers, namely the input layer, the inter layers (hidden layers) and the output layer. The input layer receives data from the external world. The output layer presents the data to the user. The hidden layers between these two layers are where the data is processed. The number of the nerve cells in the hidden layer is significant for the performance as well as the length of the network. Usually the output of each neuron is determined by using a nonlinear activation function such as a sigmoid and hyperbolic tangent. ANNs are trained by experience, when applied to an unknown input in the network; it can generalize from past experiences and produce a new result [30]. The output of the neuron net is determined by Eq. (8). Fig. 5 shows a fundamental representation of an artificial neuron [31–33].

⎛ ⎞ n n   y(t + 1) = F ⎝ wij xj (t) − i ⎠ and neti = wij xj − i j=1

because of their parallel processing capabilities [30]. There are many different parameters that should be decided when designing an ANN such as the number of layers, the number of neurons per layer, and the number of training iterations, learning rate, and activation function. 6. Proposed method for identifying butterfly species The idea of using LBPs for butterfly species description is motivated by the fact that butterflies can be seen as a composition of micro-patterns which are properly described by this operator. The method for classifying the images proposed in this paper is based on texture descriptors extracted from the butterfly images. As illustrated in Fig. 6, our approach can be divided into four main steps: • Step 1: The objective of image preprocessing is to remove irrelevant noise and to enhance image features which are important for further analysis point of view. Otherwise it may complicate the classification and reduce the recognition rate.

(8)

j=1

In Fig. 5, various inputs of the network are represented by a symbol xn . Each of these inputs is multiplied by a connection weight that represented by Wn . In ANN, xn × Wn products are summed fed through an activation function to generate a result for output.  i is a bias value, G(x) is activation function. ANN models have been used for pattern matching, nonlinear system modeling, communications, electrical and electronics industry, energy production, chemical industry, medical applications, data mining and control

Fig. 6. Block diagram of the experimental design.

136

Y. Kaya et al. / Applied Soft Computing 28 (2015) 132–137 2.4 2.2

BS3 BS4 BS5

2 1.8 F3

7. Results

BS1 BS2

1.6 1.4 1.2 1 0.8 0.8

1

1.2

1.4 F1

1.6

1.8

2

Fig. 7. The scatter plot of pair of F1–F3 features.

In this paper, a preprocessing and tessellation step was implemented, which included normalizing the image and dividing the images into N × N overlapping cells. Then images were resized to 256 × 256 pixels. • Step 2: The objective of the feature extraction step is to produce a representative feature vector that can accurately capture the unique and salient texture features of each butterfly images. Firstly, the texture descriptors were calculated from a set of selected cells. Then, LBP features were extracted from butterfly images based on the feature extraction methods in Eqs. (3)–(7). Five-dimensional feature sets for training and testing data were obtained. The dimensions here describe different features resulting from the LBP matrix, that is to say, the total size of data set is 5 × 50, where 5 is the dimension of feature size of each butterfly and 50 comes from 10 samples per class multiplied by 5 classes. A representation of data set obtained from the LBP matrix is given in Fig. 7. This is a part of the input vector to be classified, and their coefficients were presented in a two-dimensional scatter plot form. This approach allows the assessment of the nature of the features clearly. As seen in Fig. 7, the scatter plot may provide distinctive regions to separate and distinguish each butterfly species accurately. • Step 3: The objective of the classification step is to automatically identify, as a unique gray level, the features occurring in an image in terms of the object. In this stage, the LBP matrix features were applied as input to ANN. The architecture of ANN is the multi layer perceptron. The output of the classification step is the butterfly species. The ANN training parameters used in this study were given in Table 1. • Step 4: In this stage, classification results were described in terms of a 5 × 5 confusion matrix. The diagonal elements represent the correctly classified butterfly species. The off-diagonal elements represent the misclassified butterfly species.

Table 1 ANN architecture and training parameters. Architecture

Training parameters

This paper presented the classification of butterfly images with LBP texture analysis operator. Texture analysis methods have been developed with gray-scale images, intuitively for good reasons. Humans can easily capture the textures on a surface, even with no color information. Fig. 4 shows a photograph of butterflies and theirs gray-scale version. For classification process ANN is used. We have five classes, five butterfly species. In the study, 50 butterfly images samples were used. The classification process were carried out through ANN with LBP feature sets those obtained for different neighboring (P = 8, 12, 16, 24) and radius (R = 1, 2, 3, 4) values. In these experimental studies, 10-fold cross-validation schema were applied in which 25 samples was used for training ANN and the remaining 25 samples were used as the test dataset. The crossvalidation schema was repeated for 10(ten) times. The confusion matrixes for different LBP operators were given in Tables 2–5. The performance values derived with ANN for different LBP operators were given in Table 6. From the Table 6, experimental results show that the best classification accuracy rates of butterfly Table 2 Confusion matrix for LBP8,1 . Butterfly species (BS)

BS1

BS2

BS3

BS4

BS5

Accuracy %

BS1 BS2 BS3 BS4 BS5

9 0 0 2 1

0 10 0 0 0

0 0 10 0 0

1 0 0 7 0

0 0 0 1 9

90 100 100 70 90

Table 3 Confusion matrix for LBP12,2 . Butterfly species (BS)

BS1

BS2

BS3

BS4

BS5

Accuracy %

BS1 BS2 BS3 BS4 BS5

10 0 0 2 0

0 10 0 0 0

0 0 10 0 0

0 0 0 8 0

0 0 0 0 10

100 100 100 80 100

Table 4 Confusion matrix for LBP16,3 . Butterfly species (BS)

BS1

BS2

BS3

BS4

BS5

Accuracy %

BS1 BS2 BS3 BS4 BS5

10 0 0 2 0

0 10 0 0 0

0 0 10 0 0

0 0 0 8 0

0 0 0 0 10

100 100 100 80 100

Table 5 Confusion matrix for LBP24,4 . Butterfly species (BS)

BS1

BS2

BS3

BS4

BS5

Accuracy %

BS1 BS2 BS3 BS4 BS5

10 0 0 2 0

0 10 0 0 0

0 0 10 0 0

0 0 0 8 0

0 0 0 0 10

100 100 100 80 100

The number of layers The number of neuron on the layers The initial weights and biases Activation functions

3 Input: 5, hidden: 50, output: 5 Random Hidden: sigmodial, output: purelin

Table 6 Classification accuracy for different P and R.

Performance function Learning rule

MSE Levenberg–Marquardt back-propagation 0.6 0.98 2E − 10

P=8 P = 12 P = 16 P = 24

Learning rate Moment constant Performance goal

R = 1 (%)

R = 2 (%)

R = 3 (%)

R = 4 (%)

90 90 92 90

96 96 96 96

98 96 96 96

98 98 96 96

Y. Kaya et al. / Applied Soft Computing 28 (2015) 132–137

identification based on the LBP8,3 , LBP8,4 , LBP12,4 , operators was 98.00%. In general, the problem was seen in the classification of the BS4 specie. 8. Conclusion The color of butterflies, shapes of wings and textures change with a great range. It is to such an extent that these features play an important role in the distinction of species at first glance. While these kinds of features are seen as taxonomic characters as long as being limited in species, sometimes in distinction of species very alike, especially an examination of external genital organs of male individuals is necessary. In recent years, as result of cariologic researches, it is understood that chromosome numbers and sizes of species are important in distinction of species in some Agrodiaetus species (Lycaenidae). For these reasons new classification techniques are very important. While using various techniques in butterfly species distinction, it is seen that machine learning and vision techniques are not used sufficiently. In this study, we have presented a view-based system for recognition and identification of butterfly species with LBP and ANN. LBP have been successfully applied to the fields of image processing and image analysis. Due to its texture discriminative property and its very low computational cost, LBP is becoming very popular in pattern recognition. LBPs are considered as one of the texture descriptors with better results; they employ a statistical feature extraction by means of the binarization of the neighborhood of every image pixel with a local threshold determined by the central pixel. LBP features extracted from butterfly images and classification process was evaluated through ANN by using LBP features as inputs. Experimental results showed that the LBP operator can describe the main characters of butterfly images effectively. The best classification accuracy rates of butterfly identification based on the LBP operators were 98.00%. References [1] L. Kayci, Erek Da˘gı (Van) Papilionoidea ve Hesperioidea Ekolojisi ve Faunası Üzerine Aras¸tırmalar (Lepidoptera), Priamus Suppl. 6 (2007) 1–47. [2] P.D. Hebert, T.R. Gregory, The promise of DNA barcoding for taxonomy, Syst. Biol. 54 (5) (2005) 852–859. [3] Y. Kaya, L. Kayci, R. Tekin, A computer vision system for the automatic identification of butterfly species via Gabor-filter-based texture features and extreme learning machine: GF + ELM, TEM J. 2 (1) (2013) 13–20. [4] L. Kayci, Y. Kaya, A vision system for automatic identification of butterfly species using a grey-level co-occurrence matrix and multinomial logistic regression, Zool. Middle East 60 (1) (2014) 57–64. [5] Y. Kaya, L. Kayci, Application of artificial neural network for automatic detection of butterfly species using color and texture features, Vis. Comput. 30 (1) (2014) 71–79. [6] Y. Kaya, L. Kayci, R. Tekin, Ö.F. Ertu˘grul, Evaluation of texture features for automatic detecting butterfly species using extreme learning machine, J. Exp. Theor. Artif. Intell. 26 (2) (2014) 267–281.

137

[7] A. Sengur, I˙ . Turkoglu, M.C. Ince, Wavelet packet neural networks for texture classification, Expert Syst. Appl. 32 (2) (2007) 527–533. [8] M. Mirmehdi, X. Xie, J. Suri, Handbook of Texture Analysis, Imperial College Press, USA, 2008, pp. 133. [9] M. Karabatak, M.C. Ince, A. Sengur, Wavelet domain association rules for efficient texture classification, Appl. Soft Comput. 11 (1) (2011) 32–38. [10] A. Jain, G. Healey, A multiscale representation including opponent color features for texture recognition, IEEE Trans. Image Process. 7 (1) (1998) 124–128. [11] B. Guo, R.I. Damper, S.R. Gunn, A fast separability-based feature-selection method for high-dimensional remotely sensed image classification, Pattern Recognit. 41 (2008) 1653–1662. [12] Z. Hui, W. Runsheng, W. Cheng, A novel extended local-binary-pattern operator for texture analysis, Inform. Sci. 178 (2008) 4314–4325. [13] A. Sengur, Wavelet transform and adaptive neuro-fuzzy inference system for color texture classification, Expert Syst. Appl. 34 (3) (2008) 2120–2128. [14] X. Guang-ming, An identification method of malignant and benign liver tumors from ultrasonography based on GLCM texture features and fuzzy SVM, Expert Syst. Appl. 37 (2008) 6737–6741. [15] T. Ojala, M. Pietikäinen, T. Mäenpää, Multiresolution gray-scale and rotation invariant texture classification with local binary patterns, IEEE Trans. Pattern Anal. Mach. Intell. 24 (2002) 971–987. [16] M. Pietikäinen, Image Analysis with Local Binary Patterns, Springer-Verlag, Berlin/Heidelberg, 2005, pp. 115–118. [17] B. Kurt, V.V. Nabiyev, Down syndrome recognition using local binary patterns and statistical evaluation of the system, Expert Syst. Appl. 38 (2011) 8690–8695. [18] J. Bongjin, K. Taewan, K. Daijin, A compact local binary pattern using maximization of mutual information for face analysis, Pattern Recognit. 44 (2011) 532–543. [19] N. Loris, L. Alessandra, B. Sheryl, Local binary patterns variants as texture descriptors for medical image analysis, Artif. Intell. Med. 49 (2011) 117–125. [20] W. Zhang, Y.S. Frank, J. Ningde, L. Yinfeng, Recognition of gas–liquid two-phase flow patterns based on improved local binary pattern operator, Int. J. Multiph. Flow 36 (2010) 793–797. [21] N. Loris, B. Sheryl, L. Alessandra, A local approach based on a Local Binary Patterns variant texture descriptor for classifying pain states, Expert Syst. Appl. 37 (2010) 7888–7894. [22] F. Carbonell, Contribution a la connaissance du genre Agrodiaetus Hübner (1822), position taxinomique d’Agrodiaetus anticarmon Koc¸ak, 1983 (Lepidoptera, Lycaenidae), Linneana Belgica 16 (7) (1998) 263–265. [23] G. Hesselbarth, H.V. Oorschot, S. Wagener, Die Tagfalter der Türkei, Bochum, 1995. [24] P. Skala, New taxa of the genus Hyponephele MUSCHAMP, 1915 from Iran and Turkey (Lepidoptera, Nymphalidae), Linneana Belgica 19 (1) (2003) 41–50. [25] T. Tolman, Butterflies of Britain and Europe, vol. 320, Harper Collins Publishers, London, 1997. [26] T. Ojala, M. Pietikäinen, D. Harwood, A comparative study of texture measures with classification based on feature distributions, Pattern Recognit. 29 (1996) 51–59. [27] E. Tekeli, M. C¸etin, A. Erc¸il, Shape and data driven texture segmentation using local binary patterns, in: 15th European Signal Processing Conference (Eusipco 2007), Poznan, Poland, September 3–7, 2007. [28] S. Moore, R. Bowden, Local binary patterns for multi-view facial expression recognition, Comput. Vis. Image Underst. 115 (4) (2011) 541–558. [29] C. Bishop, Neural Networks for Pattern Recognition, University Press, Oxford, 1995. [30] L. Fausett, Fundamentals of Neural Networks, Prentice Hall, New York, 1994. [31] S.B. Park, J.W. Lee, S.K. Kim, Content-based image classification using neural network, Pattern Recognit. Lett. 25 (2004) 287–300. [32] H. Kuo-Yi, Application of artificial neural network for detecting Phalaenopsis seedling diseases using color and texture features, Comput. Electron. Agric. 57 (2007) 3–11. [33] Y. Kaya, A. Yesilova, M.N. Almalı, An application of expectation and maximization, multiple imputation and neural network for missing value, World Appl. Sci. J. 9 (5) (2010) 561–566.