The Kolmogorov–Sinai entropy in the setting of fuzzy sets for image texture analysis and classification

The Kolmogorov–Sinai entropy in the setting of fuzzy sets for image texture analysis and classification

Pattern Recognition 53 (2016) 229–237 Contents lists available at ScienceDirect Pattern Recognition journal homepage: www.elsevier.com/locate/pr Th...

2MB Sizes 0 Downloads 71 Views

Pattern Recognition 53 (2016) 229–237

Contents lists available at ScienceDirect

Pattern Recognition journal homepage: www.elsevier.com/locate/pr

The Kolmogorov–Sinai entropy in the setting of fuzzy sets for image texture analysis and classification Tuan D. Pham n Department of Biomedical Engineering, Linköping University, 58183 Linköping, Sweden

art ic l e i nf o

a b s t r a c t

Article history: Received 3 September 2014 Received in revised form 17 December 2015 Accepted 19 December 2015 Available online 29 December 2015

The Kolmogorov–Sinai (K–S) entropy is used to quantify the average amount of uncertainty of a dynamical system through a sequence of observations. Sequence probabilities therefore play a central role for the computation of the entropy rate to determine if the dynamical system under study is deterministically non-chaotic, deterministically chaotic, or random. This paper extends the notion of the K–S entropy to measure the entropy rate of imprecise systems using sequence membership grades, in which the underlying deterministic paradigm is replaced with the degree of fuzziness. While constructing sequential probabilities for the calculation of the K–S entropy is difficult in practice, the estimate of the K–S entropy in the setting of fuzzy sets in an image is feasible and can be useful for modeling uncertainty of pixel distributions in images. The fuzzy K–S entropy is illustrated as an effective feature for image analysis and texture classification. & 2015 Elsevier Ltd. All rights reserved.

Keywords: Chaos Kolmogorov–Sinai entropy Fuzzy sets Uncertainty Texture analysis Image classification

1. Introduction Texture is an important feature to describe an attribute of an image, which is a phenomenon existing in natural scenes, physical and biological appearances, and art work. Mathematical methods such as measures of smoothness, coarseness, and spatial regularity of distributions of pixels are often used to quantify and classify the texture content of different types of images. Although texture is inherently present in images, it is easy to recognize but difficult to define [1]. This is because texture is subject to human perception, and therefore there is no single precise mathematical definition of texture [2]. In general, there are three main approaches to the analysis of texture: statistical, structural, and spectral [3]. Statistical approaches characterize textures as smooth, coarse, grainy, and so on. Structural techniques deal with the arrangement of image primitives, such as the description of texture based on regularly spaced parallel lines. Spectral techniques are based on properties of the frequency content of an image to detect global periodicity by taking into account high-energy, narrow peaks in the spectrum. It is known that texture analysis has a long and rich development in image processing and computer vision [4–7] as there is a textbook devoted to the topic of image texture [8]. In fact, the extraction of texture features continues receiving considerable attention to both technical developments [9–14] and applications [15–20]. n

Tel.: þ 46 13 286778. E-mail address: [email protected]

http://dx.doi.org/10.1016/j.patcog.2015.12.012 0031-3203/& 2015 Elsevier Ltd. All rights reserved.

It has been pointed out that texture analysis is difficult because of the lack of effective methods for characterizing texture at different scales and in terms of spatial multiresolution; therefore, techniques such as wavelet transform and Gabor filter can be useful for texture analysis [6,21]. Although there are many developments of methods and their applications for texture analysis, the challenge of the classification of texture in images still remains. This is because the properties of texture are ill-defined and complex in nature, being due to the high degree of local spatial variations of image intensity in orientation and scale. Such complexity makes it a burdensome task for current mathematical models to effectively discriminate various types of textural information. As a matter of fact, information and uncertainty are a coupled entity that exists and needs to be addressed in many complex problems of pattern recognition [22]. It is the uncertainty in information, which makes real-life patterns be both predictable and unpredictable at once. For example, in performance art, we know the pianist will play all the notes that Beethoven wrote, but the performance seems like it can go anywhere spontaneously. This is known as “the uncertainty principle and pattern recognition” [23]. A major school of thought for quantifying the predictability or uncertainty in complex information processing is the theory of chaos and nonlinear dynamics. Chaos is the study of surprises, the nonlinear and the unpredictable; while traditional science deals with supposedly predictable phenomena. In spite of its wide applications in many fields of science and engineering, techniques developed for measuring nonlinear behavior in chaotic

230

T.D. Pham / Pattern Recognition 53 (2016) 229–237

signals are mainly concerned with time-series data. Little effort has been spent on the formulation and extension of chaos analysis for quantifying the unpredictability of the nature of texture. The study presented in this paper was motivated by the utilization of the theories of chaos and fuzzy sets for addressing dynamic uncertainty to characterize the spatial arrangement of the distribution of intensities in textural images. In fact, it has been pointed out that the theory of chaos and fuzzy logic are among the most interesting fields of mathematical research, and these two theories were applied to study the semantic dynamics of selfreference [24]. The rest of this paper is organized as follows. Section 2 briefly describes the concept of the Kolmogorov–Sinai (K–S) entropy. The notion of the fuzzy K–S entropy is introduced in Section 3. Section 4 presents the estimate of the fuzzy K–S entropy in images. Experimental results on different types of textural images and benchmark data are discussed in Section 5. Finally, Section 6 is the conclusion of the research findings.

the entire attractor is t 1 1 NX 1 Sm þ 1  Sm ¼ lim ðSNt  S0 Þ: Nt -1N t τ N t -1N t τ m¼0

H KS ¼ lim

The third property of the K–S entropy requires two limits: one limit takes the time interval to zero, that is limτ-0 ; the second limit takes the bin or cell size ϵ to zero, that is limϵ-0 . Taking together the above three properties, the complete definition of the K–S entropy is said to be the average entropy per unit time at the limit of time approaching infinity, and at the limits of the cell size and of the time interval taking to zero. In mathematical notation, H KS ¼ lim lim lim

Sm ¼ k

Nm X

pi log ðpi Þ;

1

1 H m ¼ ðSm þ 1  Sm Þ;

τ

ð2Þ

where Hm is the rate of change of the entropy resulting in going from time t ¼ mτ to t ¼ ðm þ1Þτ. Thus, the average K–S entropy, denoted as HKS, when the number of time steps, denoted as Nt, approaches infinity to cover

ðSNt  S0 Þ:

ð5Þ

3. K–S entropy of fuzzy sets Let X ¼ fxg be a collection of points. A fuzzy set A in X is defined as a set of ordered pairs: A ¼ fðx; μA ðxÞÞj x A Xg, where μA ðxÞ is called the fuzzy membership function for the fuzzy set A, which maps each element of X to a real number in the interval [0, 1] [38]. The entropy of the fuzzy set A, denoted by D(A), is a measure of the degree of its fuzziness, which has the following three properties [39]: 1. D(A) ¼0 if A is a crisp (non-fuzzy) set, that is, if μA ðxÞ A f0; 1g 8 x; 2. D(A) is maximum 3 μA ðxÞ ¼ 0:5 (most fuzzy) 8 x A X; and 3. 8 x; DðAÞ Z DðAn Þ where μAn ðxÞ is any “sharpened” (less fuzzy) version of μA ðxÞ, such that μAn ðxÞ Z μA ðxÞ if μA ðxÞ Z0:5 and μAn ðxÞ r μA ðxÞ if μA ðxÞ r 0:5. The above properties formally define a measure of fuzziness that is related to the vague distinction between a set X and its negation X : the closer X and X , the more fuzzy X. Motivated by the concept of the Shannon entropy [40], the entropy of a discrete fuzzy set A is defined as [39] DðAÞ ¼ 

where Sm is the Shannon entropy calculated at time m, pi is the probability of a trajectory in the ith cell after m units of time, Nm is the number of possible phase-space routes that the system visits at time m, k is a constant and will be taken as 1 to simplify the mathematical expression, and pi log ðpi Þ ¼ 0 when pi ¼0. The second property of the K–S entropy is characterized by the rate of change of entropy as the system travels over time. The K–S entropy Hm after m units of time that has length τ is expressed as

ð4Þ

It should be noted that either time steps or space steps are of step sizes to refer to either time evolution or spatial evolution, respectively. To maintain a coherent mathematical presentation of the K–S entropy for time series and images, which are interchangeably addressed in following sections, the term “time step” will be used to indicate either time or space.

ð1Þ

i¼1

ðSNt  S0 Þ:

For discrete systems or when HKS is applied to iterated maps, τ is set to 1 and limτ ¼ 0 is dropped out in Eq. (4) [31,32]; thus the K– S entropy becomes ϵ-0 Nt -1N t

The theory of chaos and the notion of entropy measure in information theory have been found useful for solving various problems in complex systems [25–27], ranging from electromagnetic transmission of information to medicine and biology [28–30]. The three most well-known quantitative measures of chaos are the Lyapunov exponents, Kolmogorov–Sinai (K–S) entropy, and mutual information [31,32]. This paper particularly focuses on the K–S entropy [33] to extend its notion to measure the average amount of uncertainty rate under imprecise information. Although the Shannon entropy cannot be used to identify chaotic systems, its concept forms the basis for the formulation of the K–S entropy [34]. In the analysis of time series, the K–S entropy [35–37], which is thought to be a number indicating the average time rate of newly created information induced by the time evolution of chaotic trajectories, has also been known as the measure-theoretic entropy or metric entropy, which has three important properties: (1) sequence probabilities, (2) an entropy rate, and (3) limits. To describe the state-space characteristics of a dynamical system, consider a two-dimensional state-space region represented with a box that is divided into smaller boxes or cells, of which each side has length ϵ. As the system evolves with time, the trajectories of the dynamical system will propagate over a number of cells covered by the state-space region. Regarding the first property, the K–S entropy measures the uncertainty of a system associated with a sequence of outcomes or observations of the trajectories after m units of time as follows:

1

τ-0ϵ-0 N t -1N t τ

H KS ¼ lim lim 2. K–S entropy

ð3Þ

n X

μA ðxi Þlog ðμA ðxi ÞÞ  ð1  μA ðxi ÞÞlog ð1  μA ðxi ÞÞ:

ð6Þ

i

Based on the definition of the entropy of a fuzzy set, the uncertainty of a fuzzy system in the context of the K–S entropy measured with a sequence of observations after m units of time can be defined as Dm ¼ k

Nm X

Fi;

ð7Þ

i¼1

where k is previously defined as a constant and taken to be 1, and Fi is the Shannon's function [39,40] of a fuzzy trajectory in the ith cell: F i ¼  μi log ðμi Þ ð1  μi Þlog ð1  μi Þ;

ð8Þ

where μi A ½0; 1 is the fuzzy membership grade of a trajectory in P the ith cell (it is not necessary that i μi ¼ 1 because the notion of

T.D. Pham / Pattern Recognition 53 (2016) 229–237

a fuzzy set is nonstatistical in nature). By now, it can be seen that Eq. (7) is the K–S entropy of a finite number of fuzzy trajectories distributing over a number of cells in the state-space region of a dynamical system. Based on the discussion in [32] about the meaning of probability in the formulation of the K–S entropy, two typical observations are addressed here to provide some insights into differences between the use of probability and fuzzy membership in the calculation of the K–S entropy. First, suppose that all the trajectories evolve from cell to cell as time goes on. Then the probability is 1 for the occupied cell, and 0 for the unoccupied cells, giving Sm ¼0 for all m (Eq. (1)). Thus, for the use of probability in the K–S entropy, constant entropy corresponds to what would be interpreted as regular (deterministic) dynamics. However, fuzzy trajectories have their partial membership of belonging to all cells. In this case, Dm 4 0, for all m (Eq. (7)), interpreted as fuzzy dynamics. Second, suppose that the number of occupied cells Nm increase with time and that all the occupied cells have the same probability, 1=N m ; then the entropy is Sm ¼ log ðNm Þ [32]. But according to Eq. (8), by taking both μi and its complement (1  μi ) into account, Dm 4 log ðN m Þ. So, both Sm and Dm increase as the logarithm of the number of occupied cells, but Dm increases with larger entropy. For the case when each of the M trajectory points move to its own cell, Sm ¼ log ðMÞ. This entropy indicates a random process [31,32]. Once again, by taking both μi and its complement into consideration, Dm 4 log ðMÞ, heuristically referring to a fuzzy system. In sum, these observations between the probabilistic entropy and fuzzy entropy of a dynamical system may be stated that the fuzzy entropy is an upper probabilistic entropy. This is analogous to the connection between possibility (fuzziness) and probability in that any possibility distribution defines a set of admissible probability distributions [41]. In other words, if an event is possible, it may not be probable; however, if an event is impossible, it is bound to be improbable [41]. Furthermore, this fuzzy entropy is considered as an entropy of a generalized set, which is a measure of information being not necessarily related to random events [39]. Being similar to Eq. (2), the K–S entropy of fuzzy sets after m units of time can be expressed as [42] 1 Gm ¼ ðDm þ 1 Dm Þ;

τ

ð9Þ

Now the sequential probabilities and Shannon entropy embedded in the K–S entropy in the setting of fuzzy sets are replaced with the sequential fuzzy membership grades and fuzzy entropy, respectively. The motivation for introducing the fuzzy entropy in the computation of the K–S entropy is to model the characteristics of a system of non-linear dynamics that is subject to imprecision rather than the chance of occurrence of an event. Probability and fuzziness are related but their fundamental notions are different [43–45]. Fuzziness, which is a type of deterministic uncertainty, refers to the ambiguity of an event. It can be modeled with a degree to which an event occurs, but not whether it occurs. Probability arises from the question whether or not an event occurs. Moreover, probability asserts that the event is crisply defined and that the law of non-contradiction holds, that is A \ Ac ¼ ∅, where A is a fuzzy set and Ac its complement. While uncertainty that is due to fuzziness occurs when the law of non-contradiction is not held [45]. Finally, based on Eq. (5), the complete formula for the K–S entropy of fuzzy sets is expressed as GKS ¼ lim lim

1

ϵ-0 Nt -1N t

ðGNt  S0 Þ:

ð10Þ

Regarding the sequence probabilities involving with Eq. (1), which can be of any type of probability and are the likelihoods that the system will follow each of the various possible routes, for successive time intervals; these probabilities are computed using the product rule. For the sequence fuzzy membership values associating

231

with Eq. (8), which are the degrees expressing possible routes the system will pass through, the product rule for combining conditional probabilistic events is therefore replaced with the fuzzy product rule to be consistent with the concept of a fuzzy relationship. Let A and B be two fuzzy sets, each of which associates with each x A X in a Euclidean n-space Rn . The product of A and B is a fuzzy set T ¼ A  B, which is defined using the fuzzy intersection rule as follows [38,46]: T½μA ðxÞ; μB ðxÞ ¼ min½μA ðxÞ; μB ðxÞ

8 x A X:

ð11Þ

In a more general expression, the product of fuzzy sets A1 ; A2 ; …; An is a fuzzy relationship T ¼ A1  A2  …  An , which is defined as [46] T½μA1 ðxÞ; μA2 ðxÞ; …; μAn ðxÞ ¼ min½μAi ðxÞ i

8 x A X:

ð12Þ

Two alternatives have been suggested to get the inner limit of Eq. (5) [31], or similarly to get the inner limit of Eq. (10), that is limNt -1 . As for the K–S entropy of fuzzy sets, the first alternative is to divide each entropy Gm by the associated time to get an entropy rate, and the plot of the entropy rates versus time allows the estimate of the asymptotic entropy rate as time increases. However, it can be impractical in many applications because the data must be huge enough for the time events to reach the asymptotic limit or convergence. The other alternative resorts to an entropy difference [40], denoted by Δm, as follows:

Δm ¼ Dm þ 1  Dm :

ð13Þ

It has been shown, when plotted against time, that both the average entropy rate and the entropy difference reach the same asymptotic value of the average entropy rate; but the plot of Δm versus time is the better approximation and converges sooner [40]. Therefore, the use of the entropy difference is more favorable because it can alleviate the demand of a large dataset and consequently save computational effort. The mathematical expression for the K–S entropy of fuzzy sets estimated by the method of the entropy difference becomes GKS ¼ lim lim Δm : ϵ-0 m-1

ð14Þ

The procedure for calculating the K–S entropy of fuzzy sets using Eq. (14) is outlined as follows. Estimating K–S Entropy of Fuzzy Sets 1. Select a cell size ϵ. 2. For a given time step, estimate sequence fuzzy membership grades for all possible routes, and compute Dm. 3. Repeat Step 2 for the next time step to obtain Dm þ 1 . 4. Compute the entropy difference Δm. 5. Repeat Steps 2–4 for higher successive times. 6. Plot the entropy differences versus time steps and estimate the asymptotic value of the entropy difference. 7. Stop if Δm becomes asymptotic toward a constant. Otherwise, go to the next step. 8. Repeat Steps 1–6 for smaller and smaller ϵ. Having outlined a procedure for estimating the K–S entropy of fuzzy sets, the observation of the plot of the entropy differences versus time may help identify one of the mechanisms underlying the system, assuming noiseless data: nonchaotic, fuzzy chaotic, or completely fuzzy system (the conventional K–S entropy is used to distinguish if a dynamical system is characterized with a nonchaotic or deterministic process, deterministic chaos, or randomness [31]). A basic definition of the K–S entropy [47] is that it is the average rate of creation of information contained in a sequence of measurements. However, modeling sequential information for the calculation of corresponding sequential probabilities cannot be readily obtained for many practical problems, such as the modeling of uncertainty in images. In this paper, the focus is on the construction

232

T.D. Pham / Pattern Recognition 53 (2016) 229–237

of the K–S entropy for images, in which the use of the theory of fuzzy sets and fuzzy cluster analysis allow a convenient procedure for quantifying chaos that can be utilized as a novel feature for texture analysis and classification.

4. K–S entropy of fuzzy information in images This section develops a model for estimating the K–S entropy of fuzzy information in images. In fact, image properties are not random unless they are corrupted with noise. The spatial information or the appearance of various objects in many real images are inherently imprecise and non-random. A random process in an image is considered when the histogram of the image is used and treated as a probability density function that expresses the probable occurrence of a certain gray level in the image. In many images, pixels that represent different objects are very likely to have gray values that are very similar to each other. In other words, the descriptions of the image objects are vague or fuzzy. The histograms of two images containing ill-defined and welldefined objects are therefore narrow and wide, respectively. Following the above analysis, uncertainty in an image, which is inherently due to the imprecise description of the image content, can be mathematically modeled by the arbitrary partition of the image space using the fuzzy c-means algorithm [48]. Let an image of N pixels be arbitrarily partitioned into a number of imprecise clusters using the fuzzy c-means algorithm [48]. Mathematically, let Mfc be a fuzzy c-partition space, X be a subset of the real gdimensional vector space Rg : X ¼ fx1 ; x2 ; …; xN g  Rg where xk ¼ ðxk1 ; xk2 ; …; xkg Þ A Rg . The fuzzy c-means (FCM) clustering is based on the minimization of the fuzzy objective function J F : Mfc  Rcg -R þ , which is defined as [48] J F ðU; vÞ ¼

N X c X

ðμij Þq ½dðxi ; vj Þ2

ð15Þ

i¼1j¼1

where q A ½1; 1Þ is the fuzzy weighting exponent, UA Mfc , v ¼ ðv1 ; v2 ; …; vc Þ A Rcg , and dðxi ; vj Þ is any inner-product norm metric induced on Rg . The fuzzy objective function expressed in (15) is subject to the following constraints: c X

μij ¼ 1;

i ¼ 1; …; N

ð16Þ

j¼1

where

μij A ½0; 1;

i ¼ 1; …; N; j ¼ 1; …; c:

The objective function J F ðU; vÞ is a squared error clustering criterion and to be minimized to optimally determine U, and v. A solution to the minimization of the objective function is by a process of iteratively updating U and v until some convergence is reached [48]. The fuzzy membership grades and cluster centers of the FCM are updated as

μij ¼

1  2=ðq  1Þ ; dðxi ; vu Þ j¼1 dðxi ; vj Þ

Pc

PN

i¼1

vj ¼ PN

ðμij Þq xi

i¼1

ðμij Þq

;

8 j:

1 r u rc;

ð17Þ

ð18Þ

Thus, given fuzzy c partitions, the FCM assigns each pixel to the c clusters with its respective membership grades. In other words, the FCM-based cluster analysis can be utilized to construct the modeling of uncertainty in an image in the context of imprecise boundaries or fuzzy classes.

The selection of the size of c is equivalent to Step 1 of the procedure for estimating the K–S entropy of fuzzy sets. To construct the sequence membership grades of an image, the fuzzy c-means partition is performed with given c. The sequence fuzzy membership grades at each route at mþ1 are computed by taking the minimum values of the fuzzy membership grades of the corresponding gray levels determined by the FCM at m and mþ 1 (Step 2 and Step 3), based on which the entropy difference Δm can be calculated (Step 4). A practical way is to calculate the entropy difference defined in Eq. (13) using the fuzzy membership grades for each cluster obtained from the FCM in any forward orientation of the image. Typical orientations are the vertical and horizontal directions of the image, which refer to the entropy differences in the image rows and columns, respectively. In fact, if a texture image is isotropic, the computation of the fuzzy K–S entropy is independent of the orientation of the image. The use of omnidirectional orientations for the computation of the fuzzy K–S entropy in images is justified in case of anisotropic characteristics. Practical studies on extracting spatial statistical features of two-dimensional images suggested that information obtained from row-wise and column-wise directions of images is a good numerical approximation [49,50]. The plot of the sum of all the entropy differences in the image rows or image columns against the corresponding image rows or image columns, which are the iterative steps, allows the estimate of GKS when a plateau, where the function has or seems to have a constant value (indicating chaos), is visible on the curve.

5. Experimental results 5.1. Analysis of typical textural images Fig. 1 shows the gray-scale images of (a) “Lena” (512  512), (b) partial cancer cell (603  1819), (c) partial normal cell (678  1447), and (d) human abdominal organs (246  366). The cancer cell is of cell lines derived from a human head and neck squamous cell carcinoma (SCC-61). The normal cell is of mouse embryonic fibroblast (MEF) cells. Both cell images were produced by the combination of the focused-ion-beam and scanningelectron-microscope techniques. The organelles of the cancer and normal cells can be visually observed in the intracellular space. The image of abdominal organs is an extracted region of a computed tomography (CT) scan of a patient. Reasons for selecting these four images are that the image of Lena was noted that it was a good test image because of its detail, flat regions, shading, and texture [51]; the two cell images are considered as valuable biological images where texture analysis can be useful to differentiate the microscopic morphology of cancer and normal cells in intracellular space [52]; and the CT image of abdominal organs is a typical medical image where its texture-based features have been found to be informative for detecting small-bowel ischemia [53]. Figs. 2–5 show the changes of the entropy difference with vertical (row-wise) and horizontal (column-wise) orientations of the images of Lena, SCC61, MEF, and abdominal organs, respectively, for the number of clusters c ¼2 and the fuzzy weighting exponent q ¼2 of the FCM expressed in Eq. (15). The value of GKS for each of the four images were estimated as the value when its entropy difference reaches some positive constant indicated by a plateau (a state of little or no change) of the plot of the entropy difference against the iteration. Typical effects of the image size in the plotting of the entropy differences against time steps can be observed in Figs. 2–5, in which the oscillations at earlier entropy differences are wider for smaller image size (Fig. 1(a): Lena and (d): abdominal organs) than for images of large sizes (Fig. 1(b): SCC-61 cell and (c): MEF cell).

T.D. Pham / Pattern Recognition 53 (2016) 229–237

233

Fig. 1. Images of (a) Lena, (b) SCC-61 (cancer) cell, (c) MEF (normal) cell, and (d) abdominal organs.

200 Entropy difference

Entropy difference

150

100

50

0

0

100

200

300

400

500

600

150 100 50 0

0

500

Entropy difference

300

600

200

400

100

200

0

0

100

200

300

1000

1500

2000

Iterative steps

Iterative steps

400

500

600

Iterative steps

Fig. 2. Change of entropy difference against space of the Lena image with c ¼2 and m¼ 2: (a) horizontal orientation, and (b) vertical orientation: entropy difference becomes asymptotic with value taken at the end of the curve (a) 1.12, and (b) 6.20.

It has been known that the results of the FCM can be sensitive with different values of q A ½1; 1Þ. There has been no unique theoretical basis for the optimal selection of this fuzzy exponent q. The most widely adopted value for q is 2. Therefore q¼ 2 and 3, and c¼2, 3 and 4 were used to estimate GKS. The means and standard deviations of the fuzzy K–S entropy GKS of the four images were calculated for each fixed value of q with varied c¼2, 3, and 4. These results are shown in Tables 1 and 2 for q¼ 2 and 3, respectively. It is noted that each fuzzy cluster provides an estimate for GKS, and the results shown in Tables 1 and 2 are selected according to the best visible plateaus of the plots of the entropy difference versus the iteration. Based on Tables 1 and 2, the estimate of the K–S entropy varies with different values of q and c. In all cases, the estimates of the K–S entropy obtained in the vertical orientation of the images are higher than those in the horizontal orientation.

0

0

100

200

300

400

500

600

700

Iterative steps

Fig. 3. Change of entropy difference against space of the SCC61 (cancer) cell image with c ¼ 2 and m ¼2: (a) horizontal orientation, and (b) vertical orientation: entropy difference becomes asymptotic with value taken at the end of the curve (a) 0.14, and (b) 14.53.

To test if the proposed GKS is sensitive to noise, the Lena image (Fig. 1(a)) was added with Gaussian noise with the mean of zero and variance of 0.005, as shown in Fig. 6. Using the number of clusters c ¼2 and q¼ 2 in the FCM, the values of GKS are 9.411 and 9.216 for the horizontal and vertical orientations of the noisy image, respectively (see Fig. 7). Table 3 shows the values of GKS of Lena's face image added with Gaussian noise with the mean of zero and various levels of variance, which get higher with larger value of the variance and higher than those obtained for the original image without adding noise. This observation suggests that the pixels can be considered as fuzzy random variables, which are of an important concept of uncertainty modeling in the connection of randomness with fuzziness [55–59].

234

T.D. Pham / Pattern Recognition 53 (2016) 229–237

Table 1 Means and standard deviations of K–S entropy in the setting of fuzzy sets for images with various number of clusters and q¼ 2 in the FCM.

Entropy difference

300

200

# Clusters

100

0

0

500

1000

1500

2 3 4

Iterative steps

2 3 4

Entropy difference

600

2 3 4

400

200

0

0

100

200

300

400

500

600

700

2 3 4

Horizontal Lena's face 1.127 0.02 1.3570.01 1.75 7 0.00 SCC-61 cell 0.14 70.01 0.127 0.00 8.177 0.01 MEF cell 13.45 7 0.01 16.517 0.03 10.05 7 0.03 Abdominal organs 2.78 7 0.02 1.3370.00 4.50 7 0.01

Vertical

6.20 70.04 1.36 7 0.03 4.20 70.33 14.53 7 0.36 18.497 0.19 19.40 7 0.03 32.88 70.04 37.977 0.06 33.577 0.09 6.80 70.05 4.977 0.01 10.127 0.03

Iterative steps

Fig. 4. Change of entropy difference against space of the MEF (normal) cell image with c¼ 2 and m¼2: (a) horizontal orientation, and (b) vertical orientation: entropy difference becomes asymptotic with value taken at the end of the curve (a) 13.45, and (b) 32.88.

Entropy difference

80

# Clusters

2 3 4

60 40

2 3 4

20 0

0

50

100

150

200

250

300

350

400

Iterative steps

2 3 4 2 3 4

100 Entropy difference

Table 2 Means and standard deviations of K–S entropy in the setting of fuzzy sets for images with various number of clusters and q¼ 3 in the FCM.

80

Horizontal Lena's face 5.63 70.16 2.85 70.15 4.577 0.09 SCC-61 cell 9.21 7 0.07 2.86 70.12 2.82 70.08 MEF cell 7.55 7 0.31 6.727 0.48 1.417 0.01 Abdominal organs 2.02 70.32 2.02 70.20 2.39 70.15

Vertical

7.81 7 0.28 5.26 7 0.80 7.28 7 2.24 29.677 1.59 8.78 7 0.31 15.32 7 0.62 19.73 71.02 14.52 7 0.90 4.667 0.38 4.147 0.21 3.317 0.17 3.85 7 0.18

60 40 20 0

0

50

100

150

200

250

Iterative steps

Fig. 5. Change of entropy difference against space of the CT scan of a human abdomen with c ¼2 and m¼ 2: (a) horizontal orientation, and (b) vertical orientation: entropy difference becomes asymptotic with value taken at the end of the curve (a) 2.78, and (b) 6.80.

5.2. Classification of textural images The texture database [60] is a collection of 25 image texture classes, where each class consists of 40 image samples. All images are in grayscale JPG format, each is of size 640  480 pixels. This database can be publically downloaded at the following URL: http:// www-cvr.ai.uiuc.edu/ponce_grp/data. Most recently, the same dataset was further studied for comparing the benchmark results with fractal analysis [61]. Fig. 8 shows some typical images of the texture database, where each image belongs to a different class. In this experiment, each original image was divided into a 25 subimages of size 128  96 pixels. The fuzzy K–S entropy values were extracted from the sub-images, where q¼ 2 (the most widely adopted value in practice) and c¼2 (representing the background and foreground intensities) were specified in the FCM, giving feature vectors of 4 fuzzy K–S entropy values: two for each fuzzy cluster in

Fig. 6. Lena image degraded with Gaussian noise of 0 mean and 0.005 variance.

row-wise and column-wise directions. Because of the dynamic range of the fuzzy K–S entropy values in the images, the extracted K–S entropy values were partitioned by means of the LBG vector quantization (VQ) [62] with codebooks of 4, 8, 16, and 32 codevectors. To compare with benchmark results reported in [61], the supportvector-machine (SVM) classifier was used to train and test the

T.D. Pham / Pattern Recognition 53 (2016) 229–237

images, where the Gaussian radial basis function was used as the kernel function. The tests were carried out with 10-fold cross-validation that divided each dataset into 10 subsets of equal size, where

Entropy difference

150

100

50

0

0

100

200

300

400

500

600

400

500

600

Iterative steps

Entropy difference

200 150 100 50 0

0

100

200

300 Iterative steps

Fig. 7. Change of entropy difference against space of the Lena image degraded with Gaussian noise of mean¼ 0 and variance ¼0.005; with c¼ 2 and m ¼2: (a) horizontal orientation, and (b) vertical orientation: a plateau in entropy difference at (a) 9:411 7 0:00ðGKS Þ, and (b) 9:216 7 0:00ðGKS Þ. Table 3 K–S entropy in the setting of fuzzy sets for Lena's face image degraded with zero mean and various variances, with c ¼ 2 and q ¼2 in the FCM. Noise variance

Horizontal

Vertical

0.005 0.01 0.02

9.4117 0.00 14.3787 0.00 19.052 7 0.00

9.2167 0.00 10.83370.00 18.293 70.00

235

9 subsets were used for training to test on the remaining subset, then repeated 10 times and an average accuracy was computed. Table 4 shows the classification rates of the images obtained from the proposed and other methods reported in [61], including the segmentation-based fractal texture analysis (SFTA), fast fractal stack (FFS), Haralick (6 GLCM features), Gabor (mean and standard deviation of response from filter bank), histogram (16 bins), basic feature (mean, contrast, skewness, kurtosis, entropy and standard deviation of the image), and feature combination (Haralick, histogram, basic feature, and Zernike moments [64]). In comparison with other methods, the use of the fuzzy K–S entropy (FKSE) achieves the best performance. Furthermore, because the Haralick method, which is also known as the gray-level co-occurrence matrix (GLCM) [63], is one of the most popular methods for texture analysis, the performance of the fuzzy K–S entropy was also compared with the GLCM using the k-NN classifier, where k¼1. The reason for using one of the most simple classification methods such as the non-parametric k-NN is to test the performance of the features. The division of the original images and VQ design were carried out with the same procedures for both types of features. The entropy of the GLCM was calculated as the GLCM feature, using the position operator “1 pixel to the right and 1 pixel down” and 8 gray levels. Table 5 shows the total average classification rates of the GLCM and FKSE with codebooks of 4, 8, 16, and 32 codevectors, based on which the results obtained from the fuzzy K–S entropy are almost twice as high as those obtained from the GLCM entropy. These results show the relatively significant superiority of the fuzzy K–S entropy over the GLCM entropy for classification of various textural patterns of this dataset. 5.3. Discussion on numbers of clusters and time steps in estimating values of fuzzy K–S entropy in images According to the formulation of the K–S entropy of fuzzy sets expressed in Eq. (14), the estimation of the entropy of fuzzy information in an image refers to the choice of the number of clusters c, which play the similar role as for the number of cells for

Fig. 8. Fifteen samples of the surface texture image database [60]: (a) bark, (b) brick, (c) carpet, (d) corduroy, (e) floor, (f) fur, (g) glass, (h) granite, (i) knit, (j) marble, (k) pebble, (l) upholstery, (m) wallpaper, (n) water, and (o) wood.

236

T.D. Pham / Pattern Recognition 53 (2016) 229–237

Therefore, the experimental results reported earlier were based on a single run of the FCM.

Table 4 SVM-based classification rates (%) of texture images using 8 different features. SFTA FFS Haralick Histogram Feature combination Gabor Basic texture Fuzzy K–S entropy

90.80 75.30 82.40 75.60 86.90 87.70 13.20 95.70

6. Conclusion

Table 5 Total average k-NN-based classification rates (%) of texture images using GLCM entropy and fuzzy K–S entropy. Codevectors GLCM entropy Fuzzy K–S entropy

4 13 25

8 15 26

16 15 26

32 14 25

a deterministic system. In fact, Eq. (14) requires two general types of theoretical operations to estimate the entropy. The first type of operations is to select an arbitrary size ϵ, keeping ϵ constant, and compute the entropy rate as times gets larger and approaches infinity to estimate the asymptotic entropy rate. This involves the inner limit of Eq. (10) (limNt -1 ) or Eq. (14) (limm-1 ). The numbers of time steps, which are the size (rows and columns) of the image, used in the calculation of the entropy difference approximately reached the asymptotic limit, that is, convergence. This implies that, given a sufficient size of an image, the convergence of the entropy difference is feasible for the computation of the fuzzy K–S entropy. The second theoretical operation is to repeat the whole procedure with smaller and smaller cell size ϵ (increased partitions or possible states). This operation gives the outer limit (limϵ-0 ) of Eq. (10) or Eq. (14), to get estimates of the inner limit. This type of operation equivalently requires increasing the number of clusters c (increased partitions) specified in the FCM clustering of an image. In principle, c can be increased up to N  1, where N is the number of pixels in the image. However, using such large numbers for c, the cluster analysis results are unlikely to be meaningful. For practical purposes of this study, the estimates of asymptotic values of the entropy difference for different partitions may not be necessary, because firstly these estimates are computationally expensive, and secondly the asymptotic value of the entropy difference obtained from a constant value of c for each image can be appropriately used as a feature for pattern classification. What is more important is a good choice of c, which is an influential factor for effective cluster analysis of the data under study. In the analysis of the four typical textural images (Fig. 1), the number of clusters in these images are not well defined, which may subjectively range from 2 to 4. Therefore, three numbers of clusters (c¼2, 3 and 4) were adopted to estimate an average value of the fuzzy K–S entropy. Having mentioned earlier, in the experiment using the texture database, the selection of c¼ 2 was due to the assumption that all images approximately consist of pixels of a background and an object. In this study, the FCM results converged within a tolerance of 0.001 in the partitioning of all images. Furthermore, it was observed that different initializations of the fuzzy membership matrix resulted in stable computations of the cluster centers. For example, using a subimage of size 128  96 of the floor image shown in Fig. 8(e), the first and second runs of the FCM took 82 and 81 iterations, respectively, to reach the convergence for the computation of two cluster centers; and both FCM runs gave the same values of the cluster centers for two image intensity partitions: 133.5474 and 82.2095.

A new concept for calculating the K–S entropy in the framework of the theory of fuzzy sets has been presented and applied to extract a useful feature for texture image analysis and classification, while the modeling of the K–S entropy for images is difficult to construct. The fuzzy K–S entropy of an image is a number that characterizes the nonlinear behavior of an ensemble of fuzzy membership grades of a partition of an image space. The extended framework of the K– S entropy can be useful for the analysis of imprecise uncertainty existing in many real-life pattern recognition problems. Issues for further development of the current study include optimal selections of the number of clusters, c, and fuzzy weighting exponent, q, of the FCM in order to produce an optimal fuzzy K–S entropy estimate of the image data. On the other hand, the estimates of the fuzzy K–S entropy using different values of c and q can be utilized as feature vectors for the classification of complex patterns in images inherently existing in biological and medical pattern recognition problems [65,54]. In general, the use of the FCM presented in this paper serves as a potential way for computing the fuzzy K–S entropy of images. Other methods for estimating the fuzzy K–S entropy for data of high dimensions are worth developing. It is also of interest to further study the proposed fuzzy K–S entropy to quantify information underlined with fuzzy random variables, which assign fuzzy subsets to corresponding possible random outcomes [66]. In the context of images, the variability of the pixel distribution in an image can be characterized by randomness to include some noise source, whereas the content of the image appearance is fuzzy and often described using typical approximate linguistic expressions such as “regular”, “nearly regular”, “stochastic” and “nearly stochastic” texture [67].

Conflict of interest The author declares that there are no conflicts of interest.

Acknowledgments The author thanks the Handling Editor, Nicole Vincent, and the two anonymous reviewers for their constructive comments and suggestions, which helped improve the paper. The microscope (cell) images were provided by Kazuhisha Ichikawa of the Institute of Medical Science of the University of Tokyo. The CT (abdomen) image was provided by Taichiro Tsunoyama of the Teikyo University School of Medicine. Satoshi Haga of the University of Aizu assisted the author in partial coding and testing the proposed method with the image data. This research was carried out while the author was with the University of Aizu, and supported in part by FY2014 CRF Scheme under Grant no. P-36.

References [1] D.A. Forsyth, J. Ponce, Computer Vision: a Modern Approach, Prentice Hall, New Jersey, 2003. [2] M. Nixon, A. Aguado, Feature Extraction and Image Processing for Computer Vision, 3rd edition, Academic Press, London, 2012. [3] R.C. Gonzalez, R.E. Woods, Digital Image Processing, 3rd edition, Pearson Prentice Hall, New Jersey, 2010. [4] J. Mao, A.K. Jain, Texture classification and segmentation using multiresolution simultaneous autoregressive models, Pattern Recognit. 25 (1992) 173–188. [5] J. Zhang, T. Tan, Brief review of invariant texture analysis methods, Pattern Recognit. 35 (2002) 735–747.

T.D. Pham / Pattern Recognition 53 (2016) 229–237

[6] E. de Ves, D. Acevedo, A. Ruedin, X. Benavent, A statistical model for magnitudes and angles of wavelet frame coefficients and its application to texture retrieval, Pattern Recognit. 47 (2014) 2925–2939. [7] Z. Ji, W. Wang, Detect foreground objects via adaptive fusing model in a hybrid feature space, Pattern Recognit. 47 (2014) 2952–2961. [8] M. Petrou, O.G. Sevilla, Image Processing: Dealing with Texture, Wiley, New York, 2006. [9] C. Scharfenberger, A. Wong, D.A. Clausi, Structure-guided statistical textural distinctiveness for salient region detection in natural images, IEEE Trans. Image Process. 24 (2015) 457–470. [10] J.M. Guo, H. Prasetyo, Content-based image retrieval using features extracted from halftoning-based block truncation coding, IEEE Trans. Image Process. 24 (2015) 1010–1024. [11] J. Xie, L. Zhang, J. You, S. Shiu, Effective texture classification by texton encoding induced statistical features, Pattern Recognit. 48 (2015) 447–457. [12] A. Hafiane, K. Palaniappan, G. Seetharaman, Joint adaptive median binary patterns for texture classification, Pattern Recognit. 48 (2015) 2609–2620. [13] T. Song, H. Li, F. Meng, Q. Wu, B. Luo, Exploring space-frequency co-occurrences via local quantized patterns for texture representation, Pattern Recognit. 48 (2015) 2621–2632. [14] S. Hegenbart, A. Uhl, A scale- and orientation-adaptive extension of local binary patterns for texture classification, Pattern Recognit. 48 (2015) 2633–2644. [15] R. Parekh, Using texture analysis for medical diagnosis, IEEE MultiMed. 19 (2012) 28–37. [16] E. Rexhepaj, M. Agnarsdottir, J. Bergman, P.H. Edqvist, M. Bergqvist, M. Uhlen, W.M. Gallagher, E. Lundberg, F. Ponten, A texture based pattern recognition approach to distinguish melanoma from non-melanoma cells in histopathological tissue microarray sections, PLoS One 8 (5) (2013) e62070, http://dx.doi. org/10.1371/journal.pone.0062070. [17] J.S. Nelson, O.I. Christianson, B.A. Harkness, M.T. Madsen, E. Mah, S.R. Thomas, H. Zaidi, E. Samei, Improved nuclear medicine uniformity assessment with noise texture analysis, J. Nucl. Med. 55 (2014) 169–174. [18] M. Knop, B.A. Edgar, Tracking protein turnover and degradation by microscopy: photo-switchable versus time-encoded fluorescent proteins, Open Biol. 4 (2014) 140002, http://dx.doi.org/10.1098/rsob.140002. [19] A. Depeursinge, A. Foncubierta-Rodriguez, D. Van De Ville, H. Muller, Threedimensional solid texture analysis in biomedical imaging: review and opportunities, Med. Image Anal. 18 (2014) 176–196. [20] I. Theodorakopoulos, D. Kastaniotis, G. Economou, S. Fotopoulos, HEp-2 cells classification via sparse representation of textural features fused into dissimilarity space, Pattern Recognit. 47 (2014) 2367–2378. [21] T. Chang, C.C.J. Kuo, Texture analysis and classification with tree-structured wavelet transform, IEEE Trans. Image Process. 2 (1993) 429–441. [22] G.J. Klir, Uncertainty and Information: Foundations of Generalized Information Theory, Wiley-IEEE Press, New Jersey, 2005. [23] R. Fritz, The uncertainty principle and pattern recognition, URL: 〈 http://www. robertfritz.com/index.php?content ¼writingnr&news_id ¼102〉, 2008 (accessed 15 August 2014). [24] P. Grim, Self-reference and chaos in fuzzy logic, IEEE Trans. Fuzzy Syst. 1 (1993) 237–253. [25] A.M. Fraser, Information and entropy in strange attractors, IEEE Trans. Inf. Theory 35 (1989) 245–262. [26] S. Baptista, E.J. Ngamga, Paulo R.F. Pinto, J. Margarida Brito, Kurths, Kolmogorov–Sinai entropy from recurrence times, Phys. Lett. A 374 (2010) 1135–1140. [27] J. Gao, F. Liu, J. Zhang, J. Hu, Y. Cao, Information entropy as a basic building block of complexity theory, Entropy 15 (2013) 3396–3418. [28] F.J. Escribano, A. Wagemakers, M.A.F. Sanjuan, Chaos-based turbo systems in fading channels, IEEE Trans. Circuits Syst. I 61 (2014) 530–541. [29] J.E. Skinner, M. Molnar, T. Vybiral, M. Mitra, Application of chaos theory to biology and medicine, Integr. Physiol. Behav. Sci. 27 (1992) 39–53. [30] T.D. Pham, The butterfly effect in ER dynamics and ER-mitochondrial contacts, Chaos Solitons Fractals 65 (2014) 5–19. [31] G.P. Williams, Chaos Theory Tamed, Joseph Henry Press, Washington, DC, 1997. [32] R.C. Hilborn, Chaos and Nonlinear Dynamics, 2nd edition, Oxford University Press, New York, 2000. [33] L.S. Young, Entropy, Lyapunov exponents, and Hausdorff dimension in differentiable dynamical systems, IEEE Trans. Circuits Syst. 30 (1983) 599–607. [34] E. Ott, Chaos in Dynamical Systems, 2nd edition, Cambridge University Press, New York, 2002.

237

[35] A.N. Kolmogorov, Entropy per unit time as a metric invariant of automorphism, Dokl. Russ. Acad. Sci. 124 (1959) 754–755. [36] Y.G. Sinai, On the notion of entropy of a dynamical system, Dokl. Russ. Acad. Sci. 124 (1959) 768–771. [37] Y.G. Sinai, Introduction to Ergodic Theory, Princeton University Press, Princeton, 1976. [38] L.A. Zadeh, Fuzzy sets, Inf. Control 8 (1965) 338–353. [39] A. De Luca, S. Termini, A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory, Inf. Control 20 (1972) 301–312. [40] C.E. Shannon, W. Weaver, The Mathematical Theory of Communication, University of Illinois Press, Urbana, 1949. [41] L.A. Zadeh, Fuzzy sets as a basis for a theory of possibility, Fuzzy Sets Syst. 1 (1978) 3–28. [42] T.D. Pham, Classification of complex biological aging images using fuzzy Kolmogorov–Sinai entropy, J. Phys. D: Appl. Phys. 47 (2014) 402–485. [43] J.C. Bezdek, P.K. Sankar (Eds.), Fuzzy Models for Pattern Recognition: Methods that Search for Structures in Data, IEEE Press, New York, 1992. [44] D. Dubois, H. Prade, Fuzzy sets and probability: misunderstandings, bridges and gaps, in: Proceedings of the 2nd IEEE International Conference on Fuzzy Systems, 1993, pp. 1059–1068. [45] B. Kosko, Fuzziness vs. probability, Int. J. Gen. Syst. 17 (1990) 211–240. [46] R.R. Yager, D.P. Filev, Essentials of Fuzzy Modeling and Control, John Wiley & Sons, New York, 1994. [47] H. Atmanspacher, H. Scheingraber, A fundamental link between system theory and statistical mechanics, Found. Phys. 17 (1987) 939–963. [48] J.C. Bezdek, Pattern Recognition with Fuzzy Objective Function Algorithms, Plenum Press, New York, 1981. [49] G. Ramstein, M. Raffy, Analysis of the structure of radiometric remotelysensed images, Int. J. Remote Sens. 10 (1989) 1049–1073. [50] T.D. Pham, Geostatistical entropy for texture analysis: an indicator kriging approach, Int. J. Intell. Syst. 29 (2014) 253–265. [51] D.C. Munson, A note on Lena, IEEE Trans. Image Process. 5 (1) (1996) 〈http:// www.cs.cmu.edu/  chuck/lennapg/editor.html〉 (accessed 07 August 2015).. [52] T.D. Pham, K. Ichikawa, Spatial chaos and complexity in the intracellular space of cancer and normal cells, Theor. Biol. Med. Model. 10 (2013) 62, http://dx. doi.org/10.1186/1742-4682-10-62. [53] T. Tsunoyama, T.D. Pham, T. Fujita, T. Sakamoto, Identification of intestinal wall abnormalities and ischemia by modeling spatial uncertainty in computed tomography imaging findings, Comput. Methods Programs Biomed. 117 (2014) 30–39. [54] Y. Chen, T.D. Pham, Entropy and regularity dimension in complexity analysis of cortical surface structure in early Alzheimer's disease and aging, J. Neurosci. Methods 215 (2013) 210–217. [55] H. Kwakernaak, Fuzzy random variables–I. Definitions and theorems, Inf. Sci. 15 (1978) 1–29. [56] H. Kwakernaak, Fuzzy random variables–II. Algorithms and examples for the discrete case, Inf. Sci. 17 (1979) 253–278. [57] M.L. Puri, Fuzzy random variables, J. Math. Anal. Appl. 114 (1986) 409–422. [58] E.P. Klement, Fuzzy random variables, Ann. Univ. Sci. Bp. Sect. Comput. 12 (1991) 143–149. [59] V. Kratschmer, Constraints on belief functions imposed by fuzzy random variables: some technical remarks on Romer-Kandel, IEEE Trans. Syst. Man Cybern.: Part B 28 (1998) 881–883. [60] S. Lazebnik, C. Schmid, J. Ponce, A sparse texture representation using local affine regions, EEE Trans. Pattern Anal. Mach. Intell. 27 (2005) 1265–1278. [61] A.F. Costa, G. Humpire-Mamani, A.J.M. Traina, An efficient algorithm for fractal analysis of textures, in: Proceedings of the 25th SIBGRAPI Conference on Graphics, Patterns and Images, 2012, pp. 39–46. [62] Y. Linde, A. Buzo, R. Gray, An algorithm for vector quantizer design, IEEE Trans. Commun. 28 (1980) 84–95. [63] R.M. Haralick, K. Shanmugan, I. Dinstein, Textural features for image classification, IEEE Trans. Syst. Man Cybern. 3 (1973) 610–621. [64] A. Khotanzad, Y.H. Hong, Invariant image recognition by Zernike moments, IEEE Trans. Pattern Anal. Mach. Intell. 12 (1990) 489–497. [65] N. Orlov, L. Shamir, T. Macura, J. Johnston, D.M. Eckley, I.G. Goldberg, WNDCHARM: multi-purpose image classification using compound image transforms, Pattern Recognit. Lett. 29 (2008) 1684–1693. [66] M.A. Gila, M. Lopez-Díaza, D.A. Ralescu, Overview on the development of fuzzy random variables, Fuzzy Sets Syst. 157 (2006) 409–422. [67] W.C. Lin, J. Hays, C. Wu, V. Kwatra, Y. Liu, Quantitative evaluation of near regular texture synthesis algorithms, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2006, pp. 427–434.

Tuan D. Pham received his Ph.D. degree in Civil Engineering in 1995 from the University of New South Wales, Sydney, Australia. He is a Professor of Biomedical Engineering at Linkoping University, Sweden. Prior to this current position, he held positions as a Professor and a Leader of the Aizu Research Cluster for Medical Engineering and Informatics at The University of Aizu, Japan; and Group Leader of Bioinformatics Research at The University of New South Wales, Canberra, Australia. His current research interests include image processing, pattern recognition, fractals and chaos applied to biology and medicine. His research has been funded by the Australian Research Council, JSPS (Japan), academic institutions, and industry. He has served as an Area Editor, Associate Editor, and Editorial Board Member of several journals and book series including Pattern Recognition (Elsevier), Current Bioinformatics (Bentham), Recent Patents on Computer Science (Bentham), Proteomics Insights (open access journal, Libertas Academica Press), Book Series on Bioinformatics and Computational BioImaging (Artech House), International Journal of Computer Aided Engineering and Technology (Inderscience Publishers). Pham has served as chair and technical committee member of more than 30 international conferences in the fields of image processing, pattern recognition, and computational life sciences.