A domain attraction criterion for interval fuzzy neural networks

A domain attraction criterion for interval fuzzy neural networks

Computers and Mathematics with Applications 58 (2009) 508–513 Contents lists available at ScienceDirect Computers and Mathematics with Applications ...

397KB Sizes 42 Downloads 132 Views

Computers and Mathematics with Applications 58 (2009) 508–513

Contents lists available at ScienceDirect

Computers and Mathematics with Applications journal homepage: www.elsevier.com/locate/camwa

A domain attraction criterion for interval fuzzy neural networksI Tingwen Huang a , Chuandong Li b , Zhigang Zeng c,∗ a

Texas A&M University at Qatar, Doha, P. O. Box 5825, Qatar

b

Department of Computer Science and Engineering, Chongqing University, Chongqing, 400030, China

c

Department of Control Science and Engineering, Huazhong University of Science and Technology, Wuhan, Hubei, 430074, China

article

info

Article history: Received 12 July 2008 Received in revised form 6 December 2008 Accepted 24 March 2009 Keywords: Domain attraction Interval fuzzy neural networks Stability

abstract In this paper, we study qualitative properties of equilibrium points in a class of interval fuzzy neural networks, and obtain an estimate on the domain of robust attraction of locally exponentially stable equilibrium points. Both the conditions and the estimate are formulated in terms of the parameter intervals, so they are easily verifiable. The result can be used for the evaluation of fault-tolerance capability for interval fuzzy neural networks. © 2009 Elsevier Ltd. All rights reserved.

1. Introduction In the past two decades, the dynamics of neural networks, especially the stability of neural networks, has been extensively investigated [1–19] because of their important applications in various areas such as pattern recognition and artificial intelligence etc. Many profound results on dynamical behaviors have been obtained. At the same time, Yang et al. [15– 17] proposed fuzzy neural networks which combine the fuzzy logic with the traditional neural networks. Fuzzy neural networks could be used in image processing and pattern recognition. In practice, the stability of fuzzy neural networks is very important, as is that of traditional neural networks. Yang et al. [15–17] have investigated the existence and uniqueness of the equilibrium point and the stability of fuzzy neural networks without any delays. The stability of a neural network could become unstable because of the existence of the unavoidable modeling errors, external disturbance and parameter fluctuation during the implementation on very large scale integration chips. This implies that a nice neural network should have certain robustness against such errors, disturbance and fluctuation. To deal with this problem faced by neural networks with uncertainty, Liao et al. [11] introduced interval neural networks. After that, some researchers obtained some criteria for robust stability of neural networks with or without delays. However, most of the published works on the dynamics of the neural networks consider the global stability of fuzzy neural networks. Qualitative analysis on local dynamical properties of fuzzy neural networks is another aspect which is important in the study of fuzzy neural networks. Recently, a qualitative study of the local dynamics of Hopfield neural networks has received certain attention. Cao et al. [1] and Yang et al. [18,19] derived various estimates on the domains of attraction of locally exponentially stable equilibrium points for Hopfield neural networks. To the best of our knowledge, there have been no results on the estimate of the domains of attraction of locally exponential equilibrium points in interval fuzzy neural

I The first author is grateful for the support of Texas A&M University at Qatar. This work was supported by the Natural Science Foundation of China under Grant 60774051, Program for New Century Excellent Talents in Universities of China under Grant NCET-06-0658, the Fok Ying Tung Education Foundation under Grant 111068. ∗ Corresponding author. E-mail addresses: [email protected] (T. Huang), [email protected] (C. Li), [email protected] (Z. Zeng).

0898-1221/$ – see front matter © 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.camwa.2009.03.112

T. Huang et al. / Computers and Mathematics with Applications 58 (2009) 508–513

509

networks in the literature so far. In this paper, we focus on the estimate of the attraction domain of locally exponentially stable equilibrium points under certain conditions which are easily verifiable. The rest of the paper is as follows. In Section 2, problem formulation and preliminaries are given. In Section 3, a sufficient criterion on estimates of the domain of attraction for interval fuzzy neural networks is obtained. Finally, conclusions are drawn in Section 4. 2. Problem formulation and preliminaries In this paper, we would like to consider fuzzy neural networks described by the following form: dxi dt

= −di xi (t ) +

n X

ξij fj (xj (t )) +

j=1

+

n ^

Tij µj +

j =1

n ^

γij fj (xj (t )) +

n X

j =1

n _

δij fj (xj (t )) +

j =1

n _

bij µj + Gi

j =1

Hij µj ,

i = 1, . . . , n

(1)

j =1

where γij , δij , Tij and Hij are elements of fuzzy feedback MIN template, fuzzy feedback MAX template, fuzzy feed forward V MIN W template and fuzzy feed forward MAX template, respectively; bij are elements of feed forward template; di ≥ 0; and denote the fuzzy AND and fuzzy OR operation, respectively; xi , µi and Gi denote state, input and bias of the ith neuron, respectively; fi is the activation function. In this paper, we assume that the activation functions are bounded and satisfy the following condition: H: f (x) = (f1 (x1 ), . . . , fn (xn )) is a second differentiable bounded function defined on R and satisfies 0 < max

sup

1≤i≤n −∞≤xi ≤∞

|fi00 (xi )| = Mf < ∞.

(2)

In the practical implementation of neural networks, in general, the deviations and perturbations of the weights of the connections are bounded. Therefore, the quantities of the coefficients di , ξij , γij and δij may be intervalized as follow: DI := [D, D] = {D = diag(di ) : D ≤ D ≤ D, i.e., di ≤ di ≤ di , i = 1, . . . , n}

ΞI := [Ξ , Ξ ] = {Ξ = (ξij )n×n : Ξ ≤ Ξ ≤ Ξ , i.e., ξ ij ≤ ξij ≤ ξ ij , i = 1, . . . , n} ΓI := [Γ , Γ ] = {Γ = (γij )n×n : Γ ≤ Γ ≤ Γ , i.e., γ ij ≤ γij ≤ γ ij , i = 1, . . . , n} ∆I := [∆, ∆] = {∆ = (δij )n×n : ∆ ≤ ∆ ≤ ∆, i.e., δ ij ≤ δij ≤ δ ij , i = 1, . . . , n}.

(3)

Moreover, for convenience, we let, for i, j = 1, . . . , n,

ξij∗ = max{|ξ ij |, |ξ ij |}, γij∗ = max{|γ ij |, |γ ij |} δij∗ = max{|δ ij |, |δ ij |}.

(4)

It is noted that bounded activation functions always guarantee the existence of an equilibrium point for model (1). Let x∗ = (x∗1 , . . . , x∗n )T be an equilibrium point of model (1). To simplify the notation later, we shift the equilibrium point x∗ of (1) to the origin by using the transformation y(t ) = x(t ) − x∗ . Model (1) becomes the following form: dyi dt

= −di yi (t ) +

n X

ξij (fj (xj (t )) − fj (x∗j )) +

j =1



n ^ j =1

γij fj (x∗j ) +

n ^

γij fj (xj (t ))

j =1 n _

δij fj (xj (t )) −

j =1

n _

δij fj (x∗j ),

i = 1, . . . , n.

(5)

j=1

Definition 1. The equilibrium point x∗ of (1) is said to be globally exponentially stable if there exist constants λ > 0 and M > 0 such that

kx(t ) − x∗ k ≤ M kx(0) − x∗ ke−λt

(6)

for all t ≥ 0. Definition 2. Model (1) is said to be robustly exponentially stable if its unique equilibrium point x∗ ∈ Rn is globally exponentially stable for any D ∈ DI , Γ ∈ ΓI , ∆ ∈ ∆I .

510

T. Huang et al. / Computers and Mathematics with Applications 58 (2009) 508–513

Definition 3. The domain of robust attraction of equilibrium x∗ is the maximal region Ω such that every solution x(t ) to model (1) satisfying (2) with x(0) ∈ Ω approaches robustly to x∗ . Definition 4. For any continuous function f : R → R, its Dini’s time–derivative is defined as df (t ) dt

= lim sup

f ( t + ∆t ) − f ( t )

∆t

∆t →0

.

In order to prove the main result regarding the attraction domain of model (1), we need the following lemma. Lemma 1 ([15]). For any aij ∈ R, xj , yj ∈ R, i, j = 1, . . . , n, we have the following estimations,

n n ^ ^ X aij yj ≤ (|a | · |xj − yj |) aij xj − j=1 1≤j≤n ij j=1

(7)

n n _ _ X aij yj ≤ (|aij | · |xj − yj |). aij xj − j=1 j=1 1≤j≤n

(8)

and

3. Robust attraction domain for interval fuzzy neural networks In this section, we will use the Lyapunov method to obtain the attraction domain for the interval fuzzy neural networks. The main result is presented as the following theorem. Theorem 1. Suppose that x∗ = (x∗1 , . . . , x∗n ) is an equilibrium point of model (1) with coefficients satisfying (3), and there is a ρ > 0 such that

ηi = 2di −

n n X 1X ∗ ζji∗ fi0 (x∗i )2 > 0 ζij − ρ

ρ

j =1

j =1

where ζij = |ξij | + |γij | + |δij |. Then, we have the following: ∗







(a) x∗ is locally q exponentially robust stable.

(ρ Mf fi0 (x∗i )

(b) Let gi =

Pn

j =1

ζji∗ )2 + ηi ρ Mf2 n P

−2ρ Mf fi0 (x∗i )

Pn

j =1

ζji∗ , and

ζji∗ + 2gi

j =1

r = min

1≤i≤n

ρ Mf2

n P

ζji∗

j =1

then, the open ball B(x∗ , rc ) = {x ∈ Rn : kx − x∗ k∞ < √1n r } is contained in the domain of robust attraction of x∗ . Proof. First, we prove part (a). Since ηi > 0, pick an  > 0 satisfying min1≤i≤n ηi −  > 0. Considering a Lyapunov function: V (x(t )) = e t

n X

(xi (t ) − x∗i )2 = e t

i =1

n X

yi (t )2 .

(9)

i =1

where yi (t ) = xi (t ) − x∗i . Let gi () =

q

(ρ Mf fi0 (x∗i )

Pn

j=1

−2ρ Mf fi0 (x∗i )

n P

ρ Mf2

n P

. ζji∗

j =1

First we claim the following: Claim 1: if |xi (t ) − x∗i | < r (), then

Pn

j =1

ζji∗ , and

ζji∗ + 2gi ()

j =1

r () = min

1≤i≤n

ζji∗ )2 + (ηi − )ρ Mf2

dV (x(t )) dt

< 0.

(10)

T. Huang et al. / Computers and Mathematics with Applications 58 (2009) 508–513

511

For i = 1, . . . , n, by (5) and Lemma 1, we have the following: dV (x(t )) dt

n X

=  e t

yi (t )2 + 2e t

i=1 n X

=  e t

yi (t )

i=1

yi (t )2 + 2e t

i=1

+

n X

n X

dyi dt

" yi (t ) −di yi (t ) +

i=1

n

n

^

^

γij fj (xj (t )) −

j=1

n X

ξij (fj (xj (t )) − fj (x∗j ))

j =1 n

γij fj (xj ) + ∗

j =1

_

δij fj (xj (t )) −

j=1

n _

# δij fj (xj ) ∗

j =1

" n n n X X X 2 t =e ( − 2di )yi (t ) + 2e yi (t ) ξij (fj (xj (t )) − fj (x∗j )) t

i =1

+

i=1

n

n

^

^

γij fj (xj (t )) −

j=1

≤ e t

j =1 n

γij fj (xj ) + ∗

j =1

_

δij fj (xj (t )) −

j=1

n _

# δij fj (xj ) ∗

j =1

n n n X X X ( − 2di )yi (t )2 + 2e t |yi (t )| [|ξij | + |γij | + |δij |]|fj (xj (t )) − fj (x∗j )| i =1

i=1

j=1

n n n X X X ≤ e t |yi (t )| [|ξij∗ | + |γij∗ | + |δij∗ |]|fj (xj (t )) − fj (x∗j )|. ( − 2di )yi (t )2 + 2e t i=1

i =1

(11)

j=1

Now, we use inequality 2ab ≤ ρ1 a2 + ρ b2 and the assumption (H), the above inequality is simplified as the following: dV (x(t )) dt

≤ e t

n n n X X X ( − 2di )yi (t )2 + 2e t |yi (t )| [|ξij∗ | + |γij∗ | + |δij∗ |]|fj (xj (t )) − fj (x∗j )| i =1

i=1

j=1

n n n X X X = e t |yi (t )| ζij∗ |fj (xj (t )) − fj (x∗j )| ( − 2di )yi (t )2 + 2e t i=1

i =1



=

=

=

j=1

 n n X n  X X 1 ∗ e t ζij yi (t )2 + ρζij∗ |fj (xj (t )) − fj (x∗j )|2 ( − 2di )yi (t )2 + e t ρ i =1 i=1 j=1 ! n n X n n X X 1X ∗ {ρζij∗ |fj (xj (t )) − fj (x∗j )|2 } ζij yi (t )2 + e t e t  − 2di + ρ j =1 i=1 j=1 i =1 ! n n n X n X X 1X ∗ t e ζij yi (t )2 + e t  − 2di + {ρζji∗ |fi (xi (t )) − fi (x∗i )|2 } ρ j =1 i =1 j=1 i=1 ! n n X X 1 e t  − 2di + ζij∗ yi (t )2 ρ i =1 j =1  2 n X n X 1 00 ∗ t ∗ 0 ∗ 2 +e ξi ∈ (x∗i , xi (t )) or ξi ∈ (xi (t ), x∗i ) ρζji fi (xi )yi (t ) + fi (xi (ξi ))yi (t ) 2

j =1 i =1

≤e

t

n X i =1

n n 1X ∗ X 1  − 2di + ζij + ρζji∗ fi0 (x∗i )2 + Mf fi0 (x∗i )yi (t ) + Mf2 yi (t )2 ρ j =1 4 j =1



!

yi (t )2 .

(12)

Let z = |yi (t )| and h(z ) =  − 2di +

n n 1X ∗ X 1 ζij + ρζji∗ fi0 (x∗i )2 + Mf fi0 (x∗i )z + Mf2 z 2 ρ j =1 4 j =1





h(z ) is a quadratic function of z. When 0 < z < r (), where r () is defined in (10), we have h(z ) < 0 which implies that < 0. Therefore, we have completed the proof of the claim. Now we claim the following:

dV (x(t )) dt

Claim 2: if kx(0) − x∗ k∞ < √1n r (), then

dV (x(t )) dt

< 0 for all t > 0.

512

T. Huang et al. / Computers and Mathematics with Applications 58 (2009) 508–513

0, V( (

dV (x(t0 )) dt

dV (x(t0 )) dt

≥ 0. Since dV (dtx(t )) is continuous, let tmin = min{t ≥ ≤ 0 when t ∈ [0, tmin ]. Clearly, V (x(t )) is non-increasing in [0, tmin ] and we have

Suppose it is not correct, there exists a t0 > 0 such that

dV (x(t0 )) dt x tmin

= 0}. Thus, )) ≤ V (x(0)), and the following estimate: v u n X p u kx(tmin ) − x∗ k∞ ≤ te tmin (xi (tmin ) − x∗i )2 = V (x(tmin )) i=1

p



V (x(0)) ≤



nkx(0) − x∗ k∞

≤ r ().

(13) dV (x(tmin )) dt

dV (x(tmin )) dt

Since kx(tmin ) − x∗ k∞ ≤ r (), we have < 0 by Claim 1 we have proved. It contradicts = 0. Thus, we have proved Claim 2. Now, we are ready to prove part (a). Let x(t ) be any solution other than x∗ to model (1) whose coefficients satisfy (3) and kx(tmin ) − x∗ k∞ ≤ √1n r (). By the result of Claim 2, we have V (x(t )) < V (x(0)) for all t > 0 and

kx(t ) − x∗ k∞

v u n X p u ≤ te− t e t (xi (t ) − x∗i )2 = e− t V (x(t )) i =1



v u n X u e− t V (x(0)) = te− t (xi (0) − x∗i )2

p

i=1 1 2

≤n e

− 21  t



kx(0) − x∗ k∞ .

Thus, x is locally exponentially robust stable. The proof for part (a) is completed.

(14) 

Now, we prove part (b). Since any solution with initial value x(0) satisfies kx(0) − x∗ k < r, there exists an  > 0 satisfying condition min1≤i≤n ηi −  > 0 and kx(0) − x∗ k < r (), where r () is defined in (10). By the result of Part (a), we 3

1

have kx(t ) − x∗ k∞ ≤ n 2 e− 2  t kx(0) − x∗ k∞ . According to Definition 3, x(0) is in the robust attraction domain of x∗ . Thus, we have completed the proof of Theorem 1. 4. Conclusion In this paper, we explore the local stability of interval fuzzy neural networks. A criterion on the attraction domain for the interval fuzzy neural networks has been obtained using Lyapunov method. It is believed that the robust stability is very important in designing neural networks. Thus, the results present in this paper are useful in the application and design of neural networks, since the conditions are easy to check in practice. References [1] J. Cao, Q. Tao, An estimation of the domain of attraction and convergence rate for Hopfield associative memory and an application, Journal of Computer System Sciences 60 (2000) 179–186. [2] J. Cao, T. Chen, Globally exponentially robust stability and periodicity of delayed neural networks, Chaos, Solitions and Fractals 22 (2004) 957–963. [3] J. Cao, J. Wang, Global exponential stability and periodicity of recurrent neural networks with time delays, IEEE Transactions on Circuits and Systems - Part I 52 (2005) 920–931. [4] A. Chen, L. Huang, Z. Liu, J. Cao, Periodic bidirectional associative memory neural networks with distributed delays, Journal of Mathematical Analysis and Applications 317 (2006) 80–102. [5] A. Chen, L. Huang, J. Cao, Existence and stability of almost periodic solution for BAM neural networks with delays, Applied Mathematics and Computation 137 (2003) 177–193. [6] T. Chen, S. Amari, Stability of asymmetric Hopfield networks, IEEE Transactions on Neural Networks 12 (2001) 159–163. [7] T. Huang, J. Cao, C. Li, Necessary and sufficient condition for the absolute exponential stability of a class of neural networks with finite delay, Physics Letters A 352 (2006) 94–98. [8] T. Huang, Exponential stability of fuzzy cellular neural networks with unbounded distributed delay, Physics Letters A 351 (2006) 48–52. [9] C. Li, X. Liao, T. Huang, Global stability analysis for delayed neural networks via an interval matrix approach, IET Control Theory and Applications 1 (2007) 743–748. [10] X. Liao, J. Wang, J. Cao, Global and robust stability of interval Hopfield neural networks with time-varying delays, International Journal of Neural Systems 13 (2003) 171–182. [11] X.F. Liao, J.B. Yu, Robust stability for interval Hopfield neural networks with time delay, IEEE Transactions Neural Networks 9 (1998) 1042–1045. [12] C. Li, X. Liao, R. Zhang, A global exponential robust stability criterion for interval delayed neural networks with variable delays, Neurocomputing 69 (2006) 803–809. [13] Z. Zeng, J. Wang, Complete stability of cellular neural networks with time-varying delays, IEEE Transactions on Circuits and Systems-I 53 (2006) 944–955. [14] L. Wang, Y. Lin, Global robust stability for shunting inhibitory CNNs with delays, International Journal of Neural Systems 14 (2004) 229–235. [15] T. Yang, L.B. Yang, C.W. Wu, L.O. Chua, Fuzzy cellular neural networks: Theory, in: Proc. of IEEE International Workshop on Cellular Neural Networks and Applications, 1996, pp. 181–186.

T. Huang et al. / Computers and Mathematics with Applications 58 (2009) 508–513

513

[16] T. Yang, L.B. Yang, C.W. Wu, L.O. Chua, Fuzzy cellular neural networks: Applications, in: Proc. of IEEE International Workshop on Cellular Neural Networks and Applications, 1996, pp. 225–230. [17] T. Yang, L.B. Yang, The global stability of fuzzy cellular neural network, IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications 43 (1996) 880–883. [18] X. Yang, X.F. Liao, S. Bai, D. Evans, Robust exponential stability and domains of attraction in a class of interval neural networks, Chaos, Solitions and Fractals 26 (2005) 445–451. [19] X. Yang, X.F. Liao, C. Li, D. Evans, New estimate on the domains of attraction of equilibrium points in continuous Hopfield neural networks, Physics Letters A 351 (2006) 161–166.