An analysis of global exponential stability of bidirectional associative memory neural networks with constant time delays

An analysis of global exponential stability of bidirectional associative memory neural networks with constant time delays

ARTICLE IN PRESS Neurocomputing 70 (2007) 1382–1389 www.elsevier.com/locate/neucom An analysis of global exponential stability of bidirectional asso...

249KB Sizes 0 Downloads 66 Views

ARTICLE IN PRESS

Neurocomputing 70 (2007) 1382–1389 www.elsevier.com/locate/neucom

An analysis of global exponential stability of bidirectional associative memory neural networks with constant time delays Weirui Zhaoa,b,, HuanShui Zhanga, Shulan Kongc a

Shenzhen Graduate School, Harbin Institute of Technology, HIT Campus of ShenZhen University Town, Xili, Shenzhen 518055, PR China b Department of Mathematics, Wuhan University of Technology, 122 Luoshi Road, Wuhan, Hubei 430070, PR China c School of Mathematical Sciences, Qufu Normal University, Qufu, Shandong 273165, PR China Received 7 December 2005; received in revised form 26 March 2006; accepted 8 June 2006 Communicated by T. Heskes Available online 14 October 2006

Abstract This paper presents a new sufficient condition for the existence, uniqueness and global exponential stability of the equilibrium point for bidirectional associative memory (BAM) neural networks with constant delays. The results are also compared with the previous results in the literature, implying that the results obtained in this paper provide one more set of criteria for determining the global exponential stability of BAM neural networks with constant delays. r 2006 Elsevier B.V. All rights reserved. Keywords: Bidirectional associative memory neural networks; Time delays; Global exponential stability; Lyapunov functional

1. Introduction Neural networks have been intensively studied in the past decade and have been applied in various fields such as designing associative memories and solving optimization problems. The applications crucially rely on the dynamical behavior of the designed neural network. Therefore, studying the stabilities of equilibria is of prime importance. In recent years, researchers [1–8,10–13,15–27] have paid particular attention to bidirectional associative memory (BAM) neural networks with time delays because the neural networks have been shown to be useful models in pattern recognition, optimization solving and automatic control engineering. Some useful results about the uniqueness and global asymptotic stability of the equilibrium for BAM neural networks with delays can be found in [3–8,15,18,22,24]. As we know, fast convergence of a system is essential for real-time computation, and the rate of the exponential convergence is used to determine the Corresponding author. Shenzhen Graduate School, Harbin Institute

of Technology D424, HIT Campus of ShenZhen University Town, Xili, Shenzhen 518055, PR China. E-mail address: [email protected] (W. Zhao). 0925-2312/$ - see front matter r 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.neucom.2006.06.003

speed of neural computations. On the other hand, the exponential stability property guarantees that, whatever transformation occurs, the networks’ ability to store rapidly the activity pattern remains invariant due to selforganization. Thus, it is not only theoretically interesting, but also practically important to determine the exponential stability and estimate the exponential convergence rate for BAM neural networks. Global exponential stability for neural networks with delays has been investigated and some sufficient conditions have been derived in [6–8,15]. In this paper, we focused on global exponential stability of BAM neural networks with time delays. By constructing a suitable Lyapunov functional, we obtained some new criteria for global exponential stability of BAM neural networks with time delays. These criteria generalized previous results. Consider bidirectional memory neural network model with delays described by the state equations [4]:

u_ i ðtÞ ¼ ai ui ðtÞ þ

m X j¼1

i ¼ 1; 2; . . . ; n,

ð1Þ wij sð1Þ j ðzj ðt  t1 ÞÞ þ J i ,

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389

z_j ðtÞ ¼ bj zj ðtÞ þ

n X

solution xðtÞ ¼ ðuðtÞ; zðtÞÞ of systems (1.1) with initial function f 2 Cð½t; 0; Rnþm Þ, there holds

ð2Þ vji sð2Þ i ðui ðt  t2 ÞÞ þ J j ,

i¼1

j ¼ 1; 2; . . . ; m,

ð1:1Þ

where ai and bj denote the neuron charging time constants and passive decay rates, respectively; wij and vji are synaptic connection strengths; sð1Þ and sð2Þ represent the i j activation functions of the neurons and the propagational signal functions, respectively; J ð1Þ and J ð2Þ are the i j exogenous inputs. In order to establish the global exponential stability conditions for neural networks (1.1) and make a precise comparison between our stability conditions and previous results derived in the literature, we take some traditional assumptions on the activation functions: Assumption A1 . There exist some positive constants ai ; i ¼ 1; 2; . . . ; m, and bj ; j ¼ 1; 2; . . . ; n, such that sð1Þ ðxÞ  sð1Þ i ðyÞ pai ; 0p i xy

i ¼ 1; 2; . . . ; m 8x; y 2 R; xay,

and

kxðtÞ  x k2 pgkf  x k2 ekt

for all tX0.

The organization of this paper is as follows. In Section 2, we establish a new criterion on the global exponential stability of the equilibrium point for neural networks with delays. Some comparison will be provided in Section 3. Section 4 presents our conclusions.

2. Main results Based on some facts about positive-definite matrices and integral inequalities, we present the main results in this section. Firstly, we have the following lemma due to [27]: Lemma 2.1. Given any real matrices X ; Y ; C of appropriate dimensions and a scalar 0 40, where C40. Then the following inequality holds: X T Y þ Y T X p0 X T CX þ

ð2Þ sð2Þ j ðxÞ  sj ðyÞ pbj ; xy

1383

1 T 1 Y C Y. 0

8x; y 2 R; xay.

In particular, if X and Y are vectors, X T Y pðX T X þ Y T Y Þ=2.

Assumption A2 . The activation functions are bounded; ð1Þ ð2Þ that is, jsð1Þ i ðxÞjpM i ði ¼ 1; 2; . . . ; mÞ, and jsj ðxÞjp ð2Þ ð1Þ ð2Þ M j ðj ¼ 1; 2; . . . ; nÞ, where M i and M j denote some positive constants.

Before we establish a criterion for the global exponential stability of systems (1.1), we give a lemma concerning the existence and the uniqueness of the equilibrium point of (1.1).

0p

j ¼ 1; 2; . . . ; n

Let t ¼ maxft1 ; t2 g, initial conditions for (1.1) are of the form f ¼ ðf1 ; f2 ; . . . ; fn ; fnþ1 ; fnþ2 ; . . . ; fnþm Þ 2 C ¼ Cð½t; 0; Rnþm Þ. Here, C is the continuous function space with the norm kfk2 ¼ suptptp0 fT ðtÞfðtÞ. For any initial value condition f ¼ ðf1 ; f2 ; . . . ; fnþm Þ 2 C, systems (1.1) admit a unique solution, denoted xðt; fÞ ¼ ðuðt; fÞ; zðt; fÞÞ ¼ ðu1 ðt; fÞ; u2 ðt; fÞ; . . . ; un ðt; fÞ; z1 ðt; fÞ; z2 ðt; fÞ; . . . ; zm ðt; fÞÞ. To simplify the notations, the dependence on the initial condition f will not be indicated unless necessary. For convenience, set B ¼ ðbij Þ, a real n  n dimensional matrix. The BT and B1 , respectively, represent the transpose and the inverse of the matrix B. The notation B40 ðBo0Þ means that B is symmetric and positive definite (negative definite). The kBk2 represents the norm of B induced by the Euclidean vector norm, i.e. kBk2 ¼ ðlðBT BÞÞ1=2 , where lðMÞ represents the maximum eigenvalue of matrix M. The I n denotes the n  n dimensional identical matrix. Sometimes we write xðtÞ as x, f ðxðtÞÞ as f ðxÞ and the transpose of A1 as AT . We will need the following definition: 

Definition 1.1 (Khalil [14]). The equilibrium point x ¼ ðu ; z Þ is said to be globally exponentially stable, if there exist positive constants k40 and g such that for any

Lemma 2.2. Suppose that in systems (1.1), Assumption A1 is satisfied. The neural networks defined by (1.1) have a unique equilibrium point if there exist symmetric positive diagonal ðkÞ ðkÞ matrices Pk ¼ diagðpðkÞ 1 ; p2 ; . . . ; pn Þ; k ¼ 1; 2, symmetric positive-definite matrices Q1 ; Q2 , and factorizations of W ¼ W 1 W 2 ; V ¼ V 1 V 2 such that T T O1 ¼ 2P1 Ab1  P1 W 1 Q1 1 W 1 P1  V 2 Q2 V 2 40, T T O2 ¼ 2P2 Ba1  P2 V 1 Q1 2 V 1 P2  W 2 Q1 W 2 40,

where Q1 ; Q2 ; W 1 ; W 2 ; V 1 ; V 2 are constant matrices with appropriate dimensions. Proof. Let x ¼ ðu ; z Þ ¼ ðu1 ; u2 ; . . . ; un ; z1 ; z2 ; . . . ; zm ÞT be an equilibrium point of neural network model (1.1). Then, x satisfies  Au þ WSð1Þ ðz Þ þ I ¼ 0,  Bz þ VS ð2Þ ðu Þ þ J ¼ 0.

ð2:1Þ

Let HðxÞ ¼ ðH 1 ðxÞ; H 2 ðxÞÞT ¼ ðAu þ WS ð1Þ ðxÞ þ I; Bz þ VS ð2Þ ðxÞ þ JÞT ¼ 0, ð1Þ T ð1Þ where S ðxÞ ¼ ðsð1Þ 1 ðz1 Þ; s2 ðz2 Þ; . . . ; sm ðzm ÞÞ , ð2Þ ð2Þ T ð2Þ ðs1 ðu1 Þ; s2 ðu2 Þ; . . . ; sn ðun ÞÞ , and x ¼ ðu; zÞ; u un Þ 2 Rn ; z ¼ ðz1 ; . . . ; zm Þ 2 Rm . ð1Þ

ð2:2Þ ð2Þ

S ðxÞ ¼ ¼ ðu1 ; . . . ;

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389

1384

Obviously, the solution of (2.2) is the equilibrium point of (1.1). Therefore, (1.1) has a unique solution if HðxÞ is a homeomorphism of Rnþm . From [9], we know that HðxÞ is a homeomorphism of Rnþm if HðxÞa HðyÞ; 8xay; x; y; 2 Rnþm , and kHðxÞk ! 1 as kxk ! 1. Let SðxÞ ¼ ðS ð1Þ ðxÞ; S ð2Þ ðxÞÞT , x and y be two vectors such xay. Under the assumptions on the activation functions, xay implies two cases: (i) xay and SðxÞ  SðyÞa0, (ii) xay and SðxÞ  SðyÞ ¼ 0. Now we can write H 1 ðxÞ  H 1 ðyÞ ¼ Aux þ Auy þ W ðS ð1Þ ðxÞ  Sð1Þ ðyÞÞ, H 2 ðxÞ  H 2 ðyÞ ¼ Bzx þ Bzy þ V ðS ð2Þ ðxÞ  S ð2Þ ðyÞÞ,

ð2:3Þ

where x ¼ ðux ; zx Þ; y ¼ ðuy ; zy Þ; ux ¼ ðux1 ; ux2 ; . . . ; uxn Þ; uy ¼ ðuy1 ; uy2 ; . . . ; uyn Þ 2 Rn ; zx ¼ ðzx1 ; zx2 ; . . . ; zxm Þ; zy ¼ ðzy1 ; zy2 ; . . . ; zym Þ 2 Rm , and A ¼ diagða1 ; a2 ; . . . ; an Þ; B ¼ diagðb1 ; b2 ; . . . ; bm Þ. First, consider the case where xay and SðxÞ  S ðyÞa0. In this case, there exists a k 2 f1; 2g such that ð1Þ ð1Þ S ðkÞ ðxÞaS ðkÞ ðyÞ. For P1 ¼ diagðpð1Þ 1 ; p2 ; . . . ; pn Þ and Q1 , multiplying both sides of the first equation in (2.3) by 2ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 results in ð2Þ

ð2Þ

T

2ðS ðxÞ  S ðyÞÞ P1 ðH 1 ðxÞ  H 1 ðyÞÞ ð2:4Þ

Since ð2Þ ð2Þ ð2Þ 2 ðuxi  uyi Þðsð2Þ i ðuxi Þ  si ðuyi ÞÞXð1=bi Þðsi ðuxi Þ  si ðuyi ÞÞ , we have ðS ð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 ðAux  Auy ÞXðS ð2Þ ðxÞ 1

ð2Þ

p  ðS ð1Þ ðxÞ  Sð1Þ ðyÞÞT O2 ðS ð1Þ ðxÞ  Sð1Þ ðyÞÞ  ðS ð2Þ ðxÞ  Sð2Þ ðyÞÞT O1 ðSð2Þ ðxÞ  S ð2Þ ðyÞÞ o0.

ð2:7Þ

That is, HðxÞaHðyÞ, since diagðP2 ; P1 Þ is a positive diagonal matrix. Hence, we proved that HðxÞ  HðyÞa0 when xay and SðxÞaSðyÞ. Now consider case (ii) where xay and SðxÞ  SðyÞ ¼ 0. In this case, we have   A 0 HðxÞ  HðyÞ ¼  ðx  yÞa0, 0 B implying that HðxÞaHðyÞ for all xay. In the following, we show that the conditions of Lemma 2.2 also imply that kHðxÞk ! 1 as kxk ! 1. To this end, setting y ¼ 0; inequality (2.7) yields 2ðSðxÞ  Sð0ÞÞT diagðP2 ; P1 ÞðHðxÞ  Hð0ÞÞ  ðS ð2Þ ðxÞ  Sð2Þ ð0ÞÞT O1 ðSð2Þ ðxÞ  S ð2Þ ð0ÞÞ

þ 2ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 W ðS ð1Þ ðxÞ  S ð1Þ ðyÞÞ.

T

2ðSðxÞ  SðyÞÞT diagðP2 ; P1 ÞðHðxÞ  HðyÞÞ

p  ðS ð1Þ ðxÞ  Sð1Þ ð0ÞÞT O2 ðS ð1Þ ðxÞ  Sð1Þ ð0ÞÞ

¼ 2ðS ð2Þ ðxÞ  Sð2Þ ðyÞÞT P1 ðAux  Auy Þ

ð2Þ

which implies that

p  lmin ½ðS ð1Þ ðxÞ  Sð1Þ ð0ÞÞT ðSð1Þ ðxÞ  S ð1Þ ð0ÞÞ þ ðS ð2Þ ðxÞ  Sð2Þ ð0ÞÞT ðSð2Þ ðxÞ  S ð1Þ ð0ÞÞ ¼ lmin ðSðxÞ  Sð0ÞÞT ðSðxÞ  Sð0ÞÞ,

ð2:8Þ

where lmin is the minimum eigenvalue of the positivedefinite matrices O1 and O2 . From (2.8), it follows that

ð2Þ

 S ðyÞÞ P1 Ab ðS ðxÞ  S ðyÞÞ. Let the Cholesky factorization of Q1 be Q1 ¼ K 1 K T1 . Rewriting W ¼ ½W 1 K 1 1 ½K 1 W 2 , we have 2ðS ð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 ðH 1 ðxÞ  H 1 ðyÞÞ p  2ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 Ab1 ðS ð2Þ ðxÞ  Sð2Þ ðyÞÞ þ 2½ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 W 1 K 1 1  ½K 1 W 2 ðS ð1Þ ðxÞ  Sð1Þ ðyÞÞ.

lmin kSðxÞ  Sð0Þk22    X n   ð2Þ ð1Þ ð1Þ ð1Þ ð1Þ p 2pi ðsi ðzxi Þ  si ð0ÞÞðhi ðzxi Þ  hi ð0ÞÞ   i¼1    X m   ð1Þ ð2Þ ð2Þ ð2Þ ð2Þ þ 2pi ðsi ðuxi Þ  si ð0ÞÞðhi ðuxi Þ  hi ð0ÞÞ   i¼1 p2p

By Lemma 2.1,

n X

ð1Þ ð1Þ ð1Þ jsð1Þ i ðzxi Þ  si ð0Þjjhi ðzxi Þ  hi ð0Þj

i¼1

2ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 ðH 1 ðxÞ  H 1 ðyÞÞ T

þ 2p

p  2ðS ðxÞ  S ðyÞÞ P1 Ab ðS ðxÞ  S ðyÞÞ ð2Þ

ð2Þ

1

ð2Þ

ð2Þ

m X

ð2Þ ð2Þ ð1Þ jsð2Þ i ðuxi Þ  si ð0Þjjhi ðuxi Þ  hi ð0Þj

i¼1

ð2Þ ð2Þ T þ ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT P1 W 1 Q1 1 W 1 P1 ðS ðxÞ  S ðyÞÞ

þ ðSð1Þ ðxÞ  S ð1Þ ðyÞÞT W T2 Q1 W 2 ðSð1Þ ðxÞ  S ð1Þ ðyÞÞ.

p2pkSðxÞ  Sð0Þk1

n X

ð1Þ jhð1Þ i ðzxi Þ  hi ð0Þj

i¼1

ð2:5Þ þ

Similarly,

m X

jhð2Þ i ðuxi Þ



!

hð2Þ i ð0Þj

i¼1

2ðSð1Þ ðxÞ  S ð1Þ ðyÞÞT P2 ðH 2 ðxÞ  H 2 ðyÞÞ

¼ 2pkSðxÞ  Sð0Þk1 kHðxÞ  Hð0Þk1 ,

p  2ðS ð1Þ ðxÞ  Sð1Þ ðyÞÞT P2 Ba1 ðSð1Þ ðxÞ  S ð1Þ ðyÞÞ T ð1Þ ð1Þ þ ðSð1Þ ðxÞ  S ð1Þ ðyÞÞT P2 V 1 Q1 2 V 1 P2 ðS ðxÞ  S ðyÞÞ

þ ðSð2Þ ðxÞ  S ð2Þ ðyÞÞT V T2 Q2 V 2 ðSð2Þ ðxÞ  S ð2Þ ðyÞÞ, ð2:6Þ

ð1Þ ð1Þ ð2Þ ð2Þ ð2Þ where p ¼ maxfpð1Þ 1 ; p2 ; . . . ; pn ; p1 ; p2 ; . . . ; pm g. Using the fact kSðxÞ  Sð0Þk1 pkSðxÞ  Sð0Þk2 , we obtain

lmin kSðxÞ  Sð0Þk1 p2pkHðxÞ  Hð0Þk1 .

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389

Note that kSðxÞ  Sð0Þk1 XkSðxÞk1  kSð0Þk1 kHðxÞ  Hð0Þk1 pkHðxÞk1 þ kHð0Þk1 , there holds

and

lmin kSðxÞk1  lmin kSð0Þk1 p2pkHðxÞk1 þ 2pkHð0Þk1 .

1385

Q1 ; Q2 , and factorizations of W ¼ W 1 W 2 ; V ¼ V 1 V 2 such that T T O1 ¼ 2P1 Ab1  P1 W 1 Q1 1 W 1 P1  V 2 Q2 V 2 40, T T O2 ¼ 2P2 Ba1  P2 V 1 Q1 2 V 1 P2  W 2 Q1 W 2 40,

Hence,

where Q1 ; Q2 ; W 1 ; W 2 ; V 1 ; V 2 are constant matrices with appropriate dimensions.

lmin kSðxÞk1  lmin kSð0Þk1  2pkHð0Þk1 kHðxÞk1 X 2p from which it can be easily concluded that kHðxÞk ! 1 as kSðxÞk ! 1, or equivalently kHðxÞk ! 1 as kxk ! 1. Hence, we proved that HðxÞ is a homeomorphism of Rn , implying that the neural systems (1.1) has an equilibrium point and this equilibrium point is unique. & To present sufficient conditions for the global exponential stability of the equilibrium point for neural systems (1.1), we first shift the equilibrium point of systems (1.1) to the origin. By the transformation xðtÞ ¼ uðtÞ  u ; yðtÞ ¼ zðtÞ  z , the neural network model (1.1) can be rewritten as _ ¼ AxðtÞ þ Wf ðyðt  t1 ÞÞ, xðtÞ _ ¼ ByðtÞ þ Vgðxðt  t2 ÞÞ, yðtÞ

ð2:9Þ

Proof. We employ the following positive-definite Lyapunov functional: V ðxðtÞ; yðtÞ; tÞ ¼ 1 V 1 ðxðtÞ; yðtÞÞ þ V 2 ðxðtÞ; yðtÞ; tÞ,

(2.10)

where V 1 ðxðtÞ; yðtÞÞ ¼ xT ðtÞxðtÞ þ yT ðtÞyðtÞ, Z Z yi ðtÞ m X pð2Þ f i ðsÞ ds þ V 2 ðxðtÞ; yðtÞ; tÞ ¼ 2 i þ2

n X i¼1

pð1Þ i

i¼1

Z

0

Z

xi ðtÞ

t

f T ðyðvÞÞR1 f ðyðvÞÞ dv tt1

t

gT ðxðvÞÞR2 gðxðvÞÞ dv,

gi ðsÞ ds þ tt2

0

for positive constant 1 and positive-definite matrices R1 and R2 . The positive constants 1 and positive-definite matrices R1 and R2 will be determined later. The derivative of V along trajectories of (2.9) is given by V_ ðxðtÞ; yðtÞÞ ¼ 1 V_ 1 ðxðtÞ; yðtÞÞ þ V_ 2 ðxðtÞ; yðtÞ; tÞ, where V_ 1 ðxðtÞ; yðtÞÞ ¼ 2xT ðtÞ½  AxðtÞ þ Wf ðyðt  t1 ÞÞ þ 2yT ðtÞ½ByðtÞ þ Vgðxðt  t2 ÞÞ and

where xðtÞ ¼ ðx1 ðtÞ; x2 ðtÞ; . . . ; xn ðtÞÞT , yðtÞ ¼ ðy1 ðtÞ; y2 ðtÞ; . . . ; ym ðtÞÞT , A ¼ diagða1 ; a2 ; . . . ; an Þ, B ¼ diagðb1 ; b2 ; . . . ; bm Þ; W ¼ ðwij Þnm ;

V_ 2 ðxðtÞ; yðtÞ; tÞ V ¼ ðvji Þmn ,

¼ 2f T ðyðtÞÞP2 ½ByðtÞ þ Vgðxðt  t2 ÞÞ

f ðyðtÞÞ ¼ ðf 1 ðy1 ðtÞÞ; f 2 ðy2 ðtÞÞ; . . . ; f m ðym ðtÞÞÞT ,

þ 2gT ðxðtÞÞP1 ½AxðtÞ þ Wf ðyðt  t1 ÞÞ

gðxðtÞÞ ¼ ðg1 ðx1 ðtÞÞ; g2 ðx2 ðtÞÞ; . . . ; gn ðxn ðtÞÞÞT ,

þ f T ðyðtÞÞR1 f ðyðtÞÞ  f T ðyðt  t1 ÞÞR1 f ðyðt  t1 ÞÞ

For the transformed systems (2.9), we have

þ gT ðxðtÞÞR2 gðxðtÞÞ  gT ðxðt  t2 ÞÞR2 gðxðt  t2 ÞÞ.

ð1Þ   f i ðyi ðtÞÞ ¼ sð1Þ i ðyi ðtÞ þ zi Þ  si ðzi Þ;

i ¼ 1; 2; . . . ; m,

Rewrite V_ 1 is the following form:

ð2Þ   gi ðxi ðtÞÞ ¼ sð2Þ i ðxi ðtÞ þ ui Þ  si ðui Þ;

i ¼ 1; 2; . . . ; n,

V_ 1 ðxðtÞ; yðtÞÞp  2xT ðtÞAxðtÞ  2yT ðtÞByðtÞ þ 2xT ðtÞA1=2 A1=2 Wf ðyðt  t1 ÞÞ

Assumption A1 implies that f i ðyi Þ pai and f i ð0Þ ¼ 0; yi g ðxi Þ pbi and gi ð0Þ ¼ 0; 0p i xi

0p

i ¼ 1; 2; . . . ; m, i ¼ 1; 2; . . . ; n.

Obviously, ðu ; z Þ is the globally exponentially stable equilibrium point of (1.1) if and only if the origin of (2.9) is globally exponentially stable. Thus, in the following, we only consider the global exponential stability of the origin of (2.9). We are now ready to present our main global exponential stability result. Theorem 2.3. Suppose that in systems (2.9) Assumption A1 is satisfied. The origin of neural systems (2.9) is globally exponentially stable if there exist symmetric ð1Þ ð1Þ positive diagonal matrices P1 ¼ diagðpð1Þ 1 ; p2 ; . . . ; pm Þ; P2 ¼ ð2Þ ð2Þ ð2Þ diagðp1 ; p2 ; . . . ; pn Þ, symmetric positive-definite matrices

þ 2yT ðtÞB1=2 B1=2 Vgðyðt  t2 ÞÞ. From Lemma 2.1, it follows that V_ 1 ðxðtÞ; yðtÞÞp  xT ðtÞAxðtÞ  yT ðtÞByðtÞ þ f T ðyðt  t1 ÞÞW T A1 Wf ðyðt  t1 ÞÞ þ gT ðxðt  t2 ÞÞV T B1 Vgðxðt  t2 ÞÞ p  xT ðtÞAxðtÞ  yT ðtÞByðtÞ þ M½f T ðyðt  t1 ÞÞf ðyðt  t1 ÞÞ þ gT ðxðt  t2 ÞÞgðxðt  t2 ÞÞ, where M ¼ maxðkW T A1 W k2 ; kV T B1 V k2 ÞX0. 2 Since f i ðyi ðtÞÞyi ðtÞXa1 i ðf i ðyi ðtÞÞÞ ; i ¼ 1; 2; . . . ; m, and 1 2 gj ðxj ðtÞÞxj ðtÞXbj ðgj ðxj ðtÞÞÞ ; j ¼ 1; 2; . . . ; n,  f T ðyðtÞÞP2 ByðtÞp  f T ðyðtÞÞP2 Ba1 f ðyðtÞÞ T

T

1

 g ðxðtÞÞP1 AxðtÞp  g ðxðtÞÞP1 Ab gðxðtÞÞ.

and

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389

1386

Let the Cholesky factorization of Q1 and Q2 be Q1 ¼ K T1 K 1 and Q2 ¼ K T2 K 2 . Rewriting W ¼ ðW 1 K 1 1 ÞðK 1 W 2 Þ and V ¼ ðV 1 K 1 ÞðK V Þ, we have 2 2 2

Z

Z

xi ðtÞ

t

f T ðyðvÞÞR1 f ðyðvÞÞ dv

gi ðsÞ ds þ 0

t

#

tt1

g ðxðvÞÞR2 gðxðvÞÞ dv  1 et ðxT ðtÞAxðtÞ T

þ tt2

p  f T ðyðtÞÞð2P2 Ba1 Þf ðyðtÞÞ  gT ðxðtÞÞð2P1 Ab1 ÞgðxðtÞÞ þ

pð1Þ i

i¼1

Z

V_ 2 ðxðtÞ; yðtÞ; tÞ þ

n X

þ2

T

2ðf ðyðtÞÞP2 V 1 K 1 2 ÞðK 2 V 2 gðxðt  t2 ÞÞÞ T 2ðg ðxðtÞÞP1 W 1 K 1 1 ÞðK 1 W 2 f ðxðt  t1 ÞÞÞ

þ yT ðtÞByðtÞÞ. R y ðtÞ R yi ðtÞ Note that 2pð2Þ f i ðsÞ dsp2p 0 i ai s dsppyy2i ðtÞ and i 0 R x ðtÞ R xi ðtÞ gi ðsÞ ds p 2p 0 i bi s ds p pyx2i ðtÞ, where p ¼ 2pð1Þ i 0

þ f T ðyðtÞÞR1 f ðyðtÞÞ  f T ðyðt  t1 ÞÞR1 f ðyðt  t1 ÞÞ þ gT ðxðtÞÞR2 gðxðtÞÞ  gT ðxðt  t2 ÞÞR2 gðxðt  t2 ÞÞ. That is, V_ 2 is bounded. By Lemma 2.1, V_ 2 ðxðtÞ; yðtÞ; tÞ T p  f T ðyðtÞÞð2P2 Ba1  P2 V 1 Q1 2 V 1 P2  R1 Þf ðyðtÞÞ

ð1Þ ð1Þ ð2Þ ð2Þ ð2Þ maxfpð1Þ 1 ; p2 ; . . . ; pn ; p1 ; p2 ; . . . ; pm g. We have

d t ðe V ðxðtÞ; yðtÞ; tÞÞ dt pet ð1 þ py  1 aÞðxT ðtÞxðtÞ þ yT ðtÞyðtÞÞ Z t þ et f T ðyðvÞÞR1 f ðyðvÞÞ dv tt1 t

T  gT ðxðtÞÞð2P1 Ab1  P1 W 1 Q1 1 W 1 P1  R2 ÞgðxðtÞÞ

þ et

 f T ðyðt  t1 ÞÞðR1  W T2 Q1 W 2 Þf ðyðt  t1 ÞÞ

Z

gT ðxðvÞÞR2 gðxðvÞÞ dv.

ð2:12Þ

tt2

 gT ðxðt  t2 ÞÞðR2  V T2 Q2 V 2 Þgðxðt  t2 ÞÞ.

Integrating both sides of (2.12), we obtain Since O1 40 and O2 40, there exists 2 40 such that O2  22 I m 40 and O1  22 I n 40. Set R1 ¼ W T2 Q1 W 2 þ 2 I m , R2 ¼ V T2 Q2 V 2 þ 2 I n , which are symmetric positivedefinite matrices. It follows that V_ 2 ðxðtÞ; yðtÞ; tÞp  f T ðyðtÞÞðO2  22 I m þ 2 I m Þf ðyðtÞÞ

es V ðxðsÞ; yðsÞ; sÞ  V ðxð0Þ; yð0Þ; 0Þ Z s p et ð1 þ py  1 aÞðxT ðtÞxðtÞ þ yT ðtÞyðtÞÞ dt 0 Z " Z s

T

 g ðxðtÞÞðO1  22 I n þ 2 I n ÞgðxðtÞÞ

t

#

gT ðxðvÞÞR2 gðxðvÞÞ dv dt.

ð2:13Þ

tt2

p  2 f T ðyðtÞÞf ðyðtÞÞ  2 gT ðxðtÞÞgðxðtÞÞ  2 f T ðyðt  t1 ÞÞf ðyðt  t1 ÞÞ  2 gT ðxðt  t2 ÞÞgðxðt  t2 ÞÞ. Choose 1 40 such that M1 p2 . Then V_ ðxðtÞ; yðtÞ; tÞp 1 xðtÞT AxðtÞ  1 yT ðtÞByðtÞ. Let a ¼ minfa1 ; a2 ; . . . ; an ; b1 ; b2 ; . . . ; bm g, y ¼ maxfa1 ; a2 ; . . . ; an ; b1 ; b2 ; . . . ; bm g, and r ¼ maxfkR1 k2 ; kR2 k2 g. Obviously, V ðxðtÞ; yðtÞ; tÞ is a positive definite and radially unbounded Lyapunov function. Choose 40 satisfying the following condition: 1 þ py  1 a þ ry2 tet o0.

(2.11)

We then have

Estimating the second term and the third term on the righthand side of (2.13) by changing the integrals and using 0pti pt, we have Z

d t ðe V ðxðtÞ; yðtÞ; tÞÞ dt

Z

s

et

 0

p

t

f T ðyðvÞÞR1 f ðyðvÞÞ dv dt

tt1 Z t s t

Z

e

0

Z

s

Z

f T ðyðvÞÞR1 f ðyðvÞÞ dv dt

tt minfvþt;sg

et dt f T ðyðvÞÞR1 f ðyðvÞÞ dv

¼ p p

Z

t maxfv;0g s Z vþt t

Zts

 e dt f T ðyðvÞÞR1 f ðyðvÞÞ dv

v

teðvþtÞ f T ðyðvÞÞR1 f ðyðvÞÞ dv

Zts

teðvþtÞ kR1 k2 y2 yT ðvÞyðvÞ dv Z 0  Z s 2 t v T v T ory te e y ðvÞyðvÞ dv þ e y ðvÞyðvÞ dv

p

t

¼ et V ðxðtÞ; yðtÞ; tÞ þ et "

i¼1

Z

þet

 2 gT ðxðt  t2 ÞÞgðxðt  t2 ÞÞ

pet

f T ðyðvÞÞR1 f ðyðvÞÞ dv

tt1

0

 2 f T ðyðt  t1 ÞÞf ðyðt  t1 ÞÞ

dV ðxðtÞ; yðtÞ; tÞ dt Z m X ð2Þ T T 1 ðx ðtÞxðtÞ þ y ðtÞyðtÞÞ þ 2 pi

t

et

þ

yi ðtÞ

t

0

f i ðsÞ ds 0

ð2:14Þ

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389

When W and V are nonsingular, choosing W 1 ¼ W ; W 2 ¼ I m ; V 1 ¼ V ; V 2 ¼ I n , we have:

and Z

s



e

t

Z

t

gT ðxðvÞÞR2 gðxðvÞÞ dv dt

Corollary 2.4. Suppose that in systems (2.9), Assumption A1 is satisfied, W and V are nonsingular. The origin of neural systems (2.9) is globally exponentially stable if there exists a positive constant g such that

tt2

0

1387

Z 0 ory2 tet ev xT ðvÞxðvÞ dv t  Z s v T þ e x ðvÞxðvÞ dv .

Substituting (2.14) and (2.15) into (2.13) and using (2.11), we can obtain

1 O1 ¼ 2Ab1  I n  gV T V 40, g 1 O2 ¼ 2Ba1  I m  gW T W 40. g

es V ðxðsÞ; yðsÞ; sÞ  V ðxð0Þ; yð0Þ; 0Þ Z s p et ð1 þ py  1 aÞðxT ðtÞxðtÞ þ yT ðtÞyðtÞÞ dt

Proof. Choose P1 ¼ I m ; P2 ¼ I n ; Q1 ¼ gW T W ; Q2 ¼ gV T V ; W 1 ¼ W ; W 2 ¼ I m ; V 1 ¼ V ; V 2 ¼ I n , then the O1 and O2 in Theorem 2.3 become

ð2:15Þ

0

0 2

þ ra te

t

Z

1 O1 ¼ 2Ab1  I n  gV T V g

0 v

T

T

e ðx ðvÞxðvÞ þ y ðvÞyðvÞÞ dv Zts

and

þ ry2 tet ev ðxT ðvÞxðvÞ þ yT ðvÞyðvÞÞ dv 0 Z s p et ð1 þ py  1 a þ ry2 tet ÞðxT ðtÞxðtÞ

1 O2 ¼ 2Ba1  I m  gW T W . g The corollary then follows from Theorem 2.3.

0

þ yT ðtÞyðtÞÞ dt Z 0 2 t ev ðxT ðvÞxðvÞ þ yT ðvÞyðvÞÞ dv þ ry te pry2 tet

Z

The following is a special case of Corollary 2.4. Corollary 2.5. Suppose that in systems (2.9), Assumption A1 is satisfied, W and V are symmetric positive-definite matrices. The origin of neural systems (2.9) is globally exponentially stable if there exists a positive constant g such that

t 0

ev ðxT ðvÞxðvÞ þ yT ðvÞyðvÞÞ dv

t

 M 1 kfk22 . Therefore, V ðxðtÞ; yðtÞ; tÞpðV ðxð0Þ; yð0Þ; 0Þ þ M 1 kfk22 Þet

8t40. (2.16)

V ðxð0Þ; yð0Þ; 0Þ ¼ 1 ðxT ð0Þxð0Þ þ yT ð0Þyð0ÞÞ Z yi ð0Þ Z xi ð0Þ m n X X ð2Þ ð1Þ þ2 pi f i ðsÞ ds þ 2 pi gi ðsÞ ds Z

i¼1 0

þ

0

0

i¼1 0

f T ðyðvÞÞR1 f ðyðvÞÞ dv þ

t1

Z

gT ðxðvÞÞR2 gðxðvÞÞ dv

pð1 þ py þ ry

Þkfk22

¼

1 O1 ¼ 2Ab1  W  gV T 40, g 1 O2 ¼ 2Ba1  V  gW T 40. g Proof. Choose P1 ¼ I m ; P2 ¼ I n ; Q1 ¼ gW 1 ; Q2 ¼ gV 1 ; W 1 ¼ I n ; W 2 ¼ W ; V 1 ¼ I m ; V 2 ¼ V , then the O1 and O2 in Theorem 2.3 become 1 O1 ¼ 2Ab1  W  gV T g and

t2 2

M 2 kfk22 .

According to (2.10), (2.16), and the above inequality,

1 O2 ¼ 2Ba1  V  gW T . g The corollary then follows from Theorem 2.3.

1 kðxðtÞ; yðtÞÞT k22

T

&

&

T

¼ 1 ðx ðtÞxðtÞ þ y ðtÞyðtÞpV ðxðtÞ; yðtÞ; tÞ pðM 1 þ M 2 Þkfk22 et

3. Comparison with previous results

8t40,

In this section, we compare our results with the previous results derived in the literature. Let us restate the previous stability results as follows.

that is, rffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi M1 þ M2 kzk2 p kfk2 eð=2Þt . 1

(2.17)

Inequality (2.17) implies that the origin of systems (2.9) is globally exponentially stable. &

Theorem 3.1 (Arik [4]). Suppose that in systems (2.9), Assumptions A1 and A2 are satisfied. The origin of neural systems (2.9) is globally asymptotically stable if the

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389

1388

following conditions hold: O3 ¼ a1 Ba1  2Ba1 þ VV T þ I m þ W T A1 W o0, O4 ¼ b1 Ab1  2Ab1 þ WW T þ I n þ V T B1 V o0. Theorem 3.2 (Chen et al. [7]). Suppose that in systems (2.9), Assumption A1 is satisfied. The origin of neural systems (2.9) is globally asymptotically stable if " max

max

1p jpm

! !# n m X X 1 1 jwij jaj ; max jvji jbj o1, 1p ipn a b i¼1 i j¼1 i

and l þ m þ maxðw; vÞo2y, where n X

l ¼ max

1p jpm



min

! jwij jaj ;

i¼1

fai ; bj g, ! m X jwij j ;

m ¼ max

1p ipn

m X

1p ipn

j¼1

v ¼ max

1p jpm

jvji jbj ,

j¼1

1pipn;1pjpm

w ¼ max

!

n X

! jvji j .

i¼1

In order to show that the condition we have obtained in this paper provides a new sufficient criterion for determining the equilibrium and stability properties of systems (2.9), we consider the following examples. Example 3.3 (Arik [4]). Assume that the network parameters of systems (2.9) are given as follows:     c c 1 0 W ¼V ¼ ; A¼B¼ , c c 0 1  a¼b¼

1

0

0

1

 .

The O1 and O2 in Corollary 2.4 become 2 3 2  1g  2gc2 0 5, O1 ¼ 4 0 2  1g  2gc2 2 3 2  1g  2gc2 0 5. O2 ¼ 4 0 2  1g  2gc2 In this case, in order to insure that there exists a positive constant g such that O1 and O2 are positive definite, c must be chosen as jcjo p1ffiffi2. Therefore, according to Theorem 2.3, the origin of systems (2.9) is globally exponentially stable when jcjo p1ffiffi2. However, the origin of systems (2.9) is only globally asymptotically stable when jcjo p1ffiffi2 according to Theorem 3.1. Moreover, l ¼ m ¼ w ¼ v ¼ 2jcj; a ¼ 1, thus the origin of systems (2.9) is globally exponentially stable when jcjo13 according to Theorem 3.2. Based on the above results, if we choose 13pjcjop1ffiffi2, then the conditions of Theorem 3.2 are not satisfied, whereas the conditions of Corollary 2.4 still hold.

Example 3.4. Assume that the network parameters of systems (2.9) are given as follows:     1 0 0 1 , ; A¼B¼a W ¼V ¼ 0 1 0 1  a¼b¼

1

0

0

1

 .

Choose P1 ¼ I 2 ; P2 ¼ I 2 ; Q1 ¼ Q2 ¼ gI 2 (the constant g will be determined later), W 1 ¼ V 1 ¼ I 2 ; W 2 ¼ V 2 ¼ W ; then the O1 and O2 in Theorem 2.3 become 2 3 0 2a  1g 5. O1 ¼ O2 ¼ 4 0 2a  1g  2g If a41, there always exists positive constant g such that 2a  1=g  2g40 and 2a  1=g40. Hence, the conditions of Theorem 2.3 hold for a41. Now let us check the conditions of Theorem 3.1 for the same network parameters. In this case ! 3a  2 1 . O3 ¼ O4 ¼  1 3a  2  2a In order to insure the negative definiteness of O3 and O4 , a must be chosen as a443. Thus, it can be concluded that if 1oap43, the conditions of Theorem 3.1 are not satisfied, whereas the conditions of Theorem 2.3 still hold for 1oap 43. Remark 3.5. From the above examples we can see that Theorem 2.3 is a generalization of Theorems 3.1 and 3.2. 4. Conclusion In this paper, we obtained new results for the global exponential stability properties of BAM neural networks with delays. By the Lyapunov functional method and the technique of inequality of integral, the global exponential stability criteria were derived. A comparison between our results and the previous results implies that our results establish a new set of global exponential stability criteria for BAM neural networks with constant delays. Those conditions are less restrictive than those in the earlier references. Acknowledgment The authors would like to thank the anonymous reviewers and the editor for their constructive comments. References [1] S. Arik, Global asymptotical stability of a larger class of delayed neural networks, in: Proceedings of the ISCAS 2003, 2003, pp. 721–724.

ARTICLE IN PRESS W. Zhao et al. / Neurocomputing 70 (2007) 1382–1389 [2] S. Arik, An analysis of exponential stability of delayed neural networks with time varying delays, Neural Networks 17 (2004) 1027–1031. [3] S. Arik, Global asymptotic stability analysis of bidirectional associative memory neural networks, IEEE Trans. Neural Networks 16 (3) (2005) 580–586. [4] S. Arik, V. Tavsanoglu, Global asymptotic stability analysis of bidirectional associative memory neural networks with constant time delays, Neurocomputing 68 (1–2) (2005) 161–176. [5] J. Cao, M. Dong, Exponential stability of delayed bidirectional associative memory networks, Appl. Math. Comput. 135 (1) (2003) 105–112. [6] J. Cao, L. Wang, Exponential stability and periodic oscillatory solution in BAM networks with delays, IEEE Trans. Neural Networks 13 (2) (2002) 457–463. [7] A. Chen, J. Cao, L. Huang, Exponential stability of BAM neural networks with transmission delays, Neurocomputing 57 (2004) 435–454. [8] T. Chen, L. Rong, Global exponential stability in Hopfield and bidirectional associative memory neural networks with time delays, Chin. Ann. Math. 25B (2) (2004) 255–262. [9] M. Forti, A. Tesi, New conditions for global stability of neural networks with application to linear and quadratic programming problems, IEEE Trans. Circuits Syst. I 42 (7) (1995) 354–366. [10] H. Huang, D. Ho, J. Cao, Analysis of global exponential stability and periodic solutions of neural networks with time-varying delays, Neural Networks 18 (2) (2005) 161–170. [11] C. Hwang, C. Cheng, T. Liao, Globally exponential stability of generalized Cohen–Grossberg neural networks with delays, Phys. Lett. A 319 (1–2) (2003) 157–166. [12] H. Jiang, Z. Teng, Some new results for recurrent neural networks with varying-time coefficients and delays, Phys. Lett. A 338 (2005) 446–460. [13] H. Jiang, Z. Teng, A new criterion on the global exponential stability for cellular neural networks with multiple time-varying delays, Phys. Lett. A 338 (2005) 461–471. [14] H.K. Khalil, Nonlinear Systems, Macmillan, New York, 1992. [15] C. Li, X. Liao, R. Zhang, Delay-dependent exponential stability analysis of bi-directional associative memory neural networks with time delay: an LMI approach, Chaos Solitons Fractals 24 (2005) 1119–1134. [16] X. Li, L. Huang, H. Zhu, Global stability of cellular neural networks with constant and variable delays, Nonlinear Anal. 53 (2) (2003) 319–333. [17] T. Liao, J. Yan, C. Cheng, C. Hwang, Globally exponential stability condition of a class of neural networks with time-varying delays, Phys. Lett. A 339 (2005) 333–342. [18] X. Liao, J. Yu, G. Chen, Novel stability criteria for bidirectional associative memory neural networks with time delays, Int. J. Circuit Theory Appl. 30 (2002) 519–546. [19] H. Lu, Absolute exponential stability analysis of delayed neural networks, Phys. Lett. A 336 (2005) 133–140. [20] W. Lu, L. Rong, T. Chen, Global convergence of delayed neural network systems, Int. J. Neural Syst. 13 (2) (2003) 1–12. [21] H. Qi, L. Qi, Deriving sufficient conditions for global asymptotic stability of delayed neural networks via nonsmooth analysis, IEEE Trans. on Neural Networks 15 (1) (2004) 99–109. [22] J. Zhang, Y. Yang, Global stability of bidirectional associative memory neural networks with time delays, Int. J. Circuit Theory Appl. 29 (2001) 185–196.

1389

[23] Q. Zhang, X. Wei, J. Xu, Delay-dependent exponential stability of cellular neural networks with time-varying delays, Chaos Solitons Fractals 23 (4) (2005) 1363–1369. [24] H. Zhao, Global stability of bidirectional associative memory neural networks with distributed delays, Phys. Lett. A 30 (2002) 519–546. [25] H. Zhao, Global exponential stability and periodicity of cellular neural networks with variable delays, Phys. Lett. A 336 (2005) 331–341. [26] W. Zheng, J. Zhang, Global exponential stability of a class of neural networks with variable delays, Comput. Math. Appl. 49 (2005) 895–902. [27] L. Zhou, M. Zhou, Stability analysis of a class of generalized neural networks with delays, Phys. Lett. A 337 (2005) 203–215. Weirui Zhao received the B.S. degree in mathematics from Huazhong Normal University, Wuhan, China, the M.S. degree and the Ph.D. degree from Fudan University, Shanghai, China, all in mathematics/applied mathematics, in 1989, 1994, and 2003, respectively. From June 1989 to September 2000, he was with Hubei Institute for Nationalities. In June 2003, he joined the Department of Mathematics, Wuhan University of Technology, Wuhan, China. Currently, he is a Postdoctoral Research Fellow in the Shenzhen Graduate School, Harbin Institute of Technology. He is the author or coauthor of 10 journal papers and conference papers. His research interests include stability theory, nonlinear systems, neural networks, and applied mathematics. Huanshui Zhang graduated in mathematics from the Qufu Normal University in 1986 and received his M.S. and Ph.D. degrees in control theory and signal processing from the Heilongjiang University, PR China, and Northeastern University, PR China, in 1991 and 1997, respectively. He worked as a Postdoctoral Fellow at the Nanyang Technological University from 1998 to 2001 and a Research Fellow at Hong Kong Polytechnic University from 2001 to 2003. He joined Shandong Taishan College in 1986 as an Assistant Professor and became an Associate Professor in 1994. He is currently a Professor in Shenzhen Graduate School of Harbin Institute of Technology. His interests include optimal estimation, robust filtering and control, time delay systems, singular systems, wireless communication, signal processing, and neural networks. Shulan Kong is an Associate Professor at School of Mathematical Sciences, Qufu Normal University. She graduated in mathematics from Yantai Normal College in 1992 and received her M.S. and Ph.D. degrees in mathematics from Qufu Normal University and Shandong University in 1995 and 2004, respectively. Currently, she is a Postdoctor of Shenzhen Graduate School, Harbin Institute of Technology. Her research interests include nonlinear control systems, wireless communication systems, combinatorial optimization, and neural networks.