Neurocomputing 74 (2011) 1503–1509
Contents lists available at ScienceDirect
Neurocomputing journal homepage: www.elsevier.com/locate/neucom
Robust delay-dependent exponential stability for uncertain stochastic neural networks with mixed delays Feiqi Deng a,1, Mingang Hua a,b,2, Xinzhi Liu b,, Yunjian Peng a, Juntao Fei c a
College of Automation Science and Engineering, South China University of Technology, Guangzhou 510640, PR China Department of Applied Mathematics, University of Waterloo, Waterloo, Ontario, Canada N2L 3G1 c College of Computer and Information, Hohai University, Changzhou 213022, PR China b
a r t i c l e i n f o
abstract
Article history: Received 15 July 2008 Received in revised form 2 July 2009 Accepted 2 August 2010 Communicated by J. Cao Available online 16 March 2011
This paper is concerned with the robust delay-dependent exponential stability of uncertain stochastic neural networks (SNNs) with mixed delays. Based on a novel Lyapunov–Krasovskii functional method, some new delay-dependent stability conditions are presented in terms of linear matrix inequalities, which guarantee the uncertain stochastic neural networks with mixed delays to be robustly exponentially stable. Numerical examples are given to illustrate the effectiveness of our results. & 2011 Elsevier B.V. All rights reserved.
Keywords: Stochastic neural networks Exponential stability Linear matrix inequality (LMI)
1. Introduction In recent years, dynamical neural networks have attracted considerable attention as they have proved to be essential in many applications such as pattern recognition, associative memories, optimization, signal processing and knowledge acquisition, see [1–3] for example. In such applications, it is of prime importance to know if the neural network converges globally to a unique equilibrium point for each external input vector. That is, stability is one of the main properties of neural networks, which is a crucial feature in the design of neural networks. Meanwhile, in the implementation of artificial neural networks in practical applications, it is sometimes required to introduce delays in the signals transmitted among neurons, and the delays are often the sources of oscillations, instability and poor performance of the networks. Thus, many authors have studied stability of neural networks with delays, see [4–12] and the references therein. On the other hand, due to modeling and measurement errors, linearization approximations, and so on, neural network is often disturbed by the parameter uncertainties and the stochastic perturbations, which may cause undesirable dynamic behaviors or poor performance. Therefore, the robust stability analysis for uncertain
Corresponding author. Tel.: þ 1 519 8884567x36007; fax: þ1 519 746 4319.
E-mail addresses:
[email protected],
[email protected] (F. Deng),
[email protected],
[email protected] (X. Liu). 1 Tel.: þ86 20 22236726, þ 86 87110940. 2 Now the author is with College of Computer and Information, Hohai University, Changzhou 213022, PR China. 0925-2312/$ - see front matter & 2011 Elsevier B.V. All rights reserved. doi:10.1016/j.neucom.2010.08.027
stochastic neural networks (SNNs) with delays has been extensively studied, see [13–17] for example. As pointed out in [18,19], neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths, and hence there is a distribution of propagation delays over a period of time. So many researchers have studied the stability of a class of stochastic neural networks with both discrete and distributed delays, see [20–28] for example. Recently, by introducing slack matrices, delay-dependent robust stability of uncertain stochastic neural networks with time-varying delay were studied in [29,30]. Especially, a less conservative delay-dependent exponential stability condition in mean square for a class of uncertain stochastic Hopfield neural networks with discrete and distributed time-varying delays has been derived in [31]. Nevertheless, it should be pointed out that, in spite of these above results being nice and effective, it is necessary to do more research on this topic to get less conservative exponential stability conditions. In this paper, a free-weighting matrix method is proposed to study the robust delay-dependent exponential stability for uncertain stochastic neural networks with mixed delays. As in [9,10,29–31], by introducing some freely chosen matrices, some new robust delay-dependent exponential stability problem for a class of uncertain stochastic neural networks with mixed delays is considered. The mixed delays comprise discrete and distributed time-delays, the parameter uncertainties are time-varying and norm-bounded. By utilizing a novel Lyapunov–Krasovskii functional method, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for uncertain stochastic neural networks with mixed delays to be robustly exponentially
1504
F. Deng et al. / Neurocomputing 74 (2011) 1503–1509
stable. Numerical examples are given to illustrate the effectiveness and less conservativeness of our results over the existing stability criteria.
where H, G1, G2, G3, G4, G5, G6, G7, G8, G9 are known real constant matrices with appropriate dimensions, F(t) is unknown real timevarying matrix with Lebesgue measurable elements bounded by F T ðtÞFðtÞ rI:
2. Problem description Throughout this paper, Rn denotes the n-dimensional Euclidean space, Rnm is the set of n m real matrices. I is the identity matrix, J J stands for the Euclidean norm for vectors or the spectral norms of matrices, and lmin ðÞ denotes the minimum eigenvalue of the corresponding matrix. MT stands for the transpose of the matrix M. For symmetric matrices X and Y, the notation X 4Y (respectively X Z Y) means that the X–Y is positive definite (respectively, positive semi-definite). ðO,F ,fF t gt Z 0 ,PÞ is a complete probability space with a filtration fF t gt Z 0 satisfying the conditions that it is right continuous and F0 contains all P-null sets. Cð½t,0; Rn Þ denotes the family of continuous functions j from ½t,0 to Rn with the norm jjj ¼ supt r s r 0 JjðsÞJ. Denote by CFb 0 ð½t,0; Rn Þ the family of all bounded F0-measurable, Cð½t,0; Rn Þ-valued random variables. Efg stands for the mathematical expectation, n denotes a block that is readily inferred by symmetry. Consider the following uncertain stochastic neural networks with mixed delays: Z dxðtÞ ¼ AðtÞxðtÞ þW0 ðtÞf ðxðtÞÞ þ W1 ðtÞf ðxðttðtÞÞÞ þ E1 ðtÞ
Z
t
f ðxðsÞÞ ds dwðtÞ,
Definition 1. The trivial solution of the neural networks (1) is said to be robustly exponentially stable in the mean square sense, if there exist constants a 4 0 and b 4 0 such that EJxðt, jÞJ2 r aebt
sup EJjðsÞJ2 t r s r 0
for all admissible uncertainties satisfying (5)–(6). Lemma 1 (see Boyd et al. [33]). Given constant matrices G1 , G2 and G3 with appropriate dimensions, where GT1 ¼ G1 and GT2 ¼ G2 4 0, then
if and only if " # T
trðtÞ
þ CðtÞxðtÞ þDðtÞxðttðtÞÞ þ W2 ðtÞf ðxðtÞÞ þW3 ðtÞf ðxðttðtÞÞÞ þ E2 ðtÞ
Let xðt; jÞ denote the state trajectory of the neural network (1) from the initial data xðyÞ ¼ jðyÞ on t r y r 0 in L2F 0 ð½t ,0; Rn Þ, then system (1) admits a trivial solution, see e.g. [32]. The problem to be addressed in this paper is to develop robust exponential stability condition for the uncertain stochastic neural networks with mixed delays. Let us introduce the following definition and lemmas for later use.
G1 þ GT3 G1 2 G3 o 0
f ðxðsÞÞds dt
t
ð6Þ
ð1Þ
G1 G3 o 0 or G3 G2
"
G2
GT3
#
G3 o0: G1
trðtÞ
xðtÞ ¼ jðtÞ,
t A ½t ,0, t ¼ maxft,rg,
ð2Þ
where xðtÞ ¼ ½x1 ðtÞ,x2 ðtÞ, . . . ,xn ðtÞT A Rn is the state vector associated with the neurons, the matrices AðtÞ ¼ A þ DAðtÞ, W0 ðtÞ ¼ W0 þ DW0 ðtÞ, W1 ðtÞ ¼ W1 þ DW1 ðtÞ, CðtÞ ¼ C þ DCðtÞ, DðtÞ ¼ D þ DDðtÞ, W2 ðtÞ ¼ W2 þ DW2 ðtÞ, W3 ðtÞ ¼ W3 þ DW3 ðtÞ, E1 ðtÞ ¼ E1 þ DE1 ðtÞ, E2 ðtÞ ¼ E2 þ DE2 ðtÞ, A ¼ diagfa1 ,a2 , . . . ,an g is a diagonal matrix with positive entries ai 40,i ¼ 1,2, . . . ,n, C and D are known constant matrices with appropriate dimensions, the matrices W0, W2, W1, W3, E1 and E2 are the connection weight matrix, the discretely delayed connection weight matrix, and the distributively delayed connection weight matrix, respectively. DAðtÞ, DW0 ðtÞ, DW1 ðtÞ, DCðtÞ, DDðtÞ, DW2 ðtÞ, DW3 ðtÞ, DE1 ðtÞ and DE2 ðtÞ are all unknown time-varying matrices with appropriate dimensions which represent the system uncertainty and stochastic perturbation uncertainty, respectively. wðtÞ ¼ ½w1 ðtÞ,w2 ðtÞ, . . . ,wm ðtÞT A Rm is a Brownian motion defined on a complete probability space ðO,F ,fF t gt Z 0 ,PÞ, f ðxðtÞÞ ¼ ½f1 ðx1 ðtÞÞ,f2 ðx2 ðtÞÞ, . . . ,fn ðxn ðtÞÞ denotes the neuron activation function with f(0)¼0. The discrete delay tðtÞ and distributed r(t) satisfies 0 r tðtÞ r t,
t_ ðtÞ r m, 0 r rðtÞ r r,
ð3Þ n
where t, m and r are constants. jðtÞ A Cð½t ,0; R Þ is the initial function. To derive the main results, the following assumptions are made throughout this paper. Assumption 1. The activation function f(x) is bounded and satisfies the following Lipschitz condition: jf ðxÞj rLjxj,
ð4Þ
where L is a known constant matrix. Assumption 2. The admissible parameter uncertainties are assumed to be of the following form: ½DAðtÞ DW0 ðtÞ DW1 ðtÞ DCðtÞ DDðtÞ DW2 ðtÞ DW3 ðtÞ DE1 ðtÞ DE2 ðtÞ ¼ HFðtÞ½G1 G2 G3 G4 G5 G6 G7 G8 G9 ,
ð5Þ
Lemma 2 (see Wang et al. [34]). For given matrices H, E and F(t) with F T ðtÞFðtÞ r I and scalar e 4 0, the following inequality holds: HFðtÞEþ ET F T ðtÞHT r eHHT þ e1 ET E: Lemma 3 (see Gu et al. [35]). For any constant matrix M A Rnn ,M ¼ M T 4 0, scalar g 4 0, vector function o : ½0, g-Rn such that the integrations are well defined, the following inequality holds: Z g T Z g Z g oðsÞ ds M oðsÞ ds r g oT ðsÞMoðsÞ ds: 0
0
0
3. Main results We shall develop in this section robust delay-dependent exponential stability conditions for the uncertain stochastic neural networks (1). Theorem 1. Given constants t, m and r satisfying (3). Then the trivial solution of the stochastic neural network (1) is robustly exponentially stable if there exist positive matrices P, Q1, Q2, R, Z, and matrices Ni, Si, Mi, Ti, ði ¼ 1,2, . . . ,7Þ, and scalars b1 4 0, b2 4 0, e1 4 0, e2 4 0, such that the following LMI holds: 2 3 U N T ME1 þ SE2 þ e1 wT1 G8 þ e2 wT2 G9 MH SH 6 7 1 6 7 6 R 0 0 0 0 7 6 7 t 6 7 6 7 1 6 R 0 0 0 7 6 7 o0, t 6 7 6 7 1 6 Z þ e1 GT8 G8 þ e2 GT9 G9 0 0 7 6 7 r 6 7 6 7 e I 0 1 4 5 e2 I ð7Þ where
U ¼ ðSij Þ77 þ e1 wT1 w1 þ e2 wT2 w2 ,
F. Deng et al. / Neurocomputing 74 (2011) 1503–1509
Z
S11 ¼ Q1 þQ2 þ b1 LT Lþ N1 þN1T M1 AAT M1T þS1 C þ C T ST1 , S12 ¼
Z
0
t
þ t
N1 þN2T AT M2T þS1 D þC T ST2 þT1 ,
1505
Z
yT ðsÞRyðsÞ ds dy þ
tþy
Z
0
t
f T ðxðsÞÞZf ðxðsÞÞ ds dy,
ð11Þ
tþy
r
S13 ¼ P þ N3T M1 AT M3T þC T ST3 ,
where P 40, Q1 40,Q2 4 0, R4 0 and Z 40 are to be determined. For the stochastic process fxt ,t Z 0g, by Itˆo’s differential formula we get
S14 ¼ N4T S1 AT M4T þ C T ST4 ,
dVðxt ,tÞ ¼ LVðxt ,tÞ þ 2xT ðtÞPgðtÞ dwðtÞ,
S15 ¼ N5T þ M1 W0 AT M5T þ S1 W2 þC T ST5 ,
where
ð12Þ
S22 ¼ ð1mÞQ1 þ b2 LT LN2 N2T þS2 D þDT ST2 þ T2 þ T2T ,
LVðxt ,tÞ ¼ 2xT ðtÞPyðtÞ þ g T ðtÞPgðtÞ þ xT ðtÞQ1 xðtÞ ð1t_ ðtÞÞxT ðttðtÞÞQ1 xðttðtÞÞ þ xT ðtÞQ2 xðtÞ Z t yT ðsÞRyðsÞ ds xT ðttÞQ2 xðttÞ þ tyT ðtÞRyðtÞ tt Z t T þ rf ðxðtÞÞZf ðxðtÞÞ f T ðxðsÞÞZf ðxðsÞÞ ds
S23 ¼ N3T M2 þ DT ST3 þ T3T , S24 ¼ N4T S2 þ DT ST4 þT4T ,
r2xT ðtÞPyðtÞ þ g T ðtÞPgðtÞ þ xT ðtÞðQ1 þ Q2 ÞxðtÞ
S16 ¼ N6T þ M1 W1 AT M6T þ S1 W3 þ C T ST6 , S17 ¼ N7T AT M7T þ C T ST7 T1 ,
tr
xT ðttÞQ2 xðttÞð1mÞxT ðttðtÞÞQ1 xðttðtÞÞ Z t Z ttðtÞ yT ðsÞRyðsÞ ds yT ðsÞRyðsÞ ds þ tyT ðtÞRyðtÞ
S25 ¼ N5T þ M2 W0 þ S2 W2 þ DT ST5 þT5T , S26 ¼ N6T þ M2 W1 þ S2 W3 þ DT ST6 þT6T ,
ttðtÞ T
þ rf ðxðtÞÞZf ðxðtÞÞ
S27 ¼ N7T þ T7T þDT ST7 T2 , S33 ¼ tRM3 M3T ,
Z
ttðtÞ
S46 ¼ M4 W1 ST6 þ S4 W3 , S47 ¼ ST7 T4 ,
f ðxðsÞÞZf ðxðsÞÞ ds:
t
1
ttðtÞ
Z
t
ttðtÞ
T Z ttðtÞ yðsÞ ds R
tt
f T ðxðsÞÞZf ðxðsÞÞ ds r
trðtÞ
yðsÞ ds ,
tt
1 r
Z
t
T Z f ðxðsÞÞ ds Z
trðtÞ
t
f ðxðsÞÞ ds :
trðtÞ
From (8)–(10) the following equations are true for any matrices N, T, M and S with appropriate dimensions: Z t Z t yðsÞ ds gðsÞ dwðsÞ ¼ 0, ð17Þ 2eT ðtÞN xðtÞxðttðtÞÞ
S57 ¼ W0T M7T þ W2T ST7 T5 , S67 ¼ W1T M7T þ W3T ST7 T6 , S77 ¼ Q2 T7 T7T ,
ttðtÞ
N6T
ttðtÞ
ð16Þ
S66 ¼ b2 I þ M6 W1 þ W1T M6T þ S6 W3 þW3T ST6 ,
N5T
ð13Þ
ð15Þ Z
S56 ¼ M5 W1 þ W0T M6T þ S5 W3 þW2T ST6 ,
N
yT ðsÞRyðsÞ dsr
tt
S55 ¼ b1 I þ rZ þM5 W0 þ W0T M5T þ S5 W2 þ W2T ST5 ,
N4T
t
ttðtÞ
S44 ¼ PS4 ST4 , S45 ¼ M4 W0 ST5 þ S4 W2 ,
N3T
T
By Lemma 3 and 0 o tðtÞ r t, 0 o ttðtÞ r t and 0 orðtÞ rr, Z t T Z t Z t 1 yT ðsÞRyðsÞ dsr yðsÞ ds R yðsÞ ds , ð14Þ
S36 ¼ M3 W1 M6T þS3 W3 , S37 ¼ M7T T3 ,
N2T
tt t
trðtÞ
S34 ¼ M4T S3 , S35 ¼ M3 W0 M5T þ S3 W2 ,
¼ ½N1T
Z
Z 2eT ðtÞT xðttðtÞÞxðttÞ
N7T T ,
ttðtÞ
ttðtÞ
yðsÞ ds
tt
S ¼ ½ST1 ST2 ST3 ST4 ST5 ST6 ST7 T ,
Z
ttðtÞ
gðsÞ dwðsÞ ¼ 0,
tt
ð18Þ 2eT ðtÞM AðtÞxðtÞ þ W0 ðtÞf ðxðtÞÞ þ W1 ðtÞf ðxðttðtÞÞÞ Z t f ðxðsÞÞ dsyðtÞ ¼ 0, þ E1 ðtÞ
M ¼ ½M1T M2T M3T M4T M5T M6T M7T T , T ¼ ½T1T T2T T3T T4T T5T T6T T7T T ,
w1 ¼ ½G1 0 0 0 G2 G3 0, w2 ¼ ½G4 G5 0 0 G6 G7 0: Proof. For simplicity, we denote yðtÞ ¼ AðtÞxðtÞ þ W0 ðtÞf ðxðtÞÞ þ W1 ðtÞf ðxðttðtÞÞÞ þE1 ðtÞ
Z
ð19Þ
trðtÞ
t
f ðxðsÞÞ ds,
2eT ðtÞS CðtÞxðtÞ þ DðtÞxðttðtÞÞ þ W2 ðtÞf ðxðtÞÞ Z t f ðxðsÞÞ dsgðtÞ ¼ 0, þ W3 ðtÞf ðxðttðtÞÞÞ þ E2 ðtÞ
ð20Þ
trðtÞ
trðtÞ
ð8Þ gðtÞ ¼ CðtÞxðtÞ þ DðtÞxðttðtÞÞ þ W2 ðtÞf ðxðtÞÞ Z t f ðxðsÞÞ ds, þ W3 ðtÞf ðxðttðtÞÞÞþ E2 ðtÞ
eT ðtÞ ¼ ½xT ðtÞ xT ðttðtÞÞ yT ðtÞ g T ðtÞ f T ðxðtÞÞ f T ðxðttðtÞÞÞ xT ðttÞ: ð9Þ
trðtÞ
It is obvious from (4) that
then, system (1) can be rewritten as dxðtÞ ¼ yðtÞ dt þ gðtÞ dwðtÞ:
b1 xT ðtÞLT LxðtÞb1 f T ðxðtÞÞf ðxðtÞÞ Z 0,
ð21Þ
b2 xT ðttðtÞÞLT LxðttðtÞÞb2 f T ðxðttðtÞÞÞf ðxðttðtÞÞÞZ 0:
ð22Þ
ð10Þ
Choose a Lyapunov–Krasovskii functional candidate as Z t Z t xT ðsÞQ1 xðsÞ ds þ xT ðsÞQ2 xðsÞ ds Vðxt ,tÞ ¼ xT ðtÞPxðtÞ þ ttðtÞ
where
tt
Thus combining (12)–(22), we get T
dVðxt ,tÞ r z ðtÞFzðtÞ þ xðdwðtÞÞ,
ð23Þ
1506
F. Deng et al. / Neurocomputing 74 (2011) 1503–1509
where 2 ðSij Þ77 6 6 6 6 F¼6 6 6 6 6 4
matrices Ni, Si, Mi, Ti, ði ¼ 1,2, . . . ,7Þ, and scalars b1 40, b2 4 0, N 1 R
t
T
ME1 þ SE2
0
0
1 R
0
1 Z
t
t
3
e1 4 0, e2 4 0, such that the following LMI holds:
7 7 7 7 7 7 þ DO1 þ DO2 , 7 7 7 5
2
U
6 6 6 6 6 6 6 6 6 6 6 6 6 4
DO1 ¼ MHFðtÞ½w1 0 0 G8 þ ½w1 0 0 G8 T F T ðtÞHT MT ,
T
ME1 þ SE2
MH
0
0
0
1 R
0
0
N 1 R
t
t
1 Z r
e1 I
0
SH
3
7 0 7 7 7 7 0 7 7 7 o 0, 7 0 7 7 7 7 0 5 e2 I
ð28Þ
DO2 ¼ SHFðtÞ½w2 0 0 G9 þ ½w2 0 0 G9 T F T ðtÞHT ST ,
zT ðtÞ ¼ eT ðtÞ
Z
t
yT ðsÞ dðsÞ
Z
ttðtÞ
ttðtÞ
yT ðsÞ dðsÞ
tt
2eT ðtÞT
t
Remark 3. Corollary 1 provides a less conservative delay-dependent exponential stability condition for a class of uncertain stochastic neural networks with mixed delays. It is noticed that, as discussed in [31], the stability problem studied in [20,21,28] are also a special case of this paper, and we can easily obtain the stability conditions from Corollary 1 in those literatures.
f T ðxðsÞÞ dðsÞ ,
trðtÞ
xðdwðtÞÞ ¼ 2xT ðtÞPgðtÞdwðtÞ2eT ðtÞN Z
Z
Z
t
gðsÞ dwðsÞ ttðtÞ
ttðtÞ
gðsÞ dwðsÞ: tt
By Lemma 2, there exist scalars e1 4 0 and e2 40 such that the following inequalities hold: T T T DO1 r e1 1 MHH M þ e1 ½w1 0 0 G8 ½w1 0 0 G8 ,
ð24Þ
T T T DO2 r e1 2 SHH S þ e2 ½w2 0 0 G9 ½w2 0 0 G9 :
ð25Þ
By using Lemma 1, it is easily derived that the LMI (7) implies F o 0. Set l ¼ lmin fFg, then from (23), we get T
EfdVðxðtÞ,rðtÞÞg rEfz ðtÞFzðtÞg r lEfJxðtÞJ2 þ JxðttðtÞÞJ2 þJxðtrðtÞÞJ2 g:
ð26Þ
Then, using a similar method to that used in [31], we can prove that system (1) is robust exponentially stable. This completes the proof. & Remark 1. In terms of LMI, Theorem 1 provides a sufficient condition for the exponential stability of uncertain stochastic neural networks with mixed delays (1). For given m, it is desirable to find the maximum allowable delay t and r, which guarantees the robust exponential stability of neural networks (1). Remark 2. By using the free-weighting matrix method [9,10,30] to study the delay-dependent stability of uncertain stochastic neural networks with mixed delays, the new delay-dependent stability condition has been derived, which is much less conservative than the existing stability condition given in [29–31]. In what follows, we will show that our result can be specialized to several cases including those studied in the literature. The following two corollaries given as follows are consequences of Theorem 1 and hence the proofs are omitted. We first consider the case that DE1 ðtÞ ¼ DE2 ðtÞ ¼ 0, that is, uncertain neural networks (1) has the following form in [31]: Z dxðtÞ ¼ AðtÞxðtÞ þ W0 ðtÞf ðxðtÞÞ þ W1 ðtÞf ðxðttðtÞÞÞ þE1
t
f ðxðsÞÞ ds dt
Next, we consider the case that E1(t)¼E2(t)¼0, that is, uncertain neural networks (1), without distributed delays, has the following form in [30,31]: dxðtÞ ¼ ½AðtÞxðtÞ þ W0 ðtÞf ðxðtÞÞ þW1 ðtÞf ðxðttðtÞÞÞ dt þ ½CðtÞxðtÞ þ DðtÞxðttðtÞÞ þ W2 ðtÞf ðxðtÞÞ þ W3 ðtÞf ðxðttðtÞÞÞ dwðtÞ:
ð29Þ
Corollary 2. Given constants t, m. Then the trivial solution of the stochastic neural network (29) is robustly exponentially stable if there exist positive matrices P, Q1, Q2 R, and matrices Ni, Si, Mi, Ti, ði ¼ 1,2, . . . ,7Þ, and scalars b1 40, b2 4 0, e1 4 0, e2 4 0, such that the following LMI holds: 2 3 U N T MH SH 6 7 6 1R 0 0 0 7 6 7 t 6 7 6 7 1 o0, ð30Þ 6 R 0 0 7 6 7 t 6 7 6 7 0 5 e1 I 4 e2 I where
U ¼ ðS ij Þ77 þ e1 wT1 w1 þ e2 wT2 w2 , S ij ¼ Sij ði,j ¼ 1,2, . . . ,7,ði,jÞ a ð5,5ÞÞ, S 55 ¼ b1 I þ M5 W0 þW0T M5T þS5 W2 þ W2T ST5 : Remark 4. In many cases, the information of the derivative of delay is unknown, so by choosing Q1 ¼0, Theorem 1 and Corollaries 1 and 2 reduce to a delay-dependent and rateindependent stability condition, which don’t require the differentiability (even the continuity) of the time-varying delay, even less than 1. Hence, Theorem 1 and Corollaries 1 and 2 here is less conservative than the condition given in [29–31].
trðtÞ
4. Numerical examples
þ CðtÞxðtÞ þ DðtÞxðttðtÞÞ þ W2 ðtÞf ðxðtÞÞ Z
f ðxðsÞÞ ds dwðtÞ:
ð27Þ
In this section, we shall use three examples to illustrate the effectiveness and less conservativeness of our results.
Corollary 1. Given constants t, m and r satisfying (3). Then the trivial solution of the stochastic neural network (27) is robustly exponentially stable if there exist positive matrices P, Q1, Q2, R, Z, and
Example 1. Consider the uncertain stochastic neural networks in (27) with parameters as in [31, Example 1]: 4 0 0:3 0 A¼ , C¼ , 0 0:3 0 6
þ W3 ðtÞf ðxðttðtÞÞÞ þE2
t
trðtÞ
F. Deng et al. / Neurocomputing 74 (2011) 1503–1509
D¼
W1 ¼
W3 ¼
0:5
0:1
0:5
2:272
0:4
0:2
0:1
0:7
,
0:3
0:6
0:2
0:1
E1 ¼ E2 ¼ I,
,
W0 ¼
W2 ¼ ,
H¼
L ¼ 0:2I,
0:2
4
0:1
0:3
0:2
0:6
0:5
0:1
0:1 0:1
,
Table 1 Maximum allowable delay t for different m.
,
,
m
0
0.5
0.9
Unknown m
Chen et al. [30] Li et al. [31] Corollary 2
0.1444 0.4153 0.4153
0.0965 0.3051 0.3259
0.0543 0.1675 0.2368
0.0543 0.1675 0.2368
0
Gi ¼ ½1 1 ði ¼ 1,2, . . . ,7Þ,
when m ¼ 0:5 or m is unknown, it is found that the uncertain stochastic neural networks (27) is robustly exponentially stable for 0 o t r0:0363, 0 o r r 0:0363 in [31]. But by setting Q1 ¼0 in Corollary 1, we get the uncertain stochastic neural networks (27) is robustly exponentially stable for 0 o t r 0:0514, 0 o r r 0:0514. Fig. 1 shows the state response of delayed SNNs (27).
D¼
H¼
0:2 0:5
0:1 , 0:1
0:5 , 1
1 1 0:1 0:1
,
Gi ¼ ½0:1 0:1
C¼
0:5
0
0
2
W2 ¼ W3 ¼
x1 (t) x2 (t)
−1 −1.5 −2
,
−2.5
0:1
0
0
0:1
−3
,
0
0.5
1
1.5
x1 (t) x2 (t)
1
2.5
3
m
0
0.5
0.9
Unknown m
Huang et al. [29] Chen et al. [30] Li et al. [31] Corollary 2
– 0.1464 0.4199 0.4199
– 0.1027 0.3198 0.3391
– 0.0579 0.1747 0.2470
0.0317 0.0579 0.1747 0.2470
exponential stability of the uncertain stochastic neural networks. For comparisons of our results with [29–31], see Table 2. Example 3. Consider the uncertain stochastic neural networks in (1) with parameters as follows: 5 0 0:3 4 A¼ , W0 ¼ , 0:1 0:3 0 7 W1 ¼
W3 ¼
0.5 0
D¼
E1 ¼
−1
0:2
0:1
0:6
0:3
0:6 , 0:1
0:5 0:5
,
0:4
0:3
−0.5
0:1 , 2
0:4
0:3
0:1
0:6
,
W2 ¼
C¼
−2 1
1.5 t/s
2
Fig. 1. State response of SNNs (27).
2.5
3
0:2
0:4
0:5
0:1
0:5
0
0
1
, ,
0:1 H¼ , 0:1 E2 ¼
G4 ¼ G5 ¼ G6 ¼ G7 ¼ 1
−1.5
0.5
2
Table 2 Maximum allowable delay t for different m.
ði ¼ 1,2, . . . ,7Þ:
2
0
1.5 t/s
Fig. 2. State response of SNNs (29).
L ¼ 0:1I,
For different m, by Corollary 2, we can get the maximum time delay t, which guarantees uncertain stochastic neural networks (29) is robustly exponentially stable. When m is unknown, by setting Q1 ¼0 in Corollary 2, we obtain the allowable maximum delay t. For comparisons of our results with [30,31], see Table 1. Fig. 2 shows the state response of delayed SNNs (27). Specially, when DCðtÞ ¼ DDðtÞ ¼ W2 ðtÞ ¼ W3 ðtÞ ¼ 0, the uncertain stochastic neural networks became the one in [29], it is found that the uncertain stochastic neural networks is globally stable for all 0 r t r 0:0317. Corollary 2 in this paper, Theorem 2 in [30] and Theorem 2 in [31] all provide a sufficient condition for the
x (t)
−0.5
x (t)
Example 2. Consider the uncertain stochastic neural networks in (29) with parameters as follows: 0:4 0:1 3 0 A¼ , , W0 ¼ 0:1 0 0 4 W1 ¼
1507
0:5
0
0:1
1
1 ,
,
G1 ¼ G2 ¼ G3 ¼ 1
G8 ¼ G9 ¼ 1
1 ,
1 ,
L ¼ 0:3I:
By Theorem 1, when m ¼ 0, we conclude that if 0 o t r1:6414, 0 o r r 1:6414, uncertain stochastic neural networks (1) is robustly exponentially stable. When m ¼ 0:5, it is found that if 0 o t r0:281, 0 or r 0:281, uncertain stochastic neural networks (1) is robustly exponentially stable. When m is unknown,
1508
F. Deng et al. / Neurocomputing 74 (2011) 1503–1509
2 x1 (t) x2 (t)
1.5
x (t)
1
0.5
0
−0.5 0
0.5
1
1.5
2
2.5
3
t/s Fig. 3. State response of SNNs (1).
by setting Q1 ¼0 in Theorem 1, we get uncertain stochastic neural networks (1) is robustly exponentially stable for 0 o t r 0:207, 0 o r r0:207. Fig. 3 shows the state response of delayed SNNs (1).
5. Conclusions The robust exponential stability of uncertain stochastic neural networks with mixed delays has been investigated in this paper. Based on a new Lyapunov–Krasovskii functional method, an LMI approach has been developed to solve the exponential stability of stochastic uncertain neural networks with mixed delays, which can be tested easily using the Matlab LMI toolbox. Some new stability conditions have been derived by introducing some freely chosen matrices and the stability conditions are dependent on the size of time delay. Numerical examples have been presented to illustrate the effectiveness and less conservativeness of our results over the existing ones.
Acknowledgments This work was supported by The Fundamental Research Funds for the Central Universities under Grant No. 2010B23014. The Natural Science Foundation of China under Grant No. 61074056 and 60874114, and NSERC Canada. References [1] S. Haykin, Neural Networks: A Comprehensive Foundation, Prentice Hall, New Jersey, 1998. [2] G. Joya, M.A. Atencia, F. Sandoval, Hopfield neural networks for optimization: study of the different dynamics, Neurocomputing 43 (2002) 219–237. [3] S.S. Young, P.D. Scott, N.M. Nasrabadi, Object recognition using multilayer Hopfield neural network, IEEE Trans. Image Process. 6 (1997) 357–372. [4] J. Cao, H.X. Li, L. Han, Novel results concerning global robust stability of delayed neural networks, Nonlinear Anal.: R. World Appl. 7 (2006) 458–469. [5] J. Cao, K. Yuan, H.X. Li, Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays, IEEE Trans. Neural Networks 17 (2006) 1646–1651. [6] X.Z. Liu, K.L. Teo, B.J. Xu, Exponential stability of impulsive high-order Hopfield-type neural networks with time-varying delays, IEEE Trans. Neural Networks 16 (2005) 1329–1339. [7] Q. Wang, X.Z. Liu, Exponential stability of impulsive cellular neural networks with time delay via Lyapunov functionals, Appl. Math. Comput. 194 (2007) 186–198. [8] S. Xu, Y. Chu, J. Lu, New results on global exponential stability of recurrent neural networks with time-varying delays, Phys. Lett. A 352 (2006) 371–379.
[9] Y. He, G.P. Liu, D. Rees, New delay-dependent stability criteria for neural networks with time-varying delay, IEEE Trans. Neural Networks 18 (2007) 310–314. [10] Y. He, G.P. Liu, D. Rees, M. Wu, Stability analysis for neural networks with time-varying interval delay, IEEE Trans. Neural Networks 18 (2007) 1850–1854. [11] W.-H. Chen, X. Lu, New delay-dependent exponential stability criteria for neural networks with variable delays, Phys. Lett. A 351 (2006) 53–58. [12] W.-H. Chen, X. Lu, D.-Y. Liang, Global exponential stability for discrete-time neural networks with variable delays, Phys. Lett. A 358 (2006) 186–198. [13] S. Blythe, X. Mao, X. Liao, Stability of stochastic delay neural networks, J. Franklin Inst. 338 (2001) 481–495. [14] H. Huang, J. Cao, Exponential stability analysis of uncertain stochastic neural networks with multiple delays, Nonlinear Anal.: R. World Appl. 8 (2007) 646–653. [15] J.H. Zhang, P. Shi, J.Q. Qiu, Novel robust stability criteria for uncertain stochastic Hopfield neural networks with time-varying delays, Nonlinear Anal.: R. World Appl. 8 (2007) 1349–1357. [16] L. Wan, J. Sun, Mean square exponential stability of stochastic delayed Hopfield neural networks, Phys. Lett. A 343 (2005) 306–318. [17] Z. Wang, H. Shu, J. Fang, X. Liu, Robust stability for stochastic Hopfield neural networks with time delays, Nonlinear Anal.: R. World Appl. 7 (2006) 1119–1128. [18] D.W. Tank, J.J. Hopfield, Neural computation by concentrating information in time, Proc. Natl. Acad. Sci. 84 (1987) 1896–1900. [19] B. De Vries, J.C. Principle, The gamma model—a new neural network for temporal processing, Neural Networks 5 (1992) 565–576. [20] Z. Wang, S. Lauria, J. Fang, X. Liu, Exponential stability of uncertain stochastic neural networks with mixed time-delays, Chaos Solitons Fractals 32 (2007) 62–72. [21] Z. Wang, J. Fang, X. Liu, Global stability of stochastic high-order neural networks with discrete and distributed delays, Chaos Solitons Fractals 36 (2008) 388–396. [22] Z. Wang, Y. Liu, M. Li, X. Liu, Stability analysis for stochastic Cohen–Grossberg neural networks with mixed time delays, IEEE Trans. Neural Networks 17 (2006) 814–820. [23] Z. Wang, Y. Liu, K. Fraser, X. Liu, Stochastic stability of uncertain hopfield neural networks with discrete and distributed delays, Phys. Lett. A 354 (2006) 288–297. [24] Y. Liu, Z. Wang, X. Liu, An LMI approach to stability analysis of stochastic high-order Markovian jumping neural networks with mixed time delays, Nonlinear Anal.: Hybrid Syst. 2 (2008) 110–120. [25] Y. Liu, Z. Wang, X. Liu, On global exponential stability of generalized stochastic neural networks with mixed time-delays, Neurocomputing 70 (2006) 314–326. [26] W. Su, Y. Chen, Global robust stability criteria of stochastic Cohen–Grossberg neural networks with discrete and distributed time-varying delays, Commun. Nonlinear Sci. Numer. Simulation 14 (2009) 520–528. [27] H. Zhang, Y. Wang, Stability analysis of Markovian jumping stochastic Cohen–Grossberg neural networks with mixed time delays, IEEE Trans. Neural Networks 19 (2008) 366–370. [28] J. Zhang, P. Shi, J. Qiu, H. Yang, A new criterion for exponential stability of uncertain stochastic neural networks with mixed delays, Math. Comput. Modelling 47 (2008) 1042–1051. [29] H. Huang, G. Feng, Delay-dependent stability for uncertain stochastic neural networks with time-varying delay, Physica A 381 (2007) 93–103. [30] W.H. Chen, X.M. Lu, Mean square exponential stability of uncertain stochastic delayed neural networks, Phys. Lett. A 372 (2008) 1061–1069. [31] H. Li, B. Chen, Q. Zhou, S. Fang, Robust exponential stability for uncertain stochastic neural networks with discrete and distributed time-varying delays, Phys. Lett. A 19 (2008) 3385–3394. [32] X. Mao, Stochastic Differential Equations and Their Applications, Horwood, Chichester, 1997. [33] S. Boyd, L. Ghaoui, E. Feron, V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory, SIAM, Philadephia, PA, 1994. [34] Y. Wang, L. Xie, C.E. de Souza, Robust control of a class of uncertain nonlinear systems, Syst. Control Lett. 19 (1992) 139–149. ¨ [35] K. Gu, V.L. Kharitonov, J. Chen, Stability of Time-Delay Systems, Birkhauser, Boston, 2003.
Feiqi Deng received his Ph.D. degree in Control Theory and Control Engineering from South China University of Technology in June 1997. From July 1997 to September 1999 he had been an associate professor in the College of Automation Science and Engineering, South China University of Technology. Since October 1999 he has been a professor and the director of the Systems Engineering Institute of College of Automation Science and Engineering, South China University of Technology. Since 1997 he also served concurrently as a professor of the Department of Automatic Control, Yanshan University. His main research interests include stability, stabilization and robust and variable structure control theory of complex systems, including time-delay systems, nonlinear systems and stochastic systems, and machine learning. In these areas,
F. Deng et al. / Neurocomputing 74 (2011) 1503–1509
1509
he has authored/co-authored more than 400 journal papers and a book in English and Chinese. From 2002 he has been serving as a vice chief editor of Journal of South China University of Technology and as an editor of Control Theory and Applications and Journal of Systems Engineering and Electronics.
editor of the journal DCDIS Series B: Applications and Algorithms, and associate editor of several other scientific journals. Dr. Liu served as a general chair for several international scientific conferences.
Mingang Hua was born in Jiangsu Province, China, in 1980. He is now he is a Ph.D. candidate in Automatic Control Theory and Engineering at South China University of Technology, Guangzhou, China. His current research interests include stochastic systems and stochastic neural networks.
Yunjian Peng was born in Hunan Province, China in 1974. He obtained his Ph.D. degree in Automatic Control Theory and Engineering from the South China University of Technology (SCUT), China in 2007. He is currently a lecturer in SCUT. His current research interests are stochastic dynamic systems’ stability analysis and control theory.
Xinzhi Liu received the B.Sc. degree in Mathematics from Shandong Normal University, Jinan, China in 1982, the M.Sc. and the Ph.D. degrees, both in Applied Mathematics, from University of Texas, Arlington in 1987 and 1988, respectively. He was a Post-Doctoral Fellow at the University of Alberta, Edmonton, AB, Canada from 1988 to 1990. He joined the Department of Applied Mathematics, University of Waterloo, Waterloo, ON, Canada in 1990, where he became an associate professor in 1994, and a professor in 1997. His research areas include systems analysis, stability theory, hybrid dynamical systems, impulsive control, chaos synchronization, nonlinear oscillations, artificial neural networks, and communication security. He is the author or coauthor of over 200 research articles, 2 research monographs and 10 edited books. He is the chief editor of the journal DCDIS Series A: Mathematical Analysis, and the chief
Juntao Fei received the B.S. degree in Electrical Engineering from Hefei University of Technology, China in 1991, the M.S. degree in Electrical Engineering from University of Science and Technology of China in 1998, and the M.S. and the Ph.D. degrees in Mechanical Engineering from The University of Akron, OH, USA in 2003 and 2007, respectively. During 2002/2003, he was a visiting scholar at the University of Virginia, VA, USA. He was a postdoctoral research fellow and visiting assistant professor at the University of Louisiana, LA, USA from 2007 to 2009. He is currently a professor of the College of Computer and Information at the Hohai University, China. He has authored and/or coauthored more than 60 referred technical papers and 2 books. His research interests include adaptive control, nonlinear control, intelligent control, dynamics and control of MEMS, and smart materials and structures.