ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
Contents lists available at ScienceDirect
ISA Transactions journal homepage: www.elsevier.com/locate/isatrans
Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays Yuanhua Du a,n, Shouming Zhong a, Jia Xu b, Nan Zhou c a
School of Mathematical Sciences, University of Electronic Science and Technology of China, Chengdu, Sichuan 611731, PR China Department of Financial Affairs Office, Sichuan University of Arts and Science of China, Dazhou, Sichuan 635000, PR China c School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu, Sichuan 611731, PR China b
art ic l e i nf o
a b s t r a c t
Article history: Received 25 February 2014 Received in revised form 3 July 2014 Accepted 15 November 2014
This paper is concerned with the delay-dependent exponential passivity analysis issue for uncertain cellular neural networks with discrete and distributed time-varying delays. By decomposing the delay interval into multiple equidistant subintervals and multiple nonuniform subintervals, a suitable augmented Lyapunov–Krasovskii functionals are constructed on these intervals. A set of novel sufficient conditions are obtained to guarantee the exponential passivity analysis issue for the considered system. Finally, two numerical examples are provided to demonstrate the effectiveness of the proposed results. & 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Keywords: Cellular neural networks Exponential passivity Linear matrix inequality Delay decomposition Lyapunov–Krasovskii functional Time-varying delays
1. Introduction As is well known, cellular neural networks (CNN) proposed by Chua and Yang in [1,2] have been extensively studied both in theory and applications. In order to take vagueness into consideration, parameter uncertainties are commonly encountered in modeling cellular neural networks due to the inaccuracies and changes in the environment of the model, and will break the stability of the systems. More specifically, the connection weights of the neurons are inherent, depending on certain resistance and capacitance values that inevitably bring in uncertainties during the parameter identification process. Deviations and perturbations in parameters are the main sources of uncertainty. So it is important to investigate the dynamical behaviors of cellular neural networks with uncertainty (see, e.g. [3–8] and the references therein). On the other hand, the passivity theory was firstly related to the circuit theory that plays an important role in the analysis and design of linear and nonlinear systems, especially for high-order systems [9]. It should be pointed out that the essence of the passivity theory is that the passive properties of a system can keep the system internal stability. Very recently, the exponential passivity of neural networks with time-varying delays has been studied in [10–12], where sufficient conditions have been obtained for the considered neural networks
n
Corresponding author. Tel.: þ 86 28 61831290; fax: þ86 28 61831280. E-mail address:
[email protected] (Y. Du).
to be exponential passivity. However, no authors take more attention to the distributed time-varying delays. And delay-dependent stability conditions, which contains information concerning time delay, are usually less conservative than delay-independent ones, especially for neural network with a small delay. The more general delay decomposition approach were studied in [16–19]. A piecewise delay method is firstly proposed. In this paper, in order to obtain some less conservative sufficient conditions, firstly, we decompose the delay interval ½ hu ; 0 into ½ hu ; hu =2 and ½ hu =2; 0. Secondly, we decompose the delay interval ½ hu =2; 0 into N equidistant subintervals. Furthermore, we choose different weighting matrices that is ½ hu =2; 0 ¼ ⋃N j ¼ 1 ½ jδ; ðj 1Þδ, where δ ¼ hu =2N. Lastly, we decompose the delay interval ½ hu ; hu =2 into M nonuniform subintervals and some scalars satisfying hu =2 oh0 o h1 o ⋯ o hM ¼ hu , on each subinterval, we choose different weighting matrices that is ½ hu ; hu =2 ¼ ⋃M i ¼ 1 ½ hi ; hi 1 . Up to now, there are almost no results on the problems of delay-dependent exponential passivity of cellular neural networks with discrete and distributed time-varying delays, which motivates this work. Motivated by the above discussions, we study the problem of delaydependent exponential passivity for a class of cellular neural networks with discrete and distributed time-varying delays in this paper. The aim of this paper is to derive some new sufficient conditions for the systems. The method is based on a Lyapunov functional with novel delay decompose. We also provide two numerical examples to demonstrate the effectiveness of the proposed stability results.
http://dx.doi.org/10.1016/j.isatra.2014.11.005 0019-0578/& 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Please cite this article as: Du Y, et al. Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays. ISA Transactions (2015), http://dx.doi.org/10.1016/j.isatra.2014.11.005i
Y. Du et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
2
2. Model description and preliminaries Consider the cellular neural networks with discrete and distributed time-varying delays described by the following state equations: 8 _ ¼ AxðtÞ þ B0 gðxðtÞÞ þB1 gðxðt hðtÞÞÞ xðtÞ > < Rt þB2 t τðtÞ gðxðsÞÞ dsþ uðtÞ ð1Þ > : yðtÞ ¼ gðxðtÞ þ gðxðt hðtÞÞÞ þ uðtÞ where xðtÞ ¼ ½x1 ðtÞ; x2 ðtÞ; …; xn ðtÞT A Rn is the neuron state vector, yðtÞ A Rn is the output vector of neuron networks, n denotes number of neurons in a neural network, gðxðtÞÞ ¼ ½g 1 ðx1 ðtÞÞ; g 2 ðx2 ðtÞÞ; …; g n ðxn ðtÞÞT A Rn denotes the activation function, gðxðt hðtÞÞÞ ¼ ½g 1 ðx1 ðt hðtÞÞÞ; g 2 ðx2 ðt hðtÞÞÞ; …; g n ðxn ðt hðtÞÞÞT A Rn , A ¼ diagða1 ; a2 ; …; 0 1 an Þis a positive diagonal matrix, B0 ¼ ðbij Þnn ; B1 ¼ ðbij Þnn , and 2 B2 ¼ ðbij Þnn are the interconnection matrices representing the weight coefficients of the neurons, uðtÞ A Rn is an external input vector to neurons. The delays, h(t) and τðtÞ are time-varying satisfying 0 r τðtÞ r τu ;
_ rh : 0 r hðtÞ r hu ; hðtÞ D
The activation functions g j ðxj ðtÞÞ ðj ¼ 1; 2; …; nÞ are assumed to be nondecreasing, bounded and globally Lipschiz that is, 0r
g j ðξ1 Þ g j ðξ2 Þ r lj ; ξ1 ξ2
g j ð0Þ ¼ 0;
ð2Þ
and ξ1 ; ξ2 A R, ξ1 a ξ2 ðj ¼ 1; 2; …; nÞ: where lj 4 0ði ¼ 1; 2; …; nÞ, we denote L ¼ diag½l1 ; l2 ; …; ln . It is noted from [4] that g j ð:Þ satisfies the following condition: 0r
g j ðξj Þ r lj ; ξj
8 ξj a0; j ¼ 1; …; n
ð3Þ
In this paper, we deal the problem of exponential passivity for system (1). The proposed criterion is delay-dependent both on τðtÞ and h(t) and provides a convergence rate for guaranteeing the exponential passivity of system (1). Before deriving our main results, we need the following definition and lemma. Definition 2.1. The neural networks are said to be exponentially passive from input u(t) to y(t), if there exists an exponential Lyapunov function (or, called the exponential storage function) Vðxt Þ, and a constant ρ 4 0 such that for all u(t), all initial conditions xðt 0 Þ, all t Z t 0 , the following inequality holds V_ ðxt Þ þ ρVðxt Þ r 2yT ðtÞuðtÞ;
t Z t0
ð4Þ
Lemma 2.1 (Boyd et al. [14] Schur complement). Given constant matrices Z 1 ; Z 2 ; Z 3 ; where Z 1 ¼ Z T1 and Z 2 ¼ Z T2 40. Then Z 1 þ Z T3 Z 2 1 Z 3 o 0 if and only if " # " # Z2 Z3 Z T3 Z1 o 0 or o0: Z T3 Z1 Z3 Z2 Lemma 2.2 (Sun et al. [15]). For any constant matrix M 40, any scalars a and b with a ob, and a vector function xðtÞ : ½a; b-Rn such that the integrals concerned are well defined, then the following holds "Z #T "Z # Z b
8 P 1 A AP 1 þ Q 1 e ρδ R1 þ N 1 þρP 1 ; > > > > > e ρði 1Þδ ðQ i Q i 1 Þ e ρiδ Ri e ρði 1Þδ Ri 1 ; > > > > e ρNδ ðR þ Q Þ; > > N N > > > > ð1 hD Þe ρhu N 1 ; > > > > > > e ρh0 X 1 e ρh1 W 1 ; > > > < e ρhi N 3 ðX i N 2 Xi N 3Þ ϑij ¼ ρhi N 2 > W i N 2 e ρhi N 3 W i N 3 ; > e > > > > > > > > ρhM > ðX M þ W M Þ; > e > > > ρiδ > Ri ; e > > > > > e ρhi N 2 W i N 2 ; > > > : 0;
M
a
a
xðsÞ ds r ðb aÞ
b
xðsÞT MxðsÞ ds:
a
Lemma 2.3. For all real vectors a; b and all matrix Q 40 with appropriate dimensions, it follows that: T
2aT b r aT Q 1 a þb Qb: Lemma 2.4 (Zhang and Hang [13]). For any constant matrix X A Rnn ; X ¼ X T 4 0,a scalar function h : ¼ hðtÞ 4 0, and a vectorvalued function x_ : ½ h; 0-Rn such that the following integrations are well defined: " #T # " Z 0 xðtÞ xðtÞ X X T _ _ h xðt þsÞ X xðt þ sÞ ds r : xðt hÞ xðt hÞ n X h Lemma 2.5 (Zhu et al. [10]). Given matrices Q ¼ Q T , H, E and R ¼ RT 40 with appropriate dimensions, then Q þ HFE þ ET F T H T o 0 for all F satisfying F T F r R, if and only if there exists an ε 4 0 such that Q þ εHH T þ εET RE o 0: 3. Main results In this section, we propose a new exponential passivity criterion for the cellular neural networks with discrete and distributed timevarying delays. Now, we have the following main results. Theorem 3.1. For given scalars hu 4 0; τu 4 0; ρ4 0 and hD o 1, the delayed cellular neural networks (1) is exponential passive, if there exist positive diagonal matrices P2, positive definite matrices P 1 ; U; Q j ; M j , Rj ðj ¼ 1; 2; …; NÞ, ½Xni YZ ii , W i ði ¼ 1; 2; …; MÞ, N1, N2, and any positive diagonal matrices K j ðj ¼ 0; 1; 2; …; NÞ, Kt and Si ði ¼ 0; 1; 2; …; MÞ with appropriate dimensions such that the following LMIs hold 0 1 Δ þ Ξ 1 Γ Θ1 B C Φ1 ¼ @ n ð5Þ Λ Θ2 A o 0;
where V_ ðxt Þ denotes the total derivative of V ðxt Þ along the state trajectories x(t) of system (1). The parameter ρ provides an exponential convergence information about an upper bound of exponential Lyapunov function. If ρ increases, then tighter bound about Lyapunov function than the results in case of ρ Z 0 can be provided.
b
xðsÞ ds
0 B Φ2 ¼ @
n
n
R
Δ þ Ξ2
Γ
Θ1
n
Λ
n
n
where
i ¼ j ¼ 1; i ¼ j ¼ 2; 3; …; N; i ¼ j ¼ N þ 1; i ¼ j ¼ N þ 2; i ¼ j ¼ N þ 3;
i ¼ j ¼ N þ 4; N þ 5; …; N þ M þ 2; i ¼ j ¼ N þ M þ 3; i; j ¼ i þ 1; i; j A f1; 2; 3; …; N 1g; i; j ¼ i þ 1; i; j A fN þ 3; N þ 4; …; N þ M þ 2g; otherwise;
1
C Θ2 A o 0; R
ð6Þ
Y. Du et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
Δ ¼ ½ϑij ðN þ M þ 3ÞðN þ M þ 3Þ ;
V 3 ðxt Þ ¼ ∑
i ¼ j ¼ 2; 3; …; N; i ¼ j ¼ N þ 1; i ¼ j ¼ N þ 2; i ¼ j ¼ N þ 3;
t hi 1
M
þ ∑ h^i
Z
i ¼ j ¼ N þ 4; N þ 5; …; N þM þ2; i ¼ j ¼ N þ M þ 3; i ¼ j ¼ N þ M þ 4; i ¼ j ¼ N þ M þ 5;
Z V 4 ðxt Þ ¼ τu Z
Z
t
t τu t
_ _ ds dθ e ρðt sÞ xðsÞR j xðsÞ
"
hi 1
t
t
e ρðt sÞ Z
t
tþθ
hi
i¼1
Z
t þθ
t hi
i¼1
xðsÞ gðxðsÞÞ
#T "
Xi
Yi
n
Zi
#"
# xðsÞ ds gðxðsÞÞ
_ ds dθ; e ρðt sÞ x_ T ðsÞW i xðsÞ
e ρðt sÞ g T ðxðuÞÞUgðxðuÞÞ du ds:
s
e ρðt sÞ xT ðsÞN1 xðsÞ ds
i ¼ 1; j ¼ N þ 2; i ¼ 1; j ¼ N þ M þ 4; i ¼ 1; j ¼ N þ M þ 5; i ¼ N þ2; j ¼ N þ M þ 5;
V 5 ðxt Þ ¼
otherwise;
with h^ i ¼ hi hi 1 ði ¼ 1; 2; …; MÞ. Taking the derivative of (7) along the solution of system (1) yields
t hðtÞ Z t
þ
t hðtÞ
Γ ¼ ½ιij ðN þ M þ 3ÞðN þ M þ 5Þ
e ρðt sÞ g T ðxðsÞÞN2 gðxðsÞÞ ds;
_ þ2g T ðxðtÞÞP 2 xðtÞ; _ V_ 1 ðxt Þ ¼ 2xT ðtÞP 1 xðtÞ
8 P B AT P 2 þ LK 0 þρP 2 ; > > > 1 0 > > LK > i 1; > > > > > LK t ; > > > > > e ρh0 Y 1 þ LS0 ; > > > < e ρhi N 3 ðY i N 2 Y i N 3 Þ þLSi N 3 ; ιij ¼ ρhM > Y M þLSM ; > e > > > > P 1 B1 ; > > > > > > P 1 B2 ; > > > > P1 ; > > > : 0;
i ¼ j ¼ 1; i ¼ j ¼ 2; 3; …; N þ1; i ¼ j ¼ N þ2; i ¼ j ¼ N þ3; i ¼ j ¼ N þ4; N þ 5; …; N þ M þ 2;
ð8Þ
N
V_ 2 ðxt Þ r ρV 2 ðxt Þ þ ∑ ½e ρðj 1Þδ xT ðt ðj 1ÞδÞQ j xðt ðj 1ÞδÞ j¼1
e ρjδ xT ðt jδÞQ j xðt jδÞ þ e ρðj 1Þδ g T ðxðt ðj 1ÞδÞÞM j gðxðt ðj 1ÞδÞÞ
i ¼ j ¼ N þM þ3; i ¼ 1; j ¼ N þ2; i ¼ 1; j ¼ N þM þ4; i ¼ 1; j ¼ N þM þ5;
e ρjδ g T ðxðt jδÞÞM j gðxðt jδÞÞ " Z N
þ ∑
j¼1
otherwise;
_ δe ρjδ δ2 x_ T ðtÞRj xðtÞ "
M
V_ 3 ðxt Þ ¼ ρV 3 ðxt Þ þ ∑ e ρhi 1
Ξ 1 ¼ ½ψ ij ðN þ M þ 3ÞðN þ M þ 3Þ ;
t ðj 1Þδ
t jδ
xðt hi 1 Þ
# _ ds ; x_ T ðtÞRj xðtÞ
ð9Þ
#T
gðxðt hi 1 ÞÞ # xðt hi 1 Þ n Zi gðxðt hi 1 ÞÞ #" " #T " # M Xi Y i xðt hi Þ xðt hi Þ ∑ e ρhi n Zi gðxðt hi ÞÞ gðxðt hi ÞÞ i¼1 " # Z t hi 1 M 2 _ h^ i e ρhi _ ds ; x_ T ðsÞW i xðsÞ þ ∑ x_ T ðtÞh^ i W i xðtÞ "
i ¼ N þ2; j ¼ N þ 2; i ¼ k; j ¼ k þ 1; i A fk; k þ 1g; j ¼ N þ2; otherwise;
Ξ 2 ¼ ½φij ðN þ M þ 3ÞðN þ M þ 3Þ ; 8 > 2e ρhu W κ ; > > > < e ρhu W ; κ φij ¼ > e ρhu W κ ; > > > : 0;
Z
M
i ¼ j ¼ 1;
ðj 1Þδ jδ
j¼1
8 M 1 þτ2u U þP 2 B0 þB0 P 2 þN 2 2K 0 ; > > > > e ρði 1Þδ ðM M Þ 2K > > i i1 i 1; > > > > e ρNδ M N 2K N ; > > > > > 2K ð1 hD Þe ρhu N 2 ; > > > ρh t > > 0 Z 2S ; e > 1 0 > > > > > e ρhi N 3 ðZ i N 2 Z i N 3 Þ 2Si N 3 ; > > > < e ρhM Z 2S ; M M λij ¼ > e ρτu U; > > > > > 2I; > > > > > P 2 B1 ; > > > > > > P 2 B2 ; > > > > P 2 I; > > > > > > I; > > : 0;
8 > 2e ρkδ Rk ; > > > < e ρkδ R ; k ψ ij ¼ > e ρkδ Rk ; > > > : 0;
Z
N
þ ∑ δ
Λ ¼ ½λij ðN þ M þ 5ÞðN þ M þ 5Þ ;
3
i¼1
Xi
Yi
#"
t hi
i¼1
ð10Þ
i ¼ N þ 2; j ¼ N þ 2; i ¼ N þ 2; j A fN þ 2 þ κ þ 1; N þ 2 þκ þ 2g;
V_ 4 ðxt Þ r ρV 4 ðxt Þ þ τ2u g T ðxðtÞÞUgðxðtÞÞ Z t g T ðxðsÞÞUgðxðsÞÞ ds; e ρτu τu
i ¼ N þ 2 þκ þ1; j ¼ N þ 2 þκ þ 2; otherwise;
ð11Þ
t τu
M
R ¼ δ2 ðR1 þ R2 þ ⋯ þ RN Þ þ ∑ h^ i W i ;
V_ 5 ðxt Þ r ρV 5 ðxt Þ þ hu xT ðtÞN 1 xðtÞ ð1 hD Þe ρhu xT ðt hðtÞÞN1 xðt
i¼1
hðtÞÞ þ hu g T ðxðtÞÞN 2 gðxðtÞÞ ð1 hD Þe ρhu g T ðxðt hðtÞÞÞN 2 gðxðt hðtÞÞÞ:
Θ1 ¼ ½ A01ðN þ M þ 2Þ T ; Θ2 ¼ ½B0 01N B1 01M B2 I: Proof. Consider the following Lyapunov–Krasovskii functional: 5
Vðxt Þ ¼ ∑ V i ðxt Þ;
ð7Þ
i¼1
(a) We now disclose the interrelationship between xðt hðtÞÞ and x (t), xðt δÞ; …; xðt NδÞ by utilizing the integral terms in V_ 2 ðxt Þ. Since h(t) is a continuous function satisfying 0 r hðtÞ r hu =2 ð 8 t Z 0Þ, there should exist a positive integer k A f1; 2; …; Ng such that hðtÞ A ½ðk 1Þδ; kδ. In this situation, Z t ðk 1Þδ _ ds x_ T ðtÞRk xðtÞ δe ρkδ
where n
T
Z
V 1 ðxt Þ ¼ x ðtÞP 1 xðtÞ þ 2 ∑ p2j j¼1
N
V 2 ðxt Þ ¼ ∑
j¼1
Z
t ðj 1Þδ t jδ
xj ðtÞ 0
" e ρðt sÞ
¼ δe
g j ðsÞ ds;
xðsÞ gðxðsÞÞ
#T "
Qj 0
0 Mj
#"
xðsÞ gðxðsÞÞ
t kδ Z t hðtÞ
ρkδ
t kδ
_ ds δe ρkδ x_ T ðtÞRk xðtÞ
r ½kδ hðtÞe ρkδ
# ds
ð12Þ
½hðtÞ ðk 1Þδe
Z
t hðtÞ
Z
t ðk 1Þδ t hðtÞ
_ ds x_ T ðtÞRk xðtÞ
_ ds x_ T ðtÞRk xðtÞ
t kδ Z t ðk 1Þδ ρkδ t hðtÞ
_ ds x_ T ðtÞRk xðtÞ
ð13Þ
Please cite this article as: Du Y, et al. Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays. ISA Transactions (2015), http://dx.doi.org/10.1016/j.isatra.2014.11.005i
Y. Du et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
4
Applying Lemma 2.4 to the last two integral in (13) and after simple manipulations, we have Z t ðk 1Þδ _ ds δe ρkδ x_ T ðtÞRk xðtÞ 2 6 rηT1 ðtÞ4
t kδ ρkδ
e
Rk
0
e
e ρkδ Rk
n n
ρkδ
Rk
e ρkδ Rk 2e
n
ρkδ
3 7 5η1 ðtÞ;
"
Δ þ Ξ1
Γ
n
Λ
r "
t jδ
#T "
e ρjδ Rj
n
e ρjδ Rj
xðt jδÞ xðt ðj 1ÞδÞ
#
ð15Þ
xðt jδÞ
Applying the Lemma 2.2 and the Lemma 2.4, the integral term Rt R t h e ρτu τu t τu g T ðxðsÞÞUgðxðsÞÞ ds and h^ i e ρhu t hii 1 x_ T ðsÞW i x_ ðsÞ ds, respectively, can be estimated as Z t e ρτu τu g T ðxðsÞÞUgðxðsÞÞ ds t τu
r e ρτu h^ i e ρhi " re ρhi
Z
T
t
gðxðsÞÞ ds
Z
t τu
Z
t hi 1
t hi
t
U
gðxðsÞÞ ds
ð16Þ
t τu
_ ds x_ T ðsÞW i xðsÞ
xðt hi 1 Þ xðt hi Þ
#T "
Wi
Wi
n
Wi
#"
xðt hi 1 Þ
#
xðt hi Þ
2x ðt jδÞÞLK j gðxðt jδÞÞ 2g ðxðt jδÞÞK j gðxðt jδÞÞ Z 0; ⋮ 2x ðt NδÞLK N gðxðt NδÞÞ 2g ðxðt NδÞÞK N gðxðt NδÞÞ Z 0; ð18Þ 2x ðt h0 ÞLS0 gðxðt h0 ÞÞ 2g ðxðt h0 ÞÞS0 gðxðt h0 ÞÞ Z 0; ⋮ 2xT ðt hi ÞLSi gðxðt hi ÞÞ 2g T ðxðt hi ÞÞSi gðxðt hi ÞÞ Z 0; ⋮ ð19Þ
By Eqs. (2) and (3), we know the gð:Þ is nondecreasing, and g i ðxi Þ and xi are the same sign. We have the following equation: Z xj ðtÞ n n g j ðsÞ ds r 2 ∑ p2j ∣xj ðtÞ∣∣g j ðxj ðtÞÞ∣ 2 ∑ p2j j¼1
n
¼ 2 ∑ p2j xj ðtÞg j ðxj ðtÞÞ ¼ 2xT ðtÞP 2 gðxðtÞÞ
ð20Þ
j¼1
Then V_ ðxðtÞÞ þρV ðxðtÞÞ 2yT ðtÞuðtÞ o ηT ðtÞ 2" #T " #3 # " Θ1 Θ1 Δ þ Ξ1 Γ 4 5ηðtÞ r 0: R þ Θ2 Θ2 n Λ
Z
t hðtÞ t hκ
r ½hκ hðtÞe ρhκ ½hðtÞ hκ 1 e ρhκ
6 r ηT2 ðtÞ4
Z
t hκ t hκ
_ ds h^ κ e ρhκ x_ T ðtÞW κ xðtÞ
Z Z
t hðtÞ
t hκ
Z
t hκ 1 t hðtÞ
_ ds x_ T ðtÞW κ xðtÞ
_ ds x_ T ðtÞW κ xðtÞ
t hκ 1 t hðtÞ
_ ds; x_ T ðtÞW κ xðtÞ
ð23Þ
_ ds x_ T ðtÞW κ xðtÞ
2e ρhκ W κ
e ρhκ W κ
e ρhκ W κ
n
e ρhκ W κ
0
n
n
e ρhκ W κ
3 7 5η2 ðtÞ;
ð24Þ
η2 ðtÞ ¼ ½xT ðt hðtÞÞ; xT ðt hκ 1 Þ; xT ðt hκ ÞT :
t hi
xðt hi 1 Þ
#T "
Wi
Wi
n
W i
xðt hi Þ
#"
xðt hi 1 Þ xðt hi Þ
ð21Þ
# ð25Þ
From (13)–(20), we have, V_ ðxðtÞÞ þ ρVðxðtÞÞ 2yT ðtÞuðtÞ o ηT ðtÞ 2" #T " #3 # " 2 Θ1 Θ1 Γ Δ þ Ξ 5ηðtÞ r 0: R 4 þ Θ2 Θ2 n Λ
T
0
t hκ
¼ h^ κ e ρhκ
r e ρhi
2xT ðt hðtÞÞLK t gðxðt hðtÞÞÞ 2g T ðxðt hðtÞÞÞK t gðxðt hðtÞÞÞ Z0;
2xT ðt hu ÞLSM gðxðt hu ÞÞ 2g T ðxðt hu ÞÞSM gðxðt hu ÞÞ Z0:
(b) Similarly, we disclose the interrelationship between xðt hðtÞÞ and xðt h0 Þ, xðt h1 Þ, …, xðt hM Þ by utilizing the integral terms in V_ 3 ðxt Þ. Since h(t) is a continuous function satisfying hu =2 r hðtÞ r hu ð 8 t Z 0Þ, there should exist a positive integer κ A f1; 2; …; Mg such that hðtÞ A ½hj 1 ; hj . Then we have, Z t hðκ 1Þ _ ds h^ κ e ρhκ x_ T ðtÞW κ xðtÞ
"
T
j¼1
ð22Þ
For j a k, we also have the following inequality by Lemma 2.4: Z t hi 1 _ ds x_ T ðsÞW i xðsÞ h^ i e ρhi
T
T
r 0:
Θ2
t τðtÞ
2 ð17Þ
2xT ðtÞLK 0 gðxðtÞÞ 2g T ðxðtÞÞK 0 gðxðtÞÞ Z 0; ⋮
T
#
g T ðxðt h0 ÞÞ⋯g T ðxðt hM 1 ÞÞ T Z t g T ðxðt hM ÞÞ g T ðxðsÞÞ ds uT ðtÞ
h^ κ e ρhκ
Using (3), for any positive diagonal matrices K j ðj ¼ 0; 1; 2; …; NÞ, Kt and Si ði ¼ 0; 1; 2; …; MÞ we have
T
Θ2
Θ1
g T ðxðtÞÞ g T ðxðt δÞÞ⋯g T ðxðt NδÞÞ g T ðxðt hðtÞÞÞ
Rk
#
e ρjδ Rj
R
xT ðt h0 Þ⋯xT ðt hM 1 Þ xT ðt hM Þ
ð14Þ
For j a k, we also have the following inequality by Lemma 2.4: Z t ðj 1Þδ _ ds x_ T ðtÞRj xðtÞ δe ρjδ xðt ðj 1ÞδÞ
#T "
Θ1
þ
where ηðtÞ ¼ xT ðtÞ xT ðt δÞ⋯xT ðt NδÞ xT ðt hðtÞÞ
η1 ðtÞ ¼ ½xT ðt ðk 1ÞδÞ; xT ðt kδÞ; xT ðt hðtÞÞT :
"
"
#
"
Δ þ Ξ2 n
3T " # Θ1 Θ1 6 7 r 0: þ 4 Θ2 5 R Θ2 Λ
Γ
#
ð26Þ
2
ð27Þ
By Schur complement lemma, the inequalities (22) and (27) are equivalent to LMIs (5) and (6), respectively. Therefore, if the LMIs (5) and (6) are satisfied, then the cellular neural networks (1) is guaranteed to be exponentially passive. This completes the proof of this theorem. □ Remark 1. In fact, we decompose the delay interval ½ hu ; 0 into ½ hu ; hu =2 and ½ hu =2; 0. We decompose the delay interval ½ hu =2; 0 into N equidistant subintervals, and we decompose the delay interval ½ hu ; hu =2 into M nonuniform subintervals. The decomposing delay is different from the method of the existing
Please cite this article as: Du Y, et al. Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays. ISA Transactions (2015), http://dx.doi.org/10.1016/j.isatra.2014.11.005i
Y. Du et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
results. Through numerical examples, we will show that Theorem 1 with more improvement than the recent one in [11]. For cellular neural networks with discrete and distributed time-varying delays parameter uncertainties, we have the following system of equations: 8 _ xðtÞ ¼ ð A þ ΔAðtÞÞxðtÞ þ ðB0 þ ΔB0 ðtÞÞgðxðtÞÞ > > > > < þ ðB1 þ ΔB1 ðtÞÞgðxðt hðtÞÞÞ Rt ð28Þ þ ðB2 þ ΔB2 ðtÞÞ t τðtÞ gðxðsÞÞ ds þ uðtÞ > > > > : yðtÞ ¼ gðxðtÞ þ gðxðt hðtÞÞÞ þuðtÞ where ΔA; ΔB0 ; ΔB1 and ΔB2 are the unknown matrices, denoting the uncertainties of the concerned system and satisfying the following: ½ΔAΔB0 ΔB1 ΔB2 ¼ MFðtÞ½E0 E1 E2 E3
ð29Þ
where M, E0, E1, E2 and E3 are known matrices, F(t) is an unknown, real and possibly time-varying matrix with Lebesgue measurable elements and satisfies T
F ðtÞFðtÞ r I:
ð30Þ
Then, we have the following theorem.
Theorem 3.2. For given scalars hu 4 0; τu 4 0; ρ 4 0, hD o 1 and ε 40, the delayed cellular neural networks (1) is exponential passive, if there exist positive diagonal matrices P2, positive definite matrices P1, U, Qj, Mj, Rj ðj ¼ 1; 2; …; NÞ and ½Xni YZ ii , W i ði ¼ 1; 2; …; MÞ, N1, N2, and any positive diagonal matrices K j ðj ¼ 0; 1; 2; …; NÞ, Kt and Si ði ¼ 0; 1; 2; …; MÞ with appropriate dimensions such that the following LMIs hold 2 3 Δ þ Ξ 1 Γ Θ1 Υ T1 6 n T 7 Λ Θ2 Υ2 7 6 ð31Þ 6 7 o 0; 4 n n R 0 5 n
2 6 6 6 4
n
n
Δ þ Ξ2
Γ
Θ1
n
Λ
Θ2
n
n
R
n
n
n
εI Υ T1
Λ ¼ ½λij ðN þ M þ 5ÞðN þ M þ 5Þ ; 8 M þ τ2 U þ P 2 B0 þ B0 P 2 2K 0 þ N 2 þ εET1 E1 ; > > > 1ρi 1δu > > e ðM i M i 1 Þ 2K i 1 ; > > > > > > e ρNδ M N 2K N ; > > > > > 2K t ð1 hD Þe ρhu N 2 þ εET2 E2 ; > > > > > e ρh0 Z 1 2S0 ; > > > > > e ρhi N 3 ðZ i N 2 Z i N 3 Þ 2Si N 3 ; > > > > > e ρhM Z M 2SM ; > < T λij ¼ e ρτu U þ εE3 E3 ; > > > > 2I; > > > > P 2 B1 þ εET E2 ; > 1 > > > > > P 2 B2 þ εET1 E3 ; > > > > > P 2 I; > > > > T > εE2 E3 ; > > > > > I; > > > : 0;
i ¼ j ¼ 1; i ¼ j ¼ 2; 3; …; N; i ¼ j ¼ N þ 1; i ¼ j ¼ N þ2 i ¼ j ¼ N þ 3; i ¼ j ¼ N þ 4; N þ 5; …; N þ M þ 2; i ¼ j ¼ N þ M þ 3; i ¼ j ¼ N þ M þ 4; i ¼ j ¼ N þ M þ 5; i ¼ 1; j ¼ N þ 2; i ¼ 1; j ¼ N þ M þ 4; i ¼ 1; j ¼ N þ M þ 5; i ¼ N þ 2; j ¼ N þ M þ 4; i ¼ N þ 2; j ¼ N þ M þ 5; otherwise;
Γ ¼ ½ιij ðN þ M þ 3ÞðN þ M þ 5Þ 8 P 1 B0 AT P 2 þLK 0 þρP 2 εET0 E1 ; > > > > > LK ; > > > i1 > > LK t ; > > > > > e ρh0 Y 1 þLS0 ; > > > > < e ρhi N 3 ðY i N 2 Y i N 3 Þ þ LSi N 3 ; ιij ¼ > e ρhM Y M þ LSM ; > > > > > P 1 B1 εET0 E2 ; > > > > > P 1 B2 εET E3 ; > 0 > > > > > P1; > > : 0;
i ¼ j ¼ 1; i ¼ j ¼ 2; 3; …; N þ1; i ¼ j ¼ N þ2; i ¼ j ¼ N þ3; i; j ¼ N þ4; N þ5; …; N þ M þ 2; i ¼ j ¼ N þM þ3; i ¼ 1; j ¼ N þ2; i ¼ 1; j ¼ N þM þ4; i ¼ 1; j ¼ N þM þ5; otherwise;
Ξ 1 ¼ ½ψ ij ðN þ M þ 3ÞðN þ M þ 3Þ ; 8 > e ρkδ Rk ; > > > < e ρkδ R ; k ψ ij ¼ > 2e ρkδ Rk ; > > > : 0;
i ¼ k; j ¼ k þ 1; i A fk; k þ 1g; j ¼ N þ 2; i ¼ N þ 2; j ¼ N þ 2; otherwise;
Ξ 2 ¼ ½φij ðN þ M þ 3ÞðN þ M þ 3Þ ;
3
Υ T2 7 7 7 o 0; 0 5
5
ð32Þ
εI
8 > 2e ρhκ W κ ; > > > < e ρhκ W ; κ φij ¼ > e ρhκ W κ ; > > > : 0;
i ¼ N þ 2; j ¼ N þ2; i ¼ N þ 2; j A fN þ 2 þ κ þ 1; N þ 2 þ κ þ 2g; i ¼ N þ 2 þ κ þ 1; j ¼ N þ 2 þ κ þ 2; otherwise;
where M
R ¼ δ2 ðR1 þ R2 þ ⋯ þ RN Þ þ ∑ h^ i W i ;
Δ ¼ ½ϑij ðN þ M þ 3ÞðN þ M þ 3Þ ;
i¼1
Θ1 ¼ ½ A01ðN þ M þ 2Þ T ; 8 > P 1 A AP 1 þ Q 1 e ρδ R1 þ ρP 1 þ εET0 E0 þ N 1 ; > > > > > e ρði 1Þδ ðQ i Q i 1 Þ e ρiδ Ri e ρði 1Þδ Ri 1 ; > > > > ρNδ > ðRN þ Q N Þ; > > e > > > Þe ρhu N 1 ; ð1 h > D > > > ρh 0 X e ρh1 W ; > > 1 1
> > e ρhi N 3 W i N 3 ; > > > > > > e ρhM ðX M þ W M Þ; > > > > > e ρiδ Ri ; > > > > > e ρhi N 2 W i N 2 ; > > > : 0;
i ¼ j ¼ 1; i ¼ j ¼ 2; 3; …; N; i ¼ j ¼ N þ 1; i ¼ j ¼ N þ 2; i ¼ j ¼ N þ 3; i ¼ j ¼ N þ 4; N þ 5; …; N þ M þ 3; i ¼ j ¼ N þ M þ 3; i; j ¼ iþ 1; i; j A f1; 2; 3; …; N 1g; i; j ¼ iþ 1; i; j A fN þ 3; N þ 4; …; N þ M þ 2g; otherwise;
Please cite this article as: Du Y, et al. Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays. ISA Transactions (2015), http://dx.doi.org/10.1016/j.isatra.2014.11.005i
Y. Du et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
6
Θ2 ¼ ½B0 01N B1 01ðM þ 1Þ B2 I;
Table 1 Calculated upper bounds of ρ for different hD and hu ¼ 0:16.
Υ 1 ¼ ½MT P 1 01ðN þ M þ 2Þ T ; T
Υ 2 ¼ ½M P 2 01ðN þ M þ 4Þ ;
Υ¼
½Υ T1
; Υ T2 T
;
Ξ ¼ ½ E0 01ðN þ M þ 2Þ E1 01N E2 01ðM þ 1Þ E3 0: Proof. Then (5) and (6) for the uncertain cellular neural networks system (28) is equivalent to the following conditions, if A; W 0 ; W 1 and W2 in (5) and (6) are replaced with A þ MFðtÞE0 ; W 0 þ MFðtÞE1 ; W 1 þ MFðtÞE2 and W 2 þ MFðtÞE3 , respectively.
T
T
Υ Υ þ εΞ Ξ o 0;
ð34Þ
Φ2 þ ε 1 Υ Υ T þ εΞ T Ξ o 0:
ð35Þ
Applying the Schur complement shows that (34) and (35) are equivalent to (31) and (32). By the similar proof of Theorem 1, we have V_ ðxðtÞÞ þρV ðxðtÞÞ 2yT ðtÞuðtÞ r 0:
0.5
0.9
[11] N ¼ 2, M ¼2 N ¼ 3, M ¼3
5.4753 5.6297 6.0297
5.4121 5.5032 5.8124
5.3518 5.3874 5.7735
5.2864 5.3306 5.7294
0.16 0.14
State response
Φ1 þ ε
1
0.3
0.18
ð33Þ
By Lemma 2.5, a sufficient condition guaranteeing (32) is that there exists a scalar ε 4 0 such that
0.1
0.2
Φ1 þ Υ FðtÞΞ T þ ΞF T ðtÞΥ T o 0; Φ2 þ Υ FðtÞΞ T þ ΞF T ðtÞΥ T o0:
hD
0.12 0.1 0.08 0.06 0.04 0.02
ð36Þ
0
Based on Definition 2.1, uncertain cellular neural networks (28) is exponentially passive. The proof is completed. □ Remark 2. In fact, it is almost impossible to have an exact mathematical model of a dynamical system due to modeling errors, measurement errors, liberalization approximation and so on, which motivates this work for the system (28) containing uncertainties ΔAðtÞ, ΔW 0 ðtÞ, ΔW 1 ðtÞ and ΔW 2 ðtÞ terms from the practical. 4. Examples In this section, we will give two numerical examples showing the effectiveness of the conditions given here. Example 1. Consider the uncertain cellular neural networks (28) with the following parameters [11]: 0 0:5 4 0 1 1 A¼ ; B0 ¼ ; B1 ¼ 0 7 0:5 0 1 2 E0 ¼
0:02
0:04
0:03
0:06
L ¼ diagf1; 1g;
;
E1 ¼
0:02
0:04
0:02
0:04
;
E2 ¼
0:03
0:06
0:02
0:04
Example 2. Consider the neural networks (28) with the following parameters: 0:6 0:4 0:9993 0:3554 4 0 A¼ ; B0 ¼ ; B1 ¼ ; 0:5 0:4 0:0471 0:2137 0 4
E2 ¼
0:3978
1:3337
0:2296
0:9361
0:3 0
0 ; 0:3
L ¼ diagf1; 1g;
5
10
15
20
25
30
Time[t] Fig. 1. The state response of the system in Example 2.
and the delay decompose subintervals N ¼2, M ¼2, the timevarying delay are chosen as hðtÞ ¼ 1 þ sin ðtÞ, τðtÞ ¼ 1:5 þ cos ðtÞ, the activation function is set as f ðxðtÞÞ ¼ ðjxðtÞ þ 1j þ jxðtÞ 1jÞ=2. By using Matlab LMI toolbox, we solve the LMI (31)–(32) in the Theorem 2 and obtain the feasible solutions as follows: 2:5791 0:0340 1:1456 0 P1 ¼ ; P2 ¼ ; 0:0340 4:6042 0 3:2301 Q1 ¼
2:0162
0:1040
0:1040
2:3753
;
Q2 ¼
3:5723
1:5749
1:5749
5:8827
:
Fig. 1 shows the state curves of CNNs without input u(t), and we give the initial condition ½x1 ðtÞ; x2 ðtÞT ¼ ½0:2; 0:1T . It follows from Theorem 3.2 that the cellular neural networks (28) is exponential passivity by the Definition 2.1. Through this example, we find that our results show the effectiveness of the proposed result.
M ¼ I:
The maximum upper bounds on the allowable values of ρ obtained from Theorem 3.2 are listed in Table 1 for different hD and hu ¼ 0:16. Clearly, the results obtained in this paper are more improvement.
B2 ¼
0
;
E3 ¼
M ¼ I:
0:4 0
E0 ¼
0:2
0
0
0:2
0 ; 0:4
;
E1 ¼
0:6
0
0
0:6
;
5. Conclusions In this paper, the problem of delay-dependent exponential passivity criterion for uncertain cellular neural networks with discrete and distributed time-varying delays has been investigated. Applying the delay decomposition approach, a new stability criterion has been given in terms of LMIs, which is dependent on the size of the time delay. By comparing the proposed results with the recent results, it is shown that the passivity criteria are more improvement than the recent results which shows the effectiveness of the proposed method.
Acknowledgments This work was supported by the National Natural Science Foundation of China (Grant no. 61273015).
Please cite this article as: Du Y, et al. Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays. ISA Transactions (2015), http://dx.doi.org/10.1016/j.isatra.2014.11.005i
Y. Du et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎
References [1] Chua LO, Yang L. Cellular neural networks: theory. IEEE Trans Circuits Syst 1988;35(10):1257–72. [2] Chua LO, Yang L. Cellular neural networks: applications. IEEE Trans Circuits Syst 1988;35(10):1273–90. [3] Long SJ, Xu DY, Kwon O. Global exponential stability of non-autonomous cellular neural networks with impulses and time-varying delays. Commun Nonlinear Sci Numer Simul 2013;18:1463–72. [4] Li XD, Rakkiyappan R, Balasubramaniam P. Existence and global stability analysis of equilibrium of fuzzy cellular neural networks with time delay in the leakage term under impulsive perturbations. J Frankl Inst 2011;348:135–55. [5] Kwon OM, Park JH. Exponential stability for uncertain cellular neural networks with discrete and distributed time-varying delays. Appl Math Comput 2008;203:813–23. [6] Kwon OM, Park JH. Delay-dependent stability for uncertain cellular neural networks with discrete and distribute time-varying delays. J Frankl Inst 2008;345:766–78. [7] Yang HZ, Sheng L. Robust stability of uncertain stochastic fuzzy cellular neural networks. Neurocomputing 2009;73:133–8. [8] Yuan K, Cao J, Deng J. Exponential stability and periodic solutions of fuzzy cellular neural networks with time-varying delays. Neurocomputing 2006;69:1619–27. [9] Willems JC. The analysis of feedback systems. Cambridge, MA: MIT Press; 1971. [10] Zhu S, Shen Y, Chen G. Exponential passivity of neural networks with timevarying delay and uncertainty. Phys Lett A 2010;375:136–42.
7
[11] Wu ZG, Park JH, Su H, Chu J. New results on exponential passivity of neural networks with time-varying delays. Nonlinear Anal: Real World Appl 2012;13:1593–9. [12] Kwon OM, Park JuH, Lee SM, CHa EJ. A new augmented Lyapunov–Krasovskii functional approach to exponential passivity for neural networks with timevarying delays. Appl Math Comput 2011;217:10231–8. [13] Zhang XM, Han QL. New Lyapunov–Krasovskii functionals for global asymptotic stability of delayed neural networks. IEEE Trans Neural Netw 2009;7:533–9. [14] Boyd S, Ghaohui L, Feron E, Balakrishnan V. Linear matrix inequalities in system and control theory. Philadelphia: SIAM; 1994. [15] Sun J, Liu GP, Chen J. Delay-dependent stability and stabilization of neutral time-delay systems. Int J Robust Nonlinear Control 2009;19:1364–75. [16] Li HJ, Ning ZJ, Yin YH, Tang Y. Synchronization and state estimation for singular complex dynamical networks with time-varying delays. Commun Nonlinear Sci Numer Simul 2013;18:194–208. [17] Balasubramaniam P, Nagamani G. A delay decomposition approach to delaydependent passivity analysis for interval neural networks with time-varying delay. Neurocomputing 2011;74:1635–46. [18] Zhang YJ, Yue D, Tian EG. New stability criteria of neural networks with interval time-varying delay: a piecewise delay method. Appl Math Comput 2009;208:249–59. [19] Balasubramaniam P, Krishnasamy R, Rakkiyappan R. Delay-dependent stability of neutral systems with time-varying delays using delay-decomposition approach. Appl Math Model 2012;36:2253–61.
Please cite this article as: Du Y, et al. Delay-dependent exponential passivity of uncertain cellular neural networks with discrete and distributed time-varying delays. ISA Transactions (2015), http://dx.doi.org/10.1016/j.isatra.2014.11.005i