Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays

Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays

ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎ Contents lists available at ScienceDirect ISA Transactions journal homepage: www.elsevier.com/locate/isatrans Res...

589KB Sizes 2 Downloads 90 Views

ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

Contents lists available at ScienceDirect

ISA Transactions journal homepage: www.elsevier.com/locate/isatrans

Research Article

Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays Haiying Huang a,n, Qiaosheng Du b, Xibing Kang a a b

Shijiazhuang Ordnance Engineering College, Shijiazhuang 050000, PR China Xian Research Institute of High Technology, Xian 710025, PR China

art ic l e i nf o

a b s t r a c t

Article history: Received 30 April 2013 Received in revised form 15 July 2013 Accepted 29 July 2013 This paper was recommended for publication by Dr. Jeff Pieper.

In this paper, a class of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays is investigated. The jumping parameters are modeled as a continuoustime finite-state Markov chain. At first, the existence of equilibrium point for the addressed neural networks is studied. By utilizing the Lyapunov stability theory, stochastic analysis theory and linear matrix inequality (LMI) technique, new delay-dependent stability criteria are presented in terms of linear matrix inequalities to guarantee the neural networks to be globally exponentially stable in the mean square. Numerical simulations are carried out to illustrate the main results. & 2013 ISA. Published by Elsevier Ltd. All rights reserved.

Keywords: Stability Markovian jumping Hopfield neural network Mixed time delay Linear matrix inequality (LMI)

1. Introduction The human brain is made up of a large number of cells called neurons and their interconnections. An artificial neural network is an information processing system that has certain characteristics in common with biological neural networks. Since the pioneering work in [16,17], Hopfield type neural networks have been intensively studied in the past three decades and have been applied to signal and image processing, pattern recognition, fault diagnosis, optimization problems and special problems of A/D converter design, see [9,11,13,19,20,29,31,32,34,38,43] and references therein. However such neural networks are shown to have limitations such as limited capacity when used in pattern recognition problems (see, e.g., [20]). As well, the cases of optimization problems that can be solved using neural networks are limited. This led many investigators to use neural networks with high order connections. High-order neural networks allow high-order interactions between neurons, and therefore have stronger approximation property, faster convergence rate, greater storage capacity, and higher fault tolerance than the traditional first-order neural networks [35]. Recently, there has been n Corresponding author. The Seventh Department, Shijiazhuang Ordnance Engineering College, 97 Heping West Road, Shijiazhuang 050000, Hebei Province, PR China. Tel.: +86 311 87994721. E-mail address: [email protected] (H. Huang).

considerable attention in the literature on high-order Hopfield type neural networks (see, e.g., [25,40–42] and the references therein). On the other hand, due to the complicated dynamic properties of the neural cells in the real world, the existing neural network models in many cases cannot characterize the properties of a neural reaction process precisely. It is natural and important that systems will contain some information about the derivative of the past state to further describe and model the dynamics for such complex neural reactions [28]. However, the stability analysis of neural networks of neutral-type has been investigated by only a few researchers [10,23,27,28,46]. Markovian jump systems introduced by Krasovskii and Lidskii [21] are the hybrid systems with two components in the state. The first one refers to the model which is described by a continuoustime finite-state Markovian process, and the second one refers to the state which is represented by a system of differential equations. The jump systems have the advantage of modeling the dynamic systems subject to abrupt variation in their structures, such as component failures or repairs, sudden environmental disturbance, changing subsystem interconnections, operating in different points of a nonlinear plant [26]. It should be pointed out that such a jump system has seldom been applied to neural networks due to the difficulty of mathematics. However, in real life, neural networks often exhibit information latching. It is recognized that a way of dealing with this information latching

0019-0578/$ - see front matter & 2013 ISA. Published by Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.isatra.2013.07.016

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

2

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

problem is to extract finite-state representations (also called modes or clusters). In fact, such a neural network with information latching may have finite modes, and the modes may switch (or jump) from one to another at different times, and the switching (or jumping) between two arbitrarily different modes can be governed by a Markov chain. Hence, the neural network with Markovian jump parameters has been a subject of great significance in modeling a class of neural networks with finite modes [48]. Therefore, neural networks with Markovian jump parameters have received a great deal of attention. For instance, Balasubramaniam et al. [2] investigated state estimation problem for a class of neural networks with Markovian jumping parameters, Zhang and Wang [44] discussed the problem of global asymptotical stabilization for a class of Markovian jumping stochastic Cohen– Grossberg neural networks with mixed delays including discrete delays and distributed delays, Zhu and Cao [47] studied the exponential stability problem for a class of Markovian jump impulsive stochastic Cohen–Grossberg neural networks with mixed time delays and known or unknown parameters, and so on. For details concerning stability analysis of neural networks with Markovian jump parameters, please see [3,4,6,24,49], and the references cited therein. In addition, noise disturbance is a major source of instability and can lead to poor performances in neural networks. In real nervous systems, the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes. It has also been known that a neural network could be stabilized or destabilized by certain stochastic inputs [7]. Hence, great attention has been paid on the stability analysis for stochastic neural networks, and some initial results have been obtained (see, e.g., [18,33,36,37,45] and the references therein). In addition to the noise disturbance, time delay is also a major source for causing instability and poor performances in neural networks (see, e.g., [1,5,8]). It is well-known that there exist time delays in the information processing of neurons due to various reasons. For example, time delays can be caused by the finite switching speed of amplifier circuits in neural networks or deliberately introduced to achieve tasks of dealing with motionrelated problems, such as moving image processing. Time delays in the neural networks make the dynamic behaviors become more complicated, and may destabilize the stable equilibria and admit periodic oscillation, bifurcation and chaos. Therefore, considerable attention has been paid on the study of delay systems in control theory and a large body of work has been reported in the literature (see, e.g., [7,22,30,36,37] and the references therein). To the best of our knowledge, the global exponential stability analysis problem for neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays have not been studied thoroughly in the literature and it is very important in both theories and applications, so there exists open room for further improvement. This situation motivates our present investigation. This paper is concerned with the global exponential stability analysis problem for a class of neutral highorder stochastic Hopfield neural networks with both Markovian jump parameters and mixed time delays, which synchronously comprise constant, time-varying, and distributed delays. By utilizing the Lyapunov stability theory, stochastic analysis theory, and linear matrix inequality (LMI) technique, some novel delaydependent conditions are obtained, which guarantee the exponential stability of the equilibrium point. The proposed LMI-based criteria are computationally efficient as they can be easily checked by using recently developed standard algorithms, such as interiorpoint methods [8], in solving LMIs. Finally, numerical simulations are provided to illustrate the effectiveness of the theoretical results.

The organization of this paper is as follows: in Section 2, we propose the relating notations, definitions and lemmas which would be used later, moreover, the existence of equilibrium point is proved; in Section 3, new delay-dependent global exponential stability criteria will be established for neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays to be globally exponentially stable in the mean square; numerical simulations will be given in Section 4 to demonstrate the effectiveness of our results. Finally, conclusions are drawn in Section 5. Notation: Throughout this paper, R n and R nm denote the n dimensional Euclidean space and the set of all n  m real matrices, respectively; the superscript “T” denotes matrix transposition and the notation X Z Y (respectively, X 4 Y) where X and Y are symmetric matrices, means that XY is semi-positive definite (respectively, positive definite); λmin ðÞ and λmax ðÞ denote the minimum and maximum eigenvalues of a real symmetric matrix, respectively; In is the n  n identity matrix; the notation C 2;1 ðR þ  Rn  S; R þ Þ denotes the family of all nonnegative functions Vðt; xðtÞ; iÞ on R þ  R n  S which are continuously twice differentiable in x and once differentiable in t; ðΩ; F ; PÞ is a complete probability space, where Ω is the sample space, F is the salgebra of subsets of the sample space and P is the probability measure on F ; L2F 0 ð½τ; 0; R n Þ denotes the family of all F 0 -measurable Cð½τ; 0; R n Þvalued random variables ξ ¼ fξðθÞ : τ r θ r 0g such that supτ r θ r 0 EjξðθÞj2 o 1, where Efg stands for the mathematical expectation operator with respect to the given probability measure P; the shorthand diagf⋯g denotes the block diagonal matrix; j  j is the Euclidean norm in R n ; I is the identity matrix of appropriate dimension; and the symmetric terms in asymmetric matrix are denoted by n.

2. Problem formulation In this paper, the neutral high-order stochastic Hopfield neural networks with mixed time delays is described by the following differential equation: ( dyi ðtÞki dyi ðthÞ ¼

n

n

j¼1

j¼1

ci yi ðtÞ þ ∑ aij g j ðyj ðtÞÞ þ ∑ bij g j ðyj ðtτj ðtÞÞÞ n

n

þ ∑ ∑ T ijl g j ðyj ðtτj ðtÞÞÞg l ðyl ðtτl ðtÞÞÞ j¼1l¼1 n

þ ∑ dij j¼1

Z

t tτj ðtÞ

) g j ðyj ðsÞÞ ds þ J i

þρi ðyi ðtÞ; yi ðtτi ðtÞÞ; tÞ dωi ðtÞ;

dt ð2:1Þ

where iA f1; 2; …; ng, t Z t 0 ; yi(t) is the neural state of cell i at time t; ci 40 denotes the rate with which the cell i resets its potential to the resting state; aij, bij and dij are the first-order synaptic weights of the neural networks; Tijl is the second-order synaptic weights of the neural networks; τj ðtÞ ðj ¼ 1; 2; …; nÞ is the transmission delays of the ith unit along the axon of the jth unit at time t such that 0 o τj ðtÞ r τ and τ_ j ðtÞ rϱ o1, where τ and ϱ are constants; the activation function gj is continuous on R; Ji is the external input. In general, the current state of a neuron may depend on several events having occurred in the neighboring neurons at different times. Therefore, time delays can easily be introduced in the neural network models by writing the first order interaction term in the form ∑nj¼ 1 ∑lm ¼ 1 T ijm g j ðyj ðtτijm ÞÞ, where m is an index that refers to past synaptic events and τijm Z 0. ki is the neutral delayed strength of connectivity; h 4 0 is a neutral constant delay; ρi is the diffusion coefficient (or noise intensity) and the stochastic disturbance ωðtÞ ¼ ½ω1 ðtÞ; ω2 ðtÞ; …; ωn ðtÞT A R n is a Brownian motion

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

defined on ðΩ; F ; PÞ, and Efdω2 ðtÞg ¼ dt:

EfdωðtÞg ¼ 0;

Suppose that the system (2.1) is supplemented with initial conditions of the form yi ðθÞ ¼ ϕi ðθÞ;

θ A ½τ; 0; i ¼ 1; 2; …; n;

ð2:2Þ

where ϕi ðθÞ ð1; 2; …; nÞ is continuous on ½τ; 0  Ω, τ ¼ maxfτ; hg. Throughout this paper, it is assumed that there exist positive constants Li 4 0 and χ i , i ¼ 1; 2; …; n, such that the activation function gi satisfies the following conditions:   g i ðui Þ rχ i ;

0o

ðH 1 Þ ρi ðyni ; yni ; tÞ ¼ 0; i ¼ 1; 2; …; n. Thus, system (2.1) admits one equilibrium point yn ¼ ðyn1 ; yn2 ; …; ynn ÞT . Based on the assumption ðH 1 Þ, we study the global exponential stability in the mean square of system (2.1) at the equilibrium point yn ¼ ðyn1 ; yn2 ; …; ynn ÞT . Let xi ðtÞ ¼ yi ðtÞyni , by using the mean value of Lagrange theorem, system (2.1) can be written in the following form: (

dyi ðtÞki dyi ðthÞ ¼

þ ∑ bij ½g j ðyj ðtτj ðtÞÞÞg j ðynj Þ j¼1

n

n

j¼1

j¼1

þ ∑ dij

tτj ðtÞ

j¼1

g j ðyj ðsÞÞ ds þ J i

n

j¼1l¼1

g j ðynj ÞÞðg l ðyl ðtτl ðtÞÞÞg l ðynl ÞÞ þðg j ðyj ðtτj ðtÞÞÞg j ðynj ÞÞg l ðynl Þ þ ðg l ðyl ðtτl ðtÞÞÞ ) g l ðynl ÞÞg j ðynj Þ

)

t

½g j ðyj ðsÞÞg j ðynj Þ ds

þ ∑ ∑ T ijl ½ðg j ðyj ðtτj ðtÞÞÞ

n

j¼1l¼1

t

tτj ðtÞ

j¼1

n

Z

Z

n

þ ∑ dij

þ ∑ ∑ T ijl g j ðyj ðtτj ðtÞÞÞg l ðyl ðtτl ðtÞÞÞ n

j¼1

n

ci yi ðtÞ þ ∑ aij g j ðyj ðtÞÞ þ ∑ bij g j ðyj ðtτj ðtÞÞÞ n

n

ci ðyni yi ðtÞÞ þ ∑ aij ½g j ðyj ðtÞÞg j ðynj Þ

dxi ðtÞki dxi ðthÞ ¼

g i ðui Þg i ðvi Þ rLi ; ui vi

where ui ; vi A R; i ¼ 1; 2; …; n. For the deterministic system (

3

dt;

ð2:3Þ

dt

þρi ðyi ðtÞ; yi ðtτi ðtÞÞ; tÞ dωi ðtÞ (

the result is given as follows:

n

ci xi ðtÞ þ ∑ aij f j ðxj ðtÞÞ

¼

j¼1

Theorem 2.1. System (2.3) has at least one equilibrium point.

n

þ ∑ bij f j ðxj ðtτj ðtÞÞÞ

Proof. If yn ¼ ðyn1 ; yn2 ; …; ynn ÞT is an equilibrium point of system (2.3), then yn satisfies n

n

j¼1

j¼1

n

j¼1 n

n

ci yni þ ∑ aij g j ðynj Þ þ ∑ bij g j ðynj Þ þ ∑ ∑ T ijl g j ðynj Þg l ðynl Þ Z

n

þ ∑ dij

t tτj ðtÞ

j¼1

j¼1

þ ∑ dij j¼1

Z

n

j¼1 t

n

j¼1l¼1

tτj ðtÞ

) f j ðxj ðsÞÞ ds

dt

g j ðyj Þ ds þ J i j

n

n

where i ¼ 1; 2; …; n, f j ðxj ðtÞÞ ¼ g j ðyj ðtÞÞg j ðyj Þ, f j ðxj ðtτj ðtÞÞÞ ¼ g j ðyj ðtτj ðtÞÞÞg j ðynj Þ, si ðxi ðtÞ; xi ðtτi ðtÞÞ; tÞ ¼ ρi ðyi ðtÞ; yi ðtτi ðtÞÞ; tÞ, and ζ l is between g l ðyl ðtτl ðtÞÞÞ and g l ðynl Þ. Apparently, for each i ¼ 1; 2; …; n, direct calculation shows that 8 z A R:

ð2:5Þ

Then system (2.4) can be rewritten as follows: (

n

r ∑ ðjaij j þ jbij jÞχ j þ ∑ ∑ jT ijl jχ j χ l j¼1

ð2:4Þ

n

jf j ðzÞj r Lj z;

n

tτj ðtÞ

t

þsi ðxi ðtÞ; xi ðtτi ðtÞÞ; tÞ dωi ðtÞ;

n

n

Z

þ ∑ dij

g j ðynj Þ ds þ J i ¼ 0:

j ∑ aij g j ðynj Þ þ ∑ bij g j ðynj Þ þ ∑ ∑ T ijl g j ðynj Þg l ðynl Þ j¼1

l¼1

n

By the boundedness of the activation functions gj and timevarying delays τj ðtÞ, it follows that n

∑ ðT ijl þ T ilj Þζ l f j ðxj ðtτj ðtÞÞÞ

j¼1

j¼1l¼1

!

n

þ ∑

j¼1l¼1

dxðtÞK dxðthÞ ¼

CxðtÞ þ Af ðxðtÞÞ þ Bf ðxðtτðtÞÞÞ

n

þ ∑ jdij jτχ j þ jJ i j 6 si :

Z

j¼1

þD

t

) f ðxðsÞÞ ds

dt

tτðtÞ

þsðxðtÞ; xðtτðtÞÞ; tÞ dωðtÞ;

Define a continuous bounded mapping   1 φi y1 ; y2 ; …; yn ¼ ci

n

where

n

∑ aij g j ðyj Þ þ ∑ bij g j ðyj Þ

j¼1 n

K ¼ diagðk1 ; k2 ; …; kn Þ; C ¼ diagðc1 ; c2 ; …; cn Þ;

j¼1

n

þ ∑ ∑ T ijl g j ðyj Þg l ðyl Þ j¼1l¼1 n

þ ∑ dij j¼1

Z

t tτj ðtÞ

! g j ðyj ðsÞÞ ds þ J i

ð2:6Þ

s r i 6θi : ci

It follows that φ ¼ ðφ1 ; φ2 ; …; φn ÞT maps Ω ¼ ½θ1 ; θ1   ½θ2 ; θ2  ⋯  ½θn ; θn  into itself. Therefore, by Brouwer's fixed point theorem, φ has at least one equilibrium point for system (2.3). This completes the proof. □ In order to guarantee that system (2.1) has an equilibrium point, it is assumed that as the system approaches to its equilibrium point, the stochastic noise contribution vanishes, i.e.

D ¼ ðdij Þnn ;

T i ¼ ðT ijl Þnn ;

A ¼ ðaij Þnn ;

T H ¼ ðT 1 þ T T1 ; T 2 þ T T2 ; …; T n þ T Tn ÞT ;

ζ ¼ ðζ 1 ; ζ 2 ; …; ζ n Þ;

Γ ¼ diagðζ; ζ; …; ζÞ;

B ¼ ðbij Þnn þ Γ T T H ;

xðtτðtÞÞ ¼ ðx1 ðtτ1 ðtÞÞ; x2 ðtτ2 ðtÞÞ; …; xn ðtτn ðtÞÞÞT ; f ðxðtÞÞ ¼ ðf 1 ðx1 ðtÞÞ; f 2 ðx2 ðtÞÞ; …; f n ðxn ðtÞÞÞT ; f ðxðtτðtÞÞÞ ¼ ðf 1 ðx1 ðtτ1 ðtÞÞÞ; f 2 ðx2 ðtτ2 ðtÞÞÞ; …; f n ðxn ðtτn ðtÞÞÞÞT sðxðtÞ; xðtτðtÞÞ; tÞ ¼ ðs1 ðxðtÞ; xðtτðtÞÞ; tÞ; s2 ðxðtÞ; xðtτðtÞÞ; tÞ; …; sn ðxðtÞ; xðtτðtÞÞ; tÞÞT :

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

4

Let frðtÞ; t Z 0g be a right continuous Markov chain in a complete probability space ðΩ; F ; PÞ taking values in a finite state space S ¼ f1; 2; …; Ng with generator Π ¼ ðπ ij ÞNN given by ( π ij Δt þ oðΔtÞ if i a j; Pfrðt þ ΔtÞ ¼ jjrðtÞ ¼ ig ¼ 1 þ π ii Δt þ oðΔtÞ if i ¼ j; where Δt 4 0 and limΔt-0 ðoðΔtÞ=ΔtÞ ¼ 0. Here, π ij r0 is the transition probability rate from i to j if i a j, where π ii ¼ ∑N j ¼ 1;j a i π ij . Consider a class of neutral Hopfield neural networks with stochastic noise disturbance, mixed time delays and Markovian jump parameters, which is actually a modification of (2.6):  dxðtÞKðrðtÞÞ dxðthÞ ¼ CðrðtÞÞxðtÞ þ AðrðtÞÞf ðxðtÞÞ Z t þBðrðtÞÞf ðxðtτðtÞÞÞ þ DðrðtÞÞ tτðtÞ  f ðxðsÞÞ ds dt þ sðxðtÞ; xðtτðtÞÞ; ð2:7Þ

t; rðtÞÞ dωðtÞ;

where h, τðtÞ, x(t) and f ðxðtÞÞ have the same meanings as those in (2.6), sðxðtÞ; xðtτðtÞÞ; t; rðtÞÞ is noise intensity function vector, and for a fixed system mode, KðrðtÞÞ, AðrðtÞÞ, BðrðtÞÞ, CðrðtÞÞ and DðrðtÞÞ are known constant matrices with appropriate dimensions. For convenience, each possible value of r(t) is denoted by i; i A S in the sequel. It follows that K i ¼ KðrðtÞÞ;

Ai ¼ AðrðtÞÞ;

Bi ¼ BðrðtÞÞ;

C i ¼ CðrðtÞÞ;

Di ¼ DðrðtÞÞ;

where Ki, Ai, Bi, Ci and Di for any i A S are known constant matrices of appropriate dimensions. Assume that s : R n  R n  Rþ  S-R n is locally Lipschitz continuous and satisfies the linear growth condition [14]. And it is assumed that all the eigenvalues of K are inside the unit circle. Moreover, s satisfies trace½sT ðx1 ; x2 ; t; iÞsðx1 ; x2 ; t; iÞ r xT1 R1i x1 þ xT2 R2i x2

ð2:8Þ

for all x1 ; x2 A R n and rðtÞ ¼ i, i A S, where R1i and R2i are known positive constant matrices with appropriate dimensions. Let xðt; ξÞ denote the state trajectory from the initial data xðθÞ ¼ ξðθÞ on τ rθ r 0 in L2F 0 ð½τ; 0; R n Þ. Clearly, system (2.7) admits a trivial solution xðt; 0Þ  0 corresponding to the initial data ξ ¼ 0. For simplicity, we write xðt; ξÞ ¼ xðtÞ. Before ending this section, the notion of global exponential stability for the neural networks (2.7) and some lemmas are introduced which will be essential in establishing the desired LMI-based stability criteria. Definition 2.1. The equilibrium point of the neural networks (2.7) is said to be globally exponentially stable in the mean square if for

Lemma 2.2 (Gu [12]). For any positive definite matrix G 4 0, scalar τ0 4 0, vector function ω : ½0; τ0 -R n such that the integrations concerned are well defined, the following inequality holds: Z τ0 T Z τ0  Z τ0 ωðsÞ ds G ωðsÞ ds r τ0 ωT ðsÞGωðsÞ ds: 0

0

0

Lemma 2.3 (Schur complement). Given constant matrices Ω1 , Ω2 and Ω3 with appropriate dimensions, where Ω1 ¼ ΩT1 and 0 o Ω2 ¼ ΩT2 , then Ω1 þ ΩT3 Ω1 2 Ω3 o 0 if and only if " # ΩT3 Ω1 Ω3

Ω2

" o0

or

Ω2

Ω3

ΩT3

Ω1

# o 0:

3. Main results In this section, Lyapunov stability theory [15] and a unified linear matrix inequality (LMI) approach which is different from the commonly used matrix norm theories (such as the M-matrix method) will be developed to establish sufficient conditions for the neural networks (2.7) to be globally exponentially stable in the mean square. LMIs can be easily solved by using the Matlab LMI toolbox, and no tuning of parameters is required [8]. Theorem 3.1. The neural network (2.7) is globally exponentially stable in the mean square, if for given βi 40 ðiA SÞ, there exist symmetric positive definite matrices Q1, Q2, Q3, Pi and Gi ðiA SÞ, and positive scalars δ, γ and λ i ði A SÞ, such that the following two linear matrix inequalities hold: P i oλ i I; 2 6 6 6 6 6 6 Σi ¼ 6 6 6 6 6 4

ð3:1Þ

Ξ1

βi P i K i þ K i P i C i

0

P i Ai

P i Bi

0

P i Di

n

Ξ2

n

n

0 Ξ3

K i P i Ai 0

K i P i Bi 0

K i P i Di 0

0 0

n

n

n

δI

0

0

n

n

n

n

γI

0

n

n

n

n

n

Ξ4

n

n

n

n

n

n

3

7 7 7 7 7 0 7 7 o 0; 7 0 7 7 0 7 5 Gi

ð3:2Þ where

ξ A L2F 0 ð½τ; 0; R n Þ,

any there exist positive constants α and β, such that the following condition holds: 2

βt

Efjxðt; ξÞj g r αe

2

EJξJ ;

ð2:9Þ

N

Ξ 1 ¼ βi P i 2P i C i þ ∑ π ij P j þ Q 1 þ Q 2 þ τQ 3 þ λ i R1i þ δLT L; j¼1

where E J ξ J ¼ supτ r θ r 0 EjξðθÞj.

Ξ 2 ¼ βi K Ti P i K i eβi h Q 2 ;

Definition 2.2 (Shu et al. [39]). We introduce the stochastic Lyapunov–Krasovskii functional V A C 2;1 ðR þ  R n  S; R þ Þ of system (2.7), the weak infinitesimal generator of random process LV from R þ  R n  S to Rþ defined by

Ξ 4 ¼ ð1ϱÞτ 1 Q 3 þ Gi : Proof. Define a positive definite Lyapunov–Krasovskii functional Vðt; xðtÞ; iÞ A C 2;1 ðR þ  Rn  S; R þ Þ as follows:

Ξ 3 ¼ ð1ϱÞeβi τ Q 1 þ λ i R2i þ γLT L;

Vðt; xðtÞ; iÞ ¼ V 1 ðt; xðtÞ; iÞ þ V 2 ðt; xðtÞ; iÞ þ V 3 ðt; xðtÞ; iÞ þ V 4 ðt; xðtÞ; iÞ;

1 LV ðt; xðt Þ; iÞ ¼ limþ ½EfV ðt þ Δt; xðt þ Δt Þ; r ðt þ Δt ÞÞgV ðt; xðt Þ; iÞ: Δt-0 Δt ð2:10Þ Lemma 2.1 (Boyd et al. [8]). For any real matrices X, Y and one positive definite matrix G, the following matrix inequality holds: X T Y þ Y T X r X T G1 X þ Y T GY:

ð3:3Þ where V 1 ðt; xðtÞ; iÞ ¼ eβi t ½xðtÞK i xðthÞT P i ½xðtÞK i xðthÞ; Z t V 2 ðt; xðtÞ; iÞ ¼ eβi s xT ðsÞQ 1 xðsÞ ds; tτðtÞ

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

Z

t

V 3 ðt; xðtÞ; iÞ ¼ V 4 ðt; xðtÞ; iÞ ¼

th Z 0

where L ¼ diagfL1 ; L2 ; …; Ln g. Noticing that, for any positive scalars δ and γ, there exist

eβi s xT ðsÞQ 2 xðsÞ ds; Z

τðtÞ

t tþs

5

T

δ½f ðxðtÞÞf ðxðtÞÞxT ðtÞLT LxðtÞ Z0;

eβi η xT ðηÞQ 3 xðηÞ dη ds:

T

γ½f ðxðtτðtÞÞÞf ðxðtτðtÞÞÞxT ðtτðtÞÞLT LxðtτðtÞÞ Z 0:

Then, it follows from (2.7) and (2.10) that

By virtue of (3.2), (3.5), (3.6), (3.7) and (3.9), it follows from (3.4) that

LVðt; xðtÞ; iÞ ¼ βi eβi t ½xðtÞK i xðthÞT P i ½xðtÞK i xðthÞ  þ2eβi t ½xðtÞK i xðthÞT P i C i xðtÞ þ Ai f ðxðtÞÞ Z

LVðt; xðtÞ; iÞ r eβi t ψ T ðtÞΣ ni ψðtÞ;

 f ðxðsÞÞ ds

t

þBi f ðxðtτðtÞÞÞ þ Di

" T

N

2

j¼1

βi ðtτðtÞÞ T

x ðtτðtÞÞQ 1 xðtτðtÞÞ

þeβi t xT ðtÞQ 2 xðtÞeβi ðthÞ xT ðthÞQ 2 xðthÞ Z t eβi s xT ðsÞQ 3 xðsÞ ds þτðtÞeβi t xT ðtÞQ 3 xðtÞð1_τ ðtÞÞ tτðtÞ

T

þtr½s ðxðtÞ; xðtτðtÞÞ; t; iÞP i sðxðtÞ; xðtτðtÞÞ; t; iÞ

þBi f ðxðtτðtÞÞÞ þ Di

6 6 6 6 6 n Σi ¼ 6 6 6 6 4

Ξ n1 n

βi P i K i þ K i P i C i Ξ2

n

n

0 0 Ξ3

n

n

n

n

n

n

n

P i Ai K i P i Ai 0 δI

P i Bi K i P i Bi 0 0

n

n

γI

n

n

n

T #

t

xðsÞ ds tτðtÞ

;

3

0 7 K i P i Di 7 7 7 0 7 7; 0 7 7 7 0 5 Ξ4

N

T T T Ξ n1 ¼ βi P i 2P i C i þ P i Di G1 i Di P i þ ∑ π ij P j þ Q 1 þ Q 2 þ τQ 3 þ λ i R1i þ δL L:

r βi eβi t ½xðtÞK i xðthÞT P i ½xðtÞK i xðthÞ  þ2eβi t ½xðtÞK i xðthÞT P i C i xðtÞ þ Ai f ðxðtÞÞ

j¼1

By Lemma 3 and condition (3.2), it is easy to see that Σ ni o0 for all i A S. Let γ ¼ mini A S fλmin ðΣ ni Þg. It is easy to see that γ 4 0. This fact, together with (3.10), yields

 f ðxðsÞÞ ds

t

Z

T

ψ T ðtÞ ¼ xT ðtÞ; xT ðthÞ; xT ðtτðtÞÞ; f ðxðtÞÞ; f ðxðtτðtÞÞÞ;

þeβi t xT ðtÞ ∑ π ij P j xðtÞ þ eβi t xT ðtÞQ 1 xðtÞ

Z

ð3:10Þ

where

tτðtÞ

ð1_τ ðtÞÞe

ð3:9Þ

tτðtÞ N

þeβi t xT ðtÞ ∑ π ij P j xðtÞ þ eβi t xT ðtÞQ 1 xðtÞ

LVðt; xðtÞ; iÞ r γeβi t jxðtÞj2 :

j¼1

ð3:11Þ

ð1ϱÞeβi ðtτ Þ xT ðtτðtÞÞQ 1 xðtτðtÞÞ þ eβi t xT ðtÞQ 2 xðtÞ Furthermore, it follows from the definition of Vðt; xðtÞ; iÞ that

eβi ðthÞ xT ðthÞQ 2 xðthÞ þ τeβi t xT ðtÞQ 3 xðtÞ Z t eβi s xT ðsÞQ 3 xðsÞ ds ð1ϱÞ

EV ðt; xðtÞ; iÞ Zλmin ðP i Þeβi t EfjxðtÞj2 g:

ð3:12Þ

tτðtÞ

þtr½sT ðxðtÞ; xðtτðtÞÞ; t; iÞP i sðxðtÞ; xðtτðtÞÞ; t; iÞ:

ð3:4Þ

Note that Z

On the other hand, by assumption (2.8) and condition (3.1), it follows that T

tr½s ðxðtÞ; xðtτðtÞÞ; t; iÞP i sðxðtÞ; xðtτðtÞÞ; t; iÞ r λmax ðP i Þtr½sT ðxðtÞ; xðtτðtÞÞ; t; iÞsðxðtÞ; xðtτðtÞÞ; t; iÞ T

T

r λ i ½x ðtÞR1i xðtÞ þ x ðtτðtÞÞR2i xðtτðtÞÞ:

EV ð0; xð0Þ; r ð0ÞÞ ¼ xT ð0ÞP i xð0Þ þ Z þ þ

0

h Z 0

eβi s xT ðsÞQ 1 xðsÞ ds

τð0Þ

eβi s xT ðsÞQ 2 xðsÞ ds þ

τð0Þ

ð3:5Þ

0

Z

0

s

Z

0 h

eβi s xT ðηÞQ 4 xðηÞ dη ds (Z

rλmax ðP i ÞE J ξ J 2 þ λmax ðQ 1 ÞE By employing Lemmas 2.1 and 2.2, it is easy to see that there exists positive definite matrices Gi such that Z 2xT ðtÞP i Di

t

rx

T T ðtÞP i Di G1 i Di P i xðtÞ

Z þ

t

T f ðxðsÞÞ ds

Z

r ð1ϱÞτ 1

Z

t

tτðtÞ

tτðtÞ

T Z xðsÞ ds Q 3

t

Gi tτðtÞ

 f ðxðsÞÞ ds : ð3:6Þ

It follows from Lemma 2.2 that Z t ð1ϱÞ xT ðsÞQ 3 xðsÞ ds tτðtÞ

(Z þλmax ðQ 4 ÞE (

0

0 τð0Þ

) eβi s xT ðsÞxðsÞ ds

h 0 h

) eβi s xT ðsÞxðsÞ ds

) _ ds eβi s x_ T ðsÞxðsÞ Z

0 τð0Þ

0

) eβi s xT ðηÞxðηÞ dη ds

s

r λmax ðP i Þ þ ½λmax ðQ 1 Þ þ λmax ðQ 2 Þ þ λmax ðQ 3 Þ

t

 xðsÞ ds

ð3:7Þ

) 1eβi t þτλmax ðQ 4 Þ E J ξ J 2: βi

ð3:13Þ

tτðtÞ

From (2.5), the following inequalities can be obtained easily: T

f ðxðtÞÞf ðxðtÞÞxT ðtÞLT LxðtÞ r 0; T

þλmax ðQ 2 ÞE þλmax ðQ 3 ÞE

tτðtÞ T

(Z (Z

f ðxðsÞÞ ds

eβi s x_ T ðsÞQ 3 x_ ðsÞ ds

f ðxðtτðtÞÞÞf ðxðtτðtÞÞÞxT ðtτðtÞÞLT LxðtτðtÞÞ r 0;

ð3:8Þ

By the generalized Itô's formula and (3.11) and (3.13), it follows that Z t LVðs; xðsÞ; rðsÞÞ ds EV ðt; xðt Þ; iÞ ¼ EV ð0; xð0Þ; r ð0ÞÞ þ E 0

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

6

2

(

6 6 6 6 6 6 Σi ¼ 6 6 6 6 6 4

r λmax ðP i Þ þ ½λmax ðQ 1 Þ þ λmax ðQ 2 Þ þ λmax ðQ 3 Þ 1eβi t þτλmax ðQ 4 Þ βi Z E J ξ J 2 þ E

0

t

)

γeβi s jxðsÞj2 ds :

Ξ1

βPK þ KPC

0

PA

PB

0

n

Ξ2

0

KPA

KPB

KPD

n

n

Ξ3

0

0

0

n

n

n

δI

n

n

n

n

0 γI

0 0

n

n

n

n

n

Ξ4

n

n

n

n

n

n

ð3:14Þ

PD

3

0 7 7 7 0 7 7 7 0 7 o 0; 7 0 7 7 7 0 5 G ð3:18Þ

where

Let 1eβi t λmax ðP i Þ þ ½λmax ðQ 1 Þ þ λmax ðQ 2 Þ þ λmax ðQ 3 Þ þ τλmax ðQ 4 Þ βi ; αi ¼ λmin ðP i Þ α ¼ min αi ;

β ¼ min βi :

iAS

EfjxðtÞj2 g r αeβt E J ξ J 2 :

ð3:15Þ

Therefore, by Definition 2.1 and (3.15), it is easy to see that the equilibrium point of the neural networks (2.7) is globally exponentially stable in the mean square. This completes the proof. □ Remark 1. With the best of our knowledge, the global exponential stability criteria for neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays have not been fully discussed in the existing literature. This paper reports new idea and some sufficient exponential stability conditions of neutral high-order Hopfield neural networks with stochastic noise disturbance, mixed time delays and Markovian jump parameters, which generalize and improve the outcomes in [6,23,25,46,48]. Remark 2. The methods based on M-matrix and algebraic inequality have been applied widely due to their simple structures. However, since the absolute value operation is usually used to deal with the connection weight values has to be neglected, the corresponding stability results are more conservative. Hence, the theoretical results (Theorem 3.1) in terms of LMI approach are less conservative. Remark 3. The criterion given in Theorem 3.1 is dependent on the time delay. It is well known that the delay-dependent criteria are less conservative than delay-independent criteria, particularly when the delay is small. Based on Theorem 3.1, the following results can be obtained easily. Case1. If the Markovian jumping parameters are not taken into account, then the neural networks (2.7) are simplified to

Ξ 3 ¼ ð1ϱÞeβτ Q 1 þ λR2 þ γLT L; Ξ 4 ¼ ð1ϱÞτ 1 Q 3 þ G:

dxðtÞK dxðthÞ ¼ CxðtÞ þ Af ðxðtÞÞ þ Bf ðxðtτðtÞÞÞ Z

t

tτðtÞ

f ðxðsÞÞ ds dt þ sðxðtÞ; xðtτðtÞÞ; tÞ dωðtÞ:

Case2. If there are no stochastic disturbances in system (2.7), then the neural networks are simplified to _ _ xðtÞKðrðtÞÞ xðthÞ ¼ CðrðtÞÞxðtÞ þ AðrðtÞÞf ðxðtÞÞ þ BðrðtÞÞf ðxðtτðtÞÞÞ Z t þDðrðtÞÞ f ðxðsÞÞ ds: ð3:19Þ tτðtÞ

Corollary 3.2. The neural network (3.19) is globally exponentially stable, if for given βi 4 0 ði A SÞ, there exist symmetric positive definite matrices Q1, Q2, Q3, Pi and Gi ði A SÞ, and positive scalars δ and γ, such that the following linear matrix inequality holds: 2 6 6 6 6 6 6 Σi ¼ 6 6 6 6 6 4

Ξ1

βi P i K i þ K i P i C i

n n

0

P i Ai

P i Bi

0

Ξ2

0

K i P i Ai

K i P i Bi

K i P i Di

n

Ξ3

n

n

n

0 δI

0 0

0 0

n

n

n

n

γI

0

n

n

n

n

n

Ξ4

n

n

n

n

n

n

P i Di

3

0 7 7 7 7 7 7 7 o 0; 7 0 7 7 0 7 5 Gi 0 0

ð3:20Þ where N

Ξ 1 ¼ βi P i 2P i C i þ ∑ π ij P j þ Q 1 þ Q 2 þ τQ 3 þ δLT L; j¼1

Ξ 2 ¼ βi K Ti P i K i eβi h Q 2 ; Ξ 3 ¼ ð1ϱÞeβi τ Q 1 þ γLT L; Ξ 4 ¼ ð1ϱÞτ 1 Q 3 þ Gi : 4. Numerical simulations In this section, the following numerical simulations are given to illustrate the results above. Example 1. Consider the case of 2-D stochastic neural network (2.7) with xðtÞ ¼ ðx1 ðtÞ; x2 ðtÞÞT , ωðtÞ is a 2-D Brownian motion, and r (t) is a right-continuous Markov chain taking values in S ¼ f1; 2g with generator



ð3:16Þ

Corollary 3.1. The neural network (3.16) is globally exponentially stable in the mean square, if for given β 4 0, there exist symmetric positive definite matrices Q1, Q2, Q3, P and G, and positive scalars δ, γ and λ, such that the following two linear matrix inequalities hold: P o λI;

Ξ 2 ¼ βK T PKeβh Q 2 ;

iAS

It follows from (3.12) and (3.14) that

þD

Ξ 1 ¼ βP2PC þ Q 1 þ Q 2 þ τQ 3 þ λR1 þ δLT L;

ð3:17Þ

  3 3 Π¼ : 5 5 For the two operating conditions (modes), the associated data are 

 0:2 ; 0:1   0:2 0:2 D1 ¼ ; 0:3 0:1

A1 ¼

0:2 0:3



 0:2 ; 0:1   0:3 0:2 A2 ¼ ; 0:1 0:2 B1 ¼

0:1 0:3



 0 ; 0:7   0:2 0:1 B2 ¼ ; 0:2 0:1 C1 ¼

0:8 0

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

 C2 ¼  K2 ¼

0:9

0

0

0:7

0:2 0

 ;

 0 ; 0:2

 D2 ¼

0:1

0:1

0:1

0:2

h ¼ 0:5; "

sðxðtÞ; xðtτðtÞÞ; t; 1Þ ¼ " sðxðtÞ; xðtτðtÞÞ; t; 2Þ ¼



 ;

K1 ¼

τðtÞ ¼ 0:5;

0:1

0

0

0:1

 ;

f ðxÞ ¼ 0:1 tanhðxÞ;

0:1x1 ðtÞ þ 0:2x1 ðtτðtÞÞ 0

# 0 ; 0:1x2 ðtÞ þ 0:2x2 ðtτðtÞÞ #

0:2x1 ðtÞ þ 0:3x1 ðtτðtÞÞ

0

0

0:2x2 ðtÞ þ 0:3x2 ðtτðtÞÞ

:

It is easy to know L ¼ 0:1I, τðtÞ ¼ 0:5. In addition, letting β1 ¼ 0:5; β2 ¼ 0:6. Using Theorem 3.1 and Matlab LMI control toolbox, it can be found that the neural networks (2.7) is globally exponentially stable in the mean square and the solutions of the LMIs (3.1) and (3.2) are given as follows:     0:6820 0:0002 0:6842 0:0019 P1 ¼ ; P2 ¼ ; 0:0002 0:6354 0:0019 0:6431     0:2266 0:0009 0:1735 0:0012 Q1 ¼ ; Q2 ¼ ; 0:0009 0:1557 0:0012 0:0884     0:1643 0:0002 0:4973 0:0095 ; G1 ¼ ; Q3 ¼ 0:0002 0:1543 0:0095 0:4197   0:4233 0:0068 ; δ ¼ 0:4956; γ ¼ 0:4575; G2 ¼ 0:0068 0:4188 λ 1 ¼ 1:0095;

λ 2 ¼ 0:9646:

By using the Euler–Maruyama numerical scheme, the simulation results are as follows: T ¼20 and step size t0 ¼0.01. Fig. 1 shows the state response of mode 1 (i.e., neural network (2.7)

for 0:5 r θ r 0, and Fig. 2 shows the state response of mode 2 (i. e., neural network (2.7) when rðtÞ ¼ 2) with the initial conditions x1 ðθÞ ¼ 0:5; x2 ðθÞ ¼ 0:8 for 0:5 r θ r 0. The simulation results imply that the neural networks (2.7) is globally exponentially stable in the mean square. Example 2. Consider a three-dimensional stochastic high-order neural network (2.7) with neutral type. Let the Markov process governing the mode switching has generator   0:7 0:7 Π¼ : 0:3 0:3 For the two operating conditions (modes), the associated data are 2

2 3 3 2:7 0:6 1=3 0:8 0:15 1=6 6 0:5 6 7 1 1=6 5; B1 ¼ 4 0:5 1 0:25 7 A1 ¼ 4 5; 0:8 2 1 0:5 0:25 1 2 3 2 3 0:37 0:92 1:38 3:1 0 0 6 7 6 0:18 0:09 0:13 7 C 1 ¼ 4 0 3:5 0 5; D1 ¼ 4 5; 0 0 4 0:5 0:7 0:5 2 2 3 3 0:1 0 0 3 0:5 1=3 6 6 7 7 1 0:2 5; K 1 ¼ 4 0 0:2 0 5; h ¼ 0:3; A2 ¼ 4 0:5 0 0 0:3 1 2 1 2 2 3 3 1 0:25 1=6 4:1 0 0 6 6 7 1 3:5 7 B2 ¼ 4 0:5 5; C 2 ¼ 4 0 2:9 0 5; 0:5 1=3 1 0 0 3:1 2 2 3 3 0:41 0:6 1:5 0:2 0 0 6 0:2 0:1 0:3 7 6 0 0:3 0 7 D2 ¼ 4 5; K 2 ¼ 4 5; 0:4 0:7 0:6 0 0 0:1 τðtÞ ¼ 0:9 þ 0:2 sin ðtÞ;

0.6

0.2

x1(t)

0.1

x2(t)

x1(t)

0.4

0

0.2

−0.1

0

x(t)

x(t)

0.3

−0.2 −0.3

−0.2

−0.4

−0.4

−0.5

x2(t) x3(t)

−0.6

−0.6 −0.7

7

0

2

4

6

8

10

12

14

16

18

−0.8

20

0

1

2

3

4

t

5

6

7

8

9

10

t

Fig. 1. State response of mode 1 in neural networks (2.7) with the initial conditions x1 ðθÞ ¼ 0:3; x2 ðθÞ ¼ 0:6 for 0:5 r θ r 0.

Fig. 3. State response of mode 1 in neural networks (2.7) with the initial conditions x1 ðθÞ ¼ 0:8; x2 ðθÞ ¼ 0:6; x3 ðθÞ ¼ 0:5 for 1:1 r θ r 0.

when rðtÞ ¼ 1) with the initial conditions x1 ðθÞ ¼ 0:3, x2 ðθÞ ¼ 0:6 1

0.8 x1(t)

0.6

x1(t)

x2(t)

0.4

x2(t)

0.5

x3(t)

x(t)

x(t)

0.2 0 −0.2

0

−0.4 −0.6 −0.5

0

2

4

6

8

10

12

14

16

18

20

t Fig. 2. State response of mode 2 in neural networks (2.7) with the initial conditions x1 ðθÞ ¼ 0:5; x2 ðθÞ ¼ 0:8 for 0:5 r θ r 0.

−0.8

0

1

2

3

4

5

6

7

8

9

10

t Fig. 4. State response of mode 2 in neural networks (2.7) with the initial conditions x1 ðθÞ ¼ 0:8; x2 ðθÞ ¼ 0:6, x3 ðθÞ ¼ 0:8 for 1:1 r θ r 0.

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎

8

2 6 sðxðtÞ; xðtτðtÞÞ; t; 1Þ ¼ 4 2 6 sðxðtÞ; xðtτðtÞÞ; t; 2Þ ¼ 4

0:4x1 ðtτðtÞÞ

0

0

0

0:3x2 ðtτðtÞÞ

0

0

0:2x3 ðtτðtÞÞ

0

3 7 5;

0:4x1 ðtτðtÞÞ

0

0

0:5x2 ðtÞ

0

0

0

0:2x3 ðtÞ þ 0:3x3 ðtτðtÞÞ

3 7 5;

f ðxÞ ¼ 0:1ðjx þ 1jjx1jÞ:

Similar to Example 1, the neural networks (2.7) are globally exponentially stable with the above parameters, which can be shown in Figs. 3 and 4. Remark 4. Two (or three)-neuron networks have played an important role in analysis of the dynamics of neural networks. Despite the low number of neurons in the system, in many instances, this networks display the same behavior as larger networks and many techniques developed to deal with this networks can carry over to large size networks such as those used in practical applications. Therefore, a two (or three)-neuron networks example can be used as prototype to improve our understanding of our theoretical results. Remark 5. This type of stochastic perturbation in Examples 1 and 2 can be regarded as a result from the occurrence of the internal error when the simulation circuits are constructed, such as inaccurate design of the coupling strength and some other important parameters. Remark 6. Today, there are generally two kinds of continuously distributed delays in the neural networks model, i.e., finitely distributed delays and infinitely distributed delays. In this paper, the distributed delays in neural networks model is finite and timevarying. In fact, our theoretical results cannot be used to deal the stability for neutral high-order Hopfield neural networks with Markovian jump parameters, as well as mixed time-varying delays (both discrete time-varying and infinitely distributed delays) and stochastic disturbance, this is a problem that we should study in the future. 5. Conclusions This paper has dealt with the problem of global exponential stability analysis for a class of neutral high-order stochastic Hopfield neural networks, which involve both Markovian jump parameters and mixed time delays. The jumping parameters are modeled as a continuous-time finite-state Markov chain. A linear matrix inequality (LMI) approach has been developed to derive the stability criteria, which can be tested easily using the Matlab LMI toolbox. Numerical simulations have been used to demonstrate the usefulness of the main results. This paper provides the following main contribution: (1) The model suggested in this paper is fairly comprehensive since it simultaneously incorporates noise perturbations, highorder terms, Markovian jump parameters, and mixed delays. (2) Many techniques and approaches such as Lyapunov stability theory, stochastic analysis theory, mathematical induction, and a linear matrix inequality approach have been successfully applied in this paper to reduce the conservativeness of results. (3) The obtained stability conditions are delay-dependent, which depend not only on the upper bounds of time delays but also on their upper bounds of derivatives, and thus are less conservative than delay-independent criteria, especially when the size of the delay is small. There are still many interesting and challenging mathematical questions that need to be studied for system (2.1). For example, it

is well known that diffusion effects cannot be avoided in the neural networks when electrons are moving in asymmetric electromagnetic fields, so the activations vary must be considered in space as well as in time. On the other hand, it is almost impossible to get an exact mathematical model of dynamic system due to modeling errors, measurement errors, linearization approximations and so on. Indeed, it is reasonable and practical to assume that the models of the neural networks to be controlled almost always contain parameter uncertainties. Then finding similar results in the case of neutral high-order stochastic neural networks with Markovian jump parameters, mixed time delays, reaction diffusion and parameter uncertainties may constitute a direction for future research.

References [1] Arik S. Stability analysis of delayed neural networks. IEEE Transactions on Circuits Systems I: Fundamental Theory and Applications 2000;47:1089–92. [2] Balasubramaniam P, Lakshmanan S, Jeeva Sathya Theesar S. State estimation for Markovian jumping recurrent neural networks with interval time-varying delays. Nonlinear Dynamics 2010;60:661–75. [3] Balasubramaniam P, Rakkiyappan R, Sathy R. Delay dependent stability results for fuzzy BAM neural networks with Markovian jumping parameters. Expert Systems with Applications 2011;38:121–30. [4] Balasubramaniam P, Sathy R, Rakkiyappan R. A delay decomposition approach to fuzzy Markovian jumping genetic regulatory networks with time-varying delays. Fuzzy Sets and Systems 2011;164:82–100. [5] Baldi P, Atiya AF. How delays affect neural dynamics and learning. IEEE Transactions on Neural Networks 1994;5:612–21. [6] Bao H, Cao J. Stochastic global exponential stability for neutral-type impulsive neural networks with mixed time-delays and Markovian jumping parameters. Communications in Nonlinear Science and Numerical Simulation 2011;16: 3786–91. [7] Blythe S, Mao X, Liao X. Stability of stochastic delay neural networks. Journal of Franklin Institute 2001;338:481–95. [8] Boyd S, El Ghaoui L, Feron E, Balakrishnan V. Linear matrix inequalities in system and control theory.Philadelphia, PA: SIAM; 1994. [9] Cao J. Global exponential stability of Hopfield neural networks. International Journal of Systems Science 2001;32:233–6. [10] Feng J, Xu S, Zou Y. Delay-dependent stability of neutral type neural networks with distributed delays. Neurocomputing 2009;72:2576–80. [11] Gopalsamy K, He X. Stability in asymmetric Hopfield nets with transmission delays. Physica D 1994;76:344–58. [12] Gu K. An integral inequality in the stability problem of time-delay systems. In: Proceedings of the 39th IEEE conference on decision and control, Sydney, Australia; December 2000. p. 2805–10. [13] Guez A, Protopopsecu V, Barhen J. On the stability, and design of nonlinear continuous neural networks. IEEE Transactions on Systems, Man and Cybernetics 1998;18:80–7. [14] Hale J. Theory of functional differential equations.New York: Springer-Verlag; 1977. [15] Hale J, Verduyn Lunel SM. Introduction to functional differential equations. New York: Springer-Verlag; 1993. [16] Hopfield J. Neural networks and physical systems with emergent collective computational abilities. Proceedings of National Academy of Science 1982;79:2554–8. [17] Hopfield J. Neurons with graded response have collective computational properties like those of two-state neurons. Proceedings of National Academy of Science 1984;81:3088–92. [18] Huang Z, Yang Q, Cao J. Stochastic stability and bifurcation analysis on Hopfield neural networks with noise. Expert Systems with Applications 2011;38:10437–45. [19] Ji DH, Koo JH, Won SC, Lee SM, Park JuH. Passivity-based control for Hopfield neural networks using convex representation. Applied Mathematics and Computation 2011;217:6168–75. [20] Kamp Y, Hasler M. Recursive neural networks for associative memory.New York: Wiley; 1990. [21] Krasovskii NN, Lidskii EA. Analytical design of controllers in systems with random attributes. Automat Remote Control 1961;22:1021–5. [22] Liu B, Huang L. Existence and stability of almost periodic solutions for shunting inhibitory cellular neural networks with time-varying delays. Chaos, Solitons and Fractals 2007;31:211–7. [23] Liu L, Han Z, Li W. Global stability analysis of interval neural networks with discrete and distributed delays of neutral type. Expert Systems with Applications 2009;36:7328–31. [24] Lou X, Cui B. Delay-dependent stochastic stability of delayed Hopfield neural networks with Markovian jump parameters. Journal of Mathematical Analysis and Applications 2007;328:316–26. [25] Lou X, Cui B. Novel global stability criteria for high-order Hopfield-type neural networks with time-varying delays. Journal of Mathematical Analysis and Applications 2007;330:144–58.

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i

H. Huang et al. / ISA Transactions ∎ (∎∎∎∎) ∎∎∎–∎∎∎ [26] Mahamoud MS, Shi P. Robust stability, stabilization and H1 control of timedelay systems with Markovian jump parameters. International Journal of Robust Nonlinear Control 2003;13:755–84. [27] Park JuH, Kwon OM, Lee SM. LMI optimization approach on stability for delayed neural networks of neutral-type. Applied Mathematics and Computation 2008;196:236–44. [28] Park JuH, Park CH, Kwon OM, Lee SM. A new stability criterion for bidirectional associative memory neural networks of neutral-type. Applied Mathematics and Computation 2008;199:716–22. [29] Park MJ, Kwon OM, Park JuH, Lee SM, Cha EJ. Synchronization criteria for coupled neural networks with interval time-varying delays and leakage delay. Applied Mathematics and Computation 2012;218:6762–75. [30] Singh V. Robust stability of cellular neural networks with delay: linear matrix inequality approach. IEEE Proceedings on Control Theory and Applications 2004;151:125–9. [31] Tang Y, Fang J, Xia M, Yu D. Delay-distribution-dependent stability of stochastic discrete-time neural networks with randomly mixed time-varying delays. Neurocomputing 2009;72:3830–8. [32] Tang Y, Fang J, Miao Q. Synchronization of stochastic delayed neural networks with Markovian switching and its application. International Journal of Neural Systems 2009;19:43–56. [33] Wan L, Sun J. Mean square exponential stability of stochastic delayed Hopfield neural networks. Physics Letters A 2005;343:306–18. [34] Wang J, Cai Y, Yin J. Multi-start stochastic competitive Hopfield neural network for frequency assignment problem in satellite communications. Expert Systems with Applications 2011;38:131–45. [35] Wang Z, Fang J, Liu X. Global stability of stochastic high-order neural networks with discrete and distributed delays. Chaos, Solitons and Fractals 2008;36:388–96. [36] Wang Z, Liu Y, Frase K, Liu X. Stochastic stability of uncertain Hopfield neural networks with discrete and distributed delays. Physics Letters A 2006;354: 288–97. [37] Wang Z, Shu H, Fang J, Liu X. Robust stability for stochastic Hopfield neural networks with time delays. Nonlinear Analysis: Real World Applications 2006;7:1119–28.

9

[38] Wu Z, Park JuH, Su H, Chu J. New results on exponential passivity of neural networks with time-varying delays. Nonlinear Analysis: Real World Applications 2012;13:1593–9. [39] Shu Z, Lam J, Xu S. Robust stabilization of Markovian delay systems with delay-dependent exponential estimates. Automatica 2006;42:2001–8. [40] Xu B, Liu X, Liao X. Global asymptotic stability of high-order Hopfield type neural networks with time delays. Computers and Mathematics with Applications 2003;45:1729–37. [41] Xu B, Liu X, Liao X. Global exponential stability of high order Hopfield type neural networks. Applied Mathematics and Computation 2006;174:98–116. [42] Xu B, Wang Q, Liao X. Stability analysis of high-order Hopfield type neural networks with uncertainty. Neurocomputing 2008;71:508–12. [43] Zhang D, Xu J. Projective synchronization of different chaotic time-delayed neural networks based on integral sliding mode controller. Applied Mathematics and Computation 2010;217:164–74. [44] Zhang H, Wang Y. Stability analysis of Markovian jumping stochastic Cohen– Grossberg neural networks with mixed time delays. IEEE Transactions on Neural Networks 2008;19:366–70. [45] Zhang J, Shi P, Qiu J. Novel robust stability criteria for uncertain stochastic Hopfield neural networks with time-varying delays. Nonlinear Analysis: Real World Applications 2007;8:1349–57. [46] Zhu J, Zhang Q, Yang C. Delay-dependent robust stability for Hopfield neural networks of neutral-type. Neurocomputing 2009;72:2609–17. [47] Zhu Q, Cao J. Robust exponential stability of Markovian jump impulsive stochastic Cohen–Grossberg neural networks with mixed time delays. IEEE Transactions on Neural Networks 2010;21:1314–25. [48] Zhu Q, Cao J. Exponential stability of stochastic neural networks with both Markovian jump parameters and mixed time delays. IEEE Transactions on Systems, Man and Cybernetics 2011;41:341–53. [49] Zhu Q, Huang C, Yang X. Exponential stability for stochastic jumping BAM neural networks with time-varying and distributed delays. Nonlinear Analysis: Hybrid Systems 2011;5:52–77.

Please cite this article as: Huang H, et al. Global exponential stability of neutral high-order stochastic Hopfield neural networks with Markovian jump parameters and mixed time delays. ISA Transactions (2013), http://dx.doi.org/10.1016/j.isatra.2013.07.016i