Periodicity and synchronization of coupled memristive neural networks with supremums

Periodicity and synchronization of coupled memristive neural networks with supremums

Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎ Contents lists available at ScienceDirect Neurocomputing journal homepage: www.elsevier.com/locate/neucom Brief Pap...

587KB Sizes 3 Downloads 108 Views

Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎

Contents lists available at ScienceDirect

Neurocomputing journal homepage: www.elsevier.com/locate/neucom

Brief Papers

Periodicity and synchronization of coupled memristive neural networks with supremums$ Ying Wan a, Jinde Cao a,b,n a b

Department of Mathematics, Southeast University, Nanjing 210096, China Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah 21589, Saudi Arabia

art ic l e i nf o

a b s t r a c t

Article history: Received 11 November 2014 Received in revised form 9 January 2015 Accepted 2 February 2015

This paper investigates the periodicity and synchronization of the coupled memristive neural networks with supremums and time-varying delays. By employing a novel ω-matrix measure approach and classical Filippov's discontinuous theory, some new sufficient conditions are derived to ensure the global exponential periodicity and the stability of the memristive neural network. Furthermore, the synchronization condition for the drive–response memristive neural networks via the error-feedback control scheme is also derived. Finally, numerical examples are provided to demonstrate the validity of the main results. & 2015 Elsevier B.V. All rights reserved.

Keywords: Coupled memristive neural networks Matrix measure Periodicity and stability Global exponential synchronization

1. Introduction The concept of memristor (short for memory and resistor) was first proposed by Chua [1]. The memristance of a memristor is nonlinear, which differs from the other three basic circuit elementsresistor, capacitor and inductor. More specifically, the memristance depends not only on the magnitude and polarity of the voltage applied on it, but also relies on the length of time the voltage has been applied. In other words, the memristor has the property of memorizing the history of the applied voltage, thus, it behaves just as forgetting and memory processes in human brains [2]. The physical prototype of the memristor had not been realized until 40 years later when HP laboratory published their experimental findings on Nature [3]. Since then, unprecedented attentions have been attracted worldwide to this electronic device. By substituting the resistor in the primitive neural network with memristor, the memristive neural network is then obtained. The investigation of memristor-based neural network has been a hot topic in recent years. According to the works of [4,5], the essence of

☆ This work was supported in part by the National Natural Science Foundation of China (Grant nos. 61272530 and 11072059), the Natural Science Foundation of Jiangsu Province of China (Grant no. BK2012741), and the Specialized Research Fund for the Doctoral Program of Higher Education (Grant nos. 20110092110017 and 20130092110017). n Correspondence to: Research Center for Complex Systems and Network Scicence, and Department of Mathematics, Southeast University, Nanjing 210096, PR China. E-mail address: [email protected] (J. Cao).

such memristive neural network is a state-dependent switching recurrent network with discontinuous right-hand side. The global uniform asymptotic stability is investigated by [6]. Additionally, [7] studied the nonlinear dynamics of memristor oscillators including both periodic and nonperiodic ones. Stability and periodic oscillators are prerequisites for the design of neural networks and are of significant importance when dealing with linear programming and pattern recognitions [8,9]. However, divergence or instability often occur due to the existence of timedelays which is caused by the finite speed of information transmissions between the neurons [10,11]. Different criteria have been obtained for the neural networks with or without delays for the global stability and periodicity [12,13]. Thus, it is of great interest for us to study the globally exponential periodicity and stability of the memristive neural network with time-delays. Synchronization indicates that the specific states of all the neurons in the neural networks converge to a common value. The study of synchronization has demonstrated its wide applications in numerous fields, such as genetic network, signal processing, and food webs [14,15]. The synchronization of memristor-based neural networks has been investigated recently, see, e.g., [16,17]. We aim to analyze the synchronization problem from a distinct point of view and under more general conditions. The states of neural networks are often subjected to instantaneous noises and exhibit abrupt changes, which can be modeled by impulsive differential equations with supremums [18,19]. In such a mathematical model, the influences of the maximum of the function over specific past intervals are significant. For instance, in the automatic control of various practical systems, it is noticed

http://dx.doi.org/10.1016/j.neucom.2015.02.007 0925-2312/& 2015 Elsevier B.V. All rights reserved.

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i

Y. Wan, J. Cao / Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎

2

that the law of the regulation depends on the maximum values of some regulated state parameters over some time intervals, as pointed out by [20]. Such differential equations with supremums have been applied efficiently in many fields of science and technology. In [21], the authors studied the stability and corresponding control scheme of CNNs with supremums. In this paper we focus on the memristive neural networks in which the activation functions depend on the supremum of the states in a past interval. So illustrated by the above concerns, we focus on the study of coupled memristive recurrent neural networks with supremums and bounded delays. With the employment of discontinuous theories [22], some new criteria are derived to guarantee the periodicity and stability of the memristive neural networks. The approach is quite novel since it is mainly based on matrix measure and general Halanay inequality instead of constructing Lyapunov functions or functionals. Some criteria derived from Lyapunov methods contain the matrix norms of the systems' connection weighting matrices [23,24], but noting that matrix norms always have negative values while matrix measures can be negative since the values of matrix measures are closely related to signs of connection weights. Thus, matrix measure incorporates more information about the excitatory and inhibitory behaviors of the neurons in the network. Also noting that many other works about the stability and synchronization of memristive neural networks are in the forms of LMIs [16,17], huge computation loads will be inevitable when the scale of the network is relatively large. In this case, algebraic conditions would be more profitable and efficient, though it may bring some conservations. Meanwhile, some previous studies also based upon matrix measure μ1 ðÞ; μ2 ðÞ; μ1 ðÞ [25,26]. Compared with these results, the highlight of this paper is the introducing of a novel ω-norm and the corresponding ωmatrix measure. The ω-matrix measure can be viewed as a generalization of the 1-matrix measure, but involves n freeweight parameters ωj ; j ¼ 1; 2; …; n, which contributes to less conservations of the corresponding results. The structure of the paper is outlined as follows. In Section 2, the model of memristive coupled neural network with supremums is formulated and some preliminaries are presented. Global exponential periodicity and stability are studied in Section 3. Section 4 provides the synchronization control scheme of the drive–response memristive neural networks. Finally, numerical examples are given to illustrate the effectiveness of the analytical results.

which the ith neuron will reset its potential to the resting state when being isolated from other neurons and external biases at time t. DðtÞ ¼ ðdij ðtÞÞnn represents the topological structure of the neural network, when there is a link from neuron j to i at time t, then dij ðtÞ 4 0, otherwise dij ðtÞ ¼ 0. The connecting weight matrices aij ðt; xðtÞÞ; bij ðt; xðtÞÞ; cij ðt; xðtÞÞ are given as ( a^ ij ðtÞ; kj ðxÞ 4 ηj ; ð3Þ aij ðt; xðtÞÞ ¼  a ij ðtÞ; kj ðxÞ o ηj ;

bij ðt; xðtÞÞ ¼

8 < b^ ij ðtÞ;

kj ðxÞ 4 ηj ;

: b ij ðtÞ;

kj ðxÞ o ηj ;

( cij ðt; xðtÞÞ ¼

c^ ij ðtÞ; c ij ðtÞ;

kj ðxÞ 4 ηj ; kj ðxÞ o ηj ;

Remark 1. This model is a generalization of the model defined in [4] by setting the parameters a^ ij ðtÞ, a ij ðtÞ, b^ ij ðtÞ, b ij ðtÞ, c^ ij ðtÞ, c ij ðtÞ ði; j ¼ 1; 2; …; nÞ varying with time, since the parameter fluctuation with respect to time in neural networks' implementations is also unavoidable. Additionally, the threshold level functions kj ðxÞ ¼ ηj ðj ¼ 1; 2; …; nÞ can be in general forms and are not restricted as those in previous studies [16,17]. Remark 2. Inspired by the models in some existing works (e.g., [4,5]), the parameters aij ðt; xðtÞÞ, bij ðt; xðtÞÞ, cij ðt; xðtÞÞ in (3)–(5) are actually undefined when kj ðxÞ ¼ ηj , i.e, these parameters are discontinuous functions, which leads to the investigations of the solutions of (1) under Filippov sense. Throughout the paper, we consider the neural network (1) under the following assumptions: (A1) The activation functions f j ; g j ðj ¼ 1; 2; …; nÞ satisfy the Lipschitz condition, i.e., there exist positive numbers lj ; hj ðj ¼ 1; 2; …; nÞ such that j f j ðxÞ  f j ðyÞj r lj j x yj ;

In this paper, consider the following coupled memristive neural network with bounded time delays and supremums:

n X

cij ðt; xðtÞÞg j ðxj ðt  τj ðtÞÞÞ þ

j¼1

n X

!

sup

s A ½t  τðtÞ;t

dij ðtÞxj ðtÞ þ I i ðtÞ;

xj ðsÞ

ð1Þ

j¼1

s A ½  τ0 ; 0;

tAR

tAR

tAR

  c ij ¼ max sup j c^ ij ðtÞj ; sup j c ij ðtÞj : tAR

tAR

(A3) For each kj ðj ¼ 1; 2; …; nÞ, there exists a nonempty subset Δj D R, when kj ðxÞ r ηj r kj ðyÞ, δj can be chosen from Δj such that xj r δj r yj

or

yj r δj r xj ;

and f j ðδj Þ ¼ g j ðδj Þ ¼ 0.

with initial states xi ðsÞ ¼ φi ðsÞ;

j g j ðxÞ  g j ðyÞj r hj j x  yj ; 8 x; y A R:

(A2) a^ ij ðtÞ; a ij ðtÞ; b^ ij ðtÞ; b ij ðtÞ; c^ ij ðtÞ; c ij ðtÞ are bounded continuous functions with bounds   a ij ¼ max sup j a^ ij ðtÞj ; sup j a ij ðtÞj ;   b ij ¼ max sup j b^ ij ðtÞj ; sup j b ij ðtÞj ;

2. Model description and preliminaries

þ

ð5Þ

for i; j ¼ 1; 2; …; n; 8 t A R, kj(x) is a threshold level function and ηj is the corresponding threshold value.

tAR

n n X X dxi ðtÞ aij ðt; xðtÞÞf j ðxj ðtÞÞ þ bij ðt; xðtÞÞf j ¼  r i ðtÞxi ðtÞ þ dt j¼1 j¼1

ð4Þ

ð2Þ

τ0 ¼ max1 r j r n fτ; τj g, where for i ¼ 1; 2; …; n, 0 o τðtÞ r τ ; 0 o τj ðtÞ r τj . xi(t) represents the state of the ith neuron at time t. f j ðxj ðtÞÞ and g j ðxj ðt  τj ðtÞÞÞ denote different activation functions of the jth neuron at time t and t  τj ðtÞ, respectively. f j ðsups A ½t  τðtÞ;t xj ðsÞÞ corresponds to the jth neuron's output which depends on the maximum value of its own state in the time interval ½t  τðtÞ; t. Ii(t) is a continuous external bias on the ith neuron with an upper bound Ii. ri(t) denotes the rate with

Remark 3. In (A3), if we choose kj ðxÞ ¼ j xj j , then Δj ¼ f 7 ηj g. Such a condition is satisfied by many well-known activation functions, for instance, f j ðxj ðtÞÞ ¼ tanhðj xj ðtÞj  ηj Þ. Furthermore, kj(x) can also be picked as x2j or other functions, contributing to wider applications of the model. Since the right side of (1) is discontinuous, classical solutions may not always exist. To facilitate the stability analysis, the concepts of set-valued maps and Filippov solutions are introduced [22].

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i

Y. Wan, J. Cao / Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎

For matrix A ¼ ðaij Þnn A Rnn , the matrix forms are given as

The set-valued map is defined as u½XðxÞ 9 ⋂

⋂ cofXðBðx; εÞÞ\Sg;

J A J 1 ¼ max

ε 4 0μðSÞ ¼ 0

1rjrn

where X is the original vector field, u½X is the Filippov set-valued map, coðSÞ represents the convex closure of set S and μ is Lebesgue measure. So the Filippov set-valued maps of the connection strength coefficients in (1) are given as 8 a^ ðtÞ; kj ðxÞ 4 ηj ; > > < ij ^  cof a ðtÞ; a ðtÞg; kj ðxÞ ¼ ηj ; ð6Þ uðaij ðt; xðtÞÞÞ ¼ ij ij > > : a ij ðtÞ; kj ðxÞ o ηj ;

kj ðxÞ ¼ ηj ;

ð7Þ

kj ðxÞ o ηj ;

ij

8 c^ ðtÞ; > > < ij uðcij ðt; xðtÞÞÞ ¼ cofc^ ij ðtÞ; c ij ðtÞg; > > : c ij ðtÞ;

kj ðxÞ 4 ηj ; kj ðxÞ ¼ ηj ;

ð8Þ

kj ðxÞ o ηj ;

Definition 1. The function x(t) is said to be a solution of (1) with initial condition (2) if x(t) is absolutely continuous in any given compact interval and satisfies the following differential inclusion n n X X dxi ðtÞ A  r i ðtÞxi ðtÞ þ uðaij ðt; xðtÞÞÞf j ðxj ðtÞÞ þ dij ðtÞxj ðtÞ þ I i ðtÞ dt j¼1 j¼1 ! n X uðbij ðt; xðtÞÞÞf j sup xj ðsÞ þ s A ½t  τðtÞ;t

j¼1

þ

n X

uðcij ðt; xðtÞÞÞg j ðxj ðt  τj ðtÞÞÞ;

ð9Þ

j¼1

For simplicity, the memristive neural network (9) can be written as ! _ A ð RðtÞ þ DðtÞÞxðtÞ þ U a ðt; xðtÞÞf ðxðtÞÞ þU b ðt; xðtÞÞf xðtÞ þU c ðt; xðtÞÞgðxðt  τ ÞÞ þ IðtÞ; n

sup

s A ½t  τðtÞ;t

xðsÞ ð10Þ

T

where xðtÞ ¼ ðx1 ðtÞ; x2 ðtÞ; …; xn ðtÞÞ , f ðsÞ ¼ ðf 1 ðsÞ; f 2 ðsÞ; …; f n ðsÞÞT , gðsÞ ¼ ðg 1 ðsÞ; g 2 ðsÞ; …; g n ðsÞÞT , RðtÞ ¼ diagfr 1 ðtÞ; r 2 ðtÞ; …; r n ðtÞg, DðtÞ ¼ ðdij ðtÞÞnn , U a ðt; xðtÞÞ ¼ ðuðaij ðt; xðtÞÞÞÞnn , U b ðt; xðtÞÞ ¼ ðuðbij ðt; xðtÞÞ ÞÞnn , U c ðt; xðtÞÞ ¼ ðuðcij ðt; xðtÞÞÞÞnn , IðtÞ ¼ diagfI 1 ðtÞ; I 2 ðtÞ; …; I n ðtÞg. _ A Fðt; xÞ, if for Lemma 1 (Filippov [22]). For differential inclusion xðtÞ all ðt; xÞ A G, the set Fðt; xÞ is nonempty, bounded, closed, convex and F is upper semicontinuous in t; x, then for any point ðt 0 ; x0 Þ A G, there exists a solution of x_ A Fðt; xÞ, xðt 0 Þ ¼ x0 .

j aij j ;

J A J 2 ¼ ðλmax ðAT AÞÞ1=2 ;

i¼1

1rirn

n X

j aij j ;

J A J ω ¼ max

1rjrn

j¼1

where I is an n  n identity matrix, and J  J p is the corresponding induced matrix norm. For p ¼ 1; 2; 1; ω, the matrix measure can be calculated as 8 9 n < = X μ1 ðAÞ ¼ max ajj þ j aij j ; μ2 ðAÞ ¼ λmax ½ðA þAT Þ=2; ; 1 r j r n: 8 < μ1 ðAÞ ¼ max aii þ 1 r i r n:

i ¼ 1;i a j n X

j ¼ 1;j a i

9 = j aij j ; ;

8 < μω ðAÞ ¼ max ajj þ 1 r j r n:

JxJ1 ¼

j xi j ;

JxJ2 ¼

i¼1

J x J 1 ¼ max j xi j ; 1rirn

j xi j

1rjrn

n X ωi i¼1

i ¼ 1;i a j

Definition 4. The neural network (1) is said to be globally exponentially periodic if there exists unique T-periodic solution xn ðt; ϕÞ and any other solutions xðt; φÞ of (1) converge exponentially to it, i.e., there exist constants G Z 1 and μ 4 0 such that J xðt; φÞ  xn ðt; ϕÞ J p r G J φ  ϕ J p e  μt ;

t Z 0;

where p ¼ 1; 2; 1; ω. Definition 5. The neural network (1) is said to be globally exponentially stable if the unique equilibrium point xn ¼ ðxn1 ; xn2 ; …; xnn ÞT of (1) is globally exponentially stable, i.e., there exist constants G Z 1 and μ 4 0 such that J xðt; φÞ  xn J p r G J φ  xn J p e  μt ;

t Z 0;

where p ¼ 1; 2; 1; ω. Lemma 2. Under the Assumptions (A1)–(A3), for any Ax ðtÞ A U a ðt; xðtÞÞ, Ay ðtÞ A U a ðt; yðtÞÞ, Bx ðtÞ A U b ðt; xðtÞÞ, By ðtÞ A U b ðt; yðtÞÞ, C x ðtÞ A U c ðt; xðtÞÞ, C y ðtÞ A U c ðt; yðtÞÞ, the following inequalities hold: J Ax ðtÞf ðxðtÞÞ  Ay ðtÞf ðyðtÞÞ J p r J A J p J L J p J xðtÞ  yðtÞ J p ; ! J Bx ðtÞf

sup

s A ½t  τ ðtÞ;t

xðsÞ  By ðtÞf

sup

s A ½t  τ0 ;t

JC Jp JH Jp

sup

sup

s A ½t  τ ðtÞ;t

yðsÞ J p r

J xðsÞ  yðsÞ J p ;

s A ½t  τ 0 ;t

ð12Þ

!

J xðsÞ  yðsÞ J p ;

ð13Þ

ð14Þ

for p ¼ 1; 1; ω. Here A ¼ ða ij Þnn , B ¼ ðb ij Þnn , C ¼ ðc ij Þnn , L ¼ diagfl1 ; l2 ; …; ln g, H ¼ diagfh1 ; h2 ; …; hn g.

;

i¼1

J x J ω ¼ max

9 = ja j : ωj ij ;

n X ωi

Remark 5. According to the above definitions, for any real matrix, the limit indicated in (11) exists and thus μp ðÞ is well-defined. Additionally, the value of μp ðÞ may be negative since it emphasizes more information about the diagonal elements. It can be verified that  J A J p r μp ðAÞ r J A J p ; 8 A A Rnn and in general, μp ðAÞ a μp ð  AÞ. Hence, the matrix measure is a totally different kind of measurement of matrices compared with matrix norms.

J C x ðtÞgðxðt  τn ÞÞ C y ðtÞgðyðt  τn ÞÞ J p r

!1=2

j aij j ;

ð11Þ

Definition 2 (Vidyasagar [27], Cao [28]). For x A Rn , the vector norms are defined as

2

ωj

J I þ hA J p  1 ; h

JB Jp JLJp

n X

i¼1

Definition 3 (Vidyasagar [27], Cao [28]). Suppose A ¼ ðaij Þnn is a real matrix, then the matrix measure of A is defined as

Remark 4. Combining Lemma 1 and Assumptions (A1)–(A3), one can infer that the local solution of (1) exists and it can be extended towards the maximum existence interval in Filippov sense.

n X

n X ωi

where λmax ðAT AÞ represents the maximum eigenvalue of matrix AT A, ωi 4 0 ði ¼ 1; 2; …; nÞ are positive constants.

h-0

kj ðxÞ 4 ηj ;

n X

J A J 1 ¼ max

μp ðAÞ ¼ limþ

8 > b^ ij ðtÞ; > > < uðbij ðt; xðtÞÞÞ ¼ cofb^ ij ðtÞ; b ij ðtÞg; > > > : b ðtÞ;

3

ωj

Proof. We first proof the following inequality: j xi j ;

j anij ðtÞf j ðxj ðtÞÞ  ann ij ðtÞf j ðyj ðtÞÞj ra ij lj j xj ðtÞ  yj ðtÞj ;

ð15Þ

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i

Y. Wan, J. Cao / Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎

4

Eq. (14) can be derived with similar methods and are omitted for brevity. Case 1: When kj ðxÞ o ηj , kj ðyÞ o ηj , then   j anij ðtÞf j ðxj ðtÞÞ  ann ij ðtÞf j ðyj ðtÞÞj ¼ j a ij ðtÞf j ðxj ðtÞÞ  a ij ðtÞf j ðyj ðtÞÞj r a ij lj j xj ðtÞ  yj ðtÞj ; Case 2: When kj ðxÞ 4 ηj , kj ðyÞ 4 ηj , then ^ j anij ðtÞf j ðxj ðtÞÞ  ann ij ðtÞf j ðyj ðtÞÞj ¼ j a ij ðtÞf j ðxj ðtÞÞ  a^ ij ðtÞf j ðyj ðtÞÞj r a ij lj j xj ðtÞ  yj ðtÞj ;

3. Global exponential periodicity

Case 3: When kj ðxÞ o ηj okj ðyÞ, under Assumption (A4), one has j aij ðtÞf j ðxj ðtÞÞ  aij ðtÞf j ðyj ðtÞÞj ¼ j a ij ðtÞf j ðxj ðtÞÞ  a^ ij ðtÞf j ðyj ðtÞÞj ¼ j a ij ðtÞ J f j ðxj ðtÞÞ  f j ðδj Þj þ j a^ ij ðtÞ J f j ðyj ðtÞÞ  f j ðδj Þj n

nn

r a ij lj j xj ðtÞ  δj j þa ij lj j yj ðtÞ  δj j ¼ a ij lj j xj ðtÞ  yj ðtÞj ; Case 4: When kj ðxÞ r ηj r kj ðyÞ and at least one “¼” holds, we further assume kj ðxÞ ¼ ηj okj ðyÞ, then there exists a~ ij ðtÞ A co fa^ ij ðtÞ; a ij ðtÞg such that anij ðtÞ ¼ a~ ij ðtÞ. Similar with Case 3, (15) also holds. Next, denote U ¼ Ax ðtÞf ðxðtÞÞ Ay ðtÞf ðyðtÞÞ, it follows that n X

J U J 1 r max

1rirn

1rirn

¼

n X ωi i¼1

 max li

1rirn

r

ωj

max

n X

 max li

1rjrn

ωj

n X ωi

ω i¼1 j

)

!

D uðtÞ r αðtÞuðtÞ þ βðtÞ

a ki j xi ðtÞ  yi ðtÞj !

j a ij j

max

1rjrn

n X ωi

ω i¼1 j

!

sup

s A ½t  τðtÞ þ T;t

!

xðsÞ U b ðt; xðtÞÞf

sup

s A ½t  τðtÞ;t

j xi ðtÞ  yi ðtÞj

J zðt þhÞ J p  J zðtÞ J p h J zðtÞ þ hz_ ðtÞ þoðhÞ J p  J zðtÞ J p ; ¼ limþ h h-0

D þ J zðtÞ J p ¼ limþ h-0

r J zðtÞ J p ðμp ð  RðtÞ þ DðtÞÞ þ J A J p J L J p Þ þ ð J B J p J L J p þ J C J p J H J pÞ

uðsÞ ;

s A ½t  τ ðtÞ;t

sup

s A ½t  τ 0 ;t

9 αðtÞ J zðtÞ J p þ β ðtÞ

J zðsÞ J p ;

sup

s A ½t  τ 0 ;t

J zðsÞ J p ;

when αðtÞ r 0 and there exists a positive number

αðtÞ þ βðtÞ r  σ o 0;

σ such that

8 t Z0;

with uðtÞ ¼ j ψ ðtÞj ; t r t 0 , where ψ ðtÞ is bounded and continuous for t r t 0 . If there exists σ 4 0, such that

or equivalently,

αðtÞ þ βðtÞ r  σ o0;

μp ð  RðtÞ þ DðtÞÞ þ J A J p J L J p þ J B J p J LJ p þ J C J p J H J p r  σ o 0;

t Z t0 ;

then there exist G Z 1 and μ such that

one gets

uðtÞ r Ge  μ

J zðtÞ J p r Ge  μ t ;

n

n

ðt  t 0 Þ

;

xðsÞ

It follows that



sup

s A ½t  τ ðtÞ;t

xðsÞ

þU c ðt þ T; xðt þ TÞÞgðxðt  τn þ TÞÞ  U c ðt; xðtÞÞgðxðt  τn ÞÞ:

Lemma 3 (Liu et al. [29] Generalized Halanay inequality). Suppose uðtÞ Z 0, αðtÞ r0, β ðtÞ Z 0, t A ð  1; þ 1Þ and ! þ

sup

!

þ U b ðt þ T; xðt þ TÞÞf

¼ J A J ω J L J ω J xðtÞ  yðtÞ J ω : This completes the proof.

s A ½t  τðtÞ þ T;t

!

xðsÞ  U b ðt; xðtÞÞf

þ U c ðt þ T; xðt þ TÞÞgðxðt  τ þ TÞÞ U c ðt; xðtÞÞgðxðt  τn ÞÞ; ¼ ð  RðtÞþ DðtÞÞzðtÞ þ U a ðt þ T; xðt þ TÞÞf ðxðt þTÞÞ  U a ðt; xðtÞÞf ðxðtÞÞ

a ik lk j xk ðtÞ  yk ðtÞj

k¼1

sup

n

!

i¼1

max

1rirn

8 t Z T n;

!

1rirn

j¼1

(k ¼ 1 n n X X ωk

1rjrn



J xðt þ TÞ  xðtÞ J p r ε;

þ U b ðt þ T; xðt þ TÞÞf

¼ J A J 1 J L J 1 J xðtÞ  yðtÞ J 1 ;

1rjrn

Proof. We first prove that any solutions of (1) are asymptotically periodic, i.e., for any given ε 4 0, there exists Tn, such that

dzðtÞ A ðDðt þTÞ  Rðt þTÞÞxðt þ TÞ  ðDðtÞ  RðtÞÞxðtÞ dt þ U a ðt þT; xðt þTÞÞf ðxðt þ TÞÞ  U a ðt; xðtÞÞf ðxðtÞÞ

j¼1

8 t Z 0;

ð16Þ

a ij lj j xj ðtÞ  yj ðtÞj ;

1rirn



μp ð  RðtÞ þ DðtÞÞ þ J A J p J L J p þ J B J p J LJ p þ J C J p J H J p r  σ o 0;

Letting zðtÞ ¼ xðt þ TÞ xðtÞ, then

0 1    n X j a ij j A max j xi ðtÞ  yi ðtÞj max fli g @ max

J U J ω r max

Theorem 1. Suppose a^ ij ðtÞ; a ij ðtÞ; b^ ij ðtÞ; b ij ðtÞ; c^ ij ðtÞ; c ij ðtÞ; I i ðtÞ are all periodic continuous functions with period T, then the memristive neural network (1) is globally exponentially periodic under the assumptions (A1)–(A3), if there exist a positive number σ and one matrix measure μp ðÞ, p ¼ 1; 1; ω, such that

j aij ðtÞf j ðxj ðtÞÞ  aij ðtÞf j ðyj ðtÞÞj

n X

1rirn

r

nn

j¼1

r max 

n

it plays a significant important role in the study of stability and dissipativity of differential equations with nonconstant coefficients. Furthermore, it is clear that the existence of μn 4 0 where μn ¼ inf t Z t0 fμðtÞ : αðtÞ þ βðtÞeμðtÞτðtÞ ¼ 0g determines the convergence rate of u(t). However, one cannot conclude that u(t) converges exponentially to zero when τðtÞ is an unbounded delay, since the existence of μn depends heavily on the boundedness of τðtÞ. Generally, for unbounded τðtÞ, positive number μn does not exist unless αðtÞ, β ðtÞ vary exponentially.

t Z t0 ;

n

t Z 0;

where

where

μn ¼ inf fμðtÞ : αðtÞ þ βðtÞeμðtÞτðtÞ ¼ 0g;

μn ¼ inf fμðtÞj μp ð  RðtÞ þ DðtÞÞ þ J A J p J L J p þ ð J B J p J L J p

t Z t0

and the upper-right Dini derivative is defined as D þ uðtÞ ¼ lim h-0 þ ðuðt þhÞ  uðtÞÞ=h. Remark 6. Since the generalized Halanay inequality can deal with the case when αðtÞ and βðtÞ are general time-varying functions,

8 t Z 0;

tZ0

þ J C J p J H J p ÞeμðtÞτ ¼ 0g;

ð17Þ

then there exists at least one periodic solution of the neural network (1), so the existence of asymptotically periodic solution xn ðtÞ is obtained. By virtue of similar calculations as the above

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i

Y. Wan, J. Cao / Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎

proof, the uniqueness of the periodic solution can be guaranteed. Thus, all solutions of (1) exponentially converge to the unique periodic solution xn ðtÞ with the convergence rate μn given by (17). This completes the proof. □ Remark 7. Noting that a new kind of matrix measure μω ðÞ is introduced besides μ1 ðÞ, μ2 ðÞ and μ1 ðÞ. J  J ω and μω ðÞ can be viewed as generalizations of J  J 1 and μ1 ðÞ, respectively, since J  J 1 ¼ J  J ω when ωi ¼ ωj ; 8 i; j ¼ 1; 2; …; n. However, ω-norm and ω-matrix measure can have different values when different positive constants ωi are selected. If there exist positive numbers ωi ; i ¼ 1; 2; …; n, such that the corresponding J  J ω and μω ðÞ satisfy condition (16), then the global exponential periodicity is achieved. The conservations of the result are further reduced by the involvement of the free-weight parameters ωi ; i ¼ 1; 2; …; n. Corollary 1. For the memristor-based neural network (1), suppose a^ ij ðtÞ ¼ a^ ij ; a ij ðtÞ ¼ a ij ; b^ ij ðtÞ ¼ b^ ij ; b ij ðtÞ ¼ b ij ; c^ ij ðtÞ ¼ c^ ij ; c dij ðtÞ ¼ dij ,  ij , then under the corresponding assumptions (A1)–(A3), if ij ðtÞ ¼ c there exist a positive number λ and one matrix measure μp ðÞ; p ¼ 1; 1; ω, such that

μp ðð  RðtÞ þ DðtÞÞ þ J A J p J L J p þ J B J p J L J p þ J C J p J H J p r  λ o 0;

8 t Z 0;

ð18Þ for 8 t A R, the memristive neural network (1) is globally exponentially stable. Remark 8. The obtained criteria here are not in the form of LMIs which mainly based on constructing suitable Lyapunov functions or functionals, instead, matrix measure strategy is employed to derive the global exponential periodicity and the stability. It is a novel and an efficient approach since the condition is simple in calculation and furthermore, it is less conservative when compared with matrix norms since matrix measure can utilize the information of matrix elements more sufficiently and can balance the effects of positive and negative values.

þ λ o 0;

5

8 t Z 0;

ð21Þ

then the response system will exponentially synchronized to the drive system under the control input uðtÞ ¼ KðyðtÞ  xðtÞÞ. Proof. Letting eðtÞ ¼ yðtÞ  xðtÞ, then according to (19) and (20), the error system can be written as deðtÞ A ð  RðtÞ þ DðtÞ þ KÞeðtÞ þ U a ðt; yðtÞÞf ðyðtÞÞ  U a ðt; xðtÞÞf ðxðtÞÞ dt ! ! þ U b ðt; yðtÞÞf

sup

s A ½t  τðtÞ;t

yðsÞ U b ðt; xðtÞÞf

sup

s A ½t  τðtÞ;t

xðsÞ

þ U c ðt; yðtÞÞgðyðt  τn ÞÞ U c ðt; xðtÞÞgðxðt  τ n ÞÞ; Employing the definition of matrix measure and Lemma 3, one can obtain D þ J eðtÞ J p r J eðtÞ J p ðμp ðð  RðtÞ þ DðtÞ þ KÞÞ þ J A J p J L J p Þ þ ð J B J p J L J p þ J C J p J H J pÞ

sup

s A ½t  τ0 ;t

9 αðtÞ J eðtÞ J p þ βðtÞ

sup

s A ½t  τ0 ;t

J eðsÞ J p ;

J eðsÞ J p ;

Using the generalized Halanay inequality, if (21) is satisfied, then there exists a constant G Z 1 such that J eðtÞ J p r Ge  μ t ; n

t Z 0;

where μn ¼ inf fμ j μp ð  RðtÞþ DðtÞ þKÞ þ J A J p J L J p þ ð J B J p J L J p þ J C J p J H J p ÞeμðtÞτ ¼ 0g: tZ0

Thus, the response neural network can globally exponentially synchronized to the drive system under control uðtÞ ¼ KðyðtÞ  xðtÞÞ with the convergence rate μn . □ 5. Numerical examples In this section, two examples are presented to illustrate the effectiveness of our results.

4. Global exponential synchronization

Example 1. Consider the following memristive coupled neural network:

In this section, the synchronization of drive–response memristive coupled neural networks under error feedback controllers is investigated. Consider the following drive–response system, the drive system is described as !

dxi ðtÞ ¼  r i xi ðtÞ þai1 ðt; xðtÞÞf 1 ðx1 ðtÞÞ þ ai2 ðt; xðtÞÞf 2 ðx2 ðtÞÞ dt !

_ A ð RðtÞ þ DðtÞÞxðtÞ þ U a ðt; xðtÞÞf ðxðtÞÞ þU b ðt; xðtÞÞf xðtÞ

sup

s A ½t  τðtÞ;t

þ U c ðt; xðtÞÞgðxðt  τn ÞÞ þ IðtÞ;

þ bi2 ðt; xðtÞÞf 2

xðsÞ

sup

s A ½t  τ ðtÞ;t

þ U c ðt; yðtÞÞgðyðt  τn ÞÞ þ IðtÞ þuðtÞ;

yðsÞ

sup s A ½t  0:01;t

x1 ðsÞ ! x2 ðsÞ þ ci1 ðt; xðtÞÞg 1 ðx1 ðt  1ÞÞ

Theorem 2. Under the assumptions (A1)–(A3), if there exist a positive number λ and one matrix measure μp ðÞ; p ¼ 1; 1; ω, such that

i ¼ 1; 2;

ð22Þ

where f i ðxi ðtÞÞ ¼ g i ðxi ðtÞÞ ¼ tanhðxi ðtÞ  1Þ; i ¼ 1; 2, I 1 ðtÞ ¼ 0:8 sin ðtÞ, I 2 ðtÞ ¼ 0:6 cos ðtÞ and ( sin ðtÞ; j x1 j o 1; a11 ðt; xðtÞÞ ¼ b11 ðt; xðtÞÞ ¼ c11 ðt; xðtÞÞ ¼ 0:6 cos ðtÞ; j x1 j 4 1; (

ð20Þ

where uðtÞ ¼ ðu1 ðtÞ; u2 ðtÞ; …; un ðtÞÞT , xðtÞ ¼ ðx1 ðtÞ; x2 ðtÞ; …; xn ðtÞÞT , yðtÞ ¼ ðy1 ðtÞ; y2 ðtÞ; …; yn ðtÞÞT and other notations are just as those defined in Section 3. Suppose that the states of the drive system and the response system can be obtained instantaneously, we aim to design the controller u(t) in the error feedback form with feedback gain K, i.e., uðtÞ ¼ KðyðtÞ  xðtÞÞ.

μp ð  RðtÞ þ DðtÞ þKÞ þ J A J p J L J p þ J B J p J L J p þ J C J p J H J p

sup s A ½t  0:01;t

þ ci2 ðt; xðtÞÞg 2 ðx2 ðt  1ÞÞ þ di1 x1 ðtÞ þ di2 ðtÞx2 ðtÞ þ I i ðtÞ;

ð19Þ

and the corresponding response system with control input u(t) is given by ! _ A ð  RðtÞ þ DðtÞÞyðtÞ þ U a ðt; yðtÞÞf ðyðtÞÞ þ U b ðt; yðtÞÞf yðtÞ

þ bi1 ðt; xðtÞÞf 1

a12 ðt; xðtÞÞ ¼ b12 ðt; xðtÞÞ ¼ c12 ðt; xðtÞÞ ¼ ( a21 ðt; xðtÞÞ ¼ b21 ðt; xðtÞÞ ¼ c21 ðt; xðtÞÞ ¼ ( a22 ðt; xðtÞÞ ¼ b22 ðt; xðtÞÞ ¼ c22 ðt; xðtÞÞ ¼

DðtÞ ¼

0

0

1 þ 0:1 sin ðtÞ

 1  0:1 sin ðtÞ

0:8 cos ðtÞ;

j x2 j o 1;

0:9 sin ðtÞ;

j x2 j 4 1;

0:8 sin ðtÞ;

j x1 j o 1;

0:6 cos ðtÞ;

j x1 j 4 1;

cos ðtÞ;

j x2 j o 1;

0:7 sin ðtÞ;

j x2 j 4 1;

! ;

 R¼

7

0

0

7

 ;

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i

Y. Wan, J. Cao / Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎

6 1.5

and

1





1

1

1 

A¼B¼C ¼

x

0

−1

5

10

15

7

 L¼H¼

1

0

0

1

 ;

1

0:2

0:4

1

 ;

8 t A R;

8 t A R;

for any positive number λ. Consequently, the condition of Theorem 1 with p = 1,∞ cannot be satisfied, one cannot conclude whether the global exponential periodicity can be achieved or not. If one chooses ω1 ¼ 2; ω2 ¼ 1, one obtains μω ð  R þ DÞ ¼ max f  5 þ ω2 =ω1 ;  8 þ ω1 =ω2 g ¼  4:5, J A J ω J L J ω þ J B J ω J L J ω þ J C J ω J H J ω ¼ 1:4  3 ¼ 4:2, if we pick λ ¼ 0:1, then

1

0.5

0

x

0

 ;

μ1 ð  R þ DÞ þ J A J 1 J L J 1 þ JB J 1 J L J 1 þ J C J 1 J H J 1 þ λ Z 0:2 4 0;

1.5

μω ð  R þDÞ þ J A J ω J L J ω þ JB J ω J L J ω þ J C J ω J H J ω þ λ ¼  0:2o 0;

8 t A R;

It can be seen that the conditions in Theorem 1 are satisfied, so the global exponential periodicity is then guaranteed. The results are in accordance with the simulations presented in Fig. 2.

−0.5

−1

0

5

10

15

t

Fig. 2. The state trajectories in Example 2.

1 0

 0 ; 1



1 0:8

Remark 9. From Example 2, one can see that μω ðÞ can deal with the case when μ1 ðÞ and μ1 ðÞ fail to handle. With the assistance of free-weight parameters ωi ; i ¼ 1; 2; …; n, the conservations of our obtained results can be reduced. 6. Conclusion

one gets A ¼B ¼C ¼

0

and

Fig. 1. The state trajectories in Example 1.

L¼H¼

4

μ1 ð  R þDÞ þ J A J 1 J L J 1 þ J B J 1 J L J 1 þ J C J 1 J H J 1 þ λ Z 0:2 4 0; 0

t





According to the definitions of μ1 and μ1 , one gets μ1 ð R þ DÞ ¼ μ1 ð  Rþ DÞ ¼  4, J A J 1 J L J 1 þ J B J 1 J L J 1 þ J C J 1 J H J 1 ¼ 1:4  3 ¼ 4:2, it follows that

−0.5

−1.5

 ;

one gets

0.5

−1.5



1

 0:9 ; 1

By computing μ1 ð  R þ DðtÞÞ ¼ maxf  7 þ j 1 þ 0:1 sin ðtÞj ;  8  0:1 sin ðtÞg ¼  6 þ 0:1j sin ðtÞj , and J A J 1 J L J 1 þ J B J 1 J L J 1 þ J C J 1 J H J 1 ¼ 1:9  3 ¼ 5:7, choose λ ¼ 0:1 and p ¼1, then

μ1 ð  R þ DðtÞÞ þ J A J 1 J L J 1 þ J B J 1 J L J 1 þ J C J 1 J H J 1 þ λ r  0:1 o 0; 8 t A R; It follows directly from Theorem 1 that the neural network (22) is globally exponentially periodic. This result is in accordance with Fig. 1. Example 2. Consider the neural networks (29) in the form as Example 1 but the parameters are given as ( sin ðtÞ; j x1 j o 1; a11 ðt; xðtÞÞ ¼ b11 ðt; xðtÞÞ ¼ c11 ðt; xðtÞÞ ¼ 0:6 cos ðtÞ; j x1 j 4 1; ( a12 ðt; xðtÞÞ ¼ b12 ðt; xðtÞÞ ¼ c12 ðt; xðtÞÞ ¼ ( a21 ðt; xðtÞÞ ¼ b21 ðt; xðtÞÞ ¼ c21 ðt; xðtÞÞ ¼ ( a22 ðt; xðtÞÞ ¼ b22 ðt; xðtÞÞ ¼ c22 ðt; xðtÞÞ ¼

0:2 cos ðtÞ;

j x2 j o 1;

0:2 sin ðtÞ;

j x2 j 4 1;

0:2 sin ðtÞ; 0:4 cos ðtÞ;

j x1 j o 1; j x1 j 4 1;

cos ðtÞ;

j x2 j o 1;

0:7 sin ðtÞ;

j x2 j 4 1;

In this paper, the model of the memristive coupled neural network with supremums and time delays has been formulated. Based on generalized Halanay inequality and the theory of the differential equations with discontinuous right-hand sides, criteria in the form of matrix measure have been presented to ensure the global exponential periodicity and stability of the memristive coupled neural network. The synchronization via error feedback control scheme of the drive–response neural networks has also been studied. Lastly, numerical simulations have been given to show the effectiveness of the obtained results. Our future works may involve the analysis and comparisons of the synchronization of drive–response memristive neural networks under different control schemes. References [1] L.O. Chua, Memristor—the missing circuit element, IEEE Trans. Circuits Theory 18 (5) (1971) 507–519. [2] A. Thomas, Memristor-based neural networks, J. Phys. D: Appl. Phys. 49 (9) (2013) 093001. [3] D.B. Strukov, G.S. Snider, The missing memristor found, Nature 453 (2008) 80–83. [4] A. Wu, Z. Zeng, Dynamic behaviors of memristor-based recurrent neural networks with time-varying delays, Neural Netw. 36 (2012) 1–10. [5] X. Yang, J. Cao, W. Yu, Exponential synchronization of memristive Cohen– Grossberg neural networks with mixed delays, Cogn. Neurodyn. 8 (3) (2014) 239–249. [6] J. Hu, J. Wang, Global uniform asymptotic stability of memristor-based recurrent neural networks with time delays, in: The 2010 International Joint Conference on Neural Networks, 2010, pp. 1–8. [7] F. Corinto, A. Ascoli, M. Gilli, Nonlinear dynamics of memristor oscillators, IEEE Trans. Circuits Systems. I. Regul. Pap. 58 (6) (2011) 1323–1336. [8] M.A. Cohen, S. Grossberg, Absolute stability of global pattern formation and parallel memory storage by competitive neural networks, IEEE Trans. Syst. Man Cybern. 13 (5) (1983) 815–826.

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i

Y. Wan, J. Cao / Neurocomputing ∎ (∎∎∎∎) ∎∎∎–∎∎∎ [9] H.A. Rowley, S. Baluja, T. Kanade, Neural network-based face detection, IEEE Trans. Pattern Anal. Mach. Intell. 20 (1) (1998) 23–38. [10] J. Cao, J. Wang, Global asymptotic and robust stability of recurrent neural networks with time delays, IEEE Trans. Circuits Syst. I. Regul. Pap. 52 (2) (2005) 417–426. [11] J. Cao, K. Yuan, H. Li, Global asymptotical stability of recurrent neural networks with multiple discrete delays and distributed delays, IEEE Trans. Neural Netw. 17 (6) (2006) 1646–1651. [12] Q. Liu, X. Liao, S. Guo, Y. Wu, Stability of bifurcating periodic solutions for a single delayed inertial neuron model under periodic excitation, Nonlinear Anal.: Real World Appl. 10 (4) (2009) 2384–2395. [13] J. Cao, T. Chen, Globally exponentially robust stability and periodicity of delayed neural networks, Chaos Solitons Fractals 22 (4) (2004) 957–963. [14] G. Chen, J. Zhou, Z. Liu, Global synchronization of coupled delayed neural networks and applications to chaotic CNN models, Int. J. Bifurc. Chaos Appl. Sci. Eng. 14 (7) (2004) 2229–2240. [15] P. Li, J. Cao, Z. Wang, Robust impulsive synchronization of coupled delayed neural networks with uncertainties, Physica A: Stat. Mech. A 373 (2007) 261–272. [16] A. Wu, S. Wen, Z. Zeng, Synchronization control of a class of memristor-based recurrent neural networks, Inf. Sci. 183 (1) (2012) 106–116. [17] W. Wang, L. Li, H. Peng, J. Xiao, Y. Yang, Synchronization control of memristorbased recurrent neural networks with perturbations, Neural Netw. 53 (2014) 8–14. [18] I.M. Stamova, Lyapunov–Razumikhin method for impulsive differential equations with ‘supremum’, IMA J. Appl. Math. 76 (4) (2011) 573–581. [19] J. Caballero, B. López, K. Sadarangani, On monotonic solutions of an integral equation of Volterra type with supremum, J. Math. Anal. Appl. 305 (1) (2005) 304–315. [20] E.P. Popov, Automatic Regulation and Control, Nauka, Moscow, Russia, 1966. [21] I.M. Stamova, T. Stamov, N. Simeonova, Impulsive control on global exponential stability for cellular neural networks with supremums, J. Vib. Control 19 (4) (2012) 483–490. [22] A.F. Filippov, Differential Equations with Discontinuous Righthand Sides, Kluwer Academic Publishers, Boston, MA, 1988. [23] W. He, J. Cao, Exponential synchronization of chaotic neural networks: a matrix measure approach, Nonlinear Dyn. 55 (1–2) (2009) 55–65. [24] J. Cao, Y. Wan, Matrix measure strategies for stability and synchronization of inertial BAM neural network with time delays, Neural Netw. 53 (2014) 165–172. [25] M. Chen, Synchronization in time-varying networks: a matrix measure approach, Phys. Rev. E 76 (1) (2007) 016104. [26] Q. Liu, J. Cao, Improved global exponential stability criteria of cellular neural networks with time-varying delays, Math. Comput. Model. 43 (3–4) (2006) 423–432. [27] M. Vidyasagar, Nonlinear System Analysis, Prentice Hall, Englewood Cliffs, Centre for Al and Robotics, India, 1993. [28] J. Cao, An estimation of the domain of attraction and convergence rate for Hopfield continuous feedback neural networks, Phys. Lett. A 325 (5–6) (2004) 370–374.

7

[29] B. Liu, W. Lu, T. Chen, Generalized Halanay inequalities and their applications to neural networks with unbounded time-varying delays, IEEE Trans. Neural Netw. 22 (9) (2011) 1508–1513.

Ying Wan received the B.S. degree from Anhui Normal University, Wuhu, China. She is now working toward the Ph.D. degree at Southeast University, Nanjing, China. Her research interests include neural networks, multi-agent systems and control theory.

Jinde Cao received the B.S. degree from Anhui Normal University, Wuhu, China, the M.S. degree from Yunnan University, Kunming, China, and the Ph.D. degree from Sichuan University, Chengdu, China, all in mathematics/applied mathematics, in 1986, 1989, and 1998, respectively. From March 1989 to May 2000, he was with the Yunnan University. In May 2000, he joined the Department of Mathematics, Southeast University, Nanjing, China. From July 2001 to June 2002, he was a Postdoctoral Research Fellow at the Department of Automation and Computer-Aided Engineering, Chinese University of Hong Kong, Hong Kong. In the period from 2006 to 2008, he was a Visiting Research Fellow and a Visiting Professor at the School of Information Systems, Computing and Mathematics, Brunel University, UK. On August 2014, he was a Visiting Professor at the School of Electrical and Computer Engineering, RMIT University, Australia. Currently, he is a distinguished professor and doctoral advisor at the Southeast University and also distinguished adjunct professor at the King Abdulaziz University, prior to which he was a Professor at Yunnan University from 1996 to 2000. He is the author or coauthor of more than 300 journal papers and five edited books. His research interests include nonlinear systems, neural networks, complex systems and complex networks, stability theory, and applied mathematics. Dr. Cao was an Associate Editor of the IEEE Transactions on Neural Networks, Journal of the Franklin Institute and Neurocomputing. He is an Associate Editor of the IEEE Transactions on Cybernetics, Differential Equations and Dynamical Systems, Mathematics and Computers in Simulation, and Neural Networks. Dr. Cao is a Reviewer of Mathematical Reviews and Zentralblatt-Math. He is a ISI Highly-Cited Researcher in Mathematics and Engineering listed by Thomson Reuters.

Please cite this article as: Y. Wan, J. Cao, Periodicity and synchronization of coupled memristive neural networks with supremums, Neurocomputing (2015), http://dx.doi.org/10.1016/j.neucom.2015.02.007i