Implementation of circuit for reconfigurable memristive chaotic neural network and its application in associative memory Communicated by Dr. Muhammet Uzuntarla
Journal Pre-proof
Implementation of circuit for reconfigurable memristive chaotic neural network and its application in associative memory Tao Chen, Lidan Wang, Shukai Duan PII: DOI: Reference:
S0925-2312(19)31543-7 https://doi.org/10.1016/j.neucom.2019.10.100 NEUCOM 21495
To appear in:
Neurocomputing
Received date: Revised date: Accepted date:
18 July 2019 26 October 2019 31 October 2019
Please cite this article as: Tao Chen, Lidan Wang, Shukai Duan, Implementation of circuit for reconfigurable memristive chaotic neural network and its application in associative memory, Neurocomputing (2019), doi: https://doi.org/10.1016/j.neucom.2019.10.100
This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. © 2019 Published by Elsevier B.V.
Implementation of circuit for reconfigurable memristive chaotic neural network and its application in associative memory Tao Chena , Lidan Wanga,b,∗, Shukai Duanb,c,d,e a College
of Electronic and Information Engineering, Southwest University, Chongqing 400715, China Computing & Intelligent Control of Chongqing Key Lab, Chongqing 400715, China c College of Artificiall Intelligence, Southwest University, Chongqing 400715, China d National & Local Joint Engineering Laboratory of Intelligent Transmission and Control Technology, Chongqing 400715, China e Chongqing Brain Science Collaborative Innovation Center, Chongqing 400715, China b Brain-inspired
Abstract Chaotic neural networks is widely used in associative memory because of its abundant chaotic behavior. The bridge synaptic circuit of the memristor has been mostly used in artificial neural networks, because of its synapse-like and non-volatile properties, but the weight addition circuit has a complicated structure, the high power consumption and the high complexity of the network, so the associative memory neural network circuit is still less implemented. In this paper, the memory characteristics of the threshold memristor is used to build the synaptic circuit, on the one hand, when the continuous voltage is applied to the memristor to alter its memristance, it can realize continuous synaptic weights from −1 to 1. Synaptic weight circuit has simple structure and low energy consumption, due to the configurability of the threshold memristor, and different weights can be obtained in the same circuits to achieve the function of associative memory. On the other hand, we can realize self-associative memory, hetero-associative memory, the separation of superimposed patterns, many-to-many associative memory and application in the threeview drawing, through simulation experiments. Because of the nanoscale characteristics of memristor, the hardware implementation of large-scale chaotic neural network will has simplified structure and be integrated easily. Keywords: Memristor; Associative memory; Reconfigurable; Chaotic neural network. 1. Introduction From the last century, scientists focused on studying chaos, for instance, Lorenz, the original American meteorologist, came up with the famous butterfly effect. A butterfly flapping its wings in the Amazon rainforest can generate a violent storm on the other side of the earth, which explains that the development results of things have a sensitivity to initial conditions. The researchers then discovered that there would have chaotic behavior in the biological neurons of the human brain [1, 2, 3]. Therefore, Aihara pointed out in his research that chaos behavior exists in a single neuron of biological system. With the development of studies, researchers found that chaotic neural network has a broadly used in pattern recognition [4, 5], dynamic memory [6, 7], adaptive search, combinatorial optimization [8] and associative ∗ Corresponding
author Email address:
[email protected] (Lidan Wang )
Preprint submitted to Elsevier
memory [9, 10]. Especially, the study on chaotic neural networks of associative memory has prove to be a hotspot [11, 12]. However, due to the slow development of the hardware implementation of chaotic neural network, its practical applications still need dig deeply. Memristor is yet another passive fundamental circuit element ever since resistor, capacitor, inductor, which was deduced by L. O. Chua in 1970 through the symmetry of circuit theory [13]. It was until 2008 that the existence of memristors confirmed through experiments at HP Labs. The titanium dioxide film based memristor made by HP Labs is the first memristor in the world, which has been widely used in computer science and neural networks ever since [14]. In 2009, Chen et al. proved the existence of spintronic memristor based upon spin-torque-induced magnetization switching and magnetic-domain-wall motion[15, 16]. Itoh et al. implemented a chaotic oscillation circuit using a piecewise linear model[17]. In 2010, Petras of Slovak University of Science and Technology proposed a matheNovember 5, 2019
matical model of fractional-order memristor-based on fractional calculus analysis tool, and analyzed its stability and Chua’s circuit chaos in detail[18, 19]. In 2011, Chang et al. prepared a WO x device with forgetting effect, and gave its mathematical model, and demonstrating the reliability of the model as a synapse[20, 21]. Pershin et al. proposed a threshold adaptive model and gave its SPICE implementation[22]. In 2013, kvatinsky et al. proposed a new current threshold adaptive (Team) model with current threshold[23]. In 2014, the team was extended to propose a voltage threshold adaptive (VTeam) model with voltage threshold[24]. Memristor includes following characteristics: nanometer size, nonlinear characteristics, non-volatile, synaptic characteristics, memory characteristics [25], switching mechanism [26], continuous input and output. Therefore, memristors exist in many applications, such as chaotic circuits.Synapse is the connecting part between two neurons, which is the basis for transmitting information between biological neurons[27]. Snider et al. proposed a network structure based on crossbar array, which connects the presynaptic and postsynaptic neurons through memristors[28]. Domfnguez et al. propose a synaptic circuit based on CMOS transistors[29]. Kim et al. built bridge synapse circuits with four memristors and five memristors, respectively[30, 31]. Wang et al. used a combination of two memristors and two resistors to construct a bridge synapse circuit[32]. Memristors plays as natural synapses [33, 34, 35]. Nanoscale and energy efficiency of memristors make them suitable for neural networks and neural morphological calculation. In particular, memristor combined with chaotic neural network can greatly simplify circuit structure and optimize processing capacity of information. The threshold memristor [36] is a kind of electronic synapse of neuron, which can be continuously variable by adjusting the voltage or current at both ends of the memristor. Therefore, the memory characteristics of memristor provide a new hardware circuit design technique for designing the synaptic circuit with high precision and good adjustment. Based on memristor, the chaotic neural network has the benefits of simple structure, low expenditure of energy and reconfigurable, which are suitable for largescale integrated networks. In our paper, on the basis of the research on a variety of memristors, chaotic neural network, and associative memory, we propose a noval memristive synapse circuit, which is applied to chaotic neural network and designed a series of experiments to prove its associative memory ability. The traditional chaotic neural network is introduced in section 2. Following that the proposed memristive chaotic neural network is presented in
section 3, firstly, we introduce the threshold memristor model we use; then, we propose a memristive synapse circuit that can achieve a synaptic weight from −1 to 1; finally, we introduce the characteristics of our memristive chaotic neural network by comparing with conventional chaotic neural networks. In section 4, in order to understand the associative memory ability of the new chaotic neural network, we designed five simulation experiments. The experimental results show that our chaotic neural network can realize self-association memory, hetero-association memory, separation of superimposed patterns, many-to-many associative memory and applications on artifacts. Finally, in section 5, we give the conclusion. 2. Chaotic neural network The chaotic neural network [9] composed of chaotic neuron model presented by Aihara is primarily used for associative memory. The dynamics of chaotic neurons [37] can show as the following formula: z (t + 1) = bz (t) − βy (z (t)) + c
(1)
f (t + 1) = y (z (t + 1))
(2)
y (t) = 1/ (1 + exp (−t/ε))
(3)
At time t, the internal state of the neuron is z(t), the output of the neuron is f (t); y (t) is the logistic function with steepness ε; β, the refractory scaling parameter, and b is the coefficient of attenuation. These chaotic neurons constitute the traditional chaotic neural network. The structure of the chaotic neural network shows in Figure 1. ,
;
,
;
,
;
,QSXW
2XWSXW
,1
;1
Figure 1: Chaotic neural network structure
2
δi (t + 1) =
N X
Vi j A j (t) + ke δi (t) =
M X j=1
Vi j
j=1
j=1
ηi (t + 1) =
N X
wi j x j (t)+k f ηi (t) =
M X
d=0
wi j
j=1
ζi (t + 1) = −αxi (t) − θi + kr ζi (t) = −α
t X
t X d=0
t X d=0
(4)
ke A j (t − d) (5)
k f x j (t − d) (6)
kr xi (t − d)
(7) At time t + 1, the output of i − th neuron is xi (t + 1); the internal state quantity of the feedback input is ηi (t + 1); the refractoriness of the i−th chaotic neuron is ζi (t + 1); the external input is δi (t + 1); the threshold of the i − th chaotic neuron is θi ; the number of neurons in the chaotic networks is M; wi j and Vi j , the connection weight between chaotic neurons and external input and chaotic neurons; A j (t − d), the external input; k f and kr , the attenuation constants and the constants of the feedback input; Parameter α, the neuron refractory scaling constant; and the activation function f (x), a logistic function with steepness ε. Through Hebbian learning rules, we can get synaptic weights between chaotic neurons: N 1 X p 2xi − 1 2x pj − 1 (8) wi j = n n=1
Current
0A
400Ω
0V
200Ω
0Ω
-500mV
-50mA 0V
-500mV
0s
500mV
0.5ms
1.0ms
Time
Voltage
(a)
(b)
Figure 2: Threshold memristor (a) v-i characteristic curve under sinusoidal voltage. (b) the change of memristance under pulse voltage.
3.2. Memristive synapse circuit From Figure 3 we can see that the memristive synapse circuit is composed of a proportional adder, an inverter, and can realize positive, negative, and zero synapse weights.
xip , the p − th storage patterns of the i − th neuron; N, the total number of storage patterns. Since the neurons in the neural network do not have their own feedback, wii = 0
0
9LQ 5
3. Threshold memristor and synapse circuit design 3.1. Threshold memristor model Memristor is a kind of non-linear resistor with memory capability. Its resistance can be changed by controlling the current flowing through it. Threshold memristor model based on ion migration is a universal and important memristor device verified by relevant experiments[36]. The mathematical equation of this model is as follows: R io f f uv on2 i(t)−io z (y (t)) , v (t) > VT + > 0 dy (t) D = (9) 0, VT − ≤ v (t) ≤ VT + dt uv Ron2 it z (y (t)) , v (t) < VT − < 0 D ion
500mV
50mA
Voltage
fi (t + 1) = f (δi (t + 1) + ηi (t + 1) + ζi (t + 1))
Where io , ion , io f f are constants, y denotes the state transition variable of ion migration, Ron is the low resistance state, Ro f f is the high resistance state of the memristor. uv is the ion migration rate; D represents the total thickness of the memristor; VT + and VT − are positive and negative threshold voltages; When we apply a sinusoidal voltage to memristor, its volt-ampere characteristic curve is shown in Figure 2(a). When the input voltage is a pulse voltage, its memristance value can be nonlinearly increased or decreased under the action of the pulse voltage, just as shown in Figure 2(b).The parameter settings are as follows: µv = 1.6e−12m2 s−1 Ω−1 , D = 3nm, ion = 1A, io f f = 1e − 5A, i0 = 1e − 3A, Ron = 10Ω, Ro f f = 420Ω, Rint = 420Ω, VT + = 0.37V, VT − = −0.19V.
Memristance
The output of i − th chaotic neuron at time t in the chaotic neural network can be expressed as follows:
5
5I
9RXW 0
Figure 3: Memristive synapse circuit
3
In the memristive synapse circuit, if R1 = R2 , then ! ! Rf Rf (−vin ) = −R f (G1 − G2 ) vin Vout = − vin + − M1 M2 (10) Vout = w · vin (11) w = −R f (G1 − G2 ) (12)
If the memristance Ron = 200Ω, Ro f f = 200KΩ, then ! 1 1 G∈ , = 5 × 10−3 (0.001, 1) (13) Ro f f Ron
In the above formula, at discret time t + 1, in the α layer, the output of the i−th chaotic neuron is xiα (t+1); L is the number of layers of the network; N is the number of chaotic neurons in the β layer; and M is the number of chaotic neurons in the α layer. At time t, in the α layer, the external input of the i − th chaotic neuron is Ii (t − d); in the α layer, the i − th chaotic neuron and the j − th chaotic neuron of the β layer, the synaptic weight is wαβ ij ; in the α layer, the threshold of the i−th neuron is θiα . For the sake of simplicity, we make g(x) = x and h(x) = x. The learning rule we use is the Hebbian learning rule to learn the connection weight between neurons. For example, we have a part of the synaptic weight:
If R f = 15 × 103 Ω, then R f G = (0.001, 1) ∼ (0, 1) . That is, when M1 = Ro f f , G1 = 0, w = R f ·G2 ∈ (0, 1) . When M2 = Ro f f , G2 = 0, w = −R f ·G1 ∈ (−1, 0) . If we firstly set the state of two threshold memristors, then this circuit can achieve the corresponding weight change. Due to the threshold characteristics and memory characteristics of the threshold memristor, this synapse circuit can realize the synaptic weight from −1 to 1.
1 1 5 w= 15 1 −3
3.3. Memristive Chaotic Neural Network We propose a memristive chaotic neural network. Compared with traditional chaotic neural network [33, 9], this new chaotic neural network has the following characteristics: (1) External input does not change with time, and continuous input replaces one-time input. (2) Vi j , the weight coefficient between the external input and the chaotic neuron, can be represented by a constant. (3) Replace partial sum of space-time with total sum of space-time. (4) The weight value can be represented by the synaptic circuit composed of memristor and operational amplifier. Due to the various characteristics of memristor, such as nano-property and synaptic property, the new chaotic neural network has more potential for hardware implementation. In our paper, the new memristive chaotic neural network is a fully interconnected network similar to the Hopfield network [38, 39]. Replace the total of all time and space (that is, consider the sum of time and space between 0 and t ) with the sum of partial time and space (that is, consider the sum of time and space between t − t0 ∼ t ). The dynamic equation of the i − th chaotic neuron in the α layer is: (1) When t > t0
xiα (t
j=1
+
M X
wαα i j (t)
j=1
+v
t X
d=t−t0
k s Ii (t − d) − α
d=t−t0 t X
d=t−t0
km h j x j (t − d)
199k 199k 199k 199k 0.6k 3k 0.6k 3k
1 For weight w11 = 15 , For memristor M1 , there is no input voltage, so its resistance value M1 = 199k remains unchanged.For memristor M2 , at a certain time, its resistance value M2 = 3k, the synaptic weight can reach 1 . the expected value of 15
4. Simulation results In this part, we mainly analyze the associative memory ability of chaotic neural network based on memristor and show its associative memory characteristics with five examples.
d=t−t0 t X
−1 5 3 1 −1 5 −5 1
3k 3k 199k 199k 199k 199k M1 = 3k 3k 199k 1k 0.6k 0.6k 199k 199k 3k 0.6k 1k 1k M2 = 199k 199k 3k 199k 199k 199k
t X X X (t) + 1) = f L Nwαβ km h j x j (t − d) ij β=1,β,α
−1 3 −1 −5
4.1. Self-Associative Memory
Firstly, we define the self-associative memory: There are M samples stored in the associative neural networks. {Y i }, i = 1, 2...M. If the input to the associative mem0 ory network is Y = Y i + R, where Y i is the i − th sample, and R is the deviation term (may be representative Noise, defects, distortions, etc.), Y i can be recalled through the associative memory network. Based on memristive chaotic neural network, we design a selfassociative memory simulation of binary image. We as-
kr gi (xi (t − d)) − θiα (14)
(2) While t ≤ t0 , let d = t − t0 in the above formula to d = 0. 4
sume that the chaotic neural network remembers training set (T, K). The new chaotic neural network we designed includes 49 chaotic neurons, which can realize self-associative memory. When a broken ”T ” is continuously input into the network, the network can recall the complete ”T ”. In Figure 4(b), from the step 5 to the end, ”T ” is recalled by the network. Memorized pattern
Memorized pattern
memorized pattern
memorized pattern
(a)
Input
(a)
(b) Figure 5: Hetero-association memory (a) The patterns remembered by the network. (b) The output of the network.
4.3. Separation of superimposed patterns
(b)
It is assumed that the training set (C, T, K) is remembered by the proposed network. The proposed neural network we designed contains 49 chaotic neurons. When ”C” is continuously input into the network, ”C” will be recalled by searching this pattern. While the pattern ”C +T ” is input to the network, ”C” and ”T ” will be searched by the network. Since chaotic neurons change their characteristics through chaos, ”C” and ”T ” are respectively recalled at different times. The pattern ”C” is recalled at steps 9, 12, 15, 26, 29, and the pattern ”T ” is recalled at steps 10, 13, 16, 27, 30, as shown in Figure 6(b).
Figure 4: Self-associative memory (a) Input to the network and training set. (b) Network output.
4.2. Hetero-associative memory We define hetero-associative memory: suppose there is a certain correspondence between the two sets of patterns, X 1 → Y 1 (such as someone’s photo → to some0 one’s name). If we input X = X M + V to the associa0 tive memory network, for instance, X is a broken photo of someone, we can output someone’s name through a associative memory network. Based on memristive chaotic neural network, we design a hetero-associative memory simulation of binary image. The new chaotic neural network we designed includes 49 chaotic neurons, which can achieve hetero-associative memory. We assume that the training set (T, K) is remembered by the chaotic neural network. While an ”I” is continuously input into the network, the network performs a chaotic search around this input, so ”T ” is recalled at different times. As shown in Figure 5(b), from the step 2 to the end, ”T ” is recalled by the network.
4.4. Many-to-many associative memory Many-to-many associative memory considers the common training set (A1 , B1 , C1 ), (A1 , B2 , C2 ), (A3 , B3 , C3 ) as the memory pattern of the network, if A1 is used as the input of the first layer of the network. As the network is running, at different times, (A1 , B1 , C1 ) and (A1 , B2 , C2 ) containing the common term A1 are recalled. If A1 acts as the first input to the network, the following formula determines the internal state of the 5
memorized pattern
memorized pattern
memorized pattern
(a)
(a)
(b) Figure 6: Separation of Superimposed patterns (a) The patterns remembered by the network. (b) The output of the network.
second layer of neurons: I β = A1 W αβ = A1 AT1 B1 + A1 AT1 B2 + A1 AT3 B3 = A1 AT1 (B1 + B2 ) + A1 AT3 B3
(15)
In the above formula, W αβ is a weight matrix between the first layer and the second layer, A1 AT3 B3 is the noise term, which B1 + B2 is the superimposed patterns caused by the common term A1 . Similarly, the superimposed patterns (C1 + C2 ) will appear on the third layer. The network will search through B1 and B2 in the second layer through chaotic iterations, and in the third layer will search around C1 and C2 . Then, the pattern (A1 , B1 , C1 ) and (A1 , B2 , C2 ) can be recalled at different times to achieve many-to-many associative memory [7]. The network is a three-layer chaotic neural network and each layer there are 49 chaotic neurons. The network has remembered (C, T, K), (C, I, M), (Z, W, S ) that common pattern ”C” and ”Z” are remembered by the first layer; ”T ”, ”I”, ”W” are remembered by the second layer; ”K”, ”M”, ”S ” are remembered by the third layer.In Figure 7(a),this is the training set.In Figure 7(b), this is the output of the network when ”C” is a continuous input to the network. We can see that (C, T, K) was successfully recalled 5 times, and (C, I, M) was successfully recalled 5 times. The success of the experiment means that our new chaotic neural network can achieve many-to-many association memory.
(b) Figure 7: Many-to-many association memory (a) The patterns remembered by the network. (b) The output of the network.
4.5. Application on artifacts In engineering drawing, three-view reading is a basic and important process. The three views of an artifact are divided into a front view, an end view and a vertical view. That is, space objects can be transformed into planes for thinking. Here, we designed a three-view drawing simulation experiment. Suppose our three-layer chaotic neural network remembers the three views of the three artifacts. Of the three artifacts, the front views of the two artifacts are the same, but their end and vertical views are different. The same front view is used as the input to the network. At different moments, the front views, the end views and the vertical views of two artifacts will be recalled by the network. The chaotic neural network we designed for the three views of the artifact is three layers, each of which contains 81 chaotic neurons, and the network remem6
bers three views of the three artifacts. On the first layer, the network remembers the front view of the three artifacts. On the second layer, the network remembers the end view of the three artifacts, and the vertical view on the third layer. Figure 8(a) shows three views of three artifacts remembered by the network. We use this same front view as the continuous input of the network. According to the experience of many-to-many associative memory, we can get three views of the first artifact and the second artifact at different times. From Figure 8(b), we can see that the three views of the first artifact were successfully recalled three times, and the three views of the second artifact were successfully recalled twice. Front view
End view
Vertical view
Front view
End view
Vertical view
Front view
End view
Vertical view
and replace the one-time input with continuous input. Under the circumstance, we build the circuit with the memory characteristics of the threshold memristor, which can change the value of memristor through inputting the continuous voltage. The conductance value can realize continuous synaptic weight from −1 to 1 and the structure of the whole circuit is simple, the power consumption is low, and due to the reconfigurability of the threshold memristor, different weights can be realized in the same circuit. This is closer to the behavior of biological neurons, and has achieved good results in associative memory. The simulation experiment results shows that the new chaotic neural network that we put forward, it can be able to realize self-associative memory and hetero-associative memory, which can solve the superimposed patterns, many-to-many association memory and applications in three-view of artifacts. At the same time, due to the nanometer size of the memristor, our proposed memristive synapse circuit is promising in future physical implementation, and has a potential in large-scale integrated circuit chips application.In the future researches, we will consider to store grayscale images and color images in large-scale chaotic neural networks, and then implementing associative memory of grayscale and color images. In addition, we can combine our work with style migration in deep learning to give a variety of different styles to an image.
(a)
Acknowledgements The work was supported by National Key R&D Program of China (Grant Nos. 2018YFB1306600), National Natural Science Foundation of China (Grant Nos.61571372, 61672436, 61601376), Fundamental Science and Advanced Technology Research Foundation of Chongqing (cstc2017jcyjBX0050, cstc2016jcyjA0547), Fundamental Research Funds for the Central Universities (Grant Nos.XDJK2016A001, XDJK2017A005). References
(b)
References
Figure 8: Application of Three Views (a) The pattern remembered by the network. (b) The output of the network.
[1] Shin Ishi, Kenji Fukumizu, and Sumio Watanabe. A network of chaotic elements for information processing. Neural Networks, 9(1):25–40, 1996. [2] Yuko Osana and Masafumi Hagiwara. Separation of superimposed pattern and many-to-many associations by chaotic neural networks. In 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No. 98CH36227), volume 1, pages 514–519. IEEE, 1998.
5. Conclusion In our paper, we study the circuit of reconfigurable memristor chaotic neural network, which replace all spatiotemporal sums with partial spatiotemporal sums, 7
[3] Dmitri B Strukov and R Stanley Williams. Exponential ionic drift: fast switching and low volatility of thin-film memristors. Applied Physics A, 94(3):515–519, 2009. [4] Ke Li, Luxi Yang, Ju Liu, and Zhenya He. Associative memory of gray-scale images based on chaotic neural networks. In Proceedings of IEEE. IEEE Region 10 Conference. TENCON 99.’Multimedia Technology for Asia-Pacific Information Infrastructure’(Cat. No. 99CH37030), volume 2, pages 1375–1378. IEEE, 1999. [5] Yoshihiko Horio, Kazuyuki Aihara, and O Yamamoto. Neuronsynapse ic chip-set for large-scale chaotic neural networks. IEEE Transactions on Neural Networks, 14(5):1393–1404, 2003. [6] Masaki Kobayashi, Akihiro Nakajima, and Michimasa Kitahara. Multidirectional associative memory with two hidden layers. IEEJ Transactions on Electrical and Electronic Engineering, 8(3):299–300, 2013. [7] Motonobu Hattori and Masafumi Hagiwara. Multimodule associative memory for many-to-many associations. Neurocomputing, 19(1-3):99–119, 1998. [8] Guoguang He, Manish Dev Shrimali, and Kazuyuki Aihara. Threshold control of chaotic neural network. Neural Networks, 21(2-3):114–121, 2008. [9] Kazuyuki Aihara, T Takabe, and Masashi Toyoda. Chaotic neural networks. Physics letters A, 144(6-7):333–340, 1990. [10] Kaibo Shi, Jun Wang, Yuanyan Tang, and Shouming Zhong. Reliable asynchronous sampled-data filtering of t–s fuzzy uncertain delayed neural networks with stochastic switched topologies. Fuzzy Sets and Systems, 2018. [11] Yuko Osana, Motonobu Hattori, and Masafumi Hagiwara. Chaotic bidirectional associative memory. IEEJ Transactions on Electronics, Information and Systems, 116(7):741–747, 1996. [12] Satoshi Kosuge and Yuko Osana. Chaotic associative memory using distributed patterns for image retrieval by shape information. In 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No. 04CH37541), volume 2, pages 903– 908. IEEE, 2004. [13] Leon Chua. Memristor-the missing circuit element. IEEE Transactions on circuit theory, 18(5):507–519, 1971. [14] Dmitri B Strukov, Gregory S Snider, Duncan R Stewart, and R Stanley Williams. The missing memristor found. nature, 453(7191):80, 2008. [15] Xiaobin Wang, Yiran Chen, Haiwen Xi, Hai Li, and Dimitar Dimitrov. Spintronic memristor through spin-torque-induced magnetization motion. IEEE electron device letters, 30(3):294– 297, 2009. [16] Yiran Chen and Xiaobin Wang. Compact modeling and corner analysis of spintronic memristor. In Proceedings of the 2009 IEEE/ACM International Symposium on Nanoscale Architectures, pages 7–12. IEEE Computer Society, 2009. [17] Makoto Itoh and Leon O Chua. Memristor oscillators. International journal of bifurcation and chaos, 18(11):3183–3206, 2008. [18] Ivo Petras. Fractional-order memristor-based chua’s circuit. IEEE Transactions on Circuits and Systems II: Express Briefs, 57(12):975–979, 2010. [19] Hai-Bo Bao and Jin-De Cao. Projective synchronization of fractional-order memristor-based neural networks. Neural Networks, 63:1–9, 2015. [20] Ting Chang, Sung-Hyun Jo, Kuk-Hwan Kim, Patrick Sheridan, Siddharth Gaba, and Wei Lu. Synaptic behaviors and modeling of a metal oxide memristive device. Applied physics A, 102(4):857–863, 2011. [21] Ling Chen, Chuandong Li, Tingwen Huang, Yiran Chen, Shiping Wen, and Jiangtao Qi. A synapse memristor model with for-
getting effect. Physics Letters A, 377(45-48):3260–3265, 2013. [22] Yuriy V Pershin and Massimiliano Di Ventra. Spice model of memristive devices with threshold. arXiv preprint arXiv:1204.2600, 2012. [23] Shahar Kvatinsky, Eby G Friedman, Avinoam Kolodny, and Uri C Weiser. Team: Threshold adaptive memristor model. IEEE Transactions on Circuits and Systems I: Regular Papers, 60(1):211–221, 2012. [24] Shahar Kvatinsky, Misbah Ramadan, Eby G Friedman, and Avinoam Kolodny. Vteam: A general model for voltage-controlled memristors. IEEE Transactions on Circuits and Systems II: Express Briefs, 62(8):786–790, 2015. [25] Shukai Duan, Yi Zhang, Xiaofang Hu, Lidan Wang, and Chuandong Li. Memristor-based chaotic neural networks for associative memory. Neural Computing and Applications, 25(6):1437– 1445, 2014. [26] XiaoFang Hu, ShuKai Duan, LiDan Wang, and XiaoFeng Liao. Memristive crossbar array with applications in image processing. Science China Information Sciences, 55(2):461–472, 2012. [27] Ed Bullmore and Olaf Sporns. The economy of brain network organization. Nature Reviews Neuroscience, 13(5):336, 2012. [28] Greg S Snider. Cortical computing with memristive nanodevices. SciDAC Review, 10:58–65, 2008. [29] R Dominguez-Castro et al. A 0.8-µm cmos two-dimensional programmable mixed-signal focal-plane array processor with on-chip binary imaging and instructions storage http://citeseer. ist. psu. edu/280486. html. IEEE Journal of Solid-State Circuits, 32(7), 1997. [30] Hyongsuk Kim, Maheshwar Pd Sah, Changju Yang, Tam´as Roska, and Leon O Chua. Memristor bridge synapses. Proceedings of the IEEE, 100(6):2061–2070, 2011. [31] Hyongsuk Kim, Maheshwar Pd Sah, Changju Yang, Tam´as Roska, and Leon O Chua. Neural synaptic weighting with a pulse-based memristor circuit. IEEE Transactions on Circuits and Systems I: Regular Papers, 59(1):148–158, 2011. [32] Lidan Wang, Xiaodong Wang, Shukai Duan, and Huifang Li. A spintronic memristor bridge synapse circuit and the application in memrisitive cellular automata. Neurocomputing, 167:346– 351, 2015. [33] Masaharu Adachi and Kazuyuki Aihara. Associative dynamics in a chaotic neural network. Neural Networks, 10(1):83–98, 1997. [34] Kazuyuki Aihara. Chaos engineering and its application to parallel distributed processing with chaotic neural networks. Proceedings of the IEEE, 90(5):919–930, 2002. [35] Lidan Wang, Emmanuel Drakakis, Shukai Duan, Pengfei He, and Xiaofeng Liao. Memristor model and its application for chaos generation. International Journal of Bifurcation and Chaos, 22(08):1250205, 2012. [36] Yang Zhang, Xiaoping Wang, Yi Li, and Eby G Friedman. Memristive model for synaptic circuits. IEEE Transactions on Circuits and Systems II: Express Briefs, 64(7):767–771, 2016. [37] Masaki Kobayashi. Chaotic pseudo-orthogonalized hopfield associative memory. Neurocomputing, 241:147–151, 2017. [38] Victor M Eguiluz, Dante R Chialvo, Guillermo A Cecchi, Marwan Baliki, and A Vania Apkarian. Scale-free brain functional networks. Physical review letters, 94(1):018102, 2005. [39] SG Hu, Y Liu, Z Liu, TP Chen, JJ Wang, Q Yu, LJ Deng, Y Yin, and Sumio Hosaka. Associative memory realized by a reconfigurable memristive hopfield neural network. Nature communications, 6:7522, 2015.
8
Biography of the authors
I.
Biography
of
Tao
Chen:
Tao Chen received the B.E. degree in electronic information science and technology from the College of Electronic and Information Engineering, Yangtze Normal University, Chongqing, China, in 2017. He is currently pursuing the M.E. degree in signal and information processing with the College of Electronic and Information Engineering, Southwest University, Chongqing, China.
He current research interests include artificial neural network, memristive system, and braininspired computing.
II.
Biography
of
Lidan
Wang:
Lidan Wang received the B.E. degree in automatic control from Nanjing University of Science and Technology, China, in 1999 and the Ph.D. degree in Computer Software and Theory from the Chongqing University, China, in 2008. She was at the Biological Engineering Institute, Imperial College London, as a visiting scholar, from 2010 to 2011.She also serves as an IEEE member. From October 2016 to January 2017, she was invited to be a visiting professor at Nanyang Technology University in Singapore. Currently, she is a professor in the School of Electronics and Information Engineering, Southwest University, Chongqing, China. Her research interests include the nonlinear system and chaotic circuit,
artificial
neural networks and FPGA technology, Memristor and Memristive systems. She
has published more than 20 academic papers in the fields of chaos in electronic circuits and chaotic communications.
III. Biography of Shukai Duan: Shukai Duan received the Ph.D. degree in computer science from Chongqing University, China, in 2006. Since 1996, he has been with the College of Electronic and Information Engineering, Southwest University, Chongqing, China, where he has been serving as a Professor since 2010. He was a Visiting Professor with the University of Michigan in 2010, the University of Windsor in 2013, and Texas A&M University at Qatar in 2014, respectively. He has published four books and more than 100 papers in refereed journals and conferences. His research interests include memristor devices and memristive system, nonlinear circuits and systems, artificial neural networks, chaos and chaotic circuit, and intelligent signal processing.
Dr. Duan has taken charge of more than ten national, provincial or ministry level research projects, including National Natural Science Foundation of China (NSFC) and the Program for New Century Excellent Talents from Ministry of Education. He serves as an Associate Editor of the Neurocomputing and the IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS.
1. Tao Chen
2. Lidan Wang
3. Shukai Duan
Declaration of interests The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.