1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47
Neurocomputing 38}40 (2001) 351}358
Spike-frequency adaptation as a mechanism for dynamic coding in V1 Lars Schwabe*, PeH ter AdorjaH n, Klaus Obermayer Department of Computer Science and Electrical Engineering, FR2-1, Technische Universita( t Berlin, Franklinstr. 28/29, D-10587 Berlin, Germany
Abstract We investigate the representation of visual stimuli and the short-term dynamics of activity within primary visual cortex in a &free-viewing' scenario with &saccading eye movements' modeled as a series of visual stimuli that are #ashed onto the retina for the duration of a "xation period (200}300 ms). We assume that the entire activity pattern from the beginning of "xation until time t constitutes the neural code. Given a noisy (Poissonian) representation it follows that the signal-to-noise ratio increases with time, because more spikes become available for representation. Here, we show that for archiving an optimal stimulus representation in any increasing time-window beginning with stimulus onset, the processing strategy of the network should be dynamic in the sense that an initially high recurrent cortical competition between orientation selective cells attenuates with time, i.e. mediated by the instrinsic property of spike-frequency adaptation of pyramidal cells. 2001 Elsevier Science B.V. All rights reserved. Keywords: Primary visual cortex; Neural coding; Flashed stimuli; Neuromodulation
1. Introduction Visual cortical neurons recorded in vivo exhibit variable responses to the repeated presentation of the same static stimulus [1]. This observation indicates that neurons are, at least to some degree, noisy encoding units. Under the assumption of a continous read out with a memory longer than the average inter-spike interval, the level of neuronal noise is non-stationary, since as the time-window beginning with stimulus onset increases, more spikes become available for coding. Therefore, the * Corresponding author. E-mail addresses:
[email protected] (L. Schwabe),
[email protected] (PeH ter AdorjaH n),
[email protected] (K. Obermayer). 0925-2312/01/$ - see front matter 2001 Elsevier Science B.V. All rights reserved. PII: S 0 9 2 5 - 2 3 1 2 ( 0 1 ) 0 0 4 3 6 - 2
352
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358
signal-to-noise ratio (SNR) increases. We suggest that maximizing the mutual information between static stimuli and their cortical representations in any increasing time-window beginning with the stimulus onset is evolutionarily advantageous, since higher processing areas can base their computations at any time on the maximal amount of information encodable in that time window. This would allow both for reaction times faster than the average "xation period and for detailed investigations of a visual scene. If we assume that the neural code is indeed optimized w.r.t. information transfer [2], then the dynamic SNR implies a dynamic change of the coding strategy, a fact that has been neglected in previous studies. Here we propose that modulating the strength of recurrent excitation in primary visual cortex is a plausible way to achieve this goal, and we show that spike-frequency adaptation of cortical pyramidal neurons is a promising candidate for the underlying mechanism. This paper is organized as follows: (i) Within an abstract framework (Section 2) we show that decreasing an initially high value of recurrent competition between cortical &feature detectors' (e.g. orientation selective cells) indeed increases the information transfer for any time-window beginning with the stimulus onset compared with the case where recurrent cortical competition does not adapt. In a free-viewing scenario the stimulus onset corresponds to the beginning of the "xation period. (ii) We then present simulation results (Section 4) of a detailed computational model (Section 3) of a cortical hypercolumn in order to demonstrate that the strategy of modulating recurrent competition can be implemented in V1. Finally, we discuss the biological relevance of our "ndings (Section 5). The predicted decrease of competition can be viewed as an adaptation of the neural system to its own internal states by modifying the coding strategy. We call this new concept &dynamic coding'.
2. The principle of dynamic coding 2.1. Stimulus prior and competitive transfer function Firstly, we model the competitive mapping of a recurrent cortical microcircuit (see Fig. 1A) by a simple competitive transfer function. Within an orientation hypercolumn a competitive mapping arises due to the strong recurrent selective excitation, leading to the ampli"cation of only the most salient inputs, whereas weaker inputs are suppressed due to the unspeci"c inhibition. The inputs are modeled as vectors, where the components correspond to the intensities of features present in a stimulus, i.e. the intensities of edges in a natural environment. The time dependent output of our abstract model network is given by y(x, t)"g(x, t)#(x, t). The input}output transformation of the recurrent microcircuit is given by the softmax-function g (x, t)"(t)exp((t)x )/ exp((t)x ), where we G G G G model the neurons as Poisson spiking units with mean values given by g (x, t). For G P0 the mapping becomes less competitive, and for PR it is a winner-takes-all mapping. The mean spike count (t) in the whole network is proportional to the length t of the time window for encoding the stimulus. Note that with increasing
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
353
Fig. 1. (A) Illustration of network architecture. (B, left) The optimal competition parameter for di!erent mean spike counts and input distributions, where determines the sparsity. (B, right) Information transmission for "2 for dynamic competition and static large and static small competition.
number of spikes the SNR also increases. In the next section we demonstrate that the optimal recurrent competition depends strongly on this dynamic output noise. We model the distribution of elementary features, i.e. edges, in natural scenes by a factorizing input distribution p(x)"(1/Z) exp(!x?/) for x*0, where deterG G mines the sparsity of the pdf. Z is a normalizing constant, and determines the variance. If "2 the input density is the positive half of a multivariate Gaussian distribution. For '2 the distribution becomes sub-Gaussian, and for (2 it becomes super-Gaussian. 2.2. Estimation of optimal competition Let H[ ) ] denote the entropy of a stochastic variable. For a given mean spike count dynamics (t) we aim to maximize the functional I[> , X]"H[> ]!H[> X] @R @R @R w.r.t. the dynamics (t) of the competition parameter. For all (t) with 0)t)t we drop the time dependence of (t) and calculate the optimal via stochastic gradient I ascent on the estimated mutual information IH[> , X]"HH[> ]!H[> X]. The @ @ @ conditional entropy H[> X] can be calculated analytically for the Poisson noise [5]. @ We observe (see Fig. 1B) that for small mean spike counts the network obtains maximal information transfer if the competition is high. On the contrary, for large I mean spike counts a small competition maximizes mutual information. In other I words, given only a few spikes to estimate the stimulus, a competitive mapping is e$cient. In contrast, if many spikes are available, weak competition is superior. Interpreting our results for a "xation period, we conclude that after saccades weakening the initially high recurrent competition is optimal in terms of maximizing the mutual information between static stimuli and representation.
3. Computational model We now consider a biophysically plausible realization of the above scenario by simulating a network of conductance-based model neurons. In this section we
354
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358
describe the model and in the next section we demonstrate that spike-frequency adaptation of pyramidal neurons could be the mechanisms underlying a cortical dynamic competition strategy. 3.1. Network architecture We simulate a network of N"1800 conductance-based model neurons, with N "0.8N excitatory pyramidal (PY) and N "0.2N inhibitory interneurons (IN). # ' Each neuron i is labeled by its preferred orientation , and depending on the type G 3 PY, IN of the neuron it is connected to a "xed number of presynaptic neurons of type . Let M?@ denote the number of neurons of type projecting to a postsynaptic neuron of type . We used: M.7',"105, M.7.7"60, M',.7"84 and M',.7"48. The presynaptic neurons are draw randomly according to a uniform distribution. Only the pyramid}pyramid projections are structured by drawing the presynaptic neurons from a Gaussian distribution w.r.t. the di!erence between the preferred orientations; for the standard deviation we use "7.53. The neurons are driven by a strong orientation unspeci"c noisy input, and the balance between excitation and inhibition keeps the spontaneous activity low (+3 sp/s for PY and +10 sp/s for IN). 3.2. Single cell and synapse model We describe the dynamics of the membrane potential of excitatory pyramidal and inhibitory interneurons using a simpli"ed Hodgkin}Huxley-like formalism, which was proposed recently [6] as a model for mammalian neocortical neurons [3]. The model describes the fast sodium and potassium currents responsible for spike generation, a voltage dependent calcium current and a slow calcium dependent potassium current for the spike-frequency adaptation. The membrane potential dynamics of a model neuron i with preferred orientation !903) (903 are described by G C
d < "!m(
with d R" (!R#R(<)), R(<)"1.24#3.7<#3.2<, 0 dt d S "14ms\(!S #S (<)), S (<)"8(<#0.725), ! ! ! dt ! d S "45\(!S #3S ), &. ! dt &.
m(<)"17.8#47.6<#33.8<,
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
355
where the external current I"I#IL#I is composed of the stimulus-induced G G G G current I and the lateral synaptic input I"I??#I?@. The noise current IL, G G G G G approximates orientation-unspeci"c input, i.e. from V1 layer 6 neurons. These currents are given by d IL I"I c f ( , ), I?@" g?@s (
L HZS H Z P@G The c denote the intensities (local contrasts) of stimulus components with orientation H present in the stimulus S, and H f ( , )"exp(!d( , )/ ) G H G H
with d( , )"min( ! , 1803! ! ) G H G H G H
is a monotonically decreasing function of the di!erence between the preferred orientation and the jth stimulus orientation ; I"1 nA is a scaling factor. For the noise G H we use "0.5 ms, and is drawn randomly in each timestep t"0.5 ms from L a zero-mean Gaussian distribution with unit variance. P@i denotes the set of neurons of type , which project to neuron i. PY projections are mediated by AMPA and IN projections by GABA receptors. The synaptic gating variables are modeled as an instantanous jump of magnitude 1 when a presynaptic spike occurs followed by an exponential decay with time constant 2 and 10 ms for AMPA and GABA . 4. Dynamic competition with spike-frequency adaptation We now present our simulation results: Depending on the calcium dependent potassium conductance the computational model either predicts a pure winnertakes-all mapping (g "0.5, corresponding to high in the abstract model) or an &. only initially highly competitive mapping followed by a reduction of recurrent excitation (g "5). For small g the PY neurons do not show spike-frequency &. &. adaptation, allowing continuously very high "ring rates. Fig. 2 shows the results of numerical simulations for small and large g . The &. responses are visualized as time averages over the last 50 ms, corresponding to a running estimate as assumed for a continuous &readout' mechanism with memory. The stimulus (Fig. 2A) is composed of a strong (0.8) and a weak intensity ("0.5) orientation and was presented for 50 ms)t)250 ms. We observe that for small g only the strong orientation is signalled (Fig. 2B), whereas with large g xrst the &. &. salient and then the weaker orientation is signalled (Fig. 2C). The responses of the model network to the weak intensity orientation within the complex stimulus are compared with the responses when a stimulus with only the weak intensity orientation is presented without being masked by the strong intensity stimulus component (Fig. 2D). The induced nonlinearity as the di!erence between the two stimulus conditions for the weak intensity orientation are shown in Fig. 2E. The simulation results show an initially high competition between orientation columns followed by a less competitive processing. In our model the underlying
356
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358
Fig. 2. Numerical simulation results. (A) Input to the network. The simulated responses are shown for the network without (B) and with (C) spike spike-frequency adaptation. (D) Delayed response to the weak intensity stimulus component within the complex stimulus (solid line) compared with the response when the weak intensity stimulus is presented alone (dotted line). (E) Di!erence between the two curves in (D).
mechanism of this decreased competition is the decreasing recurrent excitation (due to spike-frequency adaptation) of the orientation unspeci"c inhibition, leading to a decrease in cross-orientation inhibition (or competition between feature detectors). Thus, the functional role of spike-frequency adaptation is to adapt the strength of recurrent excitation, leading to adaption of competition at the network level, a strategy we have shown to be e$cient for dynamic coding. Since spike-frequency adaptation is a prominent feature cortical PY neurons it is likely that cortical circuits implement this processing strategy.
5. Discussion We have modeled a single hypercolumn in order to demonstrate the possible biophysical implementation of dynamic competition. Although we are considering a free-viewing scenario we can make a clear prediction for a simple electrophysiological experiment:
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45
357
When presenting an asymmetric compound stimulus composed of multiple orientation, i.e. a distorted plaid or a cross, the response to the weak orientation is delayed w.r.t. to the presentation of the weak stimulus alone. Although the concrete responses may di!er between stimulus paradigms (plaid vs. cross), the principle of xrst the strong, then the weaker feature should be observable in experiments. The biological relevance of this work is (i) that we assign a functional role to spike-frequency adaptation (a local single cell mechanism) in the context of e$cient information transfer at the network level, (ii) make explicit the temporal aspect of sensory coding and provide a processing strategy for fast and reliable encoding with noisy neurons, and (iii) present a model providing a compromise between &linear' feedforward processing of geniculocortical inputs and non-linear ampli"cation (see [4] for a review). References [1] G.R. Holt, W.R. Softky, C. Koch, R.J. Douglas, A comparison of discharge variability in vitro and in vivo in cat visual cortex neurons, J. Neurophysiol. 75 (1996) 1806}1814. [2] R. Linsker, Self-organization in a perceptual network, Computer 21 (1988) 105}117. [3] N.M. Lorenzon, R.C. Foehring, Relationship between repetitive "ring and afterhyperpolarizations in human neocortical neurons, J. Neurophysiol. 67 (2) (1992) 350}363. [4] H. Sompolinsky, R. Shapley, New perspectives on the mechanisms for orientation selectivity, Current Opinion Neurobiol. 7 (1997) 514}522. [5] R.B. Stein, The information capacity of nerve cells using a frequency code, Biophys. J. 7 (1967) 797}826. [6] H.R. Wilson, Simpli"ed dynamics of human and mammalian neocortical neurons, J. Theor. Biol. 200 (1999) 375}388.
Lars Schwabe was born in Berlin, Germany, in 1974. He got his Diploma degree in Computer Science from the Technical University Berlin. Since 1999 he is a member of the GK &Signal Chains in Living Systems', sponsored by the German Science Foundation, and he is working towards a Ph.D. in the Neural Information Processing Group at the Technical University Berlin.
PeH ter AdorjaH n is a data mining scientist at the start-up company Epigenomics. He received his M.Sc. in Computer Science from the EoK tvoK s University, Budapest, Hungary. He just completed his graduate studies at the Neural Information Processing Group at the Technical University of Berlin and is expecting to receive his Ph.D. He has been studying dynamics and representation in the primary visual cortex.
358
1 2 3 4 5 6 7
L. Schwabe et al. / Neurocomputing 38}40 (2001) 351}358 Klaus Obermayer is a Professor for Neural Information Processing at the Department of Computer Science at the Technical University Berlin. He received his M.Sc. in physics in 1987 from the University of Stuttgart and his Ph.D. in 1992 from the Technical University, Munich, Germany. His research interests cover the areas of computational neuroscience, the theory of arti"cial neural networks, and applications in image and signal processing.