Bulletin of Mathematical Biology Vol. 51, No. 3, pp. 347-358, 1989. printed in Great Britain.
AN INFORMATIONAL TIMES
0092-8240/8953.00 + 0.00 Pergamon Press pie © 1989 Society for Mathematical Biology
APPROACH TO REACTION
K. H. NORWICH, C. N. L. SEBURNand E. AXELRAD Institute of Biomedical Engineering, and Department of Physiology, University of Toronto, Toronto, Ontario, Canada Simple reaction time is the minimum time required to respond to a signal such as a steady light or tone. Such a reaction time is taken to be the time required for transmission of a fixed quantity of information, AH, from stimulus to subject. That is, information summation replaces energy summation. This information is calculated from consideration of the quantum nature of the stimulus. The theoretically derived equation for reaction time is fitted to experimental data. Pirron's empirical law for reaction time is obtained as an approximation from a proposed informational equation. The exponent in Pirron's law is found to be the same as the exponent in the power law of sensation. Threshold appears to be the smallest stimulus capable of transmitting the quantity of information AH.
Reaction time is defined as the time between the onset of stimulus and the beginning of an overt response (Coren et al., 1984). Simple reaction time involves a subject's pressing a key or button immediately upon Introduction.
detection of a stimulus such as a flash of light or a tone. It has been known for more than a century (Cattell, 1886) that simple reaction time grows shorter with increasing stimulus intensity; for example, a subject requires less time to react to a bright light than to a dim light. However, the reason why this is so has not been apparent. The most common way of accounting for this relationship between simple reaction time and stimulus intensity is by invoking a principle of energy summation or integration; that is, a subject may react only when the integral of stimulus power with time (energy) reaches a critical level. Therefore, a stimulus of lower intensity must be applied for a longer time than a stimulus of higher intensity before a subject can react. Speaking very approximately, the product of stimulus intensity with reaction time is constant. The study of reaction time is not the only one where the concept of energy integration has been invoked. For brief flashes of light near the threshold, the product of intensity with duration of flash is constant for detection of the flash. That is, a subject can detect a bright flash applied for a brief time period, or a dim flash applied for a longer time period. Mathematically: It ~- constant,
(1) 347
348
K.H. NORWICH et al.
where I is the stimulus intensity and t is time. This law is known as Bloch's law or the Bloch-Charpentier law. The law does not hold precisely, but is an approximation (Williams and Allen, 1971). Because of the similarity between the two phenomena, threshold detection and reaction times, it might be expected that Bloch's law would also hold for reaction times; however, this is not quite the case. As Pirron demonstrated (1952), and empirical rule that better describes the relationship between reaction time and intensity is the power law:
t , - trmin-----CI-",
(2)
where t~ and trmin a r e reaction time and minimum observable reaction time respectively, C and n are constants greater than zero. It has been shown by a number of investigators (e.g. Ueno, 1976; Doma and Hallett, 1988) that if t~min is small compared to t~, and if the constant, n, is equal to unity, then Pirron's law becomes:
It,=C,
(3)
which is identical to Bloch's law. Therefore, the suggestion is, perhaps both phenomena are examples of energy integration: Bloch's law for near-threshold intensities, Pirron's law for intensities above threshold. However, there are theoretical problems with the energy integration interpretation, particularly above threshold (Boynton, 1961), and there are numerical problems as well: trmin is not always negligible in comparison with tr, and the exponent n is usually of the order of 0.3 rather than 1.0 for both audition and vision. Therefore, perhaps it is not energy which is being integrated or summated but some other quantity. We shall argue here that the quantity summated, both near and above threshold, is not energy but information. In the case of simple reaction time, the subject cannot react until he/she has received a critical quantity of information, which we shall represent by AH [bits or natural units]. The advantages of this approach are several. It is not necessary for us to construct ad hoc an informational theory for application to reaction times. We can utilize an existing informational theory of neural coding that has accounted for many other neurophysiological and psychophysical phenomena, and adapt this theory to the study of reaction times. From this theory we can readily derive an equation that gives the observed reciprocal relationship between stimulus intensity and reaction time. The derived equation yields Pirron's law as an approximation. The exponent n emerges as the same exponent that appears in the psychophysical power law, or the law of sensation ("Stevens' exponent"). That is, n takes on a value near 0.3 rather than near unity. However, before
AN INFORMATIONAL APPROACH TO REACTION TIMES
349
proceeding with the derivation, we shall review the principles of the informational theory.
The Entropic or Informational Theory of Perception. A model of perception based on information theoretical principles has been proposed (Norwich, 1977; 198 lb). It has been suggested that the operations of observation or of sensation can be viewed as a process of sampling of the sensory environment. For simplicity, we considered only stimuli which are of the intensity type, such as the intensity of light or the concentration of a solution. The sensory receptors which detect these stimuli may, therefore, be regarded as sampling a gas of photons or molecules at discrete instants in time. The density of such a gas is, of course, a fluctuating quantity. The function of these receptors is, then, to determine as accurately as possible, the mean density of particles (photons, molecules) after, say, m samplings of the stimulus have been made. As progressively more samplings are made, the receptor's uncertainty about the mean particle density decreases. This process of diminishing uncertainty can be represented as a decreasing stimulus informational entropy. The physical assumptions involved in the development of the theory can be summarized briefly. These assumptions are introduced here to review with the reader the physical constraints of the theory. The full mathematical derivations are given in the preceding papers. The process of sensation is assumed to take place within a background of noise. These noise or "reference" signals are taken to be identically and independently distributed Gaussian random variables with variance a r2 . When an external stimulus is present, the discrete samples are identically and independently distributed random variables with unspecified probability density, but with variance a s2 . The means of these samples tend, by the central limit theorem, to a normal distribution with variance a2/m, where m is the number of samplings made. The differential entropy of a probability density function, p(x), is given by --f?o~ p(x)lnp(x)dx, hence: the differential entropy of a normally distributed variable with variance o-2 is (1/2) ln(2rc ea2); the mean of a the external signal or stimulus combines with the reference signal (convolution) to produce a normally distributed random variable with variance a s/m 2 + ar2 .,
350
K . H . NORWICH et al.
absolute entropy is the difference between differential entropies. The absolute entropy, H of a sensory signal is equal to the difference between differential entropies before and after the signal is applied. H = (1/2)ln[27r e(tr~/m + ar2)] -- (1/2)ln[2~r eo-~].
(4)
Take the number of samplings, m, proportional to time, t, and the signal variance proportional to P, where I is the mean signal intensity and n a constant > 0. We then obtain: H = (1/2)In(1 + tiP~t),
ti const > 0.
(5)
H(I, t) is, then, an entropy function which describes the drop in uncertainty as a sensory receptor "observes" a stimulus of mean intensity, I, for a duration of time, I. H is measured here in natural units. H/In 2 will give uncertainty in bits. It was shown (Norwich, 1984; 1987) that from H, as defined in equation (5), we can derive rather a large number of the empirical equations of sensory physiology and psychophysics" Fechner's logarithmic law, Stevens' power law, the principle of sensory adaptation, the Weber fraction, and many others. Some of these derivations are illustrated in the tree-structure of Fig. 1. All of the equations derived for a single modality (such as audition at a given frequency) utilize the same 3 or 4 parameter values. One of the features of the tree is the ubiquity of the "Stevens' exponent", n; it occurs not only in the "power law of sensation" [equation (12), below] but throughout most of the equations of sensation and perception. We shall now show that with one additional assumption, the entropy equation can be used to derive the required equation for reaction time. An Informational Principle of Reaction Time. Hick (1952) showed that for choice reaction time, where a subject has to select one from among m choices (for example by pressing one lit key from among 10 possible keys), the mean choice reaction time was proportional to log(m + 1), or approximately to the information required to press the correct key. This gave rise to Hick's law (Welford, 1980). It has also been shown (e.g. Hyman, 1953; Hellyer, 1963) that the time required for a subject to react to a complex task (e.g. by pressing a button at the completion of the task) is a linear function of the number of bits of information involved in the task. We propose to extend this principle to govern reaction to a single, steady stimulus. For such a stimulus, the drop in uncertainty (gain in information) with time is given by equation (5). Suppose that AH natural units of information are required before a subject can react to a stimulus. Then:
AH=H(to)-H(tr).
(6)
AN INFORMATIONAL APPROACH TO REACTION TIMES AH constant
r
n
351
u I constant
1 Rule of adaptation: H : (1/2) in (1 + P/t) Difference M1 - H2 = Information per stimulus. Compares with information per stimulus in category experiment. High S:N approximation: magical number' = log ~ -i BIn > > 1: Fechner's law from H = A log I + B BIn < < 1: Stevens' law from H = CIn D i f f e r e n t i a t e : 8H/SI . Veber fraction AI/I = aAH (fl" + I -n) From the expression for Veber fraction we obtain a new equation Information per stimulus = (n)( Weber constant)(Total j n d ' s ) / 2 in 2 from which we derive the Teghtsoonian-Poulton law: (n)(logl0 Stimulus Range) = 1.51 = 2(log10 2)nmax[bits ] t r e a c t i o n = [e-2A"/t0 - ( 1 - e - 2 A H ) / B I n ] - I From this equation we obtain Pidron's equation as an approximation: treaction
= tmi n + CI-n.
Derivation of Blondel-Rey law for vision and Hughes' law for audition: I/I® = 1 + t o e2AH/nt Figure 1. Showing the scope of the information equation: H = (1/2)ln (1 +fit'), in deriving the equations of sensory physiology and psychophysics. The full derivations are given in the preceding publications. H e r e the stimulus is given at t = 0, t o is the value o f t for which H is m a x i m u m , and t r is the time r e q u i r e d to t r a n s m i t A H n a t u r a l units o f i n f o r m a t i o n . T h e n :
tiP/to)-
A H = (1/2)In(1 +
(1/2)In(1 +
flP/tr).
(7)
Solving for r e a c t i o n time: tr
=[e-2an 1--e-2an1-1 to
J
"
(8)
It is, h o w e v e r , n o t clear w h e t h e r we can associate t r directly with r e a c t i o n time to a simple stimulus. It is possible t h a t the time, t~, r e q u i r e d to t r a n s m i t i n f o r m a t i o n A H s h o u l d be a u g m e n t e d b y a c o n s t a n t time, q, which is the time required to p r o d u c e the m o t o r o p e r a t i o n t h a t constitutes the r e a c t i o n to the stimulus. T h a t is, in c o m p l e t e generality we s h o u l d possibly write: tl~ = t, + q,
(9)
352
K.H.
NORWICH
et al.
or, total reaction time equals time for transmission of AH units of information plus a lag time for motor response. This will be discussed in more detail below. It can be seen from equations (8) and (9) that as stimulus intensity, /, becomes greater, tr and tR diminish, as discussed in the introduction. There are two interesting limits obtainable from equation (8). The maximum value of t r occurs for the smallest value of the quantity in parentheses. That is:
I= Imin= [ to(e2fln--1)] 1In --
-
.
(10)
The minimum value for t r occurs for the largest value of I. That is: trmin =
lim
tr= to
(11)
e 2AH.
i--. oo
These two limits will be found useful below.
Pibron's Law.
Pi6ron's empirical equation relating reaction time to stimulus intensity for visual stimuli is: t r -- trmin =
CI-n.
(2)
Equation (2) was observed to conform to experimental reaction time data for both visual and auditory data. It was found by curve fitting equation (2) to reaction time data that n-~0.3 for both vision and audition (Vaughan et al., 1966; Mansfield, 1973; Marks, 1974). However, the Stevens exponent is also 0.3 for both vision and audition. This is the exponent that relates subjective (~) to physical (/) magnitude of a stimulus according to the psychophysical relation:
¢=kr.
(12)
The reason why the Stevens exponent, n, apparently makes its appearance in both of these empirical equations has not been understood. We shall see the explanation for this as we derive Pi6ron's law from the entropy equation. Let us write equation (8) in the form:
I
1 to(e2aH--1) 1 . 17-1 tF= toe2AH fl "toe2AS f f j .
(13)
Introducing equations (10) and (11): I 1 tr ~
trmin
1 -- ~min "
trmin
1-] -1 --/ "In_]
,
(14)
AN I N F O R M A T I O N A L A P P R O A C H T O REACTION TIMES
353
or simply:
trmin
(15)
Equation (15) is, then, mathematically equivalent to equation (8). Although equation (8) contains 4 parameters, there are only 3 parameters that can be estimated by curvefitting to reaction time data. These correspond to t~mi,, Imi" and n. Let us now introduce into equation (15) the approximation:
(Imin/1)"<~1.
(16)
If we expand the denominator in equation (15) in a binomial expansion, retaining only the first term, as a consequence of inequality (16): [1 --
(Imin/I) n'] -1 ,.~ 1 + (Imin/1) n.
(17)
Hence equation (15) becomes: t r -----t r m i n [ 1 + ( I m i n / / ) n ] , or: t r - - trmin = trmin "/nmin • I - " ,
(18)
which is Pi6ron's law derived. Equations (2) and (18) are formally identical, and we can identify the constant C with the product trmin'Pmin. We observe that it is expected to be valid only for values of I which are considerably greater than lmin. It transpires that the exponent, n, is precisely the Stevens exponent that appears in the law of sensation, equation (12), as it is derived theoretically from the informational theory (Fig. 1 and Norwich, 1977; t981b). With slight adjustments in the values of the parameters, both the information equation (8) or (15) and Pi6ron's equation (2) fit the data well.
Experimental Evaluation of the Reaction Time Equation. Parameters of the reaction time equation were estimated for audition from the experiments of Chocholle (1940), and for vision from the experiments of Doma and Hallett (1988). Audition. The subjects in Chocholle's experiments were required to press a button as soon as possible after a tone was sounded. The experiments were carried out for various subjects over a complete range of auditory frequencies.
354
K.H. NORWICH et al.
For each frequency, intensity was varied over the physiological range and reaction time was tabulated, t r was found to decrease with increasing stimulus intensity. Chocholle's data, (/, tr), were fitted by means of a simplex optimization routine using the least squares criterion, to equation (8), to obtain estimates for the three parameters t o e 2an, to(1--e-2AH)/fl, and n. The experimental data were well described by equation (8). This is equivalent to fitting the data to equation (15), and obtaining values for parameters t~min,Imin and n. We do not have enough information to evaluate fl and AH uniquely. Experimental data, theoretical values, and parameter values for 1000 Hz tones are given in Table I. TABLE I Equations (8) and (15) Fitted to the Data of Chocholle, Subject I, 1000 Hz Tone Sound pressure
Response time (s) (measured)
Response time (s) (calculated)
0.110 0.110 0.112 0.118 0.124 0.129 0.139 0.148 0.161 0.192 0.218 0.248 0.276 0.312 0.398
0.117 0.118 0.118 0.119 0.121 0.124 0.129 0.138 0.157 0.203 0.220 0.243 0.275 0.320 0.394
1.00 x 105 3.20 x 104 1.00 x 104 3.20 x 103 1.00 x 103 320 100 31.6 10.0 3.16 2.51 2.00 1.58 1.26 1.00
Stevens exponent (sound pressure)= 0.439. trmi n = t o e 2AH .~. 0 . 1 1 7 /rain =
sec.
[to(e 2An_ 1)/fl] 1/n= (0.703)1/o.439 = 0.449. 1
tr -- 8.55 -- 6.011-0.439.
Vision. The subjects in the experiments of D o m a and Hallett were required to track a target visually. There was a latency between the time the target moved and the time the eye moved. Subjects were tested for a set of wavelengths over a range of light intensities. Data were again fitted to equation (8) and the results are given in Table II.
AN INFORMATIONAL APPROACH TO REACTION TIMES
355
TABLE II Equations (8) and (15) Fitted to the Data of D o m a and Hallett, Yellow,Green Light 564 nm Intensity
Response time (s) (measured)
Response time (s) (calculated)
0.163 0.173 0.185 0.208 0.239 0.244 0.262 0.277 0.291 0.305 0.335 0.364 0.414 0.475 0.543
0.165 0.173 0.184 0.203 0.238 0.248 0.260 0.274 0.291 0.312 0.337 0.369 0.411 0.467 0.547
100 31.6 10.0 3.16 1.00 0.794 0.631 0.501 0.398 0.316 0.251 0.200 0.158 0.126 0.I00 Stevens exponent = 0.288. /train : Imin =
toe2an=o.149 sec. [to( e2An- 1))/fl] 1/. =
(0.375)1/o.288 =0.0332.
1 tr -- 6.71 -- 2.52I -°'2ss "
Discussion. We have proposed an informational theory of simple reaction times to a stimulus in the form of a step function. It proceeds from a general entropic or informational theory of sensation and perception, with the additional assumption that a minimum quantity of information, AH, must be transmitted to the sensory receptor before reaction can take place. The transmitted information is not semantic in nature, but rather issues from a reduction in uncertainty about the microscopic or quantum structure of the stimulus signal. Where many receptors exist for a given modality, such as rods and cones in the eye, it is assumed that in the perception of stimulus intensity, they act in parallel and without mutual interaction. Although calculation of the reaction time, t r, issues from a probabilistic treatment of the process of sensation, it, nonetheless, provides a single estimate for t r, rather than a probability distribution for reaction times as, for example, Luce (1986). We do recognize, however, that as the theory is extended, it must come to allow for the intrinsic randomness in measured reaction times. This theory of reaction time utilizes what might be called information integration or summation, in contrast
356
K . H . N O R W I C H et al.
with energy summation. The accumulation of a critical quantity of information, rather than energy, enables the subject to react. The matter of the lag time, t 1, is uncertain. It might seem intuitive that one should associate the time t r of receipt of AH units of information with the time of first conscious awareness of the stimulus, and the time t~ with the cortex-to-muscle conduction time. However, the work of Libet (1985) casts this process in doubt. "Readiness potentials", which are brain-generated potentials that precede a voluntary motor act, were found to occur about 550 ms before the act, while the time of conscious intention to act was measured about 200 ms before the act. Therefore, the simple addition of tr + t~to give total reaction time is probably not correct. It is interesting that when we left t~ as an additional parameter to be estimated from the data, we obtained inconsistent results among subjects. Our most consistent results were obtained when we set t~equal to zero and associated t~, the information receipt time, with reaction time. This is not totally satisfactory, and is the subject of current research. We can now understand why reaction time is smaller for stimuli of higher intensity. Information is transmitted more rapidly (uncertainty vanishes more quickly) from a more intense signal. This may be seen directly by taking the derivative ~H/t~t from equation (5). Since we react when a "quantum" of information, AH, is received, we can, therefore, react more quickly to a more intense stimulus. The theoretical equations for reaction time, equation (8) or (15), were fitted to experimental data for response to auditory stimuli (Chocholle, 1940), and for response to visual data (Doma and Hallett, 1988). The correspondence between theory and experiment was quite close. The theory shows clearly why the Stevens exponent appears in reaction time equations: the same exponent, n, appears in the theoretical equation for the power law of psychophysics and in the theoretical equation for reaction time. The values obtained for this exponent from the process of curve fitting to reaction time data were quite compatible with those values obtained by subjective magnitude estimations. We have not dealt with the effect of set, or ready-signal on the reaction time (e.g. Kohfeld, 1969). We are led quite naturally to an informational interpretation of the concept of threshold. One cannot react until a basic "quantum" of information is received. Therefore, in the early stages of perception of a stimulus, although information pours rapidly into the sensory receptor, the perceiver is unaware of the existence of the stimulus. When the quantum of information has been received, "threshold" is reached, the perceiver becomes aware and is capable of reacting to the stimulus. The exact ordering of events (awareness and reaction) is not known. It should be emphasized that this theory was not formulated ad hoc to account for reaction times. Rather it followed largely from a general theory of
AN INFORMATIONAL APPROACH TO REACTION TIMES
357
sensation and perception (Norwich, 1981a; 1983) based on informational concepts, and with the added supposition of constant AH. This general theory is capable of accounting for many of the empirical rules and equations of sensory physiology and psychophysics (Fig. 1, see Norwich, 1977; 1981b; 1984; 1987 for derivations). The applicability of this theory to reaction times and threshold phenomena is further support for an entropic or "uncertainty" principle of perception. While this manuscript was being reviewed, a paper was presented by Ward and Davidson (1988) showing additional experimental evidence that the value of the exponent, n, measured psychophysically using curve fits to equation (12) was, indeed, very close to the value of n measured from reaction times to tones using equations (8) or (15). The investigators used a wide range of auditory frequencies. This work has been supported by an operating grant from the Natural Science and Engineering Research Council of Canada. We are grateful to I. Nizami and K. Valter for their help and criticisms.
LITERATURE Boynton, R. M. 1961• "Some Temporal Factors in Vision•" Sensory Communication, W. A. Rosenblith (Ed.), pp. 739-756• New York: M.I.T. Press/Wiley. Cattell, J. M. 1886. "The Influence of the Intensity of the Stimulus on the Length of the Reaction Time•" Brain 8, 512-515• Chocholle, R. 1940. "Variations des Temps de R6action Auditifs en Fonction de l'Intensit6/t Diverses Fr6quences." Annie Psychol. 41, 65-124• Coren, S., C. Porac and L. Ward. 1984. Sensation and Perception (2nd Edn). Orlando, FL: Academic Press• Doma, H. and P. E. Hallett. 1988. "Rod-Cone Dependence of Saccadic Eye-Movement Latency in a Foveating Task." Vision. Res. 28, 899-913• Hellyer, S. 1963. "Stimulus-Response Coding and Amount of Information as Determinants of Reaction Time•" J. Exp. Psychol. 65, 521-522• Hick, W. E. 1952. "On the Rate of Gain of Information•" Q. J. Exp. Psychol. 4, 11-26• Hyman, R. 1953. "Stimulus Information as a Determinant of Reaction Time•" J. Exp. Psychol. 45, 188-196. Kohfeld, D. L. 1969. "Effects of Ready-Signal Intensity and Intensity of Preceding Response Signal on Simple Reaction Time•" Am. J. Psychol. 82, 104-110. Libet, B. 1985. "Unconscious Cerebral Initiative and the Role of Conscious Will in Voluntary Action•" Behav. Brain Sci. 8, 529-566. Luce, R. D. 1986. Response Times and their Role in Inferring Elementary Mental Organization• New York: Oxford University Press• Mansfield, R. J. W. 1973. "Latency Functions in Human Vision•" Vision Res. 13,2219-2234. Marks, L. 1974. "On Scale of Sensation•" Percept• Psychophys. 16, 358-376• Norwich, K. H. 1977. "On the Information Received by Sensory Receptors•" Bull. math. Biol. 39, 453-461. • 1981a. "Uncertainty in Physiology and Physics•" Bull. math. Biol. 43, 141-149•
358
K.H. NORWICH et al.
- - . 198 lb. "The Magical Number Seven: Making a 'Bit' of'Sense'." Percept. Psychophys. 29, 409-422. - - . 1983. "To Perceive is to Doubt: The Relativity of Perception." J. theor. Biol. 102, 175-190. - - . 1984. "The Psychophysics of Taste from the Entropy of the Stimulus." Percept. Psychophys. 35, 269-278. - - . 1987. "On the Theory of Weber Fractions." Percept. Psychophys. 42, 286-298. Pi6ron, H. 1952. The Sensations. New Haven: Yale University Press. Ueno, T. 1976. "Luminance-Duration Relation in Reaction Time to Spectral Stimuli." Vision Res. 16, 721-725. Vaughan, H. G . , Jr., L. D. Costa and L. Gilden. 1966. "The Functional Relation of Visual Evoked Response and Reaction Time to Stimulus Intensity." Vision Res. 6, 645-656. Ward, L. M. and K. Davidson. 1988. " ' n ' tropy is Everywhere." 29th annual meeting of the Psychonomic Society, Abstr. No. 225. Welford, A. T. 1980. "Choice Reaction Time: Basic Concepts." Reaction Times, A. T. Welford (Ed.). London: Academic Press. Williams, D. H. and T. M. Allen. 1971. "Absolute Thresholds as a Function of Pulse Length and Null Period." The Perception and Application of Flashing Lights, pp. 43-54. University of Toronto Press.
Received 7 March R e v i s e d 16 N o v e m b e r
1988 1988