Individual differences in speed of mental processing and human cognitive abilities: Toward a taxonomic model

Individual differences in speed of mental processing and human cognitive abilities: Toward a taxonomic model

INDIVIDUAL DIFFERENCES IN SPEED OF MENTAL PROCESSING AND HUMAN COGNITIVE ABILITIES: TOWARD A TAXONOMIC MODEL RICHARDD.ROBERTSAND LAZARSTANKOV' UNIVERS...

10MB Sizes 95 Downloads 89 Views

INDIVIDUAL DIFFERENCES IN SPEED OF MENTAL PROCESSING AND HUMAN COGNITIVE ABILITIES: TOWARD A TAXONOMIC MODEL RICHARDD.ROBERTSAND LAZARSTANKOV' UNIVERSITYOFSYDNEY

ABSTRACT:

Extensive

research within the field of learning and individual

ferences focuses upon the relationship measures derived from elementary

between general intelligence

dif-

and process

cognitive tasks (ECTs). This emphasis has ig-

nored data indicating that cognitive abilities are best described by three levels (or strata). It has also been suggested

that mental speed is a unitary construct,

though it is more likely to have a complex structure. To address shortcomings ident in this literature, a multivariate tor analysis of 25 psychometric

theory of fluid (Gf) and crystallized tive abilities and parameters related to processing perimental

investigation

(G,) intelligence.

Correlations

of task complexity.

mental speed, the results were unequivocal: derived from both ECTs and psychometric

is seemingly

dependent

Broad second-order

cognitive speed factor. Implications

processing

compatibility

several explanatory

all correspondence

It would ap-

and that cognitive com-

effects) plays a crucial role in models linking intelligence

to

speed are untenable. It is likely that the search for a basic process of in-

telligence by means of mental speed frameworks

Direct

are indepen-

define a general

of these findings are discussed.

plexity (reflected in stimulus-response

upon ex-

factors may be

tests. These constructs

pear that mental speed is more intricate than proposed, In addition,

between cogn-

Regarding the factorial structure of

dent from abilities defined by accuracy scores and collectively

its ontogenesis.

Fac-

under the

derived from 11 ECTs indicated that Gf (alone) was

speed. This relationship

manipulations

(N = 179) was conducted.

indices gave seven factors postulated

alev-

(alone) is misguided.

IO: Richard 0 Roberts, Department 01 Psychology, University 01 Sydney, Sydney, NSW. 2006, Australia.

E-mail:

Learning and individual Differences. Volume 11, Number 1,1999, All rights of reproduction in any form reserved.

pages l-120.

Copyright 0 1999 by Elsevier ScrenceInc. ISSN: 1041-6080

2

LEARNINGANDINDIVIDUALDIFFERENCES

VOLUMEll,NUMBER1,1999

Recently, within the field of individual differences, there has been “an explosion of experimental studies into the speed of mental processes” (H.J. Eysenck 1995, p. 225). Various tasks, ranging from those paradigms assessing simple, psychomotor movements and on up through to measures of complex problem solving and psychometric test performance, have been employed (Stankov & Roberts 1997). The present study was designed to explore speed of processing constructs within a structural model of human cognitive abilities. Utilizing the evidence presented in Carroll’s (1993) extensive reanalysis of the main data sets collected within the psychometric discipline this century, the structural model of cognitive abilities adopted is that known as the theory offruid and crystallized intelligence (see Horn & No111994; Stankov et al. 1995). In contrast, the mental speed measures selected for investigation in this study were chosen on the basis of both experimental and psychometric findings that rely on disparate accounts (e.g., information theory). Notably, mental speed constructs are not presently encapsulated within a single unifying model. Another major aim of the present study was to redress this imbalance by establishing a rapprochement between conceptual models of mental speed and human cognitive abilities.

INTRODUCTION ORIGINS:GALTON,SPEARMAN,ANDTHECONCEPTOFMENTALSPEED Historically, the psychological investigation of performance speed has played a pivotal role in providing an understanding of human cognition (Rabbitt 1996). For instance, Galton (1883, 1908) hypothesized that differences in Reaction Time (RT) would serve as the most appropriate measure of “intelligence” (see H.J. Eysenck 1987a). Earlier this century, Spearman (1904,1927) proposed that “mental speed” should be divided into “cognitive speed” (the speed at which a person performs specific cognitive processes) and “personal tempo” (the hypothesized speed at which a person tends to perform various daily activities). However, Spearman (1927) did not acknowledge speed of performance to be a separate component of intelligence, instead treating it in much the same manner as accuracy scores (i.e., as a dependent measure reflecting an aspect of cognitive performance) (see Berger 1982). Although interest in performance speed is thus an integral part of scientific attempts to understand the nature and structure of human abilities, some commentators have sought to differentiate between Galton’s and Spearman’s more generic approach to the study of individual differences (e.g., Carlson & Widaman 1987). In the former, measures such as RT to sensory stimuli are used so that each observation is made on a highly differentiated metric scale. In the latter, the structure of human abilities is derived from consideration of patterns of relationship between more complex cognitive tasks that are scored according to a set protocol. In general, the data gathered from these two approaches have been examined us-

Pf?OCESSlNGSPEEOANDAB/LITIES

3

ing correlational (e.g., Pearson 1901; Wissler 1901) and factor analytic techniques respectively (e.g., Spearman 1927). Although distinctions between these two approaches have often been acknowledged, only recently has this dichotomy led to an apparent impasse in learning and individual differences research. On the one hand, some investigators focus on a few quite simple psychological tasks and correlate parameters with a small number of psychometric measures. On the other hand, interest is directed toward larger numbers of more complex measures to determine how these fit within the factor structure of human cognitive abilities. In principle, distinctions between the “Galtonian” and “Spearmanian” approaches are largely arbitrary. There is nothing inherently wrong with an attempt to assign speed of performance measures as correlates of, and structures within, models of human cognitive ability. In addition, the Galtonian approach primarily centers upon the speed with which individuals perform tasks, whereas the Spearmanian approach focuses upon number correct (i.e., accuracy) measures (see Carlson & Widaman 1987; H.J. Eysenck 1987a). Arguably, given that both frameworks can involve each aspect of performance, a better understanding of cognitive abilities may be achieved by considering an individual’s attainment within each response domain. This is especially true of measures obtained within the Spearmanian approach. However, in simple tasks that assess latency (and are thus representative of the Galtonian framework), individual differences in accuracy scores become difficult to detect owing to the fact these measures are generally subject to ceiling effects. REACTION TIME AND INTELLIGENCE In 1952, W.E. Hick, on the basis of his own data and that obtained by Merkel (1885), adopted information theory to account for results obtained on several chronometric tasks. Using the principles of uncertainty, entropy, and information transmission that are embodied within this theoretical framework (Shannon & Weaver 1949), Hick reported a linear relationship between choice RT and task difficulty measured in bits of information.2 These results were variously replicated by several researchers using different experimental methodologies (e.g., Bricker 1955a, 1955b; Crossman 1953; Hyman 1953; Welford 1968). This finding has come to be known as Hick’s law. With the demonstrated robustness of Hick’s law and the emergence of cognitive models of human intelligence, Roth (1964) examined the relationship between RT parameters and psychometric test performance. His finding of significant negative correlations between choice RT measures and cognitive test performance precipitated a revival of interest in the early notion that mental speed and general ability are empirically related (e.g., J. McKeen Cattell 1890; Galton 1883). Subsequently, many researchers have employed the Hick paradigm (or some other, closely related, cognitive model) to link speed of information-processing parameters to general intelligence (e.g., Agrawal & Kumar 1993; Bates & H.J. Eysenck 1993; Bowling & Mackenzie 1996; Carlson & C.M. Jensen 1982; Carlson et al. 1983; Cohn et al. 1985; H.J. Eysenck 1987b; A.R. Jensen 1980, 1982a, 1982b, 1984a, 1984b, 1987a, 1987b; A.R. Jensen et al. 1988; A.R. Jensen & Munro 1979;

4

LEARNINGANDINDIVIDUALDIFFERENCES

V0LlJME11.NUMBER1.1999

Kane et al. 1997; Kranzler et al. 1992; Larson & Saccuzzo 1989; Matthews & Dorn 1989; Nettelbeck & Kirby 1983; Nettelbeck & Lally 1976; Neubauer 1990a, 1990b, 1991; Neubauer et al. 1992; Ruchalla et al. 1985; Schweizer 1993a, 1993b; G.A. Smith & Stanley 1980,1983; Vernon 1981,1983,1987; Vernon & A.R. Jensen 1984). A.R. Jensen (1982a), a major proponent of this approach, has interpreted this research to indicate that individuals differ in the skill called for by tests of general intelligence “because they differ in the rates with which they process (and hence acquire) the information offered by the environment” (p. 98). Models implicating both cognitive and biological mechanisms have been offered to account more fully for this empirical relationship (e.g., A.R. Jensen 1987a, 1992a; E.M. Miller 1994; Salthouse 1995; Vernon 1990; see also Stankov & Roberts 1997 for a critique of this research program). Contemporary research within the Hick paradigm features tasks that allow for the independent assessment of Decision Time (DT, the time required to determine and initiate an appropriate response to a stimulus [or stimuli]) and Movement Time (MT, the time associated with sensory and motor control of movement) (e.g., A.R. Jensen 1979,1987a, 1993a, 199313;Roberts 1999a). The element of choice is introduced by varying the number of response alternatives, usually from 1 to 8, which are re-scaled into binary digits (i.e., bits). Because a large number of trials are required to obtain reliable assessment of central tendency, measures of intraindividual variability in both DT and MT (at each bit level) are also obtained (e.g., A.R. Jensen 1992a; Larson & Alderton 1990). Further, because data under all conditions are collected from each participant, measures of the intercept and gradient of DT and MT for each individual may be calculated as well. Thus, from one task a number of measures, reflecting essentially different psychological processes, can be derived (Roberts 1999a). A further feature of this approach is an attempt to identify the psychological processes that are most central to the proposed correlation between speed of information processing and intelligence measures. For example, slope of DT is taken to reflect an individual’s rate of information processing. Various parameters have been found consistently to share moderate correlation with psychometric indices (see H.J. Eysenck 1987a; A.R. Jensen 1987a; Vernon 1987, for various reviews and extensive meta-analyses). It is assumed that even low correlations with psychometric measures are interesting from a theoretical standpoint because “the simple tasks in which RT is measured have so exceedingly little resemblance to conventional psychometric g-loaded tests” (A.R. Jensen 1992b, p. 279). However, in a series of important critiques, Longstreth (1984,1986) argues that much of the research conducted within the Hick paradigm introduces a variety of experimental confounds. For instance, a common methodological problem is that the tasks employed to measure RT often involve “unequal practice at all set sizes, and cumulative effects from smaller set sizes to larger set sizes” (Longstreth 1984, p. 144). Longstreth also asserts that both visual attention effects (produced by the spatial layout of visual RT tasks) and response bias effects (introduced by the multi-choice nature of these paradigms) may account for variations in subjective response. Such effects pose serious problems to the interpretation of the results obtained from RT studies within a meaningful theoretical framework (e.g., Carl-

PROCfSSlNGSPEfDANDABlllTlES

5

son & Widaman 1987). Longstreth’s criticisms have provoked a number of research papers that specifically address these empirical issues (e.g., A.R. Jensen & Vernon 1986; Kranzler et al. 1988; G.A. Smith 1989; Widaman & Carlson 1989). However, the results of each experiment tend not to be without their own problems of substantive interpretation (A.R. Jensen 1987a; A.R. Jensen & Vernon 1986; Larson & Saccuzzo 1986; Welford 1986). As a consequence of the inconclusive nature of studies reporting both “negative” and “positive” results, research examining the speed with which individual’s process information continues largely unabated (see H.J. Eysenck 1995).

UNDERSTANDING HUMAN ABILITIES: G,/Gc THEORY AND THE COGNITIVE CORRELATES APPROACH In light of the research activity engendered by the speed of processing approach over the past two decades, it is somewhat surprising that those working within this framework seldom give detailed attention to the structure of human cognitive abilities. One critic of this position suggests that: Classical or traditional views of human intelligence are rarely questioned in these studies, and most investigators limit themselves to a unitary conception of psychometric intelligence. This unitary conception has yet to be accepted in all circles. (Juhell991, p. 75) Juhel(l991) goes on to note the existence of hierarchical models that cast doubt on the importance of a single factor of intelligence (see also Detterman 1982). Prominent among these models is the theory of fluid and crystallized intelligence originally advanced by R.B. Cattell and very much elaborated by Horn, Hakstian, and their associates (e.g., Boyle et al. 1995; R.B. Cattell 1963,1971, 1982,1987; R.B. Cattell & Horn 1978; Hakstian & R.B. Cattell 1974, 1978; Horn 1979, 1980, 1986, 1987,1989; Horn & R.B. Cattell 1966; Horn & Donaldson 1980; Horn & Hofer 1992; Horn & No11 1994; Horn & Stankov 1982; Stankov & Horn 1980).

Gt/G, Theory. The theory of fluid and crystallized intelligence suggests that there is enough structure among primary mental abilities to define a number of distinct types of broad cognitive ability. The theory derives its name from the two broad intellective functions most extensively studied. The main distinguishing feature between fluid and crystallized intelligence pertains to the amount of formal education and acculturation that is present either in the content of, or operations required during, tests measuring these abilities. It is well established that fluid intelligence (Gf) depends to a much smaller extent on formal education experiences than does crystallized intelligence (G,) (e.g., Horn 1987; Horn & Hofer 1992; Stankov et al., 1995). Gr/G, theory incorporates a number of factors in addition to the ones from which it derives its name. Some, such as broad auditory function (G,) and broad visualization (G,), are related to perceptual processes. Further factors, including short-term acquisition and retrieval (SAR) and tertiary storage and retrieval (TSR), are related to memory processes; others, such as clerical-perceptual speed (G,), reflect speed in performing tasks of relatively trivial difficulty. Each of these

6

LEARNINGANDINDWIDUAL DIFFERENCES

VOLUMEll,NUMEER1.1999

factors is assumed to share differential relationship with external measures (such as age) and each is postulated to arise from the workings of different cognitive and neurophysiological functions.

Reaction Time and Broad Cognitive Abilities. In reviewing and reanalyzing most of the psychometric data sets collected this century Carroll (1993) has proposed a three-stratum theory of intelligence that shares a number of similarities with the preceding model (see Roberts et al. 1999a). Within Carroll’s theory, abilities are classified as narrow, broad, and general with respect to factors sampled over the total psychometric domain. According to Carroll (1993): The three-stratum theory provides a framework witbin which correlations between psychometric variables and information-processing variables are to be interpreted. The cognitive correlates approach that has sometimes been adopted in studying such correlations is likely to be unsuccessful or at least misleading if its results are not properly referred to the three-stratum theory. (p. 654) Researchers utilizing the Hick paradigm have relied almost exclusively upon single-factor models of intelligence, alluding constantly to Spearman’s concept of psychometric g. Elsewhere, Stankov et al. (1995) have argued that at the conceptual level there is a misplaced faith in the principle of parsimony guiding the concept of psychometric g. Such an overly simplistic view of intelligence often attributes a greater than deserved role to lower-order cognitive processes, especially in those cases when nodes within the causal path leading to a hierarchical structure of abilities have been ignored (see also Stankov & Roberts 1997). Against this background it must be emphasized that within the speed of processing approach, intelligence has been assessed about 90% of the time by a single test, Raven’s Progressive Matrices (Juhell991). As a consequence, the relationship that choice RT parameters share with broad factors underlying hierarchical models of intelligence, in general, and the theory of fluid and crystallized ability, in particular, remains largely unspecified. Those few studies that have considered the relationship between RT and broad cognitive factors either employ young children as participants (e.g., Jenkinson 1983; G.A. Smith & Stanley 1983) or else use psychometric tests such as the Wechsler Adult Intelligence Scale (WAIS; Wechsler 1981) (e.g., Vernon 1983). In both cases there is reason to be cautious of the results that have been obtained. With young children it is not clear whether cognitive abilities are clearly differentiated (e.g., Bayley 1949; Burt 1954; Deary et al. 1996; Garrett 1946; Hofstaetter 1954; McCall et al. 1973; Stankov 1978; see also Horn & Hofer 1992 for an alternative view). Further, no study in the available literature demonstrates that cognitive speed differentiates from other abilities before the age of 13 years. In the case of the WAIS, studies have indicated that the scales typically employed are factorially impure (e.g., McArdle & Horn 1983). Studies using the WAIS do not provide definitive answers to questions concerning the correlation between RT and broad cognitive abilities and thus should be seen as suggestive, at best. To date, no study incorporating the Hick paradigm has explicitly examined the various empirical relationships that choice RT measures share with the many

PROCESSlNGSPEEDANDAtl/l/TlES

broad factors associated with Gf/G, theory.3 This shortcoming is curious, especially as some researchers (e.g., Carroll 1987; Horn 1985) have attempted to account for correlations between choice RT and psychometric g by specifying that they are a function of biasing the psychometric properties of cognitive tests toward specific broad abilities. To this end, one factor that has been suggested as biasing the results is G,, as a number of processes (e.g., degree of retinal displacement suggested by Longstreth 1986) may enter into an individual’s response to visually presented RT displays. It would also appear that the memory factors short-term acquisition and retrieval (SAR) and tertiary storage and retrieval (TSR) contain an important speeded component: the speed of retrieval from either short-term or tertiary storage (see L.T. Miller & Vernon 1992). Quite clearly there is a need to redress this imbalance. Specifically, it would seem necessary to focus upon a large subset of the broad factors of intelligence identified throughout the literature to determine the relationship that each shares with RT measures.

The Cognitive Correlates Approach. The cognitive correlates approach involves the investigation of factorially simple tasks that, it was thought initially, might lead to a definition of intelligence that is both precise and explanatory (Fogarty 1984; Hunt 1978; Roberts 1999a). Although this idea is by no means novel, it is differentiated from earlier research by being guided in task selection by theory-based experimental paradigms. Proponents thus select performance parameters from an extensive range of (so-called) elementary cognitive tasks (ECTs) and, on the basis of existing substantive theory, predict relationships between cognitive and psychometric measures (Carroll 1978,1981,1993, Chapter 11; Hunt 1980). It is common practice in the cognitive correlates approach to employ a single chronometric task that is examined in relation to psychometric performance. However, it may be argued that this practice leaves the substantive meaning of obtained correlations largely equivocal. Problems of interpretation are exacerbated. The RT tasks appear to implicate differential degrees of the construct of “cognitive complexity” suggesting that “more complex RT tasks show greater correlations with intelligence” (Larson & Saccuzzo 1989, p. 7). Complexity and many other such theoretical explanations may only be rigorously evaluated providing multiple ECTs are implemented within a study’s design. Note also that a comprehensive investigation of the relationship between speed of information processing and cognitive abilities emerges since convergent and discriminant validity may be demonstrated for correlations with ability scores. At present such information is largely unavailable.

THE MEASUREMENT OF PERFORMANCE SPEED: CONCEPTS, PROSPECTS, AND LIMITATIONS Scientific interest in performance speed and the factor structure of cognitive abilities has led to questioning of the Spearmanian approach to “intelligence.” In his review of the mental-speed literature, Carroll (1993, Chapter 11) notes problems that are implicit in any attempt to uncover the factor structure of human cognitive abilities. In particular, distinctions between speed of performance and accuracy measures are often blurred, a problem exacerbated by the fact that psy-

8

LEARNlNGANDINLWDUAL DIFFERENCES

chometric tests have usually been administered bitrary time limits.

V0LlJME11.NUMBER1.1999

within what are undoubtedly

ar-

[TJhe degree to which timed tests are speeded is not usually evident, even when information is available on the time-limit and the number of items. This is true of most tests that have been used in factorial studies, and it is therefore difficult to assess the extent to which cognitive factors are determined by dimensions of speed of performance. In effect, speed is an influential and undesirable confound in many factor-analytic studies. (Carroll 1993, p. 444 [see also Berger 1982; Furneaux 19601).

Speed and Level. In tracing the historical antecedents of the above controversy, Berger (1982) has highlighted some of the tensions apparent between researchers who attempted either to understand intelligence (e.g., Spearman 1904) or who, for pragmatic reasons, sought to measure this construct by devising “psychometric” tests (e.g., Binet & T. Simon 1905a, 1905b, 1916/1983 [see Ackerman 1996 for a commentary on Binet and Simon’s “psychological method”]). Acknowledging limitations in the latter approach, E.L. Thorndike (1921; Thorndike et al. 1926) located three sources of deficiency: The tests were ambiguous in terms of their content, arbitrary in their choice of units, and of equivocal significance (Berger 1982). As a prescription against these deficiencies, Thorndike suggested cognitive ability measures be analyzed into three separate components: level, speed, and range. By Zeve2is meant number correct, thus differentiating this concept from the speed of performance, whereas yunge is taken to mean the aggregate of tasks that may be performed at a given level of difficulty (see Carroll 1993, p. 442). Within speed of performance, further distinctions are possible because, in principle, the measurement of speed involves a “mixture of (1) the time spent in doing some tasks correctly, (2) the time spent in doing other tasks incorrectly, and (3) the times spent in inspecting other tasks and deciding not to attempt them” (Thorndike et al. 1926, p. 33; see also Berger 1982; Furneaux 1960). E.L. Thorndike’s thoughtful critique suggested the need for detailed conceptual analyses of items solution so as to ensure a more scientific approach to the study of individual differences. Despite this evocation, E.L. Thorndike never addressed this important concern himself (H.J. Eysenck 1973). Instead, an attempt along these lines was made initially by L.L. Thurstone (1937) and later by Furneaux (1960). In both Thurstone’s and Furneaux’s models, although the terminology is largely different, there is a common attempt to provide a detailed description of test-taking behavior (see H.J. Eysenck 1967). Although Furneaux’s (1952, 1960) work has subsequently spawned several techniques that enter speed of response into an item-scoring formula (e.g., Roskam 1987; Thissen 1983; Van der Ven 1974; White 1982), criticisms have been offered of these models, with methods difficult to realize in practice (Berger 1982). Moreover, as Carroll (1993) notes, “these models have not been applied to a sufficient variety of mental test performances to permit making generalizations about how speed factors operate in different domains of cognitive ability” (p. 450).

Mental Speed: Evidence for Independent Factors. In developing his model, Furneaux (1960) points to some of the problems

of interpretation

that accompany

factorial

Pt?OCESS/NGSfEEDANDAt?/l/T/ES

9

solutions that do not take into account the proposed distinction between level and speed. Of particular interest, he speculates that primary mental abilities may in fact be artifacts of the measures employed (Berger 1982). Thus, after critically evaluating H.J. Eysenck’s (1939) reanalysis of Thurstone’s (1938) “Primary Mental Abilities,” Fumeaux (1960) speculates that a variety of interpretations are possible. “The evidence could...be interpreted as supporting the hypothesis that at least a part of the apparent differentiation of Visuo-Spatial and Arithmetical tests is not due to differences in content at all, but to differences in the extent to which they measure speed as opposed to accuracy” (Furneaux 1960, p. 170). There would appear to be a number of shortcomings in Furneaux’s critique. In particular, there is a conspicuous failure to address factors that may, in principle, represent a mixture of level and speed (e.g., Spatial Visualization). Notwithstanding, Carroll (1993) has been persuaded by the general tenor of Furneaux’s arguments to offer a further suggestion: [I]t is conceivable that correlations among tests could be due only to common elements of speed. Of course, the almost overwhelming body of evidence goes counter to such an implication. Most first-order factors show clear differences in content. But what about correlations among factors--correlations that are often used to support the notion of a higher-order general factor? If tests of all first-order factors are characteristically speeded (given with a fairly severe time limit), could not the second-order general factor reflect mainly differences in speed abilities rather than the operation of a factor reflecting differences in levels of cognitive functioning that individuals can attain? This is a possibility that must be entertained, because it is by no means ruled out by the evidence presently at our disposal. (p. 461)

Given the above arguments it would seem judicious to differentiate many of the primary mental abilities found in the literature on the basis of whether these constructs appear to implicate dimensions of either level or speed. Empirical evidence for cognitive speed factors appears to be meager, so that in examining each cognitive ability there is a need to remain cautious in making definitive statements. Nevertheless, the possibility that a number of speed factors are parallel to level abilities should be acknowledged (e.g., Reasoning Speed ability).

The Factor Structure

of Cognitive Speed. A review of the literature suggests that only a few psychometric speed factors have acceptable empirical status. These factors include the primary mental abilities of Number Facility and Perceptual Speed obtained from tests that are generally quite easy. Importantly, Carroll (1993, p. 613ff.) has reported evidence indicating that several of these abilities define a second-stratum ability. This second-stratum factor would appear to correspond closely to the G, factor hypothesized under the framework of Gf/G, theory (see, e.g., Horn & Hofer 1992; Horn & No11 1994; Stankov et al. 1995). Elsewhere, Berger (1982) has argued that rate of performance measures derived from psychometric tasks of nontrivial difficulty tend not to have been treated with the necessary caution required to differentiate conceptually meaningful dimensions of mental speed. The reluctance of theoreticians to classify further dimensions of psychometric speed may in part reflect this. However, Horn

10

LEARMI~GAMDINDIVIDUALDIFFERENCES

V0LUME11.NlJMBER1.1999

and Hofer (1992) claim that, providing certain prescriptions are met, there would appear evidence for a broad second-order factor, Correct Decision Speed (CDS). This factor is thought to be independent of the G, factor within Gr/G, theory. Evidence from life-span developmental studies would seem to be supportive of this distinction. This broad second-order factor shares some parallels with the results presented in Stankov, Roberts, and Spilsbury (1994), who found a general speed of test-taking factor that is similarly independent of G, and unable to account for any of the variance in the decline of Gr with age. Indeed, the proliferation of studies employing the cognitive correlates approach over the past two decades prompted Carroll (1993, p. 478) to include many studies employing chronometric tasks in his reanalysis of the cognitive abilities domain. These analyses indicated separate DT and MT factors as well as evidence for a second-stratum Processing Speed factor. Because of their common occurrence in the contemporary individual differences literature and their postulated theoretical importance, these speed factors are considered to form part of the previously enunciated three-stratum model of cognitive abilities that Carroll (1993, p. 626) has developed. Notwithstanding, the available literature suggests the need for more comprehensive research into the factor structure of various performance measures derived from response speed. In the past these indices have received insufficient treatment, such that information on cognitive speed is slight. For example, to date, no study has investigated the intercorrelations that RT, G,, and speed of testtaking measures might share. Clearly, by using factor analysis in a single multivariate investigation, a number of important issues pertaining to the structure of cognitive speed might be addressed. The relative paucity of research in this general area necessitates bringing to the reader’s attention a number of further issues that require empirical clarification. A subset of these research issues includes each of the following: 1. The speed with which individuals perform psychometric tests provides additional information to that obtained from scores that are based on accuracy measures. While properties of test-taking speed parameters have been investigated in some studies (see Spilsbury 1992; Spilsbury et al. 1990; Stankov 1994; Stankov & Crawford 1993; Stankov & Cregan 1993; Stankov et al. 1994), pivotal questions remain that these investigators have proposed but failed to address. For instance, do speed scores from different tests tend to define one factor or behave in a fashion analogous to accuracy scores (i.e., define several factors)? Are speed scores factorially independent of level measures? In particular, what is the relationship between speed of test-taking and the broad speed factor, encompassed by Gr/G, theory, G,? 2. After examining several data sets in which RT is analyzed into its two “basic” components (i.e., DT and MT), Carroll (1993, p. 484ff.) has argued that these factors are orthogonal. However, if consideration is given to Fitts’ law (a psychometric principle relating movement to task difficulty [see Fitts 1954]), the conceptual status of the MT factor thus far examined in the contemporary individual differences literature is open to question (see Roberts 1997a, 1999a).

PROCESSlNG SPEEDANDALWTK

11

Moreover, as factors derived from available chronometric studies are seldom overdetermined, it would appear that their interpretation is equivocal. In light of this feature of previous empirical research, the results presented in Carroll’s (1993) reanalysis of mental speed constructs undoubtedly require more systematic evaluation. 3. If cognitive speed factors are shown to exist as broad abilities, is it possible to extract from them a higher stratum mental-speed factor? Or are speed measures derived from psychometric performance orthogonal to processing speed measures as some analyses suggest (see in particular, Carroll 1993, Chapter 15, especially Figure 15.1, p. 627)? The importance of the above undertakings should not be underestimated. The extent to which information concerning the structure of intelligence is utilized by educationalists, developmental psychologists, clinicians, and even social-policy makers is substantial (Carroll 1993; Naglieri 1997; Neisser 1997; Neisser et al. 1996; rallier et al. 1999; Spearitt 1996; Woodcock & Johnson 1989; Yee 1997). Plausibly, failure to address many of the issues surrounding mental speed has rendered previous models of human intelligence less efficacious than might otherwise be the case.

THEAlMSANDTHEORETlCALRATlONALESOFTHEMULTlVARlATElNVESTlGATlON The present study involves a comprehensive investigation of mental processingspeed measures within the structure of human cognitive abilities. The intention is to clarify some contentious features found in the contemporary literature and in so doing enable differential psychologists to assess more fully the efficacy of this research program. A major aim of the present study was to determine the relationship that speed of information-processing measures share with (a) cognitive abilities defined by level, and (b) cognitive abilities defined by speed. In a related vein, the study was designed to obtain information on the factor structure of mental speed. As such, the project involves investigation of a number of issues central to current theory and experimentation in the psychology of individual differences. THEPSYCHOMETRlCFACTORSWlTHWHlCHPROCESSING-SPEEDMEASURESARE CORRELATEDAREPOORLYDEFINED:THElMPORTANCEOFSAMPLlNGFROMA BROAD RANGE OF COGNITIVE ABILITIES Almost all of the studies of RT conducted within the individual-differences framework have focused on a definition of intelligence that assumes a unitary, overriding dimension (e.g., A.R. Jensen 1993a). The aim has been to derive a single measure of intelligence and then apply the cognitive correlates approach with the desire of showing how speed of information processing is basic (i.e., fundamental) to general intelligence (see Stankov & Roberts 1997). The imprudence of

12

LEARNlNGAND/ND/V/DUALDIFFERENCES

V0LUME11.NUMBER1.1999

such views has been aptly highlighted by Carroll (1993, p. 647ff.) in his reanalysis of a study examining the information-processing correlates of reading ability (see Palmer et al. 1985). Because the authors of that report failed to address the structural implications of their psychometric data, they arrived at a series of fallacious conclusions. Addressing this (and related) issue(s) satisfactorily requires an extensive battery of cognitive tests, Ideally, this would involve three or four marker tests of identified primary mental abilities, with the three or four measures of primary mental abilities defining a single, second-stratum, ability. In order to go to the next order, where the clearest evidence for a general psychometric factor (& exists, a further three or four measures of second-stratum abilities are required. It is at this point that researchers interested in speed of information processing claim linkage between RT and intelligence occurs. However, as emphasized at several points in the current report, few studies employ more than a handful of psychometric tests.

Rationale. The preceding account would appear to explain why there is a dearth of information on the relationship between clearly defined broad cognitive ability factors and speed of information-processing measures. As this renders the meaning of mental processing speed and its relation to intelligence ambiguous, the present study was designed to address this shortcoming. The present authors have, in various articles, developed a series of arguments that show multivariate studies should be guided by the theory of fluid and crystallized intelligence (see, e.g., Roberts et al. 1999a; Roberts et al. 1999b; Stankov et al. 1995). Accordingly, the relevance of a higher-order factor remains something of a moot point (see also Detterman 1982; Horn 1985, 1998). Therefore, it was deemed necessary to try to sample as many of the second-order factors from this theory as possible. Recently, nine have been postulated (Horn & Hofer 1992), and for the sake of resources, some discretion was necessary. As they are the most well established in the theory of fluid and crystallized intelligence, the following six broad abilities were selected for investigation: Gf, G,, SAR, G,, G,, and G,. In the hope of circumventing the logistical problem of employing at least (3 X 3 X 6) = 54 psychometric tests to define these abilities, the present study employed a “higher-stratum design” (Carroll 1995). To this end, the battery of variables was selected in such a way that at least some of what were ostensibly first-order factors constituted second-stratum abilities. Caution was exercised in selecting these tests to represent a factor adequately. Recently, much of the literature involving speed of information processing has been interpreted as involving linkage with fluid intelligence. Accordingly, this factor was overdetermined-with six tests of various primary mental abilities hypothesized to underlie Gf employed in the current design. THE DOMAIN OF COGNITIVE ABILITIES IS NOT STATIC: THE EMERGING IMPORTANCE OF MENTAL SPEED Horn and No11(1994) have argued that the capabilities indicating human intelligence are changing over time as a function, in particular, of technological and

cultural evolution. Within this context, developments in information technology have opened up a means of investigating the factor structure of mental speed as defined by the rate at which an individual solves psychometric test problems. An interest in mental speed so defined is not new, although a means of investigating this accurately, computerized presentation, has become practicable in psychometric assessment only recently (see Kyllonen 1991,1998). As mentioned previously, there may be as many different abilities underlying cognitive speed as there are primary (and broad) mental abilities emanating from measures of level. Because tests of first-order factors are characteristically speeded, the existence of second- (and third-) stratum abilities might reflect differences in mental processing speed rather than differences in levels of cognitive functioning (see Carroll 1993, p. 460461; Draycott & Kline 1994a). Extending this argument, a further possibility cannot be ruled out: Measures of speed of information processing share a correlation with cognitive abilities (defined by level) precisely for this reason rather than any of the various theoretical explanations advocated in the literature.

Rationale. Cognitive (or mental) speed divides into two major conceptualizations within this study: (a) the rate at which an individual performs complex psychometric tests, and (b) the speed of performance in which complex cognitive capabilities are only minimally involved. The terms speed of test-taking and speed of information processing are reserved (respectively) to differentiate between these constructs where necessary. Speed of Test-taking. An aim of the study was to explore the factor structure of psychometric tests as defined by the speed (or rate) at which participants perform these tasks. In the factor analytic literature this issue is seldom addressed. Within Gr/G, theory, where arguably more attention has been devoted to this construct than in other psychometric models, there would appear to be two factors defined by speed of test-taking-G, and CDS (Horn & Hofer 1992). However, it seems unlikely that these constructs are the only psychometric factors comprising the domain of psychometric speed. For example, broad visualization function (G,) is often defined by tests given within strict time limits. Would measures of test-taking time in identified marker tests of such broad constructs define second-stratum speed abilities? The question also remains as to whether or not, in extracting separate measures of level and speed from tests defining various broad abilities, these indices subsequently show up as distinct factors or as a single latent trait. Speed oflnformation Processing. Although speed of information-processing measures employed in this study serve several purposes, of relevance to the issue currently raised is the factorial structure of parameters extracted from ECTs. Carroll’s (1993) reanalysis would seem to establish the existence of separate speed factors tied specifically to movement and decision processes. Arguably, the existence of such factors owes as much to the prevailing Zeitgeist as it does to technological advances. Although the existence of two speed of information-processing factors may be taken as established, the place of these factors within the structure of human cognitive abilities remains largely unexplored.

14

LEARNING AND INDIVIDUAL DIFFERENCES

VOL!JME11.NUMBEFi1,1999

Adequate attention to the factor structure of processing-speed measures requires a wider representation of chronometric tasks than that provided in the current literature since very few studies have employed more than a handful of measuring instruments (see Kranzler & A.R. Jensen 1991a, 1991b; Nettelbeck & Rabbitt 1992; Saccuzzo et al. 1986; Tomer & Cunningham 1993 for notable exceptions). Problems in interpreting the factor structure of speed of information-processing constructs are thus likely to be manifest in the majority of studies. Equally, researchers investigating ECTs make use of three or four dependent variables selected from within the one task without addressing the issue of experimental dependence. In attempting to remedy these weaknesses, this study employed 11 chronometric tasks from which an attempt was made to isolate the most valid and reliable parameters. In short, two aims of this investigation that were formulated in order to address each of the issues raised above were: 1. To relate speed of information-processing measures to the emerging factor structure of cognitive speed. Notably, information on the relationships shared among rate of test-taking, Decision Time (DT), and Movement Time (MT) factors is virtually nonexistent. 2. To determine the role (and relevance) of the factor structure of cognitive speed in relationship to broad abilities defined by level.

SPEED OF INFORMATION PROCESSING: INCONSISTENCIES EMANATING FROM THE JUXTAPOSITION OF THE EXPERIMENTAL AND INDIVIDUAL DIFFERENCES APPROACHES Most published studies examining the relationship between speed of information processing and cognitive abilities employ both a single ECT and a single psychometric measure. The quantitative nature of the task difficulty manipulation underlying these “experimental tasks” is often questionable. While this is not crucial per se to the design of such studies, it would seem to leave limited scope for manipulating these tasks in various fashions in order to implicate higher levels of cognitive complexity. For this reason, the chronometric tasks of the present design were generally amenable to analysis within an information-theory framework (see Attneave 1959; Garner 1962). The application of information theory in studies of RT has led to experimental designs incorporating light-key apparatus, card-sorting measures, light-voice, vibrotactile stimuli, and so forth (for reviews see E.E. Smith 1968; Teichner & Krebs 1974; Welford 1968). This is advantageous from several perspectives, including the fact that speed of information-processing tasks subscribing to information theory may be designed: (A) In different sensory modalities. (B) In different formats (noting that, with computer presentation in particular, older participants may suffer undue bias in their performance because of unfamiliarity with equipment [see Detterman 19871). (C) To involve conditions of divided attention in which task difficulty may be manipulated in both primary and secondary conditions to both equal and varying extent (see Roberts et al. 1988; 1991a).

PROCESS~'~GSPEEDAIVDABILITIES

15

(D) Such that, despite the diversity implied by the above, comparable intraindividual performance parameters may (in theory) still be obtained from each task. The most commonly employed information-theory model in individual differences research would appear the Hick paradigm (Roberts 1995,1999a). This paradigm is assumed to be free of higher mental processing, yet various parameters extracted from this model exhibit moderate correlation with psychometric test measures (e.g., A.R. Jensen & Vernon 1986). These variables are the major speed of information-processing parameters focused upon in the present investigation.

A Summary of PreviousLimitations/WeaknessesIdentified in the Literature. H.J. Eysenck (1987b, see also H.J. Eysenck 1984) has noted certain differences in the expectations and purposes of those working from within an individual differences perspective compared to those psychologists who subscribe more closely to the experimental framework. The psychometrist is concerned with a relationship postulated by theory, such as that between reaction time and intelligence...[so] inevitably needs a relatively large number of subjects.... This inevitably means that the psychometrist must choose, on the basis of as much information as is available, and otherwise on the basis of guesswork, the parameters of the experimental variables which best suit his purpose, and give him maximum information about the relationship in which he is interested. (p. 309) While there can be no doubting the distinction Eysenck proposes, to what extent this account justifies ignoring certain problems inherent in the RT literature constitutes a point of contention (see also Rabbitt 1996). For instance, a number of claims are made concerning the status of intraindividual variability measures of the Hick paradigm (A.R. Jensen 1992a; Roberts 1999a). It passes unnoticed that this variable behaves rather unlawfully in its correlation with intelligence as a function of task difficulty (A.R. Jensen 1987a). Nor does it seem of consequence that while measures of central tendency exhibit a simplex structure (largely as a natural consequence of the linear relationship of this measure over bits), there exists no study that reports evidence on the presence of simplex in intraindividual variability measures, even though this increases with set size in the same linear fashion. Studies conducted within the Hick paradigm would also appear not to have bothered to examine whether or not the proposed lawfulness of individual participants to the model is an empirical phenomenon. Thus, it should not go unnoticed that individual studies of RT generally include only four data points from which this inference is drawn (Roberts 1999a). Likewise, a large proportion of research reported in the individual differences literature seems oblivious to findings, presented by experimentalists, which have demonstrated (a) the distinctiveness of stimulus-response codes (e.g., Teichner & Krebs 1974; Wang & Proctor 1996); (b) the importance of stimulus-response compatibility (e.g., Fitts & Deininger 1954; Fitts & Seeger 1953; Fitts & Switzer 1962; Kornblum et al. 1990; Komblum & Lee 1995); and (c) sometimes even the inappropriateness of the Hick model (e.g., Kornblum 1967,1968; Longstreth et al. 1985; Welford 1968). Seemingly a hiatus exists

16

LEARNlNGANDINDIVlDUALDIFFERENCES

VOLlJMEll,NUMBER1,1999

wherein dialogue between the psychometrician and experimentalist is seldom either encouraged or established. It is now approaching two decades since A.R. Jensen’s (1979) provocative report first appeared extolling the “scientific virtues” of the Hick paradigm and its importance in uncovering the underlying physiological structure of intelligence, and still no reconciliation has been achieved. Further, there appears little concern in the published literature for validating parameters obtained from the Hick paradigm according to minimally accepted standards. Instead, mean structure is generally reported, perhaps reliability, and then a potpourri of parameters depending on whichever construct would currently appear in fashion (Stankov & Roberts 1997). Superficially, this may appear something on an exaggeration. Nonetheless, consider the following: Initial interest in choice RT derived from Roth’s (1964) findings that measures of slope DT were related to intelligence. Subsequent studies have revealed the importance of intraindividual variability in DT (e.g., A.R. Jensen 1987a, 1992a), whereas more recently, median MT would appear paramount, enticing more than passing interest (e.g., Buckhalt et al. 1990; Houlihan et al. 1994). Equally compelling (at least if the results presented in A.R. Jensen’s [1987a] meta-analysis are a meaningful guide), only one study out of 26 given in Table 25 (p. 158-159) of the Jensen investigation reported correlations between all variables of the Hick paradigm and intelligence measures.

Rationale. Another aim of this study was to isolate those parameters of the Hick paradigm (and two other information-theory paradigms) having construct validity. A number of criteria given in the literature were examined for each parameter of some 11 ECTs, in as rigorous and consistent a fashion as possible. As an additional feature, an attempt was made to relate results to the experimental (as opposed to individual differences) literature whenever (and wherever) this seemed appropriate. This is a particularly complex undertaking involving a large number of conceptual and statistical analyses. Consequently, these findings have been reported elsewhere in the literature. (See Roberts [1999a], who summarizes the findings from the present battery of ECTs while simultaneously detailing their conceptual relevance, and also Robert’s [1999b] WWW document [available also as a technical paper], which gives all pertinent statistical analyses, commentary, and references.) Nevertheless, because these findings are pivotal to how other data are to be interpreted in the present article, the major outcomes will be summarized in an appropriate section (along with descriptive statistics that are of relevance to issues raised throughout the current investigation). COGNITIVECORRELATESANDSECOND-STRATUMABILITIES:AMODlFlEDAPPROACH It appears that the cognitive correlates approach will remain flawed as long as it continues to attempt to establish meaningful correlations between general intelligence and chronometric performance by employing a handful of psychological tests. This approach is problematic largely because the specific source of correlation (particularly the stratum of cognitive abilities from which correlations derive) needs to be established in a rigorous manner before compelling conclusions can be drawn. However, there would appear nothing fundamentally wrong with

this approach per se, providing a broad range of cognitive abilities are sampled within a given study and speed of information-processing measures are (subsequently) referenced to this wider domain.

Rationale. The aim was to demonstrate the presence (or absence) of correlation between ECTs and second-stratum abilities as current information would appear to be cursory at best. For example, Carroll’s (1993, p. 508) summary of research involving RT and human abilities remained speculative. This tentative tone reflects an acknowledgment of specific limitations inherent in many cognitive correlates designs. In essence, these studies simply do not sample widely enough across the psychometric domain. In focusing upon a variety of chronometric measures, each of which subscribes to information-theory principles, convergent and discriminant validity with human cognitive abilities sampled in the present investigation may be established. This framework addresses various theoretical explanations proposed in the literature (e.g., the status of the limited capacity model as an explanation of intelligence, the role of task complexity, and so forth) and hence may be differentiated somewhat from concerns surrounding the delineation of the factor structure of mental-speed constructs. INTRODUCTION TO THE STUDY The various issues raised in the preceding sections suggest that a systematic approach to an understanding of mental processing speed is required. To this end, the present investigation first drew from a diverse array of cognitive-ability constructs. Each psychometric measure was carefully selected and analyzed so as to ensure that cognitive-ability constructs were analogous to those factors most often replicated in the individual differences literature. Second, the construct validity of the various mental-speed measures were examined in more rigorous a fashion than has been common in the past. Having met these prerequisites, measures both of speed of test-taking and speed of information processing were (subsequently) correlated with ability constructs so as to derive an explanatory model of (certain) intelligence factors. Finally, because it would seem pertinent that a better understanding of the factorial structure of speed of test-taking and speed of information-processing measure be obtained, these two types of speed were examined. In conducting this undertaking a “new” taxonomic model of mental-processing speed should result that, in turn, may be integrated with structural models of human intelligence.

METHOD PARTICIPANTS A total of 179 participants were involved in the study. The majority of participants (82%) were first-year psychology students at the University of Sydney, Aus-

18

LEARNINGANDINDIVIDUALDIFFERENCES

VOLlJMEll.NUMEER1,1999

tralia. The remaining participants were drawn from the adult population of western Sydney. The age of participants ranged from 17 to 50 years, with a mean of 21.58 years (SD = 6.18 years). One hundred and ten of the participants were women. It should be noted that the population drawn from outside the university was generally well educated, holding bachelor’s degrees or higher. PSYCHOMETRIC MEASURES The design was intended to provide a framework for systematically investigating speed of test-taking and speed of information-processing measures in relation to each of the issues raised in the preceding section. In all, the test battery consisted of 36 tasks. Of these tests, 25 were used to obtain six broad cognitive ability factors hypothesized under the framework of Gf/G, theory. Each of these tests is listed in Table 1 along with the factors that they are postulated to define. As can be observed from Table 1, there were eight markers of Gf (Tests l-8), four markers of G, (Tests 9-12), two markers of SAR (Tests 13,14), four markers of G, (Tests 15-N), three markers of G, (Tests 19-21), and four markers of G, (Tests 22-25). Except for those marker tests hypothetically defining the clerical-perceptual speed (i.e., G,) factor, the dependent variable was number-correct (i.e., level, see Carroll 1993, Chapter 11). SPEED OF MENTAL-PROCESSING MEASURES The computerized testing of broad cognitive abilities allowed measures of testtaking speed to also be collected from each psychometric task administered in this fashion (i.e., the tests printed in italics in Table 1). Previous research has shown such measures may have a factor structure distinct from number-correct scores, although the extent and nature of these differences remain largely unspecified at present (see, in particular, Stankov et al. 1994). In addition to these psychometric tests, all participants completed 11 speed-ofprocessing tasks (Tests 26-36). For each of these ECTs the dependent variable was time per response measured in milliseconds.4 Almost all of these paradigms also involved the manipulation of task difficulty (which could be scaled into bits using principles derived from information theory). These task are outlined briefly in the passage that follow. For the frequently used tasks, the standard procedures were implemented and the original sources can be consulted for comparison purposes. (26) Fifts’ Movement Task. In this task, participants were required to tap a small metal probe between two targets as quickly and accurately as possible. Following Fitts’ (1954) pioneering work, task difficulty was manipulated by changing target width. Using the formula underlying Fitts’ law, values of task difficulty were subsequently scaled into information units. Five conditions were selected for investigation: 2.88,3.34,4.05,4.62, and 5.66 bits. (27) Joystick Reaction Task. In this task (see Myors 1985), participants were presented with a central fixation point on the computer screen in addition to varying numbers of lines emanating from this at 45 degree increments from the horizontal. The number of lines ranged from 1 to 8. At the end of each line was a small

19

TABLE 1 Cognitive Ability Measures and Their Hypothetical Variable

G,

Level Measures (number correct) 01. Progressive Matrices (RM) 02. Letter Counting (LC) 03. Letter Sets (SL) 04. Number Series-Single (NSS) 05. Number Series-Competing (NSC) 06. Letter Series-Single (LSS) 07. Letter Series-Competing (LSC) 08. Water Jars (WJj 09. Scrambled Words (SW) 10. General Information (GI) 11. Vocabulary Multichoice (VM) 12. Esoteric Analogies (EA) 13. Digit Span Forwards (SF) 14. Digit Span Backwards (SB) 15. Card Rotations (CR) 16. Computer Form Boards (CFBI 17. Hidden Figures-Sing/e (HFS) 18. Hidden Figures-Competing (HFC) 19. Tonal Memory-Single (TMSI 20. Tonal Memory-Competing (TMC) 21. Speech Distortion (SD) Speed Measures (msec) 22. Number Comparison (NCT) 23. Stroop Color (SCT) 24. String Search (SST) 25. Digit Symbol (DST) Note:

G,

SAR

Structure G>,

G‘,

G,

X X X X X X X X X X X X X X X X X X X X X X X X X

Tests given in italics were administered by computer and hence provide for information on both the speed and accuracy of performance (see text for details).

open circle. The participant’s task was to move a hand-held joystick from its resting position-a point corresponding to the central fixation point-in the direction of any circle that became illuminated. Reaction time (RT) was determined from initiation of the signal to termination of the response by the participant’s movement of the joystick. There were 10 trials for each condition. (28) Single Response Choice Reaction Task. This task was akin in design and format to the choice RT paradigm that has been employed extensively during the last two decades to examine the relationship between speed of processing and stimulus information (e.g., Jensen 1982a, 1987a). The set sizes manipulated were 1,2,4,6, and 8 ‘n’, with 10 trials per condition. A slight modification, introduced to reduce angular displacement effects, involved presenting the stimuli along a horizontal (rather than semicircular) array. Because the design of this test utilized a home key, both MT and DT were independently assessed. (29) Tachistoscopic Choice Reaction Task. In this paradigm, two-, four-, or eightparallel vertical lines were exposed to each participant for periods of time ranging (in equal increments of 40 msec) between 40 and 480 msec. There were 10 trials

20

LEARNINGANDINDIVIDUALDIFFERENCES

V0LUME11.NUMBEA1.1999

per condition. In all there were 360 trials covering each cross of task difficulty and exposure duration. Within any trial condition a single line was smaller by 2% than all other stimuli. Participants were required to lift their finger from a home button and move it to a response key indicating the serial position of the perceived shortest line. This task, although following procedures underlying the Inspection Time (IT) methodology, differed from recent studies of IT in that (a) neither a forward nor a backward mask was employed, (b) conditions extending beyond a simple binary decision were included, and (c) both MT and DT were recorded in addition to the conventional measure of accuracy (see Nettelbeck 1987). (30) Complex Choice Reaction Task. In this task, the stimuli, method of presentation, and mode of response were analogous to that employed in the Single Response Choice Reaction Task. A simple procedural difference was, however, introduced. Instead of one stimulus target becoming illuminated, several did so simultaneously. The participants’ task was to press each number key corresponding to filled-in targets. The number of targets employed was 2 for set sizes of 4,6, and 8; 3 for set sizes of 6 and 8; and 4 for an array size of 8. In implementing a mathematical extension of Hick’s law, it was again possible to determine stimulus values (see Beh et al. 1994). These ranged from 2.58 to 6.12 bits of information (i.e., stimulus values generally well beyond those normally investigated). (31) Binary Reaction Tusk. This task incorporated methodological aspects from both Test 28 and Test 30. In it participants were given a simple rule requiring a binary decision and response. For example, if the number eight light becomes illuminated press ‘Yes,’ otherwise press the ‘No’ key. Note that, in many respects, this task parallels the widely employed “odd-man-out” paradigm (e.g., Frearson et al. 1988; Frearson & H.J. Eysenck 1986; Reed & A.R. Jensen 1993). It is envisaged that this task by “increasing the complexity of the discrimination and forcing participants to make judgments about relationships among elements of a stimulus array increases the correlation with IQ” (Diascro & Brody 1994, p. 92). (32) Single Card-Sorting Task. This task, modeled after Crossman (1953), utilized the informational properties of a simple deck of playing cards. In it participants were required to perform four subtasks in various random orders. These included sorting into alternate piles (0 bits); sorting according to the color of the cards (1 bit); sorting into suits (2 bits); and sorting according to number and suits (3 bits). Following the rationale detailed in Roberts et al. (1991a) this task (as well as Tests 33-35) was administered within a 60-set time interval. This variable was subsequently transformed into a speed measure that was scaled in milliseconds so as to make it comparable to the other chronometric variables of the current investigation. (33) Mdtitask Card-Sorting. There were three tasks that collectively defined this task structure. Each involved instructions emphasizing different attentional requirements for simultaneous presentation of two information theory tasks-the card-sorting paradigm described previously and word-classification tasks to be described shortly (see Roberts et al. 1988,199la). In the first of these-the competing task condition-participants were required to divide their attention equally between the two tasks. In a second version, participants were required to attend principally to the cards. In a third and final version, participants were directed to focus attention mainly upon the words (see Stankov 1987,1989).

PROCESSINGSPEEDANDAiYllT/ES

21

(34) Single Word-Classification. Four conditions were constructed for this task. These corresponded to 0,1,2, and 3 bits of stimulus information. Each of these consisted of a list of 32 words defined by prearranged categories, which the experimenter read aloud. The participant was required to state (orally) the category to which a given word belonged (e.g., ‘robin = bird). Upon correct classification of this stimulus the next word was immediately presented. Words were selected to represent each category name on the basis of Rosch’s (1975,1978) research on prototypes. (35) Multitask Word-Classification. This was the word-classification component of the pairing of cards and words under the three multitask conditions. (36) Information Storage Measures. This task consisted of two subtests thought to assess different parameters of information processing: the duration of presence (i.e., basic information processing [BP] unit), denoted by Ck, and the flow to short-term storage, denoted by Ts. To obtain the former a Letter Reading task was used. This subtest was constructed using guidelines set down by Lehrl and Fischer (1988,199O; see also Draycott & Kline 1994b). In elaboration of these procedures, the measurement of CK was conducted using a card on which there was a boldly printed series of 20 letters from the English alphabet (e.g., E H A Z G C T L V M I B A R U F P 0 Q D). Participants read each out loud as quickly as possible, correcting any errors made in the process. This procedure was repeated four times with different cards and was timed to the nearest 10 msec. To assess the latter parameter, T a, a Memory Span subtest was employed. This task actually involved two memory paradigms-digit span forward and letter span forward. Each participant provided an oral response to these task types. The string length of both letters and digits varied from 3 to 11, with two trials per string length, each worth one-half mark. To take account of possible “chunking” by participants, scores on digit span forward were corrected thus: 5 digits to 4.7; 5.5 to 4.9; 6.0 to 5.1. For participants recalling more than 6.0 digits, 0.9 was deducted from their scores. The duration-of-presence measure, TR was obtained by averaging the letter forward score and the correct digit forward score for each participant.5

PROCEDURE Total testing time per participant ranged between 8 and 9 hours. This was split over four experimental sessions. However, because some of these tests were selfpaced, the time varied between individuals. Rest pauses of 10 minutes were given to all participants at the end of 50 minutes of work, with short rest pauses also given at the completion of each task. Paper-and-pencil and computerized tasks were administered in groups to 4 to 5 participants and lasted about 4 hours split over the first two sessions. The computerized tasks were programmed on either an Amiga or Commodore-64 computer, with slightly modified keyboard consoles so as to ensure ease of response in the various paradigms administered in this fashion. Because some of the participants had little prior experience with these computers, time was provided during the first session for familiarization with the equipment. The remaining experimental tasks were administered on an individual basis that covered the penultimate and final 2-hour sessions. During these tasks an AMF Accusplit stop-

22

LEARNINGANDINDIVIDLIAL DIFFERENCES

VOLUME11,NlJMBERl,l999

watch was used to assess latency. All results were recorded onto the mainframe computer for later statistical analysis. The principal statistical packages used were SPSS (Norusis 1990) and Cricketgraph (Cricket Software Inc. 1991). It should be noted that the order in which tasks were given in any test session was randomized, as were the conditions comprising each speed-of-processing task. While arguments persist in the literature concerning this aspect of design as it relates to individual differences psychology (see, in particular, A.R. Jensen & Vernon 1986), this was envisaged as going some way toward meeting the methodological concerns explicit in Longstreth’s (1984,1986) critique.

RESULTS ANALYSES INVOLVING BROAD COGNITIVEABILITIES The tasks initially analyzed are those presented in Table 1. For Tests 1-21 the number correct (i.e., level)6 scores that were obtained for each participant are included in these analyses.7 For Tests 22-25 (the G, marker tests), the dependent variable of interest was time (in msec) to complete each item. Note that all of the constructs examined in this section are well replicated in the individual differences literature (see, e.g., Stankov et al. 1995). Summa y Statistics for Psychometric Variables ldentifiing Broad Cognitive Abilities. Means and standard deviations calculated for each test are presented in Table 2, which also contains (for comparison purposes) information pertaining to the number of items contained in each psychometric test. In general, the mean and standard deviations obtained on respective tests are analogous to those obtained previously with student populations, both in Australia (e.g., Davies et al. 1998; Fogarty & Stankov 1988; Roberts et al. 1991a; 1997; Stankov 1988a) and abroad (e.g., Horn 1988; Horn & No111994; Horn & Stankov 1982). Exploratory Factor Analysis. Correlations were obtained between all psychometric variables listed in Table 2 (i.e., Tests 1-21: number correct and Tests 22-25: time per item). These results are presented in Table A.1 of the Appendix.8 To determine the structure underlying the variables in question, this 25 X 25 correlation matrix was subjected to maximum likelihood analysis.9 This technique provides a principal factor (as opposed to principal components) solution and offers a statistical test of goodness-of-fit (Lawley 1940; Lawley & Maxwell 1963; McDonald 1985). A solution employing root-one criterion yielded seven factors. Although the Scree plot (see R.B. Cattell 1996) indicates that up to 12 factors may be acceptable for these data, there is a dip after the seventh latent root, suggesting that seven factors are likely to produce a reasonable fit. With these seven factors, the goodness-of-fit chi-square test was satisfactory (&i-square = 144.43; df = 146; p = 0.521). These seven factors were then rotated to an oblique (i.e., oblimin) solution. The 20.10 hyperplane count of this solution was 61.1%, suggesting adequate attainment of simple structure (see Boyle et al. 1995).

23

f’ROCfSS//VGSPEEDA/VDA6/N/fS

Interpretation ofFactor Loadings. The oblimin factor pattern solution is presented in Table 3. As for the majority of factor analyses reported in the present report, loadings above 0.20 are in boldface. To provide some idea of the correspondence between the hypothetical and obtained factor structure, loadings in line with expectations are underscored. The last column of Table 3 contains communalities (h2). Although this solution is close to the predicted structure of the battery of psychometric tests given in Table 1, it is necessary (largely because of the major aims of the present study) to consider the interpretation of these factors in some detail. Factor 1: Fluid Intelligence (G,). The Gf factor is easy to interpret because, as expected from a wide body of literature (e.g., Carroll 1993, p. 696), the Raven’s Progressive Matrices Test has the highest loading on this factor. In addition, as preTABLE 2 Means and Standard Deviations of Cognitive Ability Variables Mean

Test Level Measures (number correct) 01. Progressive Matrices (RM) 02. Letter Counting (LC) 03. Letter Sets (SL) 04. Number Series Single (NSS) 05. Number Series Competing (NSC) 06. Letter Series Single (LSS) 07. Letter Series Competing (LSC) 08. Water Jars (WJ) 09. Scrambled Words (SW) 10. General Information (GI) 11. Vocabulary Multichoice (VM) 12. Esoteric Analogies (EA) 13. Digit Span Forwards (SF) 14. Digit Span Backwards (SB) 15. Card Rotations (CR) 16. Computer Form Boards (CFB) 17. Hidden Figures Single (HFS) 18. Hidden Figures Competing (HFC) 19. Tonal Memory Single (TMS) 20. Tonal Memory Competing (TMC) 21. Speech Distortion (SD) Speed Measures (msec) 22. 23. 24. 25. Note:

Number Comparison Time (NCT) Stroop (Color) Time (SCT) String Search Time (SST) Digit Symbol Time (DST)

SD

No.

ofItems

50.04 6.90 10.99 19.38 11.07 13.85 8.32 38.94 7.30 10.03 10.31 15.42 9.80 9.18 51.31 10.44 13.18 14.63 13.74 13.16 18.50

5.79 3.83 2.46 3.77 3.78 4.03 3.00 14.23 4.90 3.85 3.14 3.82 2.10 2.38 13.41 3.52 3.96 3.99 3.39 3.65 1.70

60 15 15 24 30 24 30 15 25 20 18 24 14 14 80 20 20 20 20 20 24

3000.97 1724.92 1165.94 1351.99

903.87 604.45 333.18 222.43

48 80 90 90

For Tests 1 to 21, the dependent variable is number correct from all items in the test, whether attempted or not. The one exception to this is Test 8, which was scored as the total number of steps performed by the participant in solving the last five items of the Water Jars Test. The minimum number for this is 25; the closer an individual’s score to this the better the performance. Thus, for future purposes (particularly as an aid to factor interpretation) Test 8 was transformed into a negative number by simply multiplying this score by a constant of negative 1. For Tests 22 and 25 the dependent variable is average time per item. For Test 25 this was obtained by dividing the time limit of the task (i.e., 90,000 msec) by the number of items correct.

LEARNIIVGANDINDMDUAL DIFFERENCES

24

VOLUMEll,NIJMBER1,1999

TABLE 3 Oblimin Factor Pattern Matrix of Psychometric Test

Fl (Gy)

01. Progressive Matrices (RM) 02. Letter Counting (LC) 03. Letter Sets (SL) 04. Number Series Single (NSS) 05. Number Series Competing (NSC) 06. Letter Series Single (LSS) 07. Letter Series Competing (LSC) 08. Water Jars (WJ) 09. Scrambled Words (SW) 10. General Information (GI) 11. Vocabulary Multichoice (VM) 12. Esoteric Analogies

.94

F2 CG,)

.14

F3 (SARJ

-24

F4 CG,)

.Ol

.04

.99

-.15 .02

.07 .05

-.lO -.ll

.13 -.03

.30 .44

fl

-.02

.15

-.05

.Ol

-.06

.64

.53

&3

-.lO

.13

-.06

-.02

.52

.31

m -Jl.l

-a .17 -.03

.04

.13

.02

.03

.60

.45

.07 .08

.08 -.19

.08 .08

.25 .03

.06 -.06

.43 .40

.39 .22

.29

.Ol

-.13

.12

.33

- .05

.oo

.57

2 zz

-.04

-.16

.09

&I

.04

.14

(SF) 14. Digit Span Backwards

.04

.lO

_71

(SB) 15. Card Rotations (CR) 16. Computer Form Boards (CFB) 17. Hidden Figures Single (HFS) 18. Hidden Figures Competing (HFC) 19. Tonal Memory Single (TMS) 20. Tonal Memory Competing (TMC) 21. Speech Distortion

.Ol .02

.09 -.08

& .12

.31

.02

.04

.09

.06

-.08

.05

(EA) 13. Digit Span Forwards

Note:

hZ

.16 .15

.06

PST)

.02

F7 (IR)

-.lO .05

.lO

(SST) 25. Digit Symbol Time

F6 (G,)

39 255

.76

(SCT) 24. String Search Time

F5 (G,)

.12

.02

(SD) 22. Number Comparison Time (NCT) 23. Stroop (Color) Time

Tests

-.04

-.07

-.08 .07 -.07 .06

.13 -09

-.Ol

.61

-.04

.77

.22

.18

.07

.68

&I

.19 .08

.02 - .22

.18 -.09

.44 .24

&I

.I0

.20

.09

.27

.12

.61

-.05 .04

-.04

-.Ol

.07

a

- .05

ti

.13

.04

.08

.82

-.03

.67

.02

- .05

.55

-.14

.lO

.79

.18

.02

.06

- .08

.03

.ll

.25

.Ol

.oo

.26

.08

-_02

-.06

.03

.09

.oo

.OO

.oo

.Ol

.05

75

.oo

.56

.Ol

.lO

.07

- .29

69

-.07

.62

-.14

.49

-.11 .oo -.03

-.15 .12

_71

.ll

-.03

.02

_66

-.16

-.12

.05

.57

All loadings above 0.20 are in boldface. To provide some idea of the correspondence cal and obtained factor structure, loadings in line with expectations are underlined.

.07

.42

between the hypotheti-

dieted from Horn and Stankov (1982) and Ekstrom et al. (1976, 1979), both Letter Counting and Letter Sets share salient loadings on the Gf factor. There are, however, two features of this factor that require additional comment. The first is that the G, marker test, Form Boards, has a loading on this factor. This is perhaps partly because in an attempt to construct both easy and difficult versions of this test for computer presentation its factorial structure may have been altered somewhat. Furthermore, researchers sometimes experience difficulties distinguishing between Gr and G, abilities, especially where the relationships among the visual patterns are not clearly manifest (e.g., Horn 1986; Humphreys 1962). The second interesting feature of this factor is the lack of salient loadings from any of the four series-completion tasks, all of which have previously been implicated as markers of Gf (e.g., Horn 1976; Horn & R.B. Cattell 1967; Myors et al. 1989; Stankov 1988b; Stankov & Myors 1990). There are two possible explanations for this outcome. Specifically, as all five of these tasks share salient loadings on Factor 7, it is possible that Gr has split into two broad factors. In this account Factor 1 represents one type of fluid ability (i.e., a composite of Cognition of Figural Relations and Temporal Tracking) and Factor 7 another. Alternatively, Factor 7 may be taken to define a lower-order factor (i.e., Inductive Reasoning). Resolution of this issue is addressed in the discussion that surrounds the interpretation of Factor 7.

Factor 2: Crystallized Intelligence (G,). The four hypothesized markers of G, (i.e., General Information, Vocabulary, Scrambled Words, and Esoteric Analogies) all have salient loadings on Factor 2. Because no other test from the battery of measures currently analyzed shares substantial loading on this factor, it is taken to represent breadth and depth of acquired knowledge (e.g., R.B. Cattell 1971; Horn & Hofer 1992). Factor 3: Short-Term Acquisition and Retrieval Function (SAR). The most pronounced loadings on the SAR factor come from the two short-term memory-span tests of the WAYS-R (i.e., Digit Span Forward and Digit Span Backward). The low salient loading of three other tasks on this factor (i.e., Scrambled Words, Raven’s Progressive Matrices, and Speech Distortion) may, however, be taken to suggest that this represents a more general aspect of memory (see Carroll 1993; p. 605ff.). Accordingly, Scrambled Words would appear to implicate Tertiary Storage and Retrieval functions; Raven’s Progressive Matrices, Working Memory; and Speech Distortion (like the Digit Span Tests), Short-term Acquisition and Retrieval. However, several lines of evidence fail to add weight to a more general interpretation of this factor. For instance, none of the tests involving series completion share loadings on Factor 3, which is true also of Tonal Memory. Elsewhere, each of these tasks has been shown to involve processes that are dependents of important memory functions (e.g., Kotovsky & Simon 1973; Stankov & Horn 1980). Factor 4: Broad Visualization Ability (Gy). As both single and competing versions of Hidden Figures have salient loadings on this factor (along with Card Rotations and Form Boards) its interpretation would appear straightforward. All of these

26

LEARNINGANDINDIVIDUALDIFFERENCES

VOLUMEll,NUMEER1,1999

tasks require a complex set of abilities of the kind proposed by Horn and Hofer (1992) as defining G,: fluent visual scanning, mind’s_eye rotation, seeing reversals, visual constancy, and memory for designs and spatial events (see also Lohman 1979; Lohman et al. 1987). The salient loading of the competing task version of the G, marker test, Tonal Memory, on this factor may be attributed to the fact that it was paired with Hidden Figures in its competing presentation. Under such conditions it is quite lawful to find two tests sharing some common variance (see Fogarty & Stankov 1982,1988; Stankov 1983a). Factor 5: Broad Auditory Function (GB). The tests with the highest loadings on this factor are the single and competing task presentations of Tonal Memory. Because two other aurally presented tasks (i.e., Digit Span Forward and Letter Series [Competing]) share salient loadings here, interpretation of Factor 5 as G, (rather than the primary mental ability-Tonal Memory [see Stankov & Horn 19801) would appear less problematic than might otherwise be the case. The salient negative loading of Stroop Time on this factor may be taken as additional evidence for the interpretation of this construct as G,. In the experimental literature, interference mechanisms of a semantic kind are often postulated to account for decrement on this cognitive task (see, in particular, Cohen et al. 1990; MacLeod 1991). Even so, one noticeable difficulty with interpretation of Factor 5 is the lack of salient loading from Speech Distortion on this factor. However, it should be noted that Carroll (1993, p. 372) has suggested that this test might (under certain circumstances) reflect more directly on short-term acquisition and retrieval (SAR) abilities, and that, in the context of the present investigation, it is with the SAR factor that Speech Distortion shares its only salient loading. factor 6: Clerical/Perceptual Speed (G,). The three tasks having highest loadings on this factor (i.e., Number Comparison, Search, and Stroop) have been found in previous research to define a Search (or Clerical/Perceptual Speed) factor. For example, Stankov (1988b) obtained a similar pattern of results in a factor analysis involving these three tasks. As in the study, the Stroop Test’s loading on this factor lends support to a notion that there is an important speeded quality to performance on this task. Because the Digit Symbol Test (which was determined initially as an output score) also loads on Factor 6, it would appear that the extracted factor does not differ from the G, construct obtained using number-correct score under timed conditions (see Stankov 1988b). This finding also argues for the generality of this factor across different methodologies (i.e., paper-and-pencil versus computerized administration) for extracting the underlying trait (i.e., it is not merely a “method” factor). Consideration of the low (negative) salient loading of the Card Rotations Test on this factor lends still further support to this factor’s interpretation as this test involves a large number of items (i.e., SO)that must be correctly solved within a limited period of time (i.e., 180 set). Factor 7: Inductive Reasoning (IR). While five tasks have substantial loadings on this factor (i.e., Number Series [Single], Number Series [Competing], Letter Series [Competing], and Water Jars), interpretation of this ability is the most problem-

Pt?OCESSlNGSPEEDANDA8/LITIES

27

atic among all cognitive factors reported in the present investigation. As the most salient loadings on this factor emanate from tasks defining the primary mental ability, Inductive Reasoning (IR), this would appear the most compelling interpretation for this factor. Even so, Letter Sets (which according to Ekstrom et al. [1976,1979] also serves as a marker for IR), has near zero loading here, and Water Jars is considered a marker of Attentional Flexibility at the first-order rather than of IR (Stankov 1988b). It would be remiss not to suggest another interpretation for this factor given that all of these tasks were hypothesized to load on Factor 1, and that unlike other constructs identified in the present analyses, IR is considered a first-stratum ability. Significantly, the three tasks having highest loadings on Factor 1 (Gr) are all paper-and-pencil administered tests, whereas the five task having salient loadings on Factor 7 are computer-administered tests. In this interpretation, differences between these two factors are purely a function of method-there is Gf (General) and Gf (Computerized). Note that this interpretation might be preferred were it not the case that Form Boards (a computerized test) shares salient loadings on Factor 1, but not on the present factor. A further interpretation of Factor 7 as Working Memory runs into similar problems. Thus, while H.A. Simon and Kotovsky (1963) and Kotovsky and Simon (1973) have made use of working memory placekeeper (WMP) measures in series-completion tasks, it would appear necessary for Digit Span Backwards to share salient loadings on Factor 7 and Water Jars to have nonsalient loading here to justify an interpretation of this factor as Working Memory.10 However, for both of these tasks the present result is the opposite of the outcome required for this interpretation to have merit (i.e., Digit Span Backwards has low loading, Water Jars salient loading). Factor Intercorrelations. Gf/G, theory predicts certain relationships between second-order factors. Factor intercorrelations thus constitute a useful means of corroborating the interpretation of each of the seven (second-stratum) factors. These intercorrelations are presented in the upper diagonal of Table 4. The lower diagonal of Table 4 contains correlations between factor scores calculated for each participant for each ability using the Bartlett method implemented in SPSS (see Norusis 1990). In addition to allowing comparisons between these two indices, Table 4 provides a means for checking the validity of the factor scores that are to be employed in further analyses of this study. Although factor intercorrelations are almost always higher in magnitude than factor score intercorrelations, there is considerable correspondence in the rank ordering of coefficients contained in the upper and lower diagonals of Table 4. For example, the largest factor intercorrelation (i.e., between Gr and IR) happens also to provide the highest factor score intercorrelation; and the smallest factor intercorrelation (i.e., between G, and SAR) provides the lowest factor score intercorrelation. Indeed, the Pearson correlation between correlation coefficients presented in the upper and lower triangle is 0.96. Whether consideration is given to factor (or factor score) intercorrelations, the trait designated Gf shares moderate to high positive correlation with four broad ability measures-SAR, G,, G,, and IR, respectively. The order of magnitude of each of these correlations, while somewhat lower than that generally found in

28

LEARNINGANDINDIVIDUALDIFFERENCES

Factor Intercorrelation

VOLUMEll.NUMBER 1.1999

TABLE 4 Matrix (upper diagonal) and Correlations (lower diagonal)

Among Factor Scores

Factor

Gf

G,

SAX

G,

GO

G,

1R

G

.

.21 . .oo .I0 .I4 .04 .23

.25 .03 . .I2 .07 -.I6 .ll

.24 .13 .12 . .19 -.14 .14

.37 .I8 .21 .30 . -.03 .20

-.18 .04 -.17 -.I6 -.09 . -.lO

.41 .28 .24 .23 .28 -.15 .

GC SAR G G G IR

.16 .30 .20 .30 -.17 .30

empirical research, are as predicted from Gr/G, theory (i.e., they indicate the relative independence of respective factors). The factor score intercorrelation between Gf and G, was low and nonsignificant-a finding at odds with that generally reported in the literature (but within the lower range of some studies [see, e.g., Davies et al. 1998; Roberts et al. 1997; Horn & R.B. Cattell 19671). Moreover, it should be noted that Gf is often defined by tests of primary mental ability that require participants to understand relations, comprehend implications, and draw inferences through inductive reasoning (Horn & Hofer 1992). In the present study these tasks (e.g., Letter Series) define a separate factor, IR. This same factor shares significant correlation both with G, and Gr. Indeed, its substantial factor intercorrelation with Gf, and similar pattern of intercorrelation with other factors to that of Gr, adds weight to its interpretation. In other words, both Gf and IR would appear to be measuring a similar latent trait. The interpretation of the factors G, and G, is enhanced by the moderate positive correlation that these two factors share with each other since, according to the tenets underlying Gr/G, theory, both abilities involve perceptual organization to an important extent (Horn 1985, 1987). The fact that these two abilities are more highly intercorrelated with Gr than G, is also consistent with findings reported in the literature. This outcome would appear an empirical corollary of the theoretical postulation that Gr, G,, and G, represent abilities that are vulnerable to the influences of aging, whereas acculturated processes such as G, are maintained over the life-span (see Horn & Hofer 1992). In terms of the overall objectives of the present study it is important to single out G, for special consideration. It should be noted that this factor shares nonsignificant negative correlation with all factors excepting G,, with which it shares near zero and positive correlation. It would appear that, at least for the present battery of tests, simply because a task is timed (as were a portion of the tests [e.g., Scrambled Words, Letter Sets]) does not mean the extracted factor will share an empirical relationship with broad clerical/perceptual speed. Hierarchical

Factor Solution: Do the Data Support the Existence of a General Factor?

A feature of the present study is an attempt not only to define broad cognitive factors but also a hierarchical factor (or factors). This undertaking assumes that such

PROCESSlNGSPEEDANDABlLlTlES

29

a construct is supported by the intercorrelations between second-stratum factors. Data presented in Table 4 indicate rather low correlations among the extracted factors. This result suggests that a hierarchical solution with the present cognitive test battery may be neither appropriate nor necessary. Nevertheless, to explore this possibility further, a higher-order factor analysis with cognitive variables was conducted. A maximum likelihood factor analysis was performed on the seven first-order (i.e., second-stratum) factor scores provided in Table 3. The goodness-of-fit chisquare test for one factor was satisfactory (chi-square = 14.05, df = 14, p = 0.45) such that one factor was retained at the highest order. However, this factor is very weak in that it accounts for a very small percentage (i.e., 15.5%) of common variance among the second-stratum factors. Note also that the first principal component of the correlations between test scores accounts for only 24% of the total variance. Clearly, these two different means of gauging the importance of the general factor indicates that it is not strong in the present test battery. Results of the preceding factor analysis are presented in Table 5. Factor loadings range from .71 for Gf down to - .23 for G,. Because all second-stratum factors have salient loadings on this factor it is tempting (despite the small amount of common variance accounted for by the solution presented in Table 5) to interpret this factor as the general intelligence construct of the kind postulated by Spearman (1927). However, inspection of the relative magnitudes of factor loadings presented in Table 5 suggests that the hierarchical factor extracted from the present battery of tests is undoubtedly biased toward Gf. In support of this assertion, the interpretation of IR and the high salient loadings of competing tasks on each of the perceptual factors should be recalled (see, in particular, Fogarty 1987; Fogarty & Stankov 1988; Stankov 1983a, 1983b, 1988c).” This result may be interpreted as offering some confirmation of Gustafsson’s (1984, 1988) claims that the hierarchical g and Gf are relatively indistinct. Notwithstanding, the outcome of any hierarchical solution depends critically on the design of the study and in particular on the selection of tests to be included in the analysis (see Horn 1985,1988; Horn & No11 1994; Roberts et al. 1997,1988a, 1998b, 1999b) . Within this context it should be noted that the lowest loadings of this hierarchical factor are for the clerical-perceptual speed factor (GJ, and the ed-

TABLE 5 General Factor Loading From Seven Second-Order Factor Gf G SAR G G G IR Note:

Factor Scores

Fl(GF/g?J

h2

.71 .25 .36 .33 .41 -.23 .44

.51 .06 .13 .I1 .17 .06 20

Each factor score corresponds to cognitive abilities identified in Table 3.

30

LEARNING AND INDIVIDUAL LWERENCES

VOLUMEll,NUMBER1,1999

ucationally biased G, factor, which relative to Gf have fewer marker tests. The present result is not in agreement with various analyses of the WAIS (e.g., Matarazzo 1972) that have indicated the central role of a G, measure (i.e., Vocabulary) in the general factor (see also A.R. Jensen 1980, Robinson 1999). An alternative interpretation of the eattern of results given in Table 5 is the absence of a general factor. Viewed thus, this factor may be taken to be an “inflated” (or bloated) Gf factor (hereafter denoted GF). To ascertain the appropriateness of this interpretation, a Schmid-Leiman transformation of the two orders of analysis was obtained (S&mid &. Leiman 1957). (See Carroll 1995; A.R. Jensen & Weng 1994 for a theoretical justification of this technique as it relates to higher-order factors.) Results of this transformation are presented in Table 6. Table 6 reveals that the highest salient loadings on this factor are for Gf marker tests or for perceptual markers given as competing tasks. Except for Water Jars (which was scored in a distinct fashion), all loadings from this group of tasks were above .30. In addition, there are a number of G, and G, marker tests that have nonsalient loadings on Factor 02:l. Moreover, the markers of these two factors, which have salient loading (i.e., Scrambled Words, Esoteric Analogies, and

Factor Loadings on Highest-Order

Test 01. 02. 03. 04. 05. 06. 07. 08. 09. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25.

Progressive Matrices (RM) Letter Counting (LC) Letter Sets (SL) Number Series Single (NSS) Number Series Competing (NSC) Letter Series Single (LSS) Letter Series Competing (LSC) Water Jars (WJ) Scrambled Words (SW) General Information (GI) Vocabulary Multichoice (VM) Esoteric Analogies (EA) Digit Span Forwards (SF) Digit Span Backwards (SB) Card Rotations (CR) Computer Form Boards (CFB) Hidden Figures Single (HFS) Hidden Figures Competing (HFC) Tonal Memory Single (TMS) Tonal Memory Competing (TMC) Speech Distortion (SD) Number Comparison Time (NCT) Stroop (Color) Time (SCT) String Search Time (SST) Digit Symbol Time (DST)

Note:

Salient loadings are

underlined.

TABLE 6 Factor Obtained from a Schmid-Leiman

Transformation

02:Fl

01:Fl

Ol:F2

Ol:F3

Ol:F4

Ol:F5

Ol:F6

Ol:F7

(GF)

(Gf)

K&J

(SAR)

W

(G3

(G,)

W

.67 .37 .49 38 .32 .33 34 Yi7 .35 z .19 .35 .37 z z .31 .35 5 .39 z .15 -.15 -.32 -?i -Yi?

.66 .27 3s .05 .06 .oo -.Ol -.Ol .12 -.02 .Ol .06 .03 .Ol .Ol .21 03 .04 .13 -.06 .Ol .oo -.08 .oo -.02

.14 -.lO .05 -.02 -.lO .05 .07 .08 .28 74 74 79 _lo .09 -.08 .02 .09 - .04 .02 .03 .oo .oo .Ol -.15 .12

.11 -.14 .02 -.05 -.06 .12 .08 .08 .Ol -.15 .06 .13 -.05 .04 .32 23 .67 78 -z .24 08 .Ol .07 -.03 -.ll

.Ol .06 .05 .Ol -.07 .02 .23 .03 -.07 .06 -.06 .05 .20 .17 .07 .09 - .04 .12 .61 65 -02 .05 -.26 .02 .05

.02 -.lO -.ll -.06 -.02 .03 .06 -.06 -.13 -.05 .13 -.09 .18 .02 -.21 .19 -.Ol .04 .02 -.14 -.06 _73 .67 z ss 2

.04 .12 -.03 .57 46 53 .38 36 ;11 .oo -.Ol -.04 .06 .16 -.08 .08 .11 .07 -.04 .09 .03 .OO -.06 -.12 .06

-.22 .15 .14 .14 .lO .04 .07 -.18 .27 -04 .09 .04 66 45 Yii -.07 .07 -.05 .06 .lO .24 .oo .09 .lO -.15

PROCESSlNGSffEDANDABlllTlES

31

the Stroop Task), have been shown elsewhere to possess high factorial complexity and thus share loadings on several factors (i.e., TSR, Gr, and G, [or G,] respectively; see, e.g., Ekstrom et al. 1979; Stankov 198813).Bearing this in mind, the pattern and magnitude of all loadings given by the S&mid-Leiman transformation suggest interpretation of this factor as GF rather than the general factor, psychometric g.12 INFORMATION DERIVED FROM TIMED MEASURES OF COGNITIVE ABILITIES Measure of speed of test-taking have attracted comparatively little interest in the literature relative to those abilities derived from accuracy scores (see Carroll 1993, Chapter 11; Stankov et al. 1995). Recall, for example, that within GJG, theory only two speed factors (G, and CDS) have been postulated.13 However, compelling empirical evidence has not been mounted in support of a distinction between these two speed of test-taking factors. Alternatively, speed has been incorporated with level to indicate power (e.g., Frearson et al. 1990; Fumeaux 1960) or mental efficiency (e.g., Spilsbury 1992; Spilsbury et al. 1990). However, relatively few (if any) studies have been designed to examine the factor structure of speed of test-taking per se, although the evidence reviewed in the introduction to this article suggests that such an undertaking should prove profitable. Summa y Statistics for Psychometric Variables Identifying Additional Test-Taking Abilities. To address these issues, average time (in msec) to complete each item on computerized “level” tests (i.e., Tests 4-7, 16-20) were obtained for each participant.14 For three marker tests of G, given in the same format (i.e., Tests 22-24), number of items correctly solved was analyzed, largely because these measures had been defined by speed of response in previous analyses.15 The means and standard deviations calculated for each test are presented in Table 7, as is the number of items contained in each test. It may be noted from Table 7 that the average time to do Form Boards appears substantially longer than any other test. This outcome may be explained by reference to a qualitative difference between this task and all other computerized measures. Form Boards was the only task in which items were presented directly for solution-all other stimuli being presented in sequential fashion with time of response recorded from the last stimuli presented until key press. Thus, solution time on all other tasks would be considerably higher (by a constant amount), if total processing time were measured (as in Form Boards). Table 7 also shows that means and standard deviations vary considerably among each of the psychometric tests. Factor Analysis of Additional Test-Taking Abilities. Correlations between the variables given in Table 7 were analyzed using a maximum likelihood extraction procedure and oblimin rotation. A solution employing root-one criterion yielded three factors. With these three factors, the goodness-of-fit chi-square test was satisfactory (chi-square = 44.12; df = 33; p = 0.09). Results of this factor analysis are given in Table 8. Interpretation of Factor Loadings. It will be noted from the solution depicted in Table 8 that the variables possess a definite structure. Thus, there are two factors defined by test-taking speed parameters and a single factor that is defined by

32

LEARNING AND INDIVIDUAL DIFFERENCES

VOLUMEli.NUMBER1.1999

TABLE 7 Means and Standard Deviations of Timed Measures of Level Abilities and Level Measures of Timed Abilities Test Times Measures (msec) 04. Number Series Single (NSTS) 05. Number Series Competing (NSTC) 06. Letter Series Single (LSTS) 07. Letter Series Competing (LSTC) 16. Computer Form Boards (CFBT) 17. Hidden Figures Single (HFTS) 18. Hidden Figures Competing (HFTC) 19. Tonal Memory Single (TMTS) 20. Tonal Memory Competing (TMTC) Level Measures (number correct) 22. Number Comparison (NCNC) 23. Stroop (Color) (SCNC) 24. String Search (SSNC)

Mean

SD

No. of Items

6768.18 7468.24 7716.57 6074.74 44,555.68 2154.90 2935.70 1804.65 2581.04

1973.80 2368.97 3302.44 2028.18 18,486.48 643.68 926.66 687.31 1006.95

24 30 24 30 20 20 20 20 20

42.93 71.98 84.83

4.14 6.56 4.09

48 80 90

measures of accuracy in tasks of relatively trivial item difficulty. Because these outcomes were not anticipated from previous studies, consideration is given to an interpretation of each of the factors presented in Table 8.

Factor 1: Visual/Auditory (Perceptual) Test-taking Speed (T&. Five tasks have salient loadings on Factor 1 (Form Boards and the single and competing task manipulations of Hid-den Figures and Tonal Memory). In sum, while accuracy scores for these tasks define two separate board abilities (i.e., G, and G,) they appear to define a single latent trait for timed measures of performance. Thus, interpretation of this factor as visual/auditory test-taking speed would seem justified. However, it should be noted that competing versions of the Hidden Figures and Tonal Memory Tests have the highest loadings on this factor, plausibly sharing common variance because of the nature of task presentation. In mounting a case for a general-ability construct, Stankov (1983a) notes a similar result with accuracy scores-competing tasks have higher salient loadings than do single tasks on the general factor. Even so, Factor 1 is taken to represent Visual/Auditory (Perceptual) Test-Taking Speed as weighing against generality is the observation that none of the other speeded tasks share salient loading on this factor. The fact that timed measures have a distinct factorial structure suggests that a nomenclature for these factors (paralleling that of level abilities) be developed. It is proposed that this factor be labeled T,ja. The notation T is preferred over S both to accentuate the timed quality of this factor and to differentiate it from Spearman’s specific ability(s). The subscript v/a, in contrast, indicates the dimensions of level with which this chronometric ability corresponds (i.e., visual and auditory).‘6 Factor 2: Clerical/Perceptual Accuracy (CPA). Th ree tasks have high salient loadings on this factor (i.e., Number Comparison,

Stroop, and String Search). In previous

33

PROCESSlNGSPEEDANDA6lL/T/ES

TABLE 8 Factor Analysis of Timed Measures (of Previously Identified Level Abilities) and Level Measurw (of Previously Identified Timed Abilities)

Test

F1 (T,jJ

Times Measures (msec) 04. Number Series Single (NSTS) 05. Number Series Competing (NSTC) 06. Letter Series Single (LSTS)

07. Letter Series Competing (LSTC) 16. Computer Form Boards (CFBT) 17. Hidden Figures Single (HFTS) 18. Hidden Figures Competing (HETC) 19. Tonal Memory Single (TMTS) 20. Tonal Memory Competing (TMTC) Level 22. 23. 24. Note:

Measures (number correct) Number Comparison (NCNC) Stroop (Color) (SCNC) String Search (SSNC)

Figures in

bold

type represent

F2 (CPA)

-.Ol .15

-.04 .03 .34 .42 .80 .53 .92 .ll -.07 -.lO

F3 U,d

h2

-.07

.54

.29

-.02

.50

.33

.62 .73 .lO .15 -.04 -.Ol

.37 .56 .25 .25 .63 .30

.04

.90

-.I6 .13 -.06

.46 .25 .65

.oo .08 .29 .06 .18 .12 -.21 .66 .48 .81

salient loadings (i.e. loadings above 20).

psychometric studies the dependent variable assessed in these tasks have been the number of items correctly solved within a limited period of time. In turn, by noting the speeded quality of these tasks, loadings have been interpreted to reflect the second-stratum G, factor. In theory, each of these tests represents performance requirements having minimum cognitive complexity. Assuming that the participant is cautious enough, accuracy of performance on any of these (socalled) clerical tests should approach 100%. That each test presently defines a single factor (by virtue of the fact that number of correct responses is recorded under potentially limitless time constraints) is somewhat surprising. Apparently there are important individual differences in the speed-accuracy trade-off that may be captured by a single latent trait. To the extent that, with proper care, any individual can do these tasks without recording errors, this factor probably reflects aspects of carefulness in situations requiring close attentional deployment (French 1957; see also Carroll 1993, p. 55Off, Stankov 1983a). However, to avoid possible unwanted connotations with dimensions of personality, it is presently interpreted as Clerical/Perceptual Accuracy (CPA).

Factor 3: Induction Speed (Ti,). In the present study, the four tasks that load on this factor (i.e., single and competing task versions of Number Series and Letter Series) define an Inductive Reasoning primary mental ability (IR) when measured by level. Presently, factor interpretation is dependent on the fact that loadings are derived from speed of test-taking measures. With these qualifications in mind and observing that there are no salient loadings from any other variable on this factor, interpretation of Factor 3 as Induction Speed (Ti,) would seem straightforward.17 Factor Score Intercorrelations ofAdditional Test-Taking Abilities. To validate the interpretation of the above three test-taking abilities, it is necessary to consider a number of criteria including factor (and factor score) intercorrelations. These re-

34

LEARNlNGANDlNDlVlDUAL DIFFERENCES

VOLUME11,NlJMBER1,l999

sults are presented in Table 9, which demonstrates close correspondence in the magnitude of correlations given in the upper (factor) and lower (factor score) triangles of this matrix. In each triangle, there is moderate intercorrelation between T,,, and Tir, with each of these time-dependent factors sharing near zero factor (and factor score) intercorrelation with CPA. These results indicate the relative independence of speed factors from CPA. They are also suggestive of a more general mental speed factor. Correlations of Additional Test-Taking Abilities with “Traditional” Broad Cognitive Factors. The preceding analyses have established that measures of speed of performance in psychometric tests possess a factorial structure that is not unlike that observed for level abilities. In an attempt to understand these constructs more fully it is necessary to examine the empirical relationships that “speed” measures share with “level” variables. For this purpose, the three “timed” factor scores were correlated with factor scores corresponding to each of the seven broad cognitive abilities given in Table 3. These correlation coefficients are presented in Table 10. Regarding the issue of cognitive speed, it should be noted from Table 10 that both TVla and Ti, share highest correlation with G,, suggesting that a general speed factor may circumscribe the measures defined by test-taking time. The potential generality of this factor is highlighted by the range of complexity inherent in performance of each task defining this construct. Thus, they extend from being trivially simple (e.g., Number Comparison) to tests that are extremely complex (e.g., Letter Series [Competing]). The significant positive correlation between G, and CPA lends support to interpretation of the latter factor as being related to carefulness. However, CPA shares different patterns of correlation than either TV,, or Ti, with other cognitive abilities, indicating a difference in processes captured by the CPA and T factors. It can also be observed in Table 10 that the factor T,,, shares significant negative correlation with all but two level factors-G, and IR. These findings suggest that speed of perceptual organization may be germane to a variety of human cognitive abilities. Equally, it would not appear to be related to processes that are potentially the product of acculturation. Conversely, the factor CPA shares significant correlation with Gf, SAR, G,, and IR, suggesting that working carefully through an easy test is an advantage when the nature of the task itself is somewhat novel and/or attention demanding. Note that this finding would appear to link CPA quite closely to those abilities that are vulnerable to the effects of aging. While this result is theoretically predicated in some of the research by Horn and

TABLE 9 Factor Intercorrelation Matrix (Upper Diagonal) and Correlations Among Factor Scores (Lower Diagonal) of the Solution Given in Table 8 Factor

Tv/a CPA Ti,

T u/a

CPA

TW

.

.07

.40 .07 .

.11 .31

35

Correlation

TABLE 10 of “Speed” (Factor) Scores with Psychometric Second-Order Factors

Variables Identifying

Broad

Variable

Tu/o

CPA

Ti”

G

.31 .02 -.22 -.39 -.18 .54 -.lO - .41

.23 .03 .23 .13 .16 .21 .24 .28

-.07 -.02 -.07 -.lO -.19 .22 .13 -.03

G, SAR G, G, G, IR GF Note:

Significant correlations

(p< 0.01) are in boldface.

collaborators (e.g., Horn & Hofer 1992) it finds ready empirical confirmation, apparently for the first time, in the present data set. Given the processes required in tasks defining IR, it is somewhat surprising that this factor shares nonsignificant correlation with Ti,. It might be presumed that taking too long to solve an inductive-reasoning problem should cause decay of the information needed to solve such problems. Indeed, faster solution time would seem a prerequisite to successful performance given the working-memory constraints imposed in tasks of this nature. However, the data are clearly contradictory to this supposition. A similar phenomenon has been observed in the literature with the correlation between accuracy and test-taking time on Matrices Tests, which has come to be known as the test-speed paradox (e.g., Horn et al. 1981; White 1973). A related observation is the absence of correlation between age and speed of working through difficult items. An explanation as to why these phenomena pertain to only the most complex of tests is no doubt required. Obviously this finding does not hold true for all types of cognitive ability as evidenced in Table 10 (i.e., the correlation between T,ja and both G, and G, is significant). One entertaining possibility is that level and speed scores on a given psychometric test fan-out in their correlation as cognitive complexity increases, or the type of learning required for successful solution shifts from a perceptual to more cognitive kind (see Roberts et al. 1991a). In fact, given ceiling effects with accuracy measures obtained from psychomotor paradigms, it is tempting to conclude that the correlation between speed and level within tasks follows an inverted U-function as tests shift along a mode of processing continuum (see Lindley et al. 1995).‘8 Overall, evidence from factor-score intercorrelations indicates that each of the speed factors identified in this study (i.e., TV,,, Tir, and G,) is structurally independent, having different magnitudes and patterns of correlation with each other and broad cognitive factors defined by level. Similarly, the evidence suggests CPA is independent from each of the broad factors currently sampled and is most notably not simply an artifact derived from the measurement of G,. Validation of the Factorial Structure of Timed Performance. To this point, the present study has reported two data sets dealing with cognitive abilities. The first

36

LEARNINGANDINDIVIDUALDFFERENCES

VOLUME11,NUMBERl,i999

focused upon psychometric variables assessed in the usual fashion (i.e., number correct); the other involved analysis of timed measures as an indicator of cognitive performance. In sum, the evidence suggests that “traditional” variables behave in a fashion oftentimes replicated in the literature. There are several welldefined second-stratum factors that define at the next order of analysis a general (albeit weak) factor. The evidence from speed measures suggests a factor structure that is similar in the sense that there are not one but three discrete factors. Of these three factors, two (CPA and Ti,) were defined by tasks that provided measures of G, and IR in the initial analysis. The third factor (TV,,) is defined by tasks that provided two separate factors when assessed by level (i.e., G, and G,). However, questions concerning the underlying model remain. In particular: 1. Are speed factors merely an artifact of method or perhaps of the types of statistical analyses employed? 2. How independent are speed constructs from abilities defined by level? 3. Are psychometric speed factors as broad as level factors? 4. At what stratum do linkage(s) between speed and level abilities occur? That is, do they occur at the second-, third-, or a higher-order of analysis? 5. What is the status of additional speed factors in relation to the well-replicated G, factor? Analyses Incorporating Level and Speed Measures of all Cognitive Ability Tests. The analyses reported in the passages that follow were conducted in order to resolve some of the above-mentioned issues. To this end, the 24 tests of the battery in which accuracy scores were obtained were included in a factor analysis with the 13 tests of the battery in which solution times were recorded. The descriptive statistics underlying performance on each of these tasks were provided earlier in Tables 2 and 7. A maximum likelihood analysis was performed on the 37 variables comprising this data set. While root-one criterion suggested 11 factors, for the objective of validating abilities already identified, a lo-factor solution was preferred. The goodnessof-fit test for this solution was satisfactory (cl-ii-square = 413.71, df = 341; p = 0.01). The ? .lO hyperplane count for the oblimin solution is 64.6%, a value that is indicative of the fact that adequate attainment of simple structure has been obtained. The oblimin factor pattern matrix of this solution is presented in Table 11. Interpretation of each of these factors is straightforward given parallels to results presented in Tables 3 and 8. There are seven factors defined by number correct scores and three by cognitive speed measures. The seven level factors may be interpreted respectively as: Gf (Factor 1), G, (Factor 2), SAR (Factor 3), G, (Factor 4), G, (Factor 5), IR (Factor 7), and CPA (Factor 9). The three speed factors that may clearly be identified are: G, (Factor 6), T,,, (Factor 8), and Ti, (Factor 10). Most importantly, within this solution, measures that are experimentally dependent (e.g., Letter Series [Time], Letter Series [Number Correct]) tend not to share loading and hence define task-specific factors, but instead define factors previously identified as being much broader (i.e., in respect of the example, Ti, and IR respectively). Factor intercorrelations are presented in Table 12. There are several interesting features of this solution. First, close correspondence exists between the coeffi-

37

TABLE 11 Oblimin Factor Pattern Matrix of all Speed and Number Correct Measures That Are Able to Be Derived from the Cognitive-Ability Tests of the Present Battery (i.e., Tests l-25) F1 Test Level 01. Progressive Matrices (RM) 02. Letter Counting (LC) 03. Letter Sets (SL) 04. Number Series Single (NSS) 05. Number Series Competing (NSC) 06. Letter Series Single (LSS) 07. Letter Series Competing (LSC) 08. Water Jars (WJ) 09. Scrambled Words (SW) 10. General Information (GI) 11. Vocabulary Multichoice (VM) 12. Esoteric Analogies (EA) 13. Digit Span Forwards (SF) 14. Digit Span Backwards (SB) 15. Card Rotations (CR) 16. Computer Form Boards (CFB) 17. Hidden Figures Single (HFS) 18, Hidden Figures Competing (HFC) 19. Tonal Memory Single (TMS) 20. Tonal Memory Competing (TMC) 21. Speech Distortion (SD) 22. Number Comparison (NCNC) 23. Stroop (Color)(SCNC) 24. String Search Time (SSNC) Speed 04. Number Series Single (NSTS) 05. Number Series Competing (NSTC) 06. Letter Series Single (LSTS) 07. Letter Series Competing (LSTC) 16. Computer Form Boards (CFBT) 17. Hidden Figures Single (HFTS) 18. Hidden Figures Competing (HFTC) 19. Tonal Memory Single (TMTS) 20. Tonal Memory Competing (TMTC)

F2

F3

F4

F5

F6

F7

F8

F9

FIO

(G$ (GJ LSAR)(G,.) (Go)(GJ UR) U,,/o/,) (CPA)CT,,) h2 .06 .OY -.15 .lO -.05 .99 .13 -.21 .08 -.04 .32 -.07 .oo .15 -.Ol .09 -.06 .54 .OO .Ol .16 -.06 .Ol .07 .14 -.07 -.02 .04 .08 .15 -.06 .04 .13 .05 .02 -.Ol .30 .Ol

07 .08 .lO .25 .81 .72 .79 .07 .08 -.09 .08 .ll

.13 -.08

.16 -.02

-.16

.02 .OY .07 .16 .12 .18 -.lO .14 .Ol .28 -.04 -.05 -.08 -.18 .05 .ll .06 -.OY .08 .14 .Ol .82 -.03 .07 .49 .03 .16 .05 .26 .15 -.lO .18 .Ol -.Ol .59 -.04

-.13

-.04 .OO .Ol -.03 .13 -.16 .15 -.04 -.05 .Ol .OO -.02 .05 .61 -.08 .06 .43

.02

.14 .62 .14 .39 -.lO .33 -.08 .16 .Ol .02 .lO .02 -.08 -.03 .13 .Ol -.05 .15 -.ll -.Ol .13 .02 .05 .15

-.07 -.05 .lO -.03 -.09 .12 .02 -.04 .03 -.17 -.ll -.27

.03

.88

.08

.05

.12 -.12

.26

.05

.19

.05

.51

.02

.05

.23

.31

.58 - .Ol

-.Ol -.Ol .09 -.ll .07 .05 .05 -.04

.25 .02 .lO .20

.04 -.02 -.07 .15 -.02 -.09 .lO .16 -.21 .oo .oo -.ll

-.Ol

-.ll

-.lO

-.08

-.05

-.I1

-.05

-.09

-.03

-.04

.13

.24

.07 -.02

-.05

.OO -.Ol

.03 -.02

22. Number Comparison Time (NCT)

-.Ol

23. Stroop (Color) Time (SCT) 24. String Search Time (SST) 25. Digit Symbol Time (DST)

.04 .lO -.15 .ll -.04 -.ll .08 -.lO .Ol

.Ol

.04

.04 .89

.06 -.06

.07

.07 .74

.04 -.Ol .05 -.Ol .04 .08 .09 -.lO

.Ol .67 .52 .73

.15

.02

-.lO -.18 .06 -.06

.lO .54 .37 .65

-.05

.52 .32

.lO -.03

.43 .40

.Ol .oo .13 -.Ol .OY .37 .35 .12

.65 .68 .19 .lY

.04 -.Ol

.04 -.08

-.13

.52 .40 .20 .36 .63 .62 .77 .74 .45 .22 .34 .59

.21 56

-.OY

.18

-.02

.15 .33 -.12 .12 .04 -.19 .05 -.lO .02 .08 .08 -.06 .13 .03

.05

.08 .04 .03 .02 -.07 -.lO .03 .14 .02 .05 .08 .02 .18 .06 .02 -.08 -.19 -.07 .05 -.04 .oo .09 -.lO -.03 .12

.ll -.03 .02 -.03 .05 -.04 -.04 -.04 .12 .OO .27 .07

-.06

-.02 .oo .Ol -.02

-.15

.oo

.9Y .33 .42 .54

.69 -.03

.67

.lO .39

.ll

-.22

.02 -.06

.38

-.06

-.05

.OO -.Ol

.90 -.09

.07 .92

.07 -.lO

.56 -.OY

.18

.Ol .58

.09 -.50 -.Ol -.ll -.03 .04

.43 -.17 .36 -.28 .72 .09

.lY .6Y .oo .OY .04 .55 .14 .38 -.06 .66 .20 -.20

Note: Numbers in boldface representsalientloading (i.e. loadings above .20).

.21

-.06

.41 .57 .36 .29

.25

38

LtilRNlNGANOlNOMOlJALOFFERENCES

VOLUMEll,NUMBER1,1999

cients reported in Table 12 and factor-score intercorrelations presented in three separate analyses of the present study (i.e., results presented in the upper diagonals of Table 4 and 9 and throughout Table 10). The Pearson correlation between values reported in these various tables is high (r = 0.870). Second, almost without exception, negative correlation exists between factors defined by speed and factors defined by level. Third, intercorrelations between factors defined by speed are moderate and positive, as are the intercorrelations that result between factors defined by level. Although the magnitude of these various coefficients is sufficient to consider a hierarchical solution, this analysis will be reserved for the complete data set in which time measures were obtained. INFORMATIONDERIVEDFROMSPEEDOFINFORMATION-PROCESSINGMEASURES D#mztiating between Lawful and Problematic Parameters of ECTs. The tasks discussed in this section are the various processing-speed paradigms whose conditions each, in principle, subscribe to a particular aspect of the information theory model (i.e., Tests 26-36). Commonly within contemporary individual differences studies, a number of parameters are extracted from each elementary cognitive task (ECT). This outcome follows from the fact that both MT and DT may be obtained for the majority of ECTs across a number of different treatment conditions. Table 13 lists all measures that were initially obtained in the present multivariate investigation, along with the various paradigms where this assessment was theoretically (or pragmatically) possible. It should be noted from Table 13 that a series of fairly disparate measures could be derived from each chronometric task. Elsewhere, A.R. Jensen (1987a) has presented a range of procedures by which such parameters should be validated. For example, the median DT variable alone should exhibit the following properties as a function of the levels of stimulus information (i.e., bits) comprising an RT paradigm: 1. Adherence to Hick’s law in both (a) group, and i,b) individual data sets. 2. Acceptance of the Hick model over alternative frameworks (e.g., Hick’s correction for temporal uncertainty) when examined in terms of fit statistics.

Factor Intercorrelation Factor

G SAR G” G, G, IR Tv/a CPA Ti,

Gf

G

.26 .24 .21 .31 -.lO .40 -.31 .19 -.03

. .12 .I7 .13 .03 .22 .02 .02 - ~.05

TABLE 12 Matrix of the Oblimin Pattern Matrix Solution Given in Table 11 SAR

G

G

9 .19 .09 .17 -.25 .11 -.06

-.15 .19 -.25 - .05 .08

G

IR

T u/u

CPA

. .12 .15 -.06 .26 -.19 .15 -.07

l l

-.14 .34 .14 .17

l

-.I2 .ll .08

l

.05 .33

l

.13

PROCESSlNGSPEEDANDABlLlTlES

39

TABLE 13 Frequently Measured Individual Differences Variable Median DT Mean Median DT Intraindividual Variability of DT Mean Intraindividual Variability of DT Median MT Mean Median MT Intraindividual Variability of MT Mean Intraindividual Variability of MT Intercept of MT Slope of MT Fit of MT

Median RT Mean Median RT Intraindividual Variability of RT Mean Intraindividual Variability of RT Intercept of RT Slope of RT Fit of RT

Note:

Parameters of ECTs

Description Median DT over all trails for a given number of bits. Mean of all the median DTs over bits. The average standard deviation (SD) of DTs over trials at each number of bits. The mean of the average standard deviation (SD) of DTs obtained at each number of bits. Median MT over all trials for a given number of bits. Mean of all the median MTs over bits. The average SD of the MTs at each number of bits. The mean of the average SD of MTs obtained at each number of bits. Intercept of the regression of mean MTs on bits. Slope of the regression of mean MT on bits. Index of fit determined from the regression of MT on bits (Pearson product moment coefficient). (Median DT + Median MT) over all trials for a given number of bits. Mean of all the median RTs (as above) obtained at each number of bits. Average SD of the RTs (i.e., sdDT + sdMT) at each number of bits. The mean of the average standard deviation (SD) of RTs obtained at each number of bits. Intercept of the regression of median RTs on bits. Slope of the regression of median RTs on bits. Index of fit determined from the regression of RT on bits (Pearson product moment coefficient).

Tests

Symbol

DTo,

28-35

DT,, etc. 28-35 28, 30, 31

DT, sdDTO, sdDTi, etc. sdDT,

28,30,31

MTo,

2628-35

MTi, etc. MT, sdMT”, sdMTi, etc. sdMT,

26,28-35 28,30,31

MTa MTb MTr

26 26 26

RTo,

27-35

28,30,31

RT,, etc. RT,

27-35

sdRT,,, sdRTi, etc. sdRT,

27 27

RTa

27-30,32-35

RTb RTr

27-30,32-35 27-30,32-35

After A.R. Jensen, 1987a, Table 1, p. 111. The final column of this table lists ECTs of the present study assessing each construct.

Significant correlations at each set size that respectively exhibit simplex structure (i.e., a pattern of correlation coefficients where values close to the main diagonal are large and taper off to the bottom left-hand comer of the obtained matrix). Reliabilities exceeding 0.6 at each bit level. A well-replicated series of empirical relationships with other parameters obtained from the same task (e.g., sdDT). Note also that additional parameters (e.g., slope RT) are required to satisfy analogous and/or further requirements. Table 13 also contains a number of parameters presently investigated that are not well represented in the literature. For instance, slope MT has so far escaped even the most tentative of explorations, largely because studies in the individual

40

LEARNINGANDINDMDUAL DIFFERENCES

V0LlJME11.NUMBER1.1999

differences domain have ignored the information theory law proposed for psychomotor movement (Fitts, 1954). For unrelated reasons, intraindividual regression parameters derived from RT (i.e., DT + MT) have been disregarded in favor of the “parallel” information provided by DT measures. However, G.A. Smith (1989) notes a fundamental flaw with any approach that focuses on the latter (arguably less informative) index-some participants may use a “hovering strategy” which in turn makes assessment of this aspect of participants performance highly problematic (see also Welford 1986). Similarly, the fit statistic for intraindividual regression lines (i.e., RTr) has not elsewhere been investigated. Its inclusion in Table 13 is prompted by two observations: (a) some participants do not fit the Hick model well (Barrett et al. 1986); and (b) the extent to which this occurs may represent an individual differences phenomenon in its own right, a claim that until the present data set had escaped empirical evaluation (H.J. Eysenck 1987a). However, it turns out that many of the parameters listed in Table 13 have poor construct validity. Thus, in a series of correlational and regression analyses involving all variables given in Table 13, Roberts (1995, 1999a, 1999b) has demonstrated that measures of central tendency (in DT, MT, and RT) alone adhere to lawful principles. In contrast, both intraindividual variability measures and intraindividual regression parameters lack construct validity. For example, the sdDT parameter fails to exhibit simplex in any of the ECTs of the battery (i.e., Tests 27-28,30,31) despite the fact that this outcome is clearly predicted by information theoretic principles. Intraindividual variability in DT also tends to share so high a correlation with median DT performance as to render the measure redundant. Intraindividual regression parameters are equally problematic in that a large number of individuals do not adhere to the underlying model (e.g., Test 28 where some 40% of participants fail to conform to Hick’s law) nor do the measures correlate highly (e.g., the average r for the slope of RT is 0.11). These results cannot simply be attributed to methodological artifacts or sampling practices because when each data set is reduced to similar dimensions as more frequently reported in the literature (e.g., only four levels of task difficulty are examined rather than five) the results are in general agreement. Regarding measures of central tendency it should be noted that for the present battery of ECTs, DT, MT, and RT parameters were consistently shown to be valid. To give the reader an impression both of the nature and degree to which construct validity was exhibited in these measures the main findings are summarized below (see also Roberts 1999a, 1999b). 1. DT, MT, and RT measures were highly reliable. Cronbach alpha’s ranged between 0.55 (MT3,00in Test 28) and 0.99 (DT3.00in Test 29) within bit levels, with overall reliability exceeding 0.90 for the DT,, MT,, and RT, parameters of each ECT. 2. The mean results obtained with DT, MT, and RT in each ECT share a good deal of correspondence with findings reported in the experimental literature. In the case of MT, for example, the rate of transmission in Test 26 (calculated as the inverse of the slope constant) was 12.7 bits/set. This value is very close to that obtained by Fitts (Fitts 1954; Fitts & Petersen 1964) and other researchers who have employed this paradigm (e.g., Welford 1968; Welford et al. 1969). Similarly, in the

PROCESSlNGSPEEDANDAB/f/T/ES

41

case of RT, the rate of transmission in Test 32 was 2.69 bits/set, a value that is close to that reported by a number of investigators employing the card-sorting methodology (e.g., Crossman 1953; Roberts et al. 1988, 1991a). Note also that the stimulus-response (S-R) codes (e.g., light-key [Tests 28, 30, 311 vs. word-voice [Tests 34, 351) manipulated across ECTs had significant effects on RT, such that differences in slope and intercept functions between tasks were as predicated in the literature (e.g., Brainard et al. 1962). Parenthetically, the nature of the S-R code, while often acknowledged for its importance in cognitive models of RT (e.g., Teichner & Krebs 1974, who devote much of their review of choice RT studies to this factor), has received scant attention in the individual differences domain. 3. Group conformity to underlying information theoretic principles was also highly adequate. For example, while each of the tasks clearly involved different cognitive requirements, model fit to the Hick-Hyman law (CRT = SRT + k log2 n [where CRT is choice RT, SRT is simple RT, k is the slope constant, and n is set size]) exceeded 0.96 for all ECTs in which a relationship between DT and bits was hypothesized. Note, however, in three ECTs where a high degree of stimulusresponse (S-R) compatibility was prevalent (Tests 27, 34, 35), a more accurate fit was provided by the Power Function (CRT = SRT + k [l - n-l]). This too was as predicted by the available literature (see, in particular, Longsteth et al. 1985; and also the reanalysis of several data sets pertinent to this point by Roberts 1995,1999b). 4. Each parameter exhibited robust measurement properties. For instance, application of conjoint measurement procedures (e.g., Lute & Tukey 1964; Michell 1990; Stankov & Cregan 1993) within a selection of ECTs (in particular, Tests 3235) revealed that MT, DT, and RT variables consistently met the conditions for the assumption of quantitative structure. 5. Simplex patterns of intercorrelation (Guttman 1955, 1965) were exhibited in many of the data sets. To account for the lawfulness of this phenomenon A.R. Jensen (1987a) adopted an “overlap model” based on the idea of common elements between variables. This model predicts the actual magnitude of correlation between different set sizes. In the present study, conformity to the “overlap model” was adequate across the majority of data sets (r’s generally exceeded 0.70). 6. Measures correlated highly enough among themselves, indicating that they do indeed measure a valid construct. In particular, the average correlation between the MT, of any two ECTs of the battery was 0.38, whereas the average correlation between DT, of any two ECTs was 0.36. Of critical importance, MT, also shared a low average correlation of 0.16 with DT, across the battery of ECTs. It would thus appear that these two constructs are independent. The theoretical implications of these findings are discussed in detail by Roberts (1999a). However, for present purposes there is an important (practical) corollary underlying the finding of poor construct validity in intraindividual regression and variability parameters. In particular, it would appear unnecessary to report the correlation that each of the variables listed in Table 13 share with broad cognitive-ability constructs. This is no small point when it comes to interpreting the relationship that speed of mental-processing measures share with both “level” and “speed” factors derived from psychometric performance. To give due attention to these measures would require that hundreds of correlation coefficients be consid-

42

LEARMINGANDIA'DMDUAL DIFFERENCES

VOLUMEll,NUMBER1,1999

ered for each cognitive ability factor-a particularly complex and arduous undertaking. However, even allowing for this fact, there remained 273 measures of median (or mean) performancei as well as various mean median (or mean) parameters that could be calculated over conditions and throughout an entire task. Because of the disparate aims of the investigation it was decided to focus upon a measure (or measures) that would retain the essential features of the various experimental manipulations while making interpretation as simple as possible. Based on various rationales (e.g., the fact that the most reliable parameter extracted from any ECT was performance averaged over conditions) it was decided to employ the MT,, DT,, and RT, parameters from each ECT in further analyses (Roberts 1999b). In the passages that follow the descriptive statistics for the MT,, DT,, and RT, parameter are reported for each ECT, along with some specific comments on interpretation of performance in these tasks. Summary Statistics of Movement Time (MT,) for each ECT. MT, was obtained in each of the information theory tasks by summing MT across experimental conditions and then dividing by the number of such conditions. These results are presented in Table 14, which indicates a particularly interesting feature-low standard deviations across each measure. As expected, because only psychomotor movement was involved, the shortest mean MT was obtained in Fitts’s Movement Task. The next shortest MT, (providing a value very close to this outcome) was in the Binary Reaction Task. This result was also not unexpected given aspects of its design. In short, participants had to indicate their response on either the left- or right-hand side of the keyboard with the ballistic movements likely to become highly automatic over time (see Roberts 1999b) (i.e., the participants did not have to search the array to match their response to the target as is frequently the case in RT paradigms). At the other end of this performance continuum, the longest mean MT was obtained for the Complex Reaction Task (Test 30) when not corrected for number of target responses. (In this ECT, participants were required to make multiple responses. Variable 30A represents the time required to make all responses, while Variable 30B is the same measure divided by the number of responses [see Beh et al. 19941). Although the correction made to this measure would appear to make this aspect of performance more consistent with the other measures of MT obtained in many of the present tasks, as will be demonstrated shortly, this procedure is questionable. Apart from this finding, the Tachistoscopic Reaction Task alone (among the computerized paradigms) would seem to lead to excessive MT relative to that generally obtained in this component of performance. Indeed, with one or two major exceptions, it might be argued that there is actually relatively little difference in the mean values of MT parameters obtained across these various ECTs. It is possible to offer a particularly compelling reason for this close correspondence. Accordingly, Fitts’s Movement Task was the only experimental paradigm in which psychomotor movement was actually manipulated. In this paradigm, task requirements dictated that aimed ballistic movement was the only component of performance. The amount of information transmitted over each condition may be obtained from the information theoretic model for

43

PROCESSINGSPEEDANDABlLlTlES

TABLE 14 Summary Statistics of MT Measures (Averaged Over All Conditions) the Present Battery (in msec)

for Each of the ECTs of

Test Variable

Mean

SD

26. Fitts’s Movement MT 28. Single-Response MT 29. Tachistoscopic MT 30A. Complex Choice MT 30B. Complex Choice MT 31. Binary Reaction MT 32. Cards Single MT 33. Cards Multitask MT 34. Word Single MT 35. Word Multitask MT

367 411 609 1225 454 371 463 482 819 895

48 80 165 260 92 86 86 96 48 78

Note:

MT data were not obtained in one of these ECTs: Joystick Reaction (Test 27).

aimed ballistic movement: Fitts’s law (i.e., MT = k log2 [A/W + 0.51, where k is the slope constant, A is target distance, and W is target width). The present value represents the MT averaged over the five conditions and as such corresponds to 4.32 bits. As it turns out, this value coincides closely to the amount of information contained in the MT phase of a number of other ECTs. Thus, Fitts’s law may be viewed in light of the physical dynamics of the MT component of each computerized task. As noted by Roberts (1995; see also Roberts 1999b), each response key, measuring 1.00 cm in diameter, was set some 10.00 cm from the home key. During this component of performance, participants were required to lift their finger from the home key and press the response button. This movement phase may be quantified (under the feasible assumption that target width is half the diameter of the response key) by applying Fitts’s law. If so calculated-at least for the majority of computerized ECTs-the value obtained is 4.11 bits. The qualification “majority” is important here especially since in the Complex Choice Reaction Task, on the average, the second, third, or fourth response was very short (i.e., 2 to 3 cm). Hence, the average amount of stimulus information contained in the MT phase of Test 30 is considerably less than any other computerized tasks if an adjustment based on number of targets is employed. Similar interpretation of MT in the two card-sorting paradigms is possible, although it must be mentioned that the movements required here were more complex. (The participant had to turn the card up before placing it on a template and movement “errors” were not taken into account [i.e., cards lying marginally outside the designated area were not counted as incorrect]). In regard to the wordclassification “MTs,” it must be mentioned that these were qualitatively distinct from all others. The task involved the experimenter reading aloud one of 32 words (e.g., carrot) that the participant knew in advance belonged to the same semantic category (i.e., vegetable), to which they made the vocal response “veggie.” As such, the “MT” of this task plausibly represented some combination of “auditory recognition time” and the speed with which participants articulated their response. Moreover, this aspect of performance was comprised of a number of still

44

LEARNlNGANDINDIViDUALDIFFERENCES

VOLUMElI,NUMBER1,1999

further components: the reading of the word, the participants’ acknowledgment of the stimulus and their response, and the experimenter’s acknowledgment of the participants’ response in order to present the next stimulus. However, the term MT is retained both for ease of exposition and because no single term adequately encapsulates the essential features of this measure.

A Comment on the Independence of DT from MT in the Current ECTs. Despite the preceding commentary, insofar as any of the ECTs shows significant difference in MT from Fitts’s Movement Task (Test 26), it might be claimed that some aspect of the DT component had entered into the MT phase of performance. As a consequence, with the exception of the Binary Reaction Task (where MT, is statistically insignificant relative to the MT, of Test 26) and the card-sorting and word-classification paradigms (which are indeterminate), intertask comparisons reinforce several intratask analyses of ECTs (Roberts 1999a, 1999b). These analyses suggested that some decision processes might enter the MT phase such that RT should also be evaluated when considering the cognitive correlates of processing speed (see G.A. Smith 1989; G.A. Smith & Stanley 1983). The intrusion of DT into MT would appear to be particularly pronounced in the Tachistoscopic and Complex Choice Reaction Task. Summary gfafisfics of Decision Time CDT,)for each ECT. It is possible to use the DT measure to obtain an indication of the effects of complexity on task performance. To this end, DT, was obtained by averaging across the experimental conditions of each ECT. These results are presented in Table 15, which is almost ordered according to an “intuitive” analysis provided by Roberts (1995,1999b). To some extent the reversal occurring between the Binary Reaction Task and the Complex Reaction Task is probably a function of design limitations in the former. In particular, stimuli were not presented in equal proportion in Test 31-one can only speculate that the hypothesized ordering of task complexity might have been realized were this not the case. Furthermore, as predicted from the literature, choice RT to verbal stimuli requiring a spoken response (Test 34) leads to faster processing rate than any of the spatial tasks requiring a nonverbal motoric response (Pollack 1963; Teichner & Krebs 1974; Welford 1968, p. 82). The particularly strong influence of divided attention upon the DT component of performance is also evident in Table 15. Although the “verbal” DT of wordclassification is faster than each of the visual ECTs when presented under singletask conditions, it is among the slowest when averaged across the various multitask conditions. The influence of symbolic coding on “motoric” DT should also not be lost on the reader. As argued elsewhere (see Roberts 199913)the Binary Reaction Task is, in terms of information transmitted, the simplest of all the paradigms currently investigated. However, because participants were required to transform the stimuli into a yes or no response the DT obtained in this ECT is of a considerable magnitude. Similarly, manipulations of symbolic-pictorial coding in the card-sorting paradigm leads to particularly long DTs even in its single-task presentation. Summary Sfatisfics ofRTxfor each ECT. Having noted that the MTs across ECTs

45

PROCESSlNGSPEEDANDAMlTlES

TABLE 15 Summary Statistics of DT Measures (Averaged Over All Conditions) the Present Battery (in msec) Test Variable 28. 29. 30. 31. 32. 33. 34. 35.

Single Response DT Tachistoscopic DT Complex Choice DT Binary Reaction DT Cards Single DT Cards Multitask DT Word Single DT Word Multitask DT

Note:

DT data were not obtained

in two ECTs: Fitts’s Movement

for Each of the ECTs of

Mean

SD

419 509 768 558 707 912 328 594

93 124 260 141 110 185 78 145

(Test 26) and Joystick Reaction

(Test 27)

were quite close, and that RT, represents a composite of MT, and DT,, there is little that needs to be said concerning the outcomes reported in Table 16. In short, means and standard deviations resemble those presented for the DT, parameter. Nevertheless, because this transformed variable may share higher correlation with cognitive ability measures, the RT variable remains of interest.

THECOGNlTlVECORRELATESAPPROACH:ESTABLlSHlNGLlNKAGEBETWEEN PROCESSlNGSPEEOANO(RESPECTIVE)BROAOCOGNlTlVEABlLlTYFACTORS The resultant analyses are intended to establish the “cognitive correlates” of the broad ability constructs that were identified earlier in the present investigation. This approach involves determining the magnitude of correlation that mean DT, MT, and RT parameters (extracted from each of the ECTs) share with factors that have been linked to both “level” and “speed” abilities. Because it is generally the case that speed of information-processing measures have not been examined in their relationship to second-order factors, predictions must be derived on the basis of rather limited evidence. Nevertheless, for the theoretical significance of these analyses to be understood fully, an attempt is made herein to reference results extensively to previous research findings. To make sense of the resulting body of evidence some reinterpretation of research that appears outside the present empirical focus must also be undertaken. Cognitive Correlates Approach. The cognitive correlates approach presently adopted involves correlating processing-speed variables obtained from the ECTs with the majority of level abilities derived from the factor analysis of psychometric performance. The aim will be to demonstrate the presence or absence of correlation within particular second-stratum abilities, largely because current information is cursory at best. Thus, Carroll’s (1993) summary statement of research involving RT and human cognitive abilities remained speculative: “The linkage between reaction times and level abilities is more likely to occur at a higher level of analysis, that is, with reference to broad abilities such as fluid intelligence, or general intelligence, rather than with reference to highly specific ‘primary’ abilities”

LEARNINGANDINDIV~DUALDIFFERENCES

46

VOLlJMEil.NUMBER1.1999

TABLE 16

SummaryStatistics of RT Measures (Averaged Over All Conditions) for Each of the ECTs of the Present Battery (in msec)

Test Variable

Mean

SD

27. Joystick RT 28. Single Response RT 29. Tachistoscopic RT 30A. Complex Choice RT 308. Complex Choice tRT 31. Binary Reaction RT 32. Cards Single RT 33. Cards Multitask RT 34. Word Single RT 35. Word Multitask RT

456 830 1118 1993 1222 929 1170 1394 1147 1489

93 126 165 400 292 177 161 234 94 162

Note:

RT

data were not obtained in one of these ECTs: Fitts’s Movement (Test 26)

(p. 508, emphasis ours). The degree of tentativeness in Carroll’s summation would appear to be a consequence of apparent limitations in many previous research designs. Although the present design is by no means completely representative of the domain circumscribing human cognitive abilities it is sufficiently broad to address several key issues. For example, employing the various versions of the Raven’s Progressive Matrices Test in RT studies might result in a series of biases toward visualization abilities (see Barratt 1953, 1956; Carroll 1993, p. 696). Visual RT tasks contain a sequence of responses that must be performed within a spatial array; thus, the degree of perceptual overlap between the two tasks (i.e., the ECT and psychometric test) may account for any correlation obtained between processing speed and intelligence (Carroll 1987). The influence exerted by this confound may be greatly exacerbated with the development of “new” processing[see Frearson & Eysenck 1986; Frearson et al. speed tasks (e.g., “Odd-man-out” 19881; CRT-LOW [Neubauer 19911, and card-sorting [Roberts et al. 1988, 1991aJ) that require still more complex visual discriminations and often report higher correlation with Raven’s Progressive Matrices score. Ln the multivariate design of the present study there are a number of compelling reasons that allow issues of this type to be addressed in a fairly unequivocal manner. For one, the factor that the Raven’s Progressive Matrices loads saliently on in Table 4 is Gf, which in turn has high salient loadings from Letter Counting (an aurally presented task) and Letter Sets (a generously timed test containing printed letters). Second, G, has been clearly defined by tasks having salient loadings on this factor (i.e., Hidden Figures [Single and Competing], Form Boards, and Card-Rotations). Additionally, each of the visual RT tasks contains degrees of spatial content. For example, in card-sorting, participants must first encode complex visual stimuli (such as the symbol for clubs) and then reproduce this information within the context of some fairly elaborate (spatial) array. Each of the correlations between

visual RT and G, should be of significant magnitude to give the “visual confound explanation” credibility-with low to near zero coefficients occurring on the two verbal RT tasks. Accordingly, both convergent and discriminant validity may be established (see Keating & MacLean 1987). Further, if correlations with RT parameters are obtained on both Gt and G, factors, it remains possible to partial out the effects of one factor on the other to determine which is the more influential (see Stankov 1998b). This is but one issue that may be addressed with respect to the correlation between the ECTs and broad ability factors identified in this study; it serves, nonetheless as an exemplar for a cognitive correlates approach that has been “modified” to accommodate the emerging structure of human abilities. Speed of Information Processing and Cognitive Abilities: Are There A Priori Predictions That May be Formulated? It is difficult to find a statement in the literature that would assert that processing-speed measures are limited in their relationship to one or another of the second-stratum abilities identified in this study. This shortcoming stems from considering intelligence as a unitary construct and would appear true whether the ECT measured is based on the Hick paradigm, the Inspection Time (IT) paradigm, or related alternatives. Although for many researchers this incomplete knowledge is not crucial, others have hinted at frustration. Indeed, some investigators take the purported degree of convergence between cognitive ability measures as prima facie evidence for the generality of RT measures (e.g., Jenkinson 1983; Vernon et al. 1985). From this perspective, the failure to obtain correlations between RT and a specific cognitive ability may be seen as a shortcoming of the test employed or sample obtained. It is noteworthy that within a strict model of psychometric g, the generality of RT would seem at least a necessary condition. If crystallized measures share differential relationships with speed of processing relative to fluid intelligence, does this mean that RT or intelligence (or both) are not as general as proposed? These features of contemporary individual differences research would seem to preclude the possibility of making strict a priori predictions concerning the relationship that ECTs share with the majority of second-stratum abilities identified in the present investigation. This problem is exacerbated further by noting that each of the many variables obtained within the Hick paradigm have variously been given importance in the psychometric literature (Roberts 1999a). This problematic state of affairs even occurs in the present instance where only constructvalid parameters are examined. In particular, considerably greater lip service has been paid to the MT variable in recent times than was initially the case. Thus, various processing accounts have been offered as an explanation for the correlations that MT (sometimes) shares with intelligence measures (Roberts 1997a). Fluid Intelligence (Gf) and Speed of Inform&ion Processing: Overview. Carroll’s (1993) reanalysis of many ECTs led him to conclude that a linkage between RT and Gf would seem most plausible. Roberts et al. (1991a) reached a similar conclusion when noting moderate correlation between Gf and card-sorting performance. This was interpreted in relationship to G*/G, theory where advantages would clearly be associated with the hypothesized nonincidental learning aspect of the Gf factor rather than learning acquired through acculturation. The following is offered as additional evidence suggesting Gf will share correlation with RT parameters.

48

LEARNlNGANDlNDlVlDUALDIFFERENCES

V0LlJME11.NUMBER1.1999

1. There is a burgeoning literature that has shown that one of the most salient aspects of RT may be the manipulation of task complexity-both within and across task levels (e.g., Frearson & Eysenck 1986; Jackson & McClelland 1979; A.R. Jensen 1982a; Larson et al. 1988; Payne et al. 1984). Researchers interested in these issues from a perspective that is not tied to processing-speed paradigms have tended to find relationships between complexity and fluid intelligence rather than many (or sometimes any) of the other broad cognitive ability factors (e.g., Crawford 1988,199l; Spilsbury 1992; Stankov & Crawford 1993; Stankov & Cregan 1993). 2. A.R. Jensen (1982a) has, on the odd occasion, qualified his notion of the linkage between RT and psychometric g to include only “that part of g which can be conceived of as ‘biological intelligence”’ (p. 99). Later in his writings, he notes “RT slows with age in later maturity, mirroring the decline in scores on psychometric tests of fluid g” (A.R. Jensen 1993a, p. 54). This concession has no doubt been prompted by developmentalists’ recent interest in processing speed and the fact that they consider empirical relationships almost exclusively with respect to fluid abilities (e.g., Fry & Hale 1996). However, working with women aged 65 and over, Anstey (1997) reports no correlation between simple RT or low-level CRT and cognitive abilities of Gf, G,, and G,. In accordance with point 1 above, the only significant correlation is with the most complex CRTs. 3. The Raven’s Progressive Matrices Tests are those instruments most often used in studies of speed of information processing (Juhell991). Carroll’s (1993, p. 696) reanalysis of the available psychometric literature indicates that this test remains a particularly good marker of the second-stratum Gf factor. Note, however, an important qualification-the Raven’s Progressive Matrices Tests have also been found to be factorially complex (see Wiedl & Carlson 1976), with simpler items involving a considerable spatial component (see Hunt 1974; Jacobs & Vanderventer 1968). 4. Speed of information processing would seen to be a candidate for the “Anlage” functions, which Horn (1980) has argued underlies Gr (see H.J. Eysenck 1987b; Jenkinson 1983). 5. R.B. Cattell (1971) argued that mental speed is not unitary, identifying at least seven kinds of speed-related factors within the literature. One of these factors, which he associated with Furneaux’s (1960) “intellectual speed,” is assumed to have loadings with Gf. Because little is known about the mechanisms underlying “intellectual speed,” it remains plausible that this construct is influenced directly by the speed at which individuals process transmitted information. The Processing Speed Correlates of Gf Table 17 includes the correlations between MT,, DT,, and RT, with Gr for each of the 10 ECTs. The most striking feature of Table 17 is the difference in correlation coefficients found between the MT, and DT, (or RT,) measures. Accordingly, many of the correlations between MT, and Gf are near or approaching zero. Exceptions to this outcome are rare, especially when salient characteristics of each task are taken into consideration. Alternatively, correlations between DT, and Gf are generally moderate to high such that several of these coefficients even exceed the .30 barrier suggested for ECTs (see Hunt 1980). Because the distinctive patterns for MT, and DT, have interesting consequences for theoretical postulations made in the extant literature, more de-

49

PROCESSlNGSPEEDANDAB/l/T/ES

tailed analyses and discussion of the present findings are divided into two main sections. Correlations between Gf and MT,. The two tasks in which psychomotor movement was most clearly dstablished are Fitts’s Movement Task (Tests 26) and the Binary Reaction Task (Test 31) (see Roberts 1999b). Because these ECTs likewise demonstrate low correlation between MT, and Gf, the present findings would seem to affirm initial propositions that psychomotor movement has little or no relationship with complex psychometric performance, or at least not the secondstratum factor considered in this analysis (see A.R. Jensen 1979). Indeed, the “purest” measure of movement speed, Test 26, has correlations ranging from -.07 to .06 within its various conditions (Roberts 1997a). Additionally, given the significant linkage consistently established for DT measures in Table 17, at least one of the two significant correlations reported on the MT variable is not entirely unexpected. In particular, analysis of the microstructure of the Complex Choice Reaction Task (Test 30) led Roberts (1995, 1999b) to conclude that MT to multiple targets contained some additional cognitive processing (directly related to DT) within this phase of participants’ response (see also Beh et al. 1994). A plausible assumption is that much of the research previously conducted on the relationship between MT and intelligence implicates Gr by virtue of the psychometric tests employed in their experimental designs. Given that several “positive” outcomes have been reported in this literature, the large number of near zero correlations obtained with the current battery of ECTs might appear surprising. Buckhalt et al. (1990), in obtaining significant correlations between MT and intelligence measures, summarized a number of explanatory hypotheses offered in the literature to account for this relationship. These theoretical explanations, which are listed below, may each be evaluated further in light of the present findings.

(i). Motor Response Programming and Execution. A.R. Jensen (1982a) has argued that MT and intelligence correlations may be accounted for by incomplete programming of the ballistic response in lower-intelligence participants as they leave the home button of the “Roth-Jensen” apparatus .20 This explanation seems most unlikely given the low positive correlation obtained with Test 26 and Gf at the

Correlations

TABLE 17 Between Gf and the Variables (MT, DT, and RT,) Obtained From Each ECT

Test Variable 26. Fitts’s Movement 27. Joystick Reaction 28. Single Response Choice 29. Tachistoscopic Choice 30. Complex Choice 31. Binary Reaction 32. Single Card-Sorting 33. Multitask Card-Sorting 34. Single Word-Classification 3.5. Multitask Word-Classification

MT, .OO *** - .02 - .08 -.29 -.06 -.21 -.18 - .30 -.25

DTX

RTx

***

***

*** -.32 -.12 -.31 -.42 -.39 -.46 -.22 -.23

-.18 -.25 -.15 - .39 -.36 -.36 - .42 -.35 -34

50

LEARNINGANDINDIVIDUALDIFFERENCES

VOLUME1l.NUMBER1,1999

higher levels of task difficulty (see above). In particular, for the 5.88 bit condition where programming of the ballistic response is most crucial to fast performance, slower speed is (weakly) associated with higher Gf factor score. (ii). Hovering Strategies. G.A. Smith and Carew (1987) maintain that some participants leave the home button prior to decision and thus confound the measurement of MT (and DT) in many chronometric tasks. Analyses of microstructure in the present study provided tentative support for this proposition, as least in some tasks (see Roberts 1995, 1999b). Current findings suggest that this strategy was not uniformly adopted by a large percentage of high-performing participants. This is, whether or not a hovering strategy is utilized does not appear to be related to an individual’s Gf. (iii). An Interaction with Level of Arousal. Buckhalt et al. (1990) cite a paper by Lindley, Smith, and Thomas (1988) that showed brighter participants were more motivated to respond quickly. This proposition would seem inconsistent with the present findings. For example, Test 26 was particularly monotonous (participants merely had to tap a probe back and forth between two targets), yet no significant correlations were established for any of its conditions. (iv). Developmental Phenomenon. The possibility that the relationship between MT and intelligence is a direct function of aging remains a plausible hypothesis especially in light of studies conducted with young children that have shown fairly substantial correlations with MT and Gf marker tests (Beh et al. 1991; see also Telzrow 1983). It would also appear that psychomotor performance correlates more highly with intelligence when older participants (aged 70 and above) are included in the sample (see Anstey 1997; Bors & Forrin 1995). Consistent with each of these propositions, the present sample was very much restricted with respect to the age variable. (v). Automatization of Ballistic Response. Widaman and Carlson (1989) have addressed the issue of automatization in their investigations of the Hick paradigm and procedural effects. Based on experiments conducted to examine practice effects, they predict that correlations between DT and MT with intelligence will be reduced (or nonexistent) when sufficient practice allows for the full automatization of a participant’s response. Accordingly, in the case of typical experiments conducted using the “Roth-Jensen” apparatus, participants may not have performed a sufficient number of practice trials to automatize the MT component. Moreover, the different movements required in each trial block arguably exacerbate this effect (Buckhalt et al. 1990). In the present experiment, participants were randomly assigned to one of four tasks (i.e., Tests 2%31), each of which required similar types of psychomotor movement, involving upward of 600 trials in which participants were required to make an aimed ballistic movement from a home button. It seems plausible that the MT component has, across the period of testing, become more automatized than is generally the case in individual differences studies. This explanation per-

PROCESSlNGSPEEDANDABlllTlES

51

haps accounts for the correlations observed with word-classification “MT” (Tests 34,35). This task was somewhat novel and certainly qualitatively different to the choice conditions with which other MT conditions were matched. (Recall from an earlier exposition that participants knew in advance that a word belonging to the semantic category “vegetable” was to be presented, but that they should wait until the experimenter had completed saying the word before making their own vocal response.) Equally, the task that is most easily automatized (or already automatic), Test 26, shows the lowest correlation with Gf (see also Roberts 1997a).

(vi). Neurophysiological Differences. Buckhalt et al. (1990) conclude their study by noting that evidence from their research is consistent with data demonstrating that nerve conduction velocity is related both to speed of information processing and general intelligence. Their argument entails that “intelligent” individuals are able to process information and move faster owing to central underlying physiological differences affecting DT and MT. However, none of the present findings would appear consistent with this explanation. Even if one does allow that Gf is a broad ability underlying general intelligence, it should be recalled that the tasks demarcating this factor often have the highest loadings on the first principal component (e.g., Marshalek et al. 1983).

(vii). Timed Measures of Intelligence Account for Observed Relationships with MT. Although no formal explanation of this type is offered in the literature, it is quite conceivable that correlations between MT parameters occur only in studies that employ strictly timed psychometric tests. For example, Buckhalt and colleagues (1990) used a fairly diverse battery of psychometric indices, which nonetheless involved tests given under strict time requirements (e.g., Matrices and Speed of Information Processing subtests from the British Ability Scales [Short Form]). They then based their correlational analyses upon the composite obtained from these various tests-thereby not ruling out this possibility as an explanation for obtained results. Equally, Roberts, Stankov, and Walker (1991b) found significant correlations between Ravens (Standard) Progressive Matrices performance and Fitts’s Movement Task when the former was given within the 20-minute time period recommended for this test. In attempting to replicate this finding with a larger battery containing a mixture of both time-limited and nonspeeded psychometric tests, most of these correlations were found to be disappointingly low (i.e., approaching zero). This interpretation is consistent with the findings obtained in the present study. Accordingly, tests sharing salient loadings on Gr are not heavily biased toward speed of responding such that only a small percentage of participants failed to complete all items of tests given in this study, which serve as markers for this second-stratum ability. It is worth noting that with regard to other tests serving as markers for (traditional) psychometric factors of this study, only those involving clerical/perceptual speed are strictly linked to timed performance. Thus, if this explanation is to have merit beyond this isolated instance, significant correlations between MT and G, are expected.

52

LEARNlNGANDlNDlVlDUALDIFFERENCES

VOLUME1l,NUMBER1,i999

CorrelationsbetweenGf and DT,. Table 17 indicates a number of features of the DT, and RT, variables in their relationship with Grseveral of which were not totally expected. Accordingly, it has been proposed that including a composite of the DT and MT variables (i.e, RT) will result in higher correlation with psychometric indices (G.A. Smith 1989). However, across the present set of tasks, correlations between Gf and RT were often lower, especially in the visually presented ECTs. Higher correlations obtained in Test 30 single out processes occurring during the DT phase as mitigating the relationship between ECTs and Gf. Given the above and the higher reliability generally evidenced for DT, measures (Roberts 1999b; see also Jensen 1987a), it would appear more sound to consider this variable in drawing out the relationship between speed of processing and Gf (excluding Test 30 and Test 27 in which RT is considered for substantive and practical reasons respectively). An explanation should firstly be given for the low correlations obtained with the Tachistoscopic Choice Reaction Task (Test 29). It should be noted that the task design led participants to focus attention on the accuracy of their response especially at lower ranges of exposure duration. The correlation between number correct and Gf is thus of a magnitude more in keeping with the other ECTs (r = 0.25). This result is suggestive of the role of meta-components in influencing relationships with psychometric and chronometric measures-that is, participants having high Gf are capable of sacrificing speed for accuracy in those situations where the former is required (see Marr & Stemberg 1987). Even combining speed and accuracy to form an efficiency measure (see Spilsbury et al. 1990) does not make this correlation (i.e., r = -0.23) as high as might have been expected given the demonstrated complexity of Test 29 (see Roberts 1999b). One plausible explanation for this outcome is in the large number of trials (420, including practice) that participants performed in undertaking this task. During this time some degree of automatization may have occurred, although this explanation needs to be qualified by the fact that participants seldom reached ceiling performance with respect to their accuracy score.zl Returning to DT and its correlation with Gf across each of the ECTs, it would appear the most informative means of addressing issues of relevance to the present results is to adopt a similar approach as that for the MT parameter. Thus, respective correlation coefficients were examined in relation to several theoretical explanations offered in the extant literature. (i). Confounding the Number of Bits with the Amount of Practice Received. Longstreth (1984, 1986) has argued that standard procedures adopted in the traditional “Roth-Jensen” apparatus have confounded the number of bits of stimulus uncertainty with the amount of practice that participants are given. To this end, participants have traditionally been tested in an ascending order of presentation, from 0 to 3 bits of stimulus information (see A.R. Jensen 1987a). As a consequence, either individual differences in speed of processing or differential practice effects could conceivably produce correlations between DT and intelligence measures. This procedure was not adopted in any of the RT tasks of the present study and it would not appear to influence the often-found increasingly strong correlation across bits of stimulus information. For example in the case of the card-sorting

PRUCESSlNGSPEEDANDABlLlTlES

53

task presented in the noncompeting conditions (which was in fact randomized over three separate test presentations) correlations between Gf and DT, were from 1 to 3 bits: -0.13, -0.26, and -0.41, respectively.

(ii). Response Bias Effects. Longstreth (1984, 1986) has also suggested that movements required by the various positions of response keys may involve differential preparation time in the “Roth-Jensen” apparatus. This possible confound exists only under those situations in which specific responses to choice stimuli are not all equally probable. The card-sorting paradigms of this study controlled for this aspect by the use of templates that required an approximately equal number of left- and right-hand movements even under the zero bit condition. In addition, there were proportionally greater number of movements in the same direction for the Binary Reaction Task, which shares higher correlation with Gf than tasks where response bias effects might occur (e.g., Single Response Choice Reaction). In short, no support is found for this explanation in the current study. (iii). Retinal Displacement. Longstreth (1984, 1986) further suggests that visual attention effects may contribute to correlations between processing speed and psychometric measures. In particular, the magnitude of the visual field varies with the amount of information contained in the array of the “Roth-Jensen” paradigm such that this feature of the stimulus may account for the often-observed increase in correlation with increasing set size. A.R. Jensen and Vernon (1986), however, asserted that this would more than likely attenuate the correlations with intelligence, although a study by Bors, MacLeod, and Forrin (1993) leaves this issue open to still further debate (see also Neubauer 1991). Inasmuch as visual attention effects were evident in the present group of tasks-in some this effect was highly variable (i.e., Joystick Reaction), others highly controlled (i.e., the card-sorting paradigms), and still others nonexistent (i.e., word-classification tasks)-it is doubtful whether this effect can provide even a partial explanation for the relationships observed between DT and Gr measures. (iv). Miscellaneous Accounts: Detterman’s Task and Subject Factors. In noting that choice RT is not a simple paradigm, Detterman (1987) has suggested some 10 task and subject factors that might explain the correlation that processing-speed measures share with intelligence. While it is not possible to address all of the issues raised by Detterman using the present data set, several factors that he discusses are considered below. 1. Understanding Instructions. Detterman (1987) has argued that even slight differences in understanding instructions may impact considerably upon the results obtained with ECTs. Although this issue will always remain something of an intractable problem (not only with respect to the RT paradigms employed but also in any given cognitive ability test that might be used) it should be noted that the continual references to speed and accuracy across all tasks of this study were understood so well by the participants that they often prefaced performing the tasks with this instructional aspect themselves. Equally, Detterman (1987) has noted that participants “may come to very different conclusions about aspects of the

54

LEARNINGANDINDIVIDUAL DIFFERENCES

VOLUMEl1,NUM8EA1,1999

task not included in the instructions” (p. 193). As much as possible, this potential confound was restricted in the present series of tasks. For example, the instructions accompanying the card-sorting tasks suggested to the participant how to hold the deck, how to turn over the cards, and so forth. In contrast to the outcome that would be expected on the basis of Detterman’s argument, correlations between Gr and card-sorting performance were highly significant. 2. Familiarity with Equipment. Detterman (1987) has suggested that participants familiar with, for example, a typewriter keyboard may have an advantage over the inexperienced participant performing an ECT. Note that in the present study several different types of apparatus were employed. Given the fact that the correlations presented in Table 17 are not systematically tied to any particular apparatus, it seems unlikely this factor can satisfactorily account for the correlations obtained between DT and Gf. 3. Motivation. Detterman (1987) has also argued that little attention has been given to the effects of motivation on individual differences in RT. Whereas A.R. Jensen (1980) argues against a motivational hypothesis on the grounds that DT cannot be reliably faked and would appear outside of the individual’s conscious awareness of the stimuli, these arguments are not compelling. Marr and Stemberg (1987) have noted that motivation may play a more indirect role, since it is: [MJediated by the allocation of attentional resources to the task. As set size and task complexity increase, so do the attentional requirements of the task. The unmotivated participant may be more likely to divide his or her attention between the reaction-time task and other features of the experimental environment, and this divided attention may produce slower reaction times. (p. 275) The results obtained from the present series of tasks do not rule out the influences of motivational states. As noted by Roberts (1999b), one of the salient features of the card-sorting tasks was the fact that the experimenter was always on hand to monitor response. Although at no point was feedback given, the audience effect is known to be arousing. In addition, findings obtained with the Tachistoscopic Choice Reaction Task (Test 29) were qualitatively different to many others obtained in this study, possibly because of the relatively monotonous nature of cognitive demands. Notwithstanding, findings obtained in Test 29 suggest that motivation may attenuate correlations with DT measures and Gf, rather than provide a plausible explanation for the relationships that are empirically manifested (see A.R. Jensen 1987a). 4. Memory and Search Factors. Detterman (1987) notes a number of ways in which both memory factors and search strategies could affect performance on a choice RT task: “[Dlifferences in memory could affect the criterion [that] subjects set, or how willing they are to risk an error” (p. 195). It should be pointed out that while the present series of results, as is, do not allow this issue to be addressed, the inclusion of short-term acquisition and retrieval (SAR, i.e., memory) and G, (i.e., search) marker tests provide conditions that would seem to allow a stringent test of this proposition. Preempting the results given for both G, and SAR factors, the following should be noted. While there is a strong relationship between G, and speed of information processing across all tests, and a weak relationship be-

55

PROCESSlNGSPEEDANDABlLlTlES

tween SAR and RT parameters across some tasks, partiallmg out these measures does not reduce the correlations between DT and Gr observed in Table 17. The results of this analysis (i.e., correlations between Gf and DT measures with G, and SAR partialled out) are presented in Table 18. (v). Task Complexity. The notion of task complexity being the most likely candidate mediating the relationship between processing-speed measures and intelligence would appear attractive. As a general rule, complex ECTs share higher correlation with measures of intelligence (Cohn et al. 1985; Larson et al. 1988). As those tasks having longer mean latency (i.e., Tests 30-33) in the current study (see Table 15) share higher correlation with the Gr factor, the present results would appear consistent with this general finding. These tasks, in requiring greater mental manipulation of item content, depend less on simple information encoding and transmission of information and more on active properties of memory, response preparation, and the like (see Larson et al. 1988). Present findings would seem consistent with a growing literature (e.g., Brody 1992, Chapter 3; Detterman 1987; Longstreth 1984; Marr & Sternberg 1987; Roberts et al. 1988) that questions those models suggesting that speed of information processing per se is fundamental to intelligence (e.g., H.J. Eysenck 1986,1987a; A.R. Jensen & Vernon 1986). However, this conclusion, although highlighting the importance of task complexity, derives (as do most others in the literature) from post hoc analyses of tasks purportedly measuring the construct of “cognitive complexity.” Consequently, the notion of task complexity currently adopted depends critically on the mean latency of RT after the event, as do all others (e.g., Cohn et al. 1985). According to established principles of scientific methodology this account would appear conceptually flawed. Thus, an attempt is needed to derive a series of complex tasks prior to undertaking any analysis, so as to ensure the phenomenon under investigation is not merely the product of some self-fulfilling prophecy. A step in that direction is provided by the existence of single and multitask card-sorting conditions (Variables 32 and 33). Clearly, correlation between Gr and DT is higher

Correlations

TABLE 18 Between Gf and DT, Variable for Each ECT with SAR and G. Partialled

27. 28. 29. 30. 31. 32. 33. 34. 35.

Joystick Reaction Single Response Choice Tachistoscopic Choice Complex Choice Binary Reaction Single Card-Sorting Multitask Card-Sorting Single Word-Classification Multitask Word-Classification

Note:

DTxby Gf

(G, partialled)

DTxby Gf (SAR partialled)

CG,and SAR partialled)

-.14 -.28 -.06 -.36 - .39 -.35 - .43 -.20 - .20

-2.1 -.32 -.lO -.36 -.41 -34 - .42 -.18 -.18

-.18 - .30 - .06 -.35 - .40 - .31 - .40 -.16 -.16

DTxby Gf

Test Variabk

Correlations reported in the table are based on DT, measures with the exception of Test 27 and Test 30, which are based on RT,.

56

LEARNlNGANDINDIViL'UAL DIFFERENCES

V0LUME11.NUMBER1.1999

for the more complex multitask card-sorting conditions in Tables 17 and 18. Nevertheless, as will be shown shortly, a hitherto neglected construct in the individual differences literature may allow a still better opportunity for understanding the effects of task complexity. In turn, this account provides a more definitive test of the complexity hypothesis, which (despite its shortcomings) continues to remain a plausible account of the body of findings.

(vi). Stimulus-Response (S-R) Compatibility and the Role of Attentional Mechanisms. It is worth noting that the three tasks (i.e., Binary Reaction Task and the two cardsorting paradigms) presented in Table 17 that clearly involve discrimination (and by extension low S-R compatibility) share quite substantial correlation with Gf factor scores. Alternatively, three tasks (i.e., Joystick Reaction Task and the two word-classification paradigms) demonstrating high S-R compatibility by a number of external criteria (see Roberts 1999b) manifest low correlation with Gf. With the exception of Neubauer (1991) very little attention has been given to this psychological construct in the individual differences literature. This state of affairs would appear most surprising given the importance of SR compatibility effects in contemporary cognitive psychology (Roberts 1999a).z Witness, for example, a recent edition of Psychological Research, the entire issue of which was devoted to research involving the Simon-effect-a Stroop-like phenomenon arising from the presence of conditions involving low S-R compatibility (see Kornblum 1994; Umilta 1994). It is proposed that the influence exerted by various S-R mappings serves as a hitherto neglected theoretical explanation of the higher correlations in many chronometric tasks. This account, in turn, implicates the importance of attentional mechanisms in accounting for correlations between DT and Gf. Accordingly, in the Joystick Reaction Task, where S-R compatibility is high, correlations with Gf are low. In fact, all but one of these coefficients fails to reach statistical significance, with a particularly low correlation (r = -0.12) obtained in the condition of this task involving the highest level of (intra-) task difficulty. Clearly, if speed of information processing per se were the key feature of a “bottom-up” biological process that contributes to the correlation between DT and intelligence, this finding would not be expected (see Neubauer 1991). It is worth digressing to several relevant experiments and conceptual issues to highlight some important features of S-R compatibility, features that go to some extent toward unifying certain accounts proposed in the psychological literature. This research suggests that the “Roth-Jensen” paradigm is not as simple as has been proposed. It also demonstrates that when S-R compatibility is made particularly high, the correlations between RT and (at least some) cognitive abilities may approach zero. The latter literature will be considered first so as to highlight the importance of S-R compatibility effects to understanding human cognitive abilities. Thereafter, some experiments are reported that cannot easily be reconciled to the reductionist perspective that views speed of processing parameters as reflecting intrinsic biological differences. Throughout there will be an attempt to relate previous research to findings obtained with the present battery of tasks. 1. The Bors et al. (2993) Experiment: Evidence for the Importance of S-R Compatibility. A telling example of the influence exerted by S-R compatibility effects to cor-

relations with intelligence measures is a study conducted by Bors et al. (1993). Although the researchers claimed to demonstrate the importance of visual attention effects, it is perhaps more pertinent to consider the fact that they altered the relationship between stimulus and response in two of three experiments reported. Bors et al. (1993) note that in the second of these experiments (relative to the first) correlations between DT and Ravens Progressive Matrices score approach zero. In their first experiment, Bors et al. (1993) had participants give a verbal response to stimuli (squares) that changed in color and response location. In the second of these experiments, the stimuli were presented in the same location but with choice manipulated only through the color of the stimulus. Bors et al. (1993) report that the second of these experiments revealed a significant quadratic trend and lower correlation with Raven’s Progressive Matrices performance. Reanalysis of their Experiment II data set indicates support for Longstreth et al.‘s (1985) Power Function (r = 0.992) over Hick’s law (r = 0.951), while the first clearly supports Hick’s law (r = 1.000) over the Power Function (r = 0.992). Elsewhere, Roberts (1999b) has argued that the Power Function will consistently model RT data better than Hick’s law in instances of high S-R compatibility. Moreover, by recourse to experiments conducted within the framework of S-R compatibility, the findings of the second experiment are not surprising. There is in the Bors et al. (1993) task a clear population stereotype (see Fitts & Seeger 1953)-naming a color verbally presented in the same location. In the first experiment, however, there is an extra physical dimension (i.e., a position code), which in being irrelevant (or incongruent with the most simple response) is capable of diverting attention (Kornblum et al. 1990). The Bors et al. (1993) finding shares a number of similarities with those obtained with the Joystick Reaction Task of the present study. In addition to being modeled more adequately by Longstreth et al.‘s (1985) Power Function (Roberts 1999b), Test 27 shares low correlation with the Gf factor. Similarly, although angle of displacement or response bias effects clearly do not operate in the word-classification paradigms, the correlations observed between DT and Gf are low. The high S-R compatibility evidenced in this verbal task is assumed to be a consequence of dimensional overlap-the idea that this protocol contains stimuli and responses that form natural categories (recall Tests 34 and 35 were formed on the basis of Roschian prototypes) and hence are well automatized (Komblum et al. 1990). 2. Neubauer’s (2992) Modified RT task: Reinterpretation. Neubauer’s experiments represent one of the few attempts by psychometricians to manipulate experimentally the level of task complexity by changing the degree of S-R compatibility.23 Results were taken by Neubauer to suggest that the “bottom-up” processing model was correct-essentially because correlations in a high S-R compatibility task were no different from those reported by A.R. Jensen (1987a). A certain doubt attaches itself to this interpretation owing to the fact that the psychometric test employed in this study (the Raven’s Advanced Progressive Matrices Test) was timed. Even so, obtained correlations with the Raven’s score were highest in the low S-R compatibility condition across each information level. This outcome occurred despite the fact that this condition aIways followed the high S-R compatibility condition in presentation order.

58

LE4RNINGANDINDIViDlJAL DIFFERENCES

VOLUMEll.NUMBER1.1999

In the present study, two of the computerized ECTs (Tests 28 and 31) may similarly be contrasted. However, in so doing it should be recalled that these tasks were presented in a random order, thereby ruling out influences vitiated by differential degrees of practice. Seemingly, there is a good deal of correspondence with Neubauer’s result. Thus, in the more compatible task (Test 28) the correlation is low, whereas in the less compatible ECT the correlation is among the highest given in Table 17. Note the assertions linking these tasks to different magnitudes of S-R compatibility effects are not atheoretical. Each task may be analyzed on the basis of Kornblum et al’s (1990) dimensional overlap model and placed on a continuum of S-R compatibility. Test 28 is a light-digit task indicating the compatibility is actually moderate relative to say a digit-digit task (which is highly compatible). Test 31, in contrast, requires an extra translation process relative to Test 28; the coding of response into a yes or no key press (see Roberts, 1999b). 3. The Odd-Man-Out Paradigm: The Processes Underlying Discrimination. It would seem that the odd-man-out paradigm (e.g., Diascro & Brody 1994; Frearson & H.J. Eysenck 1986; Frearson et al. 1988) (which Test 31 resembles) is an excellent exemplar of a chronometric task having low S-R compatibility. Consistent with the current interpretation, studies involving it report higher correlation with intelligence measures than those coefficients obtained with the traditional “Roth-Jensen” apparatus. As a feature of its lower compatibility, the odd-man-out paradigm implicates what Kornblum et al. (1990) term “incongruent mapping”: the activated and required response do not coincide. Figure 1 represents the information-processing operations hypothesized by Kornblum and co-workers to occur in a variety of different stimulus-response (S-R) situations. In the case of incongruent mapping, response identification takes longer, producing a delay in the verification process-one that is of considerably greater duration than if the stimulus and response were more congruent as in the “Roth-Jensen” paradigm. (The SR code for the “Roth-Jensen” paradigm would be represented predominantly by psychological processes listed in the upper portion of Figure 1, the “odd-man-out” by processes initially in the bottom before proceeding to the top.) In addition, for incongruent mapping: Since the activated, pre-programmed, congruent response and the correct response differ, the activated response must be disposed of, lest in conflict with the correct response at the time of execution. The abort process that does this (see Figure 1) constitutes a second source of delay. (Kornblum et al. 1990) The model discussed by Kornblum et al. (1990), while only briefly sketched here, couches the underlying mechanisms of the odd-man-out paradigm in far greater explanatory terms than simply referring to this situation as involving discrimination. A similar type of “process” analysis may be achieved with each of the current tasks. For example, the Joystick Reaction Task involves congruent mapping, hence a simple identity program; the Binary Reaction Task involves incongruent mapping, hence response identification, an abort process, and so on until the correct solution is acquired. The Kornblum et al. (1990) model also considers the interaction of set size with S-R compatibility-arbitrary symbols such as those used in the higher levels of card-sorting (e.g., suits) are hypothesized to

,

I

PrOgrEUIl

identity 86 +

-table lookup -search -lUle -etc.

Response Identification:

rJ

I

4

1

nidinuty

4

rj - rk

3

Verification

[~_I

I

No

Yes

;.F;

I

I

p

1

t

I

Executerr

I

FIGURE 1 Block diagram of the major information-processing operations required in conditions involving low stimulus-response (S-R) compatibility (solid lines) and high S-R compatibility (dotted lines). The top branch of the solid path illustrates operations involved in the automatic activation of a congruent response. The bottom branch of the solid path illustrates operations involved in the identification of correct response (after Komblum, Hasbroucq, and Osman 1990, Figure 3, p. 257).

--*

+

Congruent

60

LEARNINGANDINDIVIDUALDIFFERENCES

VOLUMEll,NUMBER1,1999

lead to larger increases with number of alternatives than a light-key task. As those correlations obtained with card-sorting and Gf at the three-bit level (for either single or multitask conditions) are among the highest found here (see Roberts 1995), the importance of S-R compatibility both across and within tasks should not be lost on the reader.24 4. Lindley et al’s (1988) Paper-and-Pencil “Processing Speed” Tests: Further Evidence tying S-R Compatibility to Intelligence. Arguing that standardized substitution tests were “contaminated” by a variety of nonintellective factors (e.g., pairedassociate learning), Lindley et al. (1988) devised a series of substitution tests that quite clearly implicate changes in S-R compatibility. In the first of three conditions, participants were required simply to copy alphanumeric symbols. In the second condition, participants were instructed to code forward (i.e., write the next letter or number in the series instead of the printed item). In the final condition participants were required to code backwards (i.e., write the preceding letter or number in the series instead of the given item). The highest correlation obtained with two psychometric measures was the least compatible condition (i.e., coding backwards). While preferring to attribute this outcome both to biological and acculturation mechanisms, Lindley et al. do suggest the possibility that there are two types of speed-the speed of “seeing relationships” and the speed of highly learned, automated responses-noting that only with the former does there appear to be a relationship with psychometric intelligence (see also A.R. Jensen 1986). Although a similar descriptive statement could be made for the results obtained with the current battery of tasks it would seem more efficacious to couch the results in terms of an explanatory model [such as that presently proposed] and relate findings to S-R compatibility effects. 5. More on Kornblum et al.‘s (1990) Dimensional Overlap Model: Biological Meckanisms. The preceding discussion included a rather rudimentary sketch of the Kornblum et al. (1990) model. Nonetheless, this account serves to provide the cognitive basis of S-R compatibility effects while simultaneously detailing a framework for synthesizing disparate findings reported in the individual differences literature. It is worth acknowledging further features of this model that may be of interest to those working within the field of human cognitive abilities. An interesting possibility emanates from a study that Kornblum et al. consider relevant to the efficacy of their model: Georgopoulos, Lurito, Petrides, Scwartz, and Massey (1989) had a rhesus monkey move a handle either toward (congruent mapping) or in a direction perpendicular to (incongruent mapping) a stimulus light. The RT for the incongruent mapping was 260 msec, which was approximately 80 msec longer than for the congruent mapping. During this test, the experimenters also recorded the activity of cells in the motor cortex and found that the neuronal population vector, which is a weighted sum of contributions of directionally turned neurons, pointed in the direction of the movement in congruent trials and in the direction of the stimulus (i.e., the congruent movement) at the start of the incongruent trials, with a subsequent rotation of the direction of the required movement. We interpret these data to be consistent with the automatic activation-identification-abort mechanism postulated by our model (Komblum et al. 1990, p. 261).

PROCfSSlNGSPEEDAND ABM/ES

61

This biological subtheory suggests that compatible tasks require different neurophysiological pathways for optimal performance than do incompatible tasks. Indeed, several important studies with humans (e.g., DeJong et al. 1994; Eimer 1995) also support this distinction. While direct studies of the dimensional-overlap model remain to be conducted by psychologists interested in understanding individual differences in intelligence, this biological evidence lends itself to a provocative question. To what extent is it justifiable to choose highly compatible ECTs (such as those often employed in intelligence research) and then lay claim to the fact that significant biological concomitants have been isolated? Indeed, it seems most plausible that if RT paradigms having low S-R compatibility share increasingly stronger correlation with fluid ability measures, the postulated neurophysiological mechanisms accompanying these tasks may be of considerably greater importance than those accompanying instances where congruent mapping takes place. 6. S-R Compatibility Efects and Fluid Abilities: Concluding Comments. There have been a number of disparate theories proposed in the literature to account for the relationship between speed of information processing and psychometric measures. Both the analyses of the present chronometric tasks, and of several studies in the literature, question explanations that are linked to models which emphasize “bottom-up” processes. The emphasis currently placed on S-R compatibility is consistent with this view. However, where task complexity is often poorly defined across tasks, and cognitive processes underlying certain ECTs remain little understood, the model of S-R compatibility currently proposed attempts to address each process in a cogent fashion. Moreover, these processes are assumed to be closely related both to attention (not necessarily limited capacity) and automatization. As Kornblum et al. (1990) argue: Unlike other definitions of automaticity that treat attention and automaticity as separate and independent entities, we believe, with Kahneman and Treisman (1984), that the two may be closely related. According to this view, an automatic process could under some conditions be attenuated or enhanced. However, under no conditions could it be ignored or bypassed. Participants in a properly designed experiment, whether instructed to use or suppress an automatized process would therefore produce evidence of its operation in their performance. S-R compatibility effects may be viewed as reflecting such evidence. (pp. 261-262) Interestingly, despite changing terminology (e.g, working memory capacity, attentional resources, limited capacity, and the like) researchers interested in providing an explanatory model of human cognitive abilities have remained committed to the importance of attentional mechanisms (e.g, Hunt & Lansman 1982; Kyllonen & Christal 1990; Lansman et al. 1982; Lansman & Hunt 1982; Myors et al. 1989; Necka 1992; Roberts et al. 1991a; Stankov 1983a, 1988b). Furthermore, the dimensional-overlap model provides an attractive alternative to many “homespun” theories (or otherwise questionable postulations) that are seemingly prevalent in the cognitive correlates approach to intelligence. In elaboration of this assertion, it must be remembered that the majority of research paradigms currently

62

LEARNiNGANDINDIViDUALDIFFERENCES

VOLIJME1l,NlJMBEA1,1999

employed by individual differences psychologists have a long (and somewhat checkered) history in the experimental literature. In part they are still employed by experimentalists, but the models underlying them have generally gained in breadth and sophistication beyond the rather simple (though undoubtedly important) information theoretic principles that Hick, Hyman, and a host of others utilized during the zenith of the conceptual framework (see Neisser 1967). Finally, because the dimensional-overlap model incorporates biological mechanisms under its framework it might plausibly replace primitive “neuromythological models” (Brody 1992, p. 60) that have so far dominated the cognitive correlates approach to intelligence.25 To avoid any misunderstandings, it is necessary to emphasize that S-R compatibility effects are useful primarily as an account of empirical findings involving complexity manipulations with speed measures. Accuracy measures from certain cognitive tasks can also show sensitivity to changes in complexity. Although in some instances the interpretation of findings with accuracy measures in terms of S-R compatibility effects may be plausible, in other cases it may be strained or even unsatisfactory (see Myors et al. 1989; Stankov 1983b, 1994; Stankov & Crawford, 1992; Stankov & Cregan, 1993). Crystallized Intelligence (GJ and Speed of Information Processing: Overview. The reader might assume that the relationship between RT and psychometric performance, if linked to the general factor (i.e., psychometric g), would result in moderate correlation being demonstrated between speed of information-processing measures and G,. In particular, examination of Marshalek et al.‘s (1983; see also Guttman 1954, 1965; Snow 1980) radex model indicates that almost all of the G, tasks are close to the center of the circumplex. Within Gf/G, theory, the general factor of a common, diverse battery of psychometric tests may similarly be interpreted as primarily some combination of Gf and G, with typically smaller contributions from dimensions such as G,, SAR, and the like. There are a number of experiments in the literature utilizing the Hick paradigm that would appear to support this proposition. Experiments employing the WAIS (e.g., Barrett et al. 1986; Vernon 1983) have reported correlations between various RT parameters and Verbal IQ, which are often as high as those reported for the Performance IQ scales (see A.R. Jensen 1987a, Table 25, pp. 158-159). Correlations between reading comprehension and RT parameters in two studies (Carlson & C.M. Jensen 1982; Carlson et al. 1983) exceed -0.50 for the DT, variable. In an investigation involving 162 school-aged children, A.R. Jensen (1982a) reports that the highest correlations between RT parameters and a variety of psychometric instruments were obtained for the Lorge-Thorndike Intelligence Test (a Verbal IQ measure). Results obtained when the Terman Concept Mastery Test is employed in processing-speed studies tend to be inconclusive (see A.R. Jensen 1979,1987a). G.A. Smith and Stanley’s (1987) reanalysis of their own data set is of particular interest in the present context (see G.A. Smith & Stanley 1983). These authors attempted to show that the relationship between RT and cognitive ability tests was a consequence of the tests sharing common variance with psychometric g rather than with one or more specific factors. In using regression analysis they found

P~OCESSINGSPEEDANDABlLlTlES

63

some evidence for RT measures being related more to fluid than crystallized intelligence, qualifying this outcome with the following: “the effect is slight and would need replication before acceptance (e.g., this would predict residuals for Vocabulary, a measure of crystallized intelligence, should be positive, which is only weakly supported)” (G.A. Smith & Stanley 1987, p. 298). Despite these studies, Jenkinson’s (1983) findings are most often cited by proponents of the speed of information-processing paradigm as indicating the generality of cognitive speed across fluid and crystallized intelligence factors (e.g., H.J. Eysenck 1987b; Vernon et al. 1985). In this study, Jenkinson demonstrated that a series of ECTs share comparable correlation with Gf and G, measures although curiously the Hick paradigm was not among the ECTs investigated. Unfortunately, a doubt attaches to this study’s claims regarding the psychometric factors assessed. As it turns out, G, was determined singularly by the Mill Hill Vocabulary Test and Gf solely by the Raven’s (Standard) Progressive Matrices Test. At the conceptual level, if a constraining feature of the relationship between chronometric performance and psychometric abilities is cognitive complexity, then it becomes conceivable that processing speed will share low correlations with G,. Crawford (1988) has argued that task complexity is more accurately defined in relation to fluid abilities alone. This argument, in turn, rests on the assumption that task complexity is defined in relation to mental processing at the time of testing such that markers of G, are relatively low in complexity. Three different aspects in the individual differences literature support this view. First, many theories of mental processes underlying cognitive abilities (e.g., working memory) suggest mechanisms that are plausible accounts of the types of reasoning and problem-solving tasks serving as markers for Gf, but are not plausible descriptions of mental processing that would occur during performance on tests of acquired knowledge (see Ackerman 1996). Second, under states of high arousal, recall of acquired information such as is typical in G, tasks is less impaired than, say, in the case of Backward Digit Span (M.W. Eysenck 1982). Finally, neuropsychologists often use G, measures to estimate an individual’s “premorbid ability level” (i.e., their general mental ability prior to disease states [e.g., Lezak 1983; Nelson & O’Connell 19781). It must be noted that exploration of the relationship between performance measures obtained in processing-speed paradigms and a clearly identified second-stratum G, factor is largely remiss-particularly with adult samples (Juhel 1991). As a qualification to this assertion, those few studies employing the WAIS (e.g., Barrett et al. 1986) are difficult to interpret because G, (e.g., Vocabulary) and SAR (e.g., Digit Span) marker tests sum together to form the Verbal IQ composite of this battery (see Carroll 1993, p. 702). The Processing Speed Correlates of G,. Table 19 displays the correlations between MT,, DT,, and RT, with G, for each of the ten ECTs. The results presented in Table 19 are unequivocal-with the exception of Test 34 none of these correlation coefficients approach significance, with the majority quite clearly on or approaching zero. As G, is quite clearly an important factor of cognitive abilities, and certainly relevant in most psychometric accounts of g, it would seem pertinent to ask why this “negative” result is so strongly evidenced in the cognitive correlates analysis

64

LEARNlNGANDlNDlVlDUALDIFFERENCES

VOLtlME11.NUMBER1.1999

of the present group of RT tasks. The most plausible hypothesis is that G, is neither related to cognitive complexity nor is it influenced by attentional deployment (e.g., Stankov 1983a). Roberts et al. (1991a) note a similar finding in a study of card-sorting performance, offering the following as an explanation: G, abilities are generally acquired via repetition or through the overlearning of cognitive strategies. Ability to do G, tasks is often an all or none phenomenon. Consequently, being able to process stimuli quickly and efficiently does not necessarily predispose individuals to perform better on tasks containing this component. (Roberts et al. 1991a, p. 454)

Word-Classification and G,: Evidence for Prior Learning Influencing Verbal XT. A brief explanation should be given for the correlations obtained in the Single Word-Classification Task as this was the only correlation that reached significance between DT and G, across 10 ECTs. Accordingly, some of the semantic category items constituting this task would appear subject to a certain degree of learning and acculturation. Some individuals did not recognize, for example, that “cricket” belonged to the semantic category “sport” (although well defined in Australian culture as a prototype of this concept) presumably because they were not socialized in Australia. Consistent with this interpretation, significant correlations were observed only in the latter two conditions of the word-classification task where not all items were drawn from Rosch’s (1978) work with prototypes. The correlations with G, in this paradigm were -0.27 and -0.24 for two and three bits (respectively). In the one-bit condition (in which all items were drawn from Roschian prototypes) the correlation was notably less substantial (i.e, r = -0.10). On the Lack of Relationship between Speed of Processing and G,: Comments. It has been claimed that correlations between RT task measures “are so ubiquitous that it seems very likely that most RT measures are related to general intelligence” (G.A. Smith & Stanley 1987, p. 291). If “general intelligence” means the first principal factor extracted from a large and diverse battery of psychometric tests, of which tasks marking G, typically share high salient loadings, the present results would seem to suggest that much caution should be exercised in making this as-

Correlations Behveen

TABLE 19 G, and the Variables (MT, DT,

Test Variable 26. 27. 28. 29. 30. 31. 32. 33. 34. 35.

Fitts’s Movement Joystick Reaction Single Response Choice Tachistoscopic Choice Complex Choice Binary Reaction Single Card-Sorting Multitask Card-Sorting Single Word-Classification Multitask Word-Classification

and RT,) Obtained

MT,

DL

- .06 *+*

*** ***

-.ll .13 -.08 -.05 .07 .03 -.04 -.lO

.02 -.17 -.OS -.03 -.04 -.20 -.25 -.I5

From Each ECT

RTx *+* -.04 -.05 - .oo -.lO -.05 .02 -.07 -.21 -.18

Pi?UCESS/NGSPEEDANDABlllT/ES

65

sertion. Carroll (1993) has warned of problems in the cognitive correlates approach if results are not referenced to a three-stratum theory. The results currently presented with the G, factor substantiate this claim. Whatever cognitive process(es) mediates the relationship between speed of information processing (of which cognitive complexity and S-R compatibility would seem likely candidates), these processes would appear to share nothing in common with the second-stratum G, factor. Broad Visualization (G,) and Speed of Information Processing: Overview. The relationship between speed of information processing and G, is interesting from a perspective that Carroll (1987) initiated in a critique of the Hick paradigm. He notes that one tangible problem with processing-speed measures is that they might conceivably be linked to the processes required in visualization tasks. This hypothesis was formulated on the assumption that the Raven’s Progressive Matrices Test contains a visual component as does the spatial layout of the “RothJensen” apparatus. While Carroll (1993) has subsequently tempered his views (his latest assertion being that G, is probably not responsible for the observed relationship between intelligence and RT measures) there is reason to believe this issue is far from resolved. This is because Carroll’s latest assertion derives from a single isolated study in which both verbal and spatial measures showed similar patterns of correlation (G.A. Smith & Stanley 1983). However, several doubts can be attached to the findings reported by G.A. Smith and Stanley (1983). First, the highest loadings obtained on the spatial factor were for Block Design, Picture Completion, and Cattell’s Culture Fair Test, The interpretation of this result as indicative of a spatial factor is questionable given the salient loadings on what others have considered exemplars of highly loaded Gf tasks. Second, the use of particularly young participants (G.A. Smith & Stanley’s [1983] sample comprised 12-14-year-olds) poses something of a problem in interpretation. Evidence in favor of cognitive differentiation before adulthood (certainly in the processes underlying both Gf and G,) is equivocal. Finally, the use of time-limited psychometric measures throughout G.A. Smith and Stanley’s study prompts a criticism noted earlier. The present study employed a number of psychometric tests identified in the literature as markers for the G, factor (see Table 1). These tests were administered to adult participants largely by means of microcomputer so that each individual was free to take as long as needed to complete each task. Salient features of the current design would appear to go some way toward resolving the relationship between speed of information processing and G,. The Processing-Speed Correlates of G,,. Table 20 shows the correlations between MT,, DT,, and RT, with G, for each of the 10 ECTs. Table 20 indicates that only the MT and RT parameter of Test 30 shares significant correlation with the G, factor. More detailed consideration will be given to aspects of this task shortly. These results indicate that the relationship between speed of information processing and G, is a particularly weak one. Certainly it goes no way toward accounting for the correlation observed previously between RT measures and Gf. Paradoxically, these correlation coefficients are substantially lower than those obtained by G.A. Smith and Stanley (1983) with a spatial factor, although the conclusion drawn is

66

LEARNlNGANDlNDlVlDUALDIFFERENCES

V0LlJME11.NUMBEA1.1999

similar. It should be noted that there is a remarkable degree of consistency in the direction of the relationship between G, and all parameters. The Role of T&c Dificulty in Visual XT. In reanalyzing 30 data sets involving spatial measures, Carroll (1993) notes broad visualization is: [Rleadily interpretable as measuring a general ability to deal with visual forms, particularly those that would be generally characterized as figural or geometric, and particularly those whose perception or mental manipulation is complex and difficult. Presumably, high status on this factor would signal an ability to perceive and deal with such forms accurately. It is not clear to what extent the factor involves speed of perception. (p. 609) While the analyses involving correlations of MT,, DT,, and RT, parameters with G, would suggest that speed of information processing is largely peripheral to this ability, the results obtained with Test 30 are interesting from the perspective of the above quote. The spatial arrangement of this task was more complex (in the sense used by Carroll [1993]) than several others of the ECT battery. The participant was required to make a series of judgments concerning two or more individual elements within the array as quickly and accurately as possible. While G, shares low correlation across each average performance parameter, it remains plausible that a tendency of increasing correlation will be exhibited within tasks as the number of visual stimuli to be processed (or ignored) is made greater. As a test of this proposition, consideration is given to both Test 28 and Test 30. These two tasks involve manipulations both in the number of stimuli and the number of targets. Both manipulations seem to implicate changes in the difficulty of the visual stimulus that must be perceived. Although correlations with G, remain low, there is generally a systematic tendency for coefficients to increase with task difficulty. That is, speed of perception appears to become a more important component of performance as the visual patterns that an individual inspects are made increasingly complex. Broad Auditory Function (G,) and Speed of information Processing: Overview. The relationship that processing speed shares with G, cannot readily be deduced from the available literature. This shortcoming would seem to be a consequence of a

TABLE 20 Correlations

Between G, and the Variables (MT, DT, and RT,) Obtained From Each ECT

Test Variable 26. 27. 28. 29. 30. 31. 32. 33. 34. 35.

Fitts’s Movement Joystick Reaction Single Response Choice Tachistoscopic Choice Complex Choice Binary Reaction Single Card-Sorting Multitask Card-Sorting Single Word-Classification Multitask Word-Classification

MT,

DT,

- .03 ***

*** ***

-.lO -.I4 -.23 -.02 -.02 -.04 -.lO -.16

-.21 - .09 -.15 -.20 -.17 - .20 -.lO -.06

RT,

*** -.08 -.22 -.18 -.25 -.I7 -.12 -.16 -.13 -.15

PROCESSlNGSPEEDANDABMlES

67

concern expressed most recently by Carroll (1993): “Evidence on the higher-order structure of auditory abilities is very meager because of the little attention that has been given to this domain, at least from the standpoint of individual differences and factor analysis” (Carroll 1993, p. 609). In the present study, a more specific rationale for examining G, in relationship to speed of information processing is given by noting that the two word-classification tasks were presented in the auditory modality. Extending the logic underlying the inclusion of G, measures, it would seem pertinent to consider whether or not individual differences derived from either the G, factor or classification speed measures are related. The Processing-Speed Correlates of G,. Table 21 includes the correlations between MT,, DT,, and RT, with G, for each of the 10 ECTs. These results are more difficult to interpret than are any others of the present investigation. Some of the correlations between MT and G, are significant (i.e., Test 28, Test 35, and Test 30 [which also implicates DT]) as is one correlation with the DT measures (i.e., Test 31). While these coefficients are all negative in sign, there is no degree of systematicity across identified levels of cognitive complexity. For example, the correlation between G, and RT obtained from the Joystick Reaction Task is moderate, the correlation with card-sorting paradigms quite low. Notwithstanding, the obtained correlation with MT measures cannot be explained in light of tests defining G,-Tonal Memory, in particular, was given under conditions that clearly de-emphasized speed. Despite each of these observations, it should be noted that the correlations between G, and the DT of each of the aurally presented word-classification tasks are not significant. A possible explanation for the correlations observed in Table 21 is given below.

Personal Tempo as a Factor underlying G, and Speed of Information Processing. Stankov and Horn (1980, see also Stankov 1983b) have noted the presence of a tempo factor (Maintaining and Judging Rhythm) in their studies of individual differences in G,. Similarly, it may be suggested that speed of thinking, as indexed by information-processing measures, is to some extent constrained by an individual’s natural rate of performing daily activities, such as walking, talking, and so on (see A.R. Jensen 1980). Elsewhere, this construct has been termed natural (or sometimes personal) tempo although evidence for the existence of this “ability” as an individual differences factor is mixed (see Harrison 1941; Mangan 1959; Rimoldi 1951). Arguably, similarities between ostensibly different meanings of tempo are more imagined than real. In particular, Tempo A and Tempo B of Drake’s (1954) Musical Aptitude Test have intuitive appeal as measures of personal tempo. In this test, the participant’s task is to count according to a metronome-induced beat. The participant’s score is the absolute difference between the count and the actual beat of the metronome (e.g., Horn & Stankov 1982; Stankov & Horn 1980). As pointed out by Stankov (1983b), a different scoring formula (i.e., scoring that does not take the absolute value but keeps the sign of the difference between the two counts) may provide separate “overestimation” and “underestimation” scores for actual beat. These scores may, in turn, be interpreted as indications that an individual’s personal tempo is faster or slower than the metronome’s beat. Al-

68

LEARNING AND INDIVIDUAL DIFFERENCES

V0LlJME11.NUMBER1.1999

TABLE 21 Correlations Between G, and the Variables (MT, DT, and RT,) Obtained From Each ECT Test Variable 26. Fitts’s Movement 27. Joystick Reaction 28. Single Response Choice 29. Tachistoscopic Choice 30. Complex Choice 31. Binary Reaction 32. Single Card-Sorting 33. Multitask Card-Sorting 34. Single Word-Classification 35. Multitask Word-Classification

MT,

DT*

-.12 ***

*** ***

- .24 - .09 -.30 -.I6 -.17 -.15 -.19 -.24

-.12 -.lO -.I4 -.25 -.lO -.lO -.16 -.09

XT, *,* -.23 -.24 -.I5 -.28 -.28 -.16 -.I5 -.24 -.21

though it is known that Drake’s measures of Musical Tempo (along with several other marker tests) tend to define the broad auditory (i.e., G,) factor, the correlations presently observed may reflect the fact that both broad auditory ability and speed of information processing are constrained by personal tempo. In the past, tests that have been used to assess personal tempo include motor activities such as tapping, clapping, and the like. Some degree of support for the above proposition may thus be noted in the correlation between G, and Fitts’s Movement Task, which is moderate in the lower level of task difficulty (e.g., r = -0.17 for MT3.34 and’G,, which is the highest correlation this task shares with any “level” ability). Moreover, the “MT” of the word-classification tasks would seem to resemble a tempo measure in some important respects-participants were required to say the same word (“veggie”) as close as possible to after the experimenter had presented the test item. In turn, this would appear to explain why this parameter shares moderate correlation with level abilities such as the present one. Clerical/Perceptual Speed (G,) and Speed of Information Processing: Overview. It is somewhat surprising that few studies have examined the relationship between clerical/perceptual speed measures in chronometric performance. Tests loading on this factor have also been interpreted to reflect the attentional factor search (Stankov 198813). Elsewhere, Detterman (1987) has suggested this process may be an important determinant of RT, one underlying its relationship with psychometric intelligence. Notwithstanding, a quote from A.R. Jensen (1987a) serves to preempt the analyses that follow. It should be mentioned that this assertion is not supported by any data that has (to present knowledge) been reported in the available literature. [Hlighly speeded tasks in which the task requirements per se are quite simple, such as clerical checking, letter cancellation, and the like, are among the poorest psychometric correlates of IQ or g, and they also show the weakest correlations with RT. (p. 417, underline ours).

The Cognitive CorreZates of G,. Table 22 includes the correlations between MT,, DT,, and RT, with G, for each of the 10 ECTs. Because both chronometric indices

PRDCESS/NGSffEDANDABILITIES

69

and G, were obtained as speed measures, the correlations between these parameters are positive. What is most striking, in light of the quote given earlier by A.R. Jensen (1987a), are the substantial correlations of MT,, DT,, and RT, with G, for each of the information-theory tasks presented in Table 22. With the exception of the DT component of the word-classification task given under single-task conditions, each and every correlation is sigmfcant. Although there would appear to be no systematic relationship with G, and cognitive complexity (e.g., the correlation between Test 26 and G, is higher than the correlation between Test 27 and G,), the highest correlations most frequently occur with the RT measure from each ECT. This casts considerable doubt on claims that speed of information-processing measures and other measures of cognitive speed do not load on (and hence define) a common factor (e.g., A.R. Jensen 1987a, 199213; Vernon & A.R. Jensen 1984; Vernon & Kantor 1986; Vernon et al. 1985). As more conclusive evidence regarding this issue may be obtained by examining the other speed of test-taking measure obtained in this study, more detailed comments are reserved for future discussion. The majority of correlation coefficients presented in Table 22 are significant. Because the theoretical implications of this result are somewhat different for the various processing-speed parameters, further consideration is given to these findings within the conditions comprising each ECT. Consideration will be given to the MT variable first and then to the DT measures. Correlations between MT and G,. In analyzing the correlations between Gf and MT, it was suggested that movement speed might sometimes be correlated with intelligence measures because psychometric measures are frequently administered under strict time limits. The present results reinforce this view. In each and every task in which MT was assessed, MT, shares significant correlation with G,. Notably, each of the psychometric tests demarcating G, were, in turn, time dependent. At an intuitive level of analysis this result would appear to make sense. Under time restrictions, advantages in performing a single psychomotor act are multiplied by the number of such acts required. This leaves more time available for the participant to devote to additional cognitive requirements such as error checking, or otherwise advantages the participant required to respond quickly in the final minute(s) of a speeded test. TABLE 22 CorrelationsBetween G, and the Variables (MT, DT, and RT,) Obtained From Each ECT Test Variable

MT,

DTx

RTx

26. Fitts’s Movement

.33

+**

***

27. 28. 29. 30. 31. 32. 33. 34. 35.

*** .29 .34 .47 .29 .26 .29 .29 .41

*** .39 .38 .41 .49 .43 .35 .18 .23

.28 .48 .54 57 .53 .42 .40 .32 .43

Joystick Reaction Single Response Choice Tachistoscopic Choice Complex Choice Binary Reaction Single Card-Sorting Multitask Card-Sorting Single Word-Classification Multitask Word-Classification

70

LEARNINGANDlNDlVlDUALDIFFERENCES

VOLUMEll.NUMBER1.1999

Consistent with this interpretation, Fitts’s Movement Task, which has the most minimal of cognitive content, shares significant correlation with G,. This task is the only one of the present battery in which ballistic information was manipulated within its conditions. Using this task an important question can be posed: Do correlations between MT and G, increase with task difficulty? The results relevant to resolving this issue are presented in Table 23. Interpretation of Table 23 is straightforward: There is no systematic relationship between the task difficulty established in Fitts’s Movement Task (see Roberts 1997a, 1999b) and G,. Indeed, in none of the information-theory tasks where MT is clearly measured is there a tendency for this parameter to share systematic correlations with G, across task conditions. The qualification that MT was clearly measured is important here, because for reasons to be described shortly, the correlations between MT and G, in Test 30 do, in fact, increase in a systematic manner. Correlations between DT and G,. Detterman (1987) has argued that within the context of the traditional choice RT paradigm, the way in which “alternatives are searched for stimulus onset could have a significant impact on decision tune” (p. 195). Although some of the cognitive literature addresses the issue of search strategies (e.g., Welford 1968,1980), Detterman also notes that information concerning this in relation to individual differences in DT is largely remiss. Many of the tests defining the G, factor here are assumed to be measures of search (see Cornelius et al. 1983; Stankov 1983a, 1988b, Wittenborn 1943). Analyses of the relationship between DT and G, within each respective task might provide some direct evidence for the influence of search strategies. Note that because there is no evidence of a systematic relationship across ECTs, search strategies would not appear to be allied to cognitive complexity per se. To address this issue, a series of summary statements may be made before highlighting these claims with one example. Thus, with respect to the correlations between DT and G, with increasing set size, there is a good deal of consistency in results obtained across tasks. These coefficients generally rise in a regular (albeit slight) fashion with increasing stimulus information. However, this outcome needs to be qualified because in those two paradigms where S-R compatibility was high, none of the relationships observed were, in fact, systematic across levels of task difficulty. Correlations between DT and G, for the card-sorting task are

TABLE

Correlations

23

Between Measures Derived From Fitts’s Movement Task and G, Factor Score

Parameter

Mean MT

MT2.m

.30 .35 .30 .32 .23 .33

MT3.34 MT4 05 ML.Gs MTs s MT, Note:

The values given in subscripts refer to the bit level manipulated in the Fitts’s Movement Task. Means and standard deviations for each of these conditions are given in Roberts (1999b; see also Roberts 1995).

chosen to highlight the more common finding although (with one or two relatively minor perturbations) any of the other data sets would have served this purpose. The results concerning Tests 32 and 33 are presented in Table 24. Within the present context these results indicate that individual differences in search strategies are related to speed of information-processing measures. That is, the relationship between DT and G, would appear to be directly influenced by the degrees of choice to which participants must respond. However, these findings cannot, in turn, account for the obtained correlations between Gf and parameters of the Hick paradigm. Recall thus that partialing-out the G, factor, does not substantially alter correlations between Gf and speed of information processing (see Table 18). Higher-order Factor (GF) and Speed of Information Processing: Overview. The thirdstratum factor extracted from the factor analysis of broad abilities in this study was interpreted as an “inflated” Gf, and subsequently designated GF. Results concerning this factor are presented for completeness. No particular divergence from the results given for the second-stratum Gf are expected-with the possible qualification that these coefficients may be still larger in magnitude. The Processing-Speed Correlates ofGF. Table 25 includes the correlations between MT,, DT,, and RT, with GF for each of the 10 chronometric tasks. Consistent with the assertion that the relationship between processing speed and cognitive abilities occurs at a higher stratum of the taxonomy circumscribing intelligence, these coefficients are among the highest obtained for any psychometric factor extracted in the investigation. Intraindividual Parameters in Relation to Cognitive Ability Measure: More Questions than Answers? For reasons made explicit in the section summarizing the microstructure of the ECTs there has been no attempt to include correlations between psychometric factors and the various intraindividual parameters that might be extracted from each of the ECTs. Calculating intraindividual variability measure

Correlations

TABLE 24 Between DT Measures Derived From Averaged Single and Multitask Card-Sorting

Variable

Single-Tusk Condition

Classifying Words (0 bits)

.16 .30 .44 .43

.16 .35 .41 .41

and G, Factor Score Chmzfying Words (1 bit)

Classifying Words (2 bits)

Classifiing Words (3 bits)

Classifying Words (Average)

Card-Sorting 1 bit 2 bits 3 bits Average Notes:

SF3 .25 .32 .29

.14 28 .34 .33

.15 .30 .32 .32

In these two ECTs, DT could not be extracted for 0 bits (see Crossman 1953; Roberts 1995,1999b;

.14 .35 .36 .35

Roberts et al. 1988 for the rationale surrounding this facet of the task). The first column corresponds to the single-task presentation of card-sorting (Test 32). Each of the other columns represents the word-classification task with which card-sorting was paired in a variety of multitask manipulations, averaged over all such manipulations within a given bit level. Means and standard deviations for each of these conditions are given in Roberts (1999b; see also Roberts 1995).

LEARNINGANDINDIVIDUALDIFFERENCES

V0LUME11.NUMBEFi1.1999

TABLE 25 Correlations

Between GF and the Variables (MT, DT, and RT,) Obtained From Each ECT

Test Variable

MT,

DTx

XT,

26. Fitts’s Movement

-.08

r**

***

27. Joystick Reaction 28. Single Response Choice 29. Tachistoscopic Choice 30. Complex Choice 31. Binary Reaction 32. Single Card-Sorting 33. Multitask Card-Sorting 34. Single Word-Classification 35. Multitask Word-Classification

*** -.13 -.14 -.41 -.09 -.23 -.22 -.32 -.34

*I* -.36 -.23 -.37 -.47 - .45 -.50 -.32 -.26

-.24 -.35 -.28 -.51 - .42 -.41 -.46 -.44 -.41

in the fashion which dominates the literature-that is, obtaining these indices from average performance over all intratask conditions and then correlating this parameter with a general factor (in the present case, GF)-reveals a good deal of similarity with previous findings. These correlations range between -0.24 (sdRT, in the Joystick Reaction Task by GF) and -0.46 (sdDT, in Test 30 by GF). For Test 30, where Decision Time processes are implicated, the correlation between sdMT, and GF is moderate (r = -0.36). Based on this, a fairly compelling argument might be mounted in support of the proposition that intraindividual variability (and theoretical accounts derived from this construct) are central to general intelligence (e.g., A.R. Jensen 1992a). However, more careful consideration of intratask correlation of sdDT with each of the broad cognitive factors extracted in this study paints a different picture. Of 250 correlations obtained between sdDT (or sdRT) and the seven “level” and three “speed” “second-stratum” abilities, only 29 exceed the 0.01 level of significance. Of these, 17 are tied to the two speed-related abilities, G, and T,,a. In the case of level abilities, five significant correlations were obtained between Gf and sdDT measures, five between Gf and sdDT and two between G, and sdDT. Moreover, in this case, the range of correlations tends to be relatively large. For example, correlations between sdDT and Gf range from -0.01 to -tO.43,26 with a mean correlation of only -0.17. One may therefore question the alleged importance of intraindividual variability in decision processes and also the extent to which this research focus has contributed toward a better understanding of the cognitive (or biological) mechanisms underlying individual differences in psychometric ability (see, e.g., A.R. Jensen 1993a). In the case of intraindividual regression parameters a similar case can be mounted. For example, the intercept measures of several tasks share moderate correlation with GF (in Test 26, r = -0.24; in Test 28, r = -0.26, and so on). These correlations are improved slightly if data are culled according to fit criteria (see Barrett et al. 1986). However, examining these parameters in relation to the seven “level” second-stratum abilities, only three (from a possible 63) correlations are significant at p < 0.01, with not one of these correlations occurring within the same broad ability factor.

PROCESSING SPEEDANDABlLlTlES

73

THE FACTOR STRUCTURE OF MENTAL SPEED Carroll (1993) has alluded to the possibility that “if any broad taxonomic classification of cognitive ability were to be formulated, in fact, it might be based on the distinction between level and speed” (p. 644). Given the consistent high correlations between various chronometric tasks and broad speed factors in the preceding analyses, there are justifiable grounds for exploring certain implications of this proposition. Evidence presented in, for example, Table 7 indicates that there are three cognitive speed factors (two linked to perceptual abilities, particularly visualization, and one that derives from tasks of considerable cognitive complexity). These mental-speed abilities implicate a still higher-order general speed factor. Questions remain as to (a) the structure exhibited by the DT and MT measures and (b) whether DT and MT form independent mental-speed factors, or perhaps share loadings with speed of test-taking constructs (given that they appear more “elementary”).

The Factor Structure of Speed of information-Processing Variables: Overview. AS

the concerns of most researchers investigating the Hick paradigm (or other ECTs for that matter) center on the cognitive correlates approach, there have been very few attempts to delineate the factor structure of the experimental variables that have been derived. Carroll’s (1993) reanalysis of such studies provides one of the few attempts to define clearly identified chronometric factors. The following two features are apparent (Carroll 1993, p. 484ff): 1. MT and DT factors would appear to be independent of each other. That is, they generally load on different second-order factors. Both of these factors are, in turn, interpretable. 2. Intraindividual measures (i.e., both intraindividual regression and variability parameters) tend to share loadings on the same factors as corresponding means (and in the same direction). These loadings also tend to be lower. However, as Carroll (1993) notes, both conclusions are drawn from a small number of studies whose designs (while relatively adequate) still generate substantial inconsistencies. Several further questions are pertinent. For example, is there any evidence for a general speed factor encompassing these (seemingly qualitatively distinct) domains of performance speed? Alternatively, would the DT and MT factors remain independent or share loadings on G,, TV,,, or T,,? Note, this latter possibility cannot be ruled out on the basis of intuitive analysislargely because the elementary processes involved in making a choice decision (or executing movements to indicate a response) seem to be prerequisites for the tests defining each of the psychometric speed factors.27

Summary Statistics and Correlations between all Measures of Performance Speed obtained in the Present Study. The variables selected for this analysis were (a) speed of response in the psychometric tests that defined G, in Table 3 (i.e., Tests 22-25); (b) speed of response in the psychometric tests that defined T,,, and Ti, in Table 8 (i.e., Tests 4-7,16-20); (c) each of the MT, and DT, measures obtained from the in-

74

LfARN/NGANDIND/L/IDUAf DlfFfRfNCES

VOLUMEll.NUMBEAi.1999

formation-theory test battery (i.e., Tests 26, 28-35, and Tests 28-35, respectively); and (d) RT, in Test 27 and, CK and TR in Test 36. Correlations between the ECT measures (i.e., those listed in [c] and [d] above) are presented in Table A.2 in the Appendix.

Factor Analysis of Speed-of-Performance Measures. Exploratory factor analytic procedures were employed to determine the factor structure of the 33 variables comprising this data set. A maximum likelihood solution based on root-one criterion yielded nine factors, with the tenth and eleventh root close to unity. Because of this feature of the data set and the large number of variables employed, an ll-factor solution was preferred. With these 11 factors, the goodness-of-fit chi-square test proved satisfactory (chi-square = 156.63, df = 85; p = 0.05), with each of the factors readily interpretable on the basis of previous rationale. The factor intercorrelation matrix also suggested that a second-order solution should be attempted. This resulted in a three-factor solution, with general psychomotor, general decision, and general (psychometric) test-taking speed factors clearly identifiable. Because factor intercorrelations were still of sufficient magnitude, a third-order factor analysis (of factor scores extracted from this solution) was conducted with only one factor retained. Loadings on this factor, interpreted as General Timed Performance (and henceforth designated, G,) were of a magnitude sufficient to indicate its generality: 0.37, 0.30, and 0.25 for general psychomotor, general decision, and general psychometric test-taking speed, respectively. Results of the Schmid-Leiman transformation of this hierarchial factor analyses are presented in Table 26. The resulting factors can be interpreted as follows. Order 1: Factor l-Speed of Limb Movement (MT&. Five parameters have salient loadings on this factor (MT, obtained from Fitts’ Movement Task and MT, derived from each of the computerized ECTs). The highest salient loading derives from MT, assessed in the Binary Reaction Task, which (given analysis of mean structure involving this parameter and the salient loading of Test 26) supports an interpretation of this factor as a pure psychomotor speed “ability.” Note, the parameters having the highest loading on this factor each involve tasks requiring the same subjective response-moving the index finger from a home button to press a response key located 10 cm from this point. Elsewhere, there is an extensive literature summarizing the dimensions of psychomotor abilities (e.g., Fleishman 1954, 1964, 1972; Fleishman & Quaintance 1984; Peterson & Bownas 1982). Carroll’s (1993, Chapter 13) review of this literature suggests that the hand movements required in RT paradigms that allow for the separate measurement of MT and DT implicate the Speed of Limb Movement factor (henceforth designated, MT& in the psychomotor component of performance. Factor 1 bears close correspondence to this, such that: This ability involves the speed with which discrete movements of the arms or legs can be made. The ability deals with the speed with which the movement can be carried out after it has been initiated; it is not concerned with the speed of initiation of the movement. In addition, the precision, accuracy, and coordination of the movement is not considered under this ability. (Carroll 1993, p. 533)

Movement Time 26. Fitts’s Movement MT 28. Single-Response MT 29. Tachistoscopic MT 30A. Complex Choice MT 31. Binary Reaction MT 32. Cards Single MT 33. Cards Multitask MT 34. Word Single MT 35. Word Multitask MT Decision Time 28. Single Response DT 29. Tachistoscopic DT 30. Complex Choice DT 31. Binary Reaction DT 32. Cards Single DT 33. Cards Multitask DT 34. Word Single DT 35. Word Multitask DT

Variable

.31 .20 .33 .31 .29 .43 .43 .47 .60

.05 .04 .07 .16 .02 -.02 .Ol -.08

.25 .23 .24 .37 .20 .31 .32 .28 .34

025 .27 .25 .35 54 .44 .21 .22

02:Fl MTG

.20 .20 .13 .20 .76 .62 .31 .34

.12 .13 .09 .25 .07 .25 .25 .I1 .09

02:FZ DTc

.19 .25 .26 .29 .09 .08 .02 .08

OltF2

- .06 - .02 .03 .03 .oo .04 .02 .05

.20 .64 .33 .52 .72 -.Ol .oo .04 -.03 .02 .07 p.03 .05 .oo .oo .oo .02

.21 .02 .04 .09 .03 .82 .74 .07 .02 ~ .02 -.07 .03 ~ .02 .04 .Ol .oo -.lO

.06 .09 .02 .05 -.07 - .02 .04 .53 .71

MTR/A

Ol:F3

.77 .32 .69 .66 .04 .04 .13 .oo

-.02 .06 -.lO .lO p.08 .02 .02 .oo .oo

DT,

Ol:F4

TABLE 26 From a Schmid-Leiman

MTLM MT,w

OlrFl

Obtained

.Oh .13 .08 .15 .06 - .05 - .05 .02 .06

02:F3 PTc

of All Speed Measures

03:FI G,

Factor Loadings

.Ol .04 .03 .Ol .32 .23 .02 .05

.Ol .02 .04 .02 .oo .oo .oo .02 .Ol

DTWM

Ol:F5

.06 .lO - .06 .07 - .08 .17 .77 .66

.02 -.07 -.13 .13 .ll -.03 -.Ol -.Ol - .03

DTs/v

Ol:F6

Orthogonalized

.oo .12 .04 .05 - .02 .05 .02 .03

.06 -.02 -.ll p.11 .15 ~ .Ol -.05 .07 .02

T,,

OZ:F7

.08 .oo .oo .lO .Ol .oo -.06 -.Ol

-.Ol .oo .12 .18 ~ .05 .oo - .03 -.Ol - .02

T,,

Ol:F8

Transformation

.oo .13 -.Ol .lO .Ol -.02 -.02 .Ol

.17 p.10 .lO .02 .08 - .07 - .02 - .07 .lO

G

Ol:F9

- .04 - .03 .08 - .04 - .03 -.Ol .02 .16

p.10 .08 .07 .oo -.I1 .05 .02 .09 .02

T,,?

0Z:FlO

.20 ~ .06 - .03 - .08 -.06 .05 .06 .04

-.06 -.Ol - .03 .20 - .07 - .04 .05 .oo .05

0l:Fll T?

$ z:

2 b

(NCT) 23. Stroop (Color) Time (SCT) 24. String Search time (SST) 25. Digit Symbol Time (DST) Other 27. Joystick RT 36, Lehrl-Duration 36. Lehrl-BIP

.12 .24 .16 .14

.28 -.ll - .26

.22 -.08 -.22

.18

.38

(TMTC) 22. Number Comparison

.32 .37 .26 .36

.14

.22

(TMT’S) 20. Tonal Memory Competing

Time

.I2

.36

-.03

.15

(HFTC) 19. Tonal Memory Single

-.03

.11

.07

-.lO -.02

.12 .09

.22

.06

02:FI MTG

.02

G

03:FI

(HFTS) 18. Hidden Figures Competing

“Psychometric” Speed 04. Number Series Single (NSTS) 05. Number Series Competing (NSTC) 06. Letter Series Single (LSTS) 07. Letter Series Competing (LSTC) 16. Computer Form Boards (CFBT) 17. Hidden Figures Single

Variable

.28

.25

.41 .30 .32 .22 .lO .06 -.Ol

.09 .17 .04 .28

.08 -.lO -.17

.34

.27

.07

.19

.32

.oo

.31

.21

.02

-.Ol

-23 .13

.04

02:F3 PTG

.07 .05

-.03

02:F2 DTG

OI:Fl

OZ:F2

.12 .03 .oo

.04 .18 .08 - .06

.oo

.09

.06

.03 .02 -.14

-.04 .07 - .06 .15

.06

-.04

.oo

.03

- .05

-.03

-.Ol .02

.oo -.Ol

-.04 - .02

-.Ol

.oo

M.-‘-MC

.04

MTLM

TABLE 26 Continued OI:F3

.15 - .09 -.12

.oo .ll .12 .05

.02

.oo

-.Ol

.Ol

.07

-.Ol

.oo .oo

.03

MTWA

02:F5

.03 .02 -.Ol .04 .oo -.Ol .oo

.36 .02 .08

.02

- .04

.04

.02

-.Ol

.02

.oo .Ol

-.Ol

DTP/M

.09 .19 .18 -.Ol

.lO

.12

.05

.03

-.04

-.13

.02 .04

.04

DTUK

02:F4

-.Ol -.22 - .33

-.09 .I2 .06 .04

-.03

.lO

-.02

- .04

.05

.Ol

.04 .ll

-.07

DTs,v

02:F6

-.08 .08 -.09

.02 .03 .oo -.lO

.13

.03

-.02

.18

-.02

.71

.42 .52

.24

7-u

02:F7

.02 -.09 -.lO

.02 -.04 .oo .15

.70

.30

.63

.23

.07

.08

.12 .Ol

.oo

T,?J~

02:F8

.02 .04 - .07

.64 .25 .32 .40

-.05

.14

.03

.08

.04

.08

.06 - .06

.oo

G,

02:F9

-.07 .I2 .16

.03 .22 .20 -.06

.05

.18

.04

.22

.71

.05

.14 .07

.07

T,,?

02:F20

-.ll .09 .12

-.02 -.05 - .02 .32

- .02

-.19

.I6

-.19

.04

.02

.45 .03

-.02

T?

02:FI2

Pt?OCESSlNGSPEEDANDABlLlTlES

77

Order 1: Factor 2-Multi-limb Coordination (MT&. Two parameters have high salient loadings on this factor (i.e., MT assessed in the two card-sorting paradigms), while a third measure (i.e., Fitts’s Movement Task) shares moderate loading. The loadings from the two card-sorting tasks may be taken to suggest that this factor is analogous to the Multi-limb Coordination factor identified in Carroll’s (1993, Chapter 13) review of the psychomotor literature. This factor is defined as: [T]he ability to coordinate the movements of two or more limbs (e.g., two legs, two hands, one leg and one hand). The ability odes not apply to tasks in which trunk movements must be integrated with limb movements. It is most com-

mon to tasks where the body is at rest (e.g., seated or standing) while two or more limbs are in motion. (Carroll 1993, p. 534) Coordination of the two hands is clearly pivotal to performance in the zero-bit condition of card-sorting-participants must hold the pack of playing cards with one hand, use the other to turn these over, while synchronizing both for successful performance. The interpretation of Factor 2 as Multi-limb Coordination (MT& is reinforced by a study conducted by Paterson et al. (1930), which found psychomotor movements obtained from a card-sorting task loading substantially on this factor. Order 1: Factor 3-Recognition/Articulation Speed (MT&. Two parameters have high salient loadings on this factor (i.e., word-classification under single and multitask conditions involving no choice) while a third parameter (i.e., Joystick RT) shares moderate loading. Elsewhere, these three tests have been suggested to involve high S-R compatibility. Nonetheless, any interpretation of this factor is problematic given (a) features of the measurement of “MT” in the word-classification paradigm mentioned previously (see also Roberts 1999b), and (b) the lack of sufficiently detailed research dealing with psychomotor aspects of speech. The closest psychomotor factor this would appear to resemble is one identified by Carroll (1993) as Speed of Articulation. This factor appears chiefly in “measures of the speed of performing fast articulations with the speech musculature. . .[where] in speech and hearing research, such movements are termed diadochokinetic (from Greek words meaning ‘successive movements’)” (Carroll 1993, p. 536). Any individual differences resulting in performance on the word-classification tasks (remembering participants are told the response category is constant) conceivably originate from participants’ ability to hear successive stimuli and articulate the response as quickly as possible. This factor is thus tentatively interpreted as Recognition/Articulation Speed (MTRIA). Order 1: Factor 4-Decision Time to a Light-Key S-R Code CDT&. The five parameters loading on this factor (i.e., DT, in Tests 28-31 and RT, in Test 27) are all obtained from computerized assessment of an individual’s speed of information processing. However, the higher loadings obtained from DT in the three tasks involving the same stimuli (i.e., lights) and the importance of stimulus-response codes in RT paradigms (indicated in the sections detailing the cognitive correlates of Gf [see also Kornblum et al. 1990; Teichner & Krebs 19741) suggest that this factor should be interpreted in relation to these two aspects of subjective response.

78

LEARNlNGANDlNDWlDUAL DIFFERENCES

VOLUMEll.NUMEER1.1999

That is, this factor would appear to represent participants’ ability to make choice decisions on the home key of a computer console to a specific set of stimuli. Thus, Factor 4 is interpreted as Decision Time to a Light-Key S-R Code (hereafter denoted, DT&. The fact that the Joystick Reaction and Tachistoscopic Reaction Tasks share lower (yet salient) loadings on Factor 4 support an interpretation in terms of the requisite S-R code. Thus, in the former, the stimuli were lights but the response was movement of a joystick; whereas in the latter, the stimuli were spatial cues but the response was DT assessed in terms of how quickly an individual’s index finger leaves a home button. Plausibly, with sufficient markers of either light-joystick or spatial-key S-R mappings, these measures would have loaded on distinct factors. Order 1: Factor 5-Decision Time to a Pictorial/Symbolic-Motor S-R Code (DTp,&. Two parameters have salient loadings on this factor (i.e., DT, assessed in the two card-sorting paradigms). This makes interpretation of this factor somewhat more problematic than might otherwise be the case. Nevertheless, the different stimulus response-code suggested by this factor (pictorial-motor) relative to Factor 4 and near zero loadings from all other parameters suggests an interpretation of this factor as Decision Time to a Pictorial/Symbolic-Motor S-R Code (DT~,M). Note that in no speed-of-information processing paradigm of this study, other than card-sorting, is this S-R mapping seemingly involved. Order 1: Factor 6-Decision Time to a Semantic-Verbal S-R Code CDT.&. Four parameters that share salient loadings on this factor: DT, in Tests 34,35, TR, and Cx. The highest loadings derive from the two word-classification paradigms. In the interpretation of the two preceding DT factors, the nature of the S-R code was suggested as pivotal. The salient loadings, particularly from TR and Ck, are consistent with this view. (Loadings are negative on these two variables because these measures were derived from an output score.) Thus, the features remaining consistent throughout the measures defining this factor are (a) the stimuli were semantic (i.e., either words or letters), and (b) responses were verbal. Of interest in interpreting each DT construct, tests defining the present factor were administered across more diverse modalities than those tasks that defined previously identified factors. Thus, although several parameters loading on Factor 6 were given aurally, this is certainly not true of the BIP (i.e., Ck) measure. Taking each of these considerations into account, Factor 6 is interpreted as Decision Time to a Semantic-verbal S-R Code (DTs,v). Order 1: Factor 7-Visual/Auditory (Perceptual) Test-taking Speed (TV/J. This is the first factor in Table 8. Order 1: Factor 8-Clerical/Perceptual Accuracy (CPA). This is the second factor in Table 8. Order I: Factor 94nduction Speed CT,,). This is the third factor in Table 8. Order 2: Factors 10 and Il. These two factors are difficult to interpret, appearing, however, to share salient loadings only on psychometric measures that assess slightly different aspects of visual processing. Order 2: Factor l-General Psychomotor Speed (MTc). This factor has high loadings from all but one of the ECTs in which MT was assessed (i.e., Tachistoscopic

PROCEWNG

SPEEDANDABlllTlES

79

Reaction), which nonetheless has near salient loading. Because no other measure shares loadings on Order 2: Factor 1, its interpretation as General Psychomotor Speed (MTc) would appear unambiguous. Order 2: Factov 2-General Decision Speed (IX,). Seven ECTs in which DT was assessed share salient loadings on this factor. The only ECT that might be expected to share loading but does not is the Complex Choice Reaction Task. Note, however, that this task has moderate loadings both in its DT and MT component. Given that in this ECT, participants made extra decisions during the MT phase, the fact that it has moderate loading on both components reinforces the robustness of the present factors’ interpretation. In sum, like General Psychomotor Speed, interpretation of this factor as General Decision Speed (hereafter denoted as DTc) would appear unequivocal. Order 2: Factor 3-Psychometric Speed (PT~). This factor shares loadings from psychometric tests given under either unlimited (i.e., computerized cognitive tests traditionally defining level abilities) or limited time (i.e., clerical/perceptual speed markers)-testifying to its breadth. In short, the interpretation of this construct as a general Psychometric Speed factor (i.e., PTc) is similarly unequivocal. Order 3: Factor 2-Response Speed (GJ. Finally, with respect to the third-order of this orthogonalized matrix it may be observed that measures of MT, DT, and psychometric speed each share salient loadings on a general Response Speed construct (i.e., G,). Three additional features of this third-order (general) factor should also be noted from Table 26: (a) only measures of induction speed fail to exhibit salient loadings on G,; (b) the highest loadings on the general factor are observed for DT measures, and in particular those ECTs that involve low S-R compatibility; and (c) in a context relevant to the preceding proposition, Digit/Symbol, Stroop, and the two divided-attention manipulations of psychometric tests (which would appear similarly to implicate lower levels of S-R compatibility) share high factor loadings on Gt.

Toward a Taxonomic Model of Cognitive Speed. The preceding section has addressed concerns that find no ready parallels in the extant literature. For example, in reviewing a large number of studies employing ECTs, it becomes apparent that past researchers have neither included speed of test-taking measures nor indices of clerical/perceptual speediness (i.e., G,) in their design. Alternatively, more traditional factor-analytic approaches to uncovering the structure of human cognitive abilities have simply ignored response speed derived from relatively simple laboratory tasks. Thus, in detailing a higher-stratum model of human cognitive abilities, Carroll (1993, p. 613ff) makes no definitive statement regarding the linkage between the two broad speed factors identified in his study. (These factors are Cognitive Speediness [which is akin to the current PTc factor] and Processing Speed [which is analogous to the present DTc factor]).** Note, however, there exist several substantive problems if, in the present study, these two factors are held to exist on the same stratum (as Carroll’s three-stratum model suggests). Arguments supporting the notion that PTc and DTc exist on different stratum include the following: 1. Viewing PTc and DTc as existing on the same stratum is contrary to the initial methodology of the study. Thus, each one of the “level” factors was selected

LEARNlNGANDINDIVlDUALDIFFERENCES

80

VOLlJMEl1,NUMBEFil,l999

within the context of a higher-stratum design, which clearly was not part of the experimental protocol underlying the speed of information-processing measures. In short, it would appear contradictory to suggest that abilities are narrow for speed of performance but broad with respect to accuracy (i.e., level) in psychometric tests. 2. Several of the psychometric speed factors appear to be particularly broad. In particular, TV,, cuts across putative domains of broad visualization and broad auditory function. Recall also that this factor is largely independent from another rate of test-taking measure, Ti,. Further, it is clear that some researchers treat measures obtained from rate-of-work measure as providing a broad factor, CDS, that is distinct from G, (e.g., Horn & Hofer 1992). As mentioned in the introduction, both of these factors appear to reside on the same stratum as do measures of fluid and crystallized abilities. 3. Assuming that PTc and DTc exist on the same stratum would simultaneously imply that induction speed, clerical/perceptual speed, DTL,k, MTL~, and so forth have equal status as primary mental-speed factors. With respect to the relative specificity of items comprising factors derived from ECT parameters (and generality of items comprising psychometrically based factors), this proposition would seem highly counterintuitive. 4. Based on the arguments listed above and results presented in Tables 8 and 26 (in particular), the following stratum model of cognitive speed (which is given more detailed attention in the Discussion section of this report) is proposed:

T v/a Gs

Ti,

DTG

MTG

Below each of the MTc and DTc factors of this model are lower-stratum abilities identified in the current study as (a) MTL~, MTM~, and MTR,A for MTc; and and DTs,v for DTc. The PTc factor is held to be (as has been ob(b) DTL/K, DTP/M, served with “traditional” level abilities) “in limbo” somewhere between secondand third-stratum mental-processing-speed constructs (see Carroll 1993). Of course, the results leave several questions unanswered. For example, it still needs to be determined whether or not “primary” mental-speed abilities may be identified for each of the speed of test-taking measures-although this need not be a necessary and sufficient condition of such a model. (In fact, one might construct an argument to suggest that the Ti, factor is one such instantiation of a primary factor linked to performance speed in fluid intelligence tests.) In turn, this stratum model of cognitive-speed measures suggests that it may be efficacious to develop a broad taxonomic classification of all cognitive-ability factors based on the distinction (and interactions) between level and speed measures (see Carroll 1993, p. 644). As such, the present investigation may be construed as a first step in outlining the structure of cognitive-speed measures that might be embellished within such a classification.

Cognitive Speed within the Structure of Human Abilities. The preceding model suggests that either (a) each of the second-stratum

speed constructs is as broad as the vari-

ous second-stratum level abilities; or (b) only the third-stratum speed factor shares comparable status with broad-level abilities. It should be mentioned that evidence in favor of (a) might be taken to suggest that Gt is as general as a thirdstratum level ability, leaving open the question of a still higher stratum of human cognitive abilities. That is, psychometric g actually exists on a fourth stratum with a general level and general response speed factor below this. To examine these issues further, two data sets were analyzed using exploratory factor-analytic techniques. 1. The seven level abilities (i.e., Gr, G,, SAR, G,, G,, IR, and CPA) and the five second-stratum speed abilities discussed in the model above (i.e., T,la, G,, Ti,, DTc, and MTo). 2. The seven level abilities identified in this study and the general factor extracted from the mental speed measures (i.e., G,). Factor Analysis involving Broad-Level Abilities and Second-Stratum Speed Factors. TO gain an understanding of the relationship between second-stratum-level abilities and second-stratum speed constructs, factor scores were obtained from the factor analysis given in: (A). Table 3 in which the factors Gr, G,, SAR, Gv, G,, IR, and G, were derived; (B). Table 8 where evidence for the factors TV,,, Tir, and CPA was found; and (C). The maximum likelihood analyses preceding the S&mid-Leiman transformation presented in Table 26, in which DTc and MTc were derived. The maximum likelihood extraction procedure with oblimin rotation was used for the factor analysis of the 12 variables comprising the above-mentioned data set. A solution employing root-one criterion yielded four factors. However, this solution is highly problematic in that two of the factors are defined by single variables and each constitutes a Heywood case. A three-factor solution results in one factor being defined by a singlet-with this similarly resulting in a Heywood case. For these reasons a two-factor solution was implemented. For these two factors, the goodness-of-fit chi-square test, although less than completely satisfactory (chi-square = 103.41, df = 43; p < O.OOl),needs to be considered in light of solutions involving three and four factors. These two factors account for 40.2% of the common variance among second-stratum level and speed abilities. Results of this factor analysis are presented in Table 27. This solution presented in Table 27 shows each of the speed variables loading on Factor 1. Salient loadings from both speed of test-taking and speed of information-processing measures indicate the generality of this factor. Consequently, it may be interpreted unequivocally as Gt. Factor 2, in contrast, has high salient loadings especially from Gr, IR, G,, and DTc. It would appear therefore to resemble the GF factor found in hierarchical analysis of level abilities represented in Table 6. The factor intercorrelation between Gt and GF is relatively low (-0.16). This analysis suggests little or no linkage between speed and level when viewed from what appear to be comparable strata. Based on this finding the possibility cannot be ruled out that speed and level measures form hierarchical structures that are

82

LEARNINGANDINDIVIDUALDIFFERENCES

TABLE 27 Oblimin Factor Pattern Matrix of Second-Stratum Factor MTG

D-k GS T v/a Ti, G G SAR G G, CPA IR

VOLUME1l,NUMfIER1,1%%

Speed and Level Factor Scores

Gt

GF

h2

.50 .29 .79 .71 .42 -.18 .07 -.16 -.23 -.02 .26 .02

-.26 -A0 .a2 -.16 .21 ho .26 34 .28 43 53 .50

.35 .29 .62 .57 .19 .43 .06 .16 .15 .18 .30 .25

largely independent. Nevertheless, the possibility also remains of linkage between level ability and speed ability, especially if one examines the latter as though it constituted a second-stratum ability. Factor Analysis involving Broad-Level Abilities and G1. No factor analysis of the present study has actually been conducted with ‘level’ cognitive abilities that does not also include some measures of speeded response. Although each of the seven level abilities seem highly replicable across data sets, to examine their relationship with G, would require that factor scores be derived only from measures involving accuracy of response. This leaves 24 psychometric tests (i.e., Test l-24) to be reanalyzed, as in the Digit Symbol Test (Test 25) rate alone was assessed. To maintain a focus on issues currently raised, this factor analysis is not reported here (more especially since the seven factors extracted are readily interpreted as Gt, G,, SAR, G,, G,, IR, and CPA). A maximum likelihood solution was obtained for the factor analysis of the eight variables comprising this data set. A solution employing root-one criterion suggested two factors. In this solution, Factor 1 has salient loadings only from SAR, with loadings from all other measures on Factor 2. Accordingly, a one-factor solution was also obtained. With this factor, the goodness-of-fit chi-square test proved satisfactory (cl-ii-square = 14.19, df = 20; p = 0.82). This factor accounts for 29.4% of the common variance among the abilities currently examined. Results of this factor analysis are presented in Table 28. This solution might be taken to suggest that Gt exists on the same stratum as each of the broad-level abilities comprising the current investigation. Note also that the high salient loading of Gt on this general factor now challenges the notion that cognitive speed is orthogonal (or minimally correlated) to level (see Carroll 1993, p. 495). The two preceding data analyses leave open the question as to whether or not it is more prudent to view cognitive speed independently of level. However, linkage between second-stratum level factors and a third-stratum speed factor do indicate some interaction between these two structures. Further investigation is re-

83

PROCESSINGSPEfDANDABlllTlES

TABLE 28 Factor Pattern Matrix of Second-Stratum Level Factor Scores and the Third-Stratum Response Speed Factor Score Factor G G G G CPA IR SAR G

General

GF(g?)

h2

.63 .40 .26 .43 34 .45 .31 -.62

.27 .12 .07 .14 .lO .15 .ll .24

quired to lead to satisfactory resolution of these conceptual issues. Nonetheless, it is clear that mental-processing speed constructs should play a more critical role in structural models of human intelligence than has been given to these cognitive measures in the past.

DISCUSSION MENTAL SPEED AND THE “BIG PICTURE” A major finding of the present investigation is that each one of the broad cognitive abilities shares differential relationships with the main factors of mental speed. The consistency with which relationships manifest themselves across particular abilities may be contrasted with previous studies conducted within the cognitive correlates framework that employ a single experimental paradigm and/ or a single measure of “general intelligence.” Present findings question the extent to which speed of information processing is related to psychometric g per se. Negative findings with certain parameters also question the relevance of many theoretical explanations that have been offered in the literature as processing accounts of intelligent functioning. There is a zeitgeist in contemporary individual differences psychology that views the status of psychometric g as paramount and all other cognitive ability factors as inconsequential (e.g., Brand 1987, 1996a; A.R. Jensen 1998; A.R. Jensen & Weng 1994; Ree & Earles 1991,1993; R.L. Thomdike 1986). This approach has permeated through to the industrial-organizational context (e.g., Ree & Earles 1992; Ree et al. 1993; Schmidt & Hunter 1992; Schmidt et al. 1992; R.L. Thomdike 1985) and has been most influential in fueling debate about racial differences in intelligent functioning (e.g., Brand 1996b; Hermstein & Murray 1994, A.R. Jensen 1985, 1992c, 1993b; Lynn & Holmshaw 1990; Lynn & Owen 1994; Rushton 1995). Because processing speed offers a “natural bridge” between computer metaphors of mind and biological mechanisms it is nonincidental that many proponents of general in-

a4

LEARNlNGANDINDWlDUAL DIFFERENCES

VOLLlME1l.NUMEEA1.1999

telligence place considerable emphasis on the relationship that it shares with psychometric g (e.g., Brand 1996b; A.R. Jensen 1992b; 1993b; Lynn et al. 1990; Nettelbeck 1990; Neubauer et al. 1997; Ree & Carretta 1996). And yet across a number of cognitive abilities defined by level in this investigation, the only meaningful correlations consistently established with any of the ECTs of this study were with Gf. With regard to factors such as G, and G,, which are assumed to have salient loadings on the general (psychometric) factor, correlations often approach zero for the majority of chronometric measures. This finding is not easy to reconcile with the view that all other factors are incidental to the general factor; neither is it clear whether attempts to strengthen arguments pertaining to psychometric g, by recourse to processing speed, are anything other than specious. Indeed, like the individual’s “general intelligence” assessed by tests of “level,” the present study suggests that “mental processing speed” serves as a rather imprecise term for a broad class of constructs. Thus, within the present investigation, there is clearly evidence for narrow, broad, and general factors of mental processing speed. Whether one factor is more or less important than another would appear a moot point-all factors are clearly crucial in the development of a comprehensive taxonomic model (see Carroll 1995). Plausibly (as has been argued for level factors), without taking into account the strata at which mental processing-speed constructs occur, it is likely the relationships between psychometric variables and information-processing variables will often be misinterpreted. This cautionary note constitutes no minor point. Several developmental theories have arisen in recent times on the basis of observed relationships between a general speed factor and aging decline in intellectual abilities (e.g., Bors & Forrin 1995; Case 1995; Fry & Hale 1996; Hale & Jansen 1994; Kail 1995; Salthouse 1994, 1996; Sliwinski et al. 1994). However, the general speed factor so extracted in the majority of these studies has been based on a small number of paradigms that appear to represent narrower speed factors than any current developmental model suggests. Consistent with this proposition, exploratory factor analysis of speed measures conducted throughout this investigation indicate individuals’ rates of performing various psychometric tests define a number of dependable individual differences factors (i.e., the cognitive speed factors labeled TV,,, Tir, and G,). In a similar vein, measures of speed of information processing yield reliable first- and second-order factors. In the case of the MT component extracted from chronometric tasks, this can be readily referenced to psychomotor factors that have previously been identified in the literature (e.g., Carroll 1993, Chapter 13; Fleishman 1954; Fleishman & Quaintance 1984). Without any clear precedent in the individual differences domain, the DT component of ECTs, in contrast, would appear to be defined by the nature of S-R mappings that the individual must engage in to complete tasks of this type successfully. This latter finding is consistent with the way in which RT measures have been treated within the experimental literature (e.g., Guadagnoli & Reeve 1994; Kornblum et al. 1990; Morin & Grant 1955; Morin et al. 1965; R.W. Proctor et al. 1993; J.R. Simon et al. 1981; Teichner & Krebs 1974; Umilta & Nicoletti 1990; Wang & R.W. Proctor 1996), although somewhat curiously, the present investigation represents the first individual differences study supporting the influence of S-R connections on RT.

In the passages that follow the implications of findings from the present investigation are considered in detail. It is argued that while findings involving the cognitive-correlates analysis of broad-level factors indicate mental speed is an important component of intellectual functioning, it should not be used to support reductionist arguments that assert speed is a basic (biological) property of the human organism (see Stankov & Roberts 1997). Similarly, consideration of the various factor analyses conducted throughout this study call attention to a “new” taxonomic model of mental processing speed. Although this model has been sketched in an earlier passage, its relationship to contemporary structural models of human cognitive abilities needs to be delineated. Because these two concerns converge to provide an explanatory model of (certain) intellectual functions, a detailed analysis of several theoretical frameworks is also provided. Finally, it is argued that whereas these series of outcomes suggest mental speed to be of major importance in investigating human intelligence, they also indicate that this construct should be viewed from a more balanced perspective than is presently the case.

THE COGNITIVE CORRELATES APPROACH: MENTAL SPEED AS AN IMPORTANT (BUT NOT BASIC) PROCESS OF INTELLIGENCE Movement Time and Psychometric Performance.

Interest in MT (Movement Time) as a correlate of psychometric performance has intensified during the 1990s (e.g., Buckhalt et al. 1990; Houlihan et al. 1994). Systematic examination of studies focusing on this construct raises questions concerning the psychometric measures employed, sample sizes attained, sample characteristics, and tasks examined. In the present study, any significant correlation that MT shares with cognitive abilities tends to be limited to psychometric tests in which performance speed would appear crucial (e.g., measures of clerical/perceptual speed). The low correlation observed between MT and level abilities appears to be the consequence of two features that were implicated in the design of the current investigation. These features were (a) the substantial number of psychometric tests that participants were allotted (in theory) unlimited time to complete; and (b) the plausible hypothesis that the MT component became highly automatized during the chronometric tasks that were performed. If this second explanation is accepted (even in part), this concurrently places considerably greater efficacy on correlations between cognitive abilities and DT assessed in the various ECTs of this investigation. In short, it has been argued that correlations between processing speed and intelligence are ephemeral because the tasks used to assess the former have generally been performed without taking into account differential susceptibility to practice (Bittner et al. 1986). Under this interpretation, adaptation to an ECT is the critical variable rather than any underlying cognitive process.

Decision Time and Psychometric Performance. The Decision Time (DT) variable is undoubtedly the most interesting mental speed measure examined in this investigation (not least because it appears to be both construct valid and readily interpret-

86

LEARNINGANDINDIVIDLJAL DIFFERENCES

VOLUME11.NUMBER1.1999

able [see also Roberts 1999a, 1999131).Importantly, DT appears susceptible to compatibility effects, which, in turn, provide for a more ready understanding of the present empirical findings than would appear the majority of explanations proffered in the research literature. We reserve discussion of S-R compatibility effects (and, in particular, the dimensional-overlap model) for a later section of this paper. The place of speed of information-processing parameters within any model of intelligence (including that of Gf/G, theory) has seemingly not been well established. Extrapolation of previous findings to the Gf/G, model was made difficult by the fact that previous information-processing studies were based largely on a single psychometric test. However, if (as claimed) speed of information processing is highly correlated with g then this should share correlations with any (and almost all) broad second-order factors. Indeed, as a preface to each cognitive-correlates analysis undertaken in the present investigation, tentative evidence was provided in support of this proposition for almost all broad cognitive-ability constructs. Problems in building a compelling case for each relationship should be acknowledged, not only because past research employed an insufficient number of cognitive-ability tests but also because of (a) difficulties in interpreting studies that employ a single ECT; (b) the quality of samples previously investigated; and (c) reliance (in the case of attempts to study several factors) on tests such as the WAIS, which appear to be more factorially complex than the test constructor originally intended. In attempting to circumvent these limitations, a major finding of this study was that across five second-stratum “level” cognitive abilities (Gf, G,, SAR, G,, and G,) and two first-stratum factors (IR and CPA), the only empirically meaningful correlations consistently established with ECTs were with Gr. This poses a difficult problem for those researchers who have sought to implicate speed of processing as a basic mechanism underlying intelligence (see Brody 1992, p. 49ff). It would require, in particular, that verbal abilities (G,), which often share the highest loading on the general factor (see Matarazzo 1972), not be considered part of intelligence in order for processing speed to retain its status as basic (Stankov & Roberts 1997). Indeed, present findings involving the cognitive-correlates approach consistently cast doubts on the essentially “bottom-up” model of intelligence proposed by A.R. Jensen and his associates (e.g., A.R. Jensen 1987a, 1992b, 1993a, A.R. Jensen & Vernon 1986; also Neubauer & Freudenthaler 1994). For instance, to support the proposition that speed is a basic biological property of the organism (see H.J. Eysenck 1987a, 1987b), it has been claimed that RT (Reaction Time) tasks are relatively simple in terms of cognitive demand. Thus, A.R. Jensen and Vernon (1986) have asserted that the relationship between speed of information-processing measures and the general factor extracted from psychometric tests is of “major theoretical interest, because the Hick paradigm involves no knowledge content, no problem solving, no ‘higher mental processes’. . . [having] about as little resemblance to conventional unspeeded psychometric tests as one could possibly imagine” (p. 156). Detterman (1987), in contrast, has argued that choice RT is far more complicated than it appears on the surface. Many results presented in the current study are understood only in light of the latter interpretation of the task and subject factors underlying choice RT paradigms. Little or no evidence is found to support the former argument.

P/?UCESS/NGSPEEDANDAB/llT/ES

a7

Problems in Viewing the Relationship between Processing Speed and Psychometric gas Basic: Implications from GJG, Theory. It is customary for those subscribing to single-factor models of intelligence to seek one or more of a small subset of basic information-processing skills. However, the existence of such processes causes tension within the theory of fluid and crystallized intelligence. Although it is plausible to conceive of a basic process if concentration is focused upon tasks that have low loadings (only) on the general factor (as proponents of the single-intelligence view tend to do), the existence of such processes appears less acceptable to those who study tasks with higher g loadings and more complex processes of intelligence (see Horn & No111994). The faceted theory of intelligence proposed by Gunman (1992) and supported by the work of Snow (1989) and his collaborators parallels that of Gf/G, theory (Stankov et al. 1995). Studies conducted within this framework have shown that manipulations of complexity within, for example, the quantitative domain differed from those within either verbal or spatial domains. It is difficult to envisage the importance of basic processes under such conditions. Those processes that make one test a better measure of fluid intelligence should be different from the process (or processes) that make(s) another test a better measure of crystallized intelligence. These propositions would seem to be reinforced by the findings of the present study, wherein it appears that (a) complexity of processing is more important to chronometric tasks sharing correlation with Gr than is speed of information processing per se (see also Larson & Saccuzzo 1989; Roberts et al. 1988; Stankov & Crawford 1993; Stankov & Cregan 1993; Stanton & Keats 1986; Vernon 1990) and (b) these processes share nothing in common with several other cognitive factors and in particular with G,. The Gf/G, theory is consistent with the view that there exist many elementary cognitive processes-akin to the bonds of Godfrey Thomson’s (1939/1948) model-and that these processes are organized into subpools. More complex tasks tend to draw from larger and more diverse features of these subpools (e.g., Stankov & Crawford 1993; Stankov & Cregan, 1993; Wickens 1980). The purpose of experimental manipulations of task characteristics is an increase or a decrease in the strength of relationships between given cognitive measures. The search is for processes that can reliably affect these relationships. Current findings indicate that one such process may be the nature of the S-R code in speed of informationprocessing tasks. It is unlikely that this mechanism involves any purely “bottomup” process, implicating as it does both attention and automaticity. Indeed, it would seem to be pointless to single out the basic process(es) of intelligence. The major problem for anyone wishing to make such a selection is in deciding which processes to choose. In sum, while current findings indicate the importance of processes captured by RT tasks to Gf, these also question the centraI role played by processing-speed measures in some contemporary intelligence research (Stankov & Roberts 1997). THE FACTOR STRUCTURE OF COGNITIVE SPEED For all intents and purposes, information on the factor structure of performance-speed measures is meager (Carroll 1993). In the present study consider-

00

LEARNINGANDlNDlWDUAL DIFFERENCES

VOLUMEll,NUMBER1.1999

ation was given to (a) factors that may be derived from examining speed rather than level in psychometric tests; (b) factors that result from the independent assessment of MT and DT in ECTs; (c) the relationship of these two “types” of “elementary” speed within a proposed taxonomy of cognitive speed; and (d) the issue of how these speed factors are empirically related to psychometric level factors.

Evidence for Additional Psychometric Speed Factors. Historically,

the earliest formulations of GJG, theory contained a broad speediness factor, G,. Until recently, this remained one of the least understood and poorly defined broad factors of this substantive theory. Although claims have frequently been made in the literature that speed factors may be extracted from measures designed to assess level-by in large these constructs seem to have attracted relatively little empirical research. Using exploratory factor analysis, two speed factors (in addition to G,) were found in the present study. These factors were interpreted as TV/,-the speed in doing tasks that mark G, and G, when number correct is scored-and Ti,-the speed of doing induction tasks. Thus, contrary to a literature that might indicate otherwise, speed measures appear to behave in a fashion similar to number of correct scores-that is, they define several distinct abilities rather than one or perhaps two speed-related factors (cf. Horn & Hofer 1992). Importantly, G,, Ti,, and TV,, were demonstrated to have different patterns of intercorrelation with ‘level’ cognitive abilities (e.g., Gf, G,), testifying to their factorial independence. Consideration of a number of different data sets comprising the present study confirmed that each one of these psychometric speed factors is neither an artifact of design nor analysis. For example, each of these speed and level factors appeared as clearly differentiated cognitive abilities when included in a factor analysis involving all psychometric measures of the current study. Moreover, when strict prescriptions for the measurement of speed of test-taking outlined in Horn and Hofer (1992 [e.g., splitting a given test into two parallel forms with one used to assess “number correct,” and the other rate of test-taking]) are observed, the determination and interpretation of speed of test-taking abilities remains consistent (see Roberts 1995).

The Factor Structure of Speed of Information-Processing Measures. By virtue of the large number of information-theory tasks employed in the present study, consideration was also given to the factor structure of Movement Time and Decision Time. Factor analysis of 11 ECTs indicated the existence of three first-order MT and three first-order DT factors. In the case of the former, the results could be clearly referenced to well-established psychomotor factors-Speed of Limb Movement (MT&, Multi-limb Coordination (MTMc), and a somewhat more tentatively interpreted Recognition/Articulation Speed (MTRIA). In the case of DT, it is to be acknowledged that few attempts have been made to examine the factor structure of this measure, and seemingly none that have used ECTs that are based on one underlying (information theoretic) model. Evidence accumulated in this study suggests that DT factors may be derived according to the explicit S-R mapping manipulated in a given RT paradigm. These factors were interpreted as DT to a light-key code (DT&, DT to a pictorial-motor code (DTr&, and DT to a se-

PROCESSING SPEEDAMDABILITIES

89

mantic-verbal code (DTs,v). Interpretation of DT factors along these lines was supported by (a) experimental research that has noted RTs to different S-R codes have qualitatively distinct properties (see, e.g., Teichner & Krebs 1974); (b) vanishing loadings from tasks sharing only one of the coding features; and (c) tasks that were presented in different modalities but shared loadings ostensibly because of the S-R mapping involved. Because these findings have no ready parallel in the individual differences literature, a wider sampling of RT paradigms involving a number of differing S-R mappings (see Brainard et al. 1962) is perhaps required, as is replication of this factor structure. Nevertheless, obtained factor loadings and interpretation in terms of the requisite SR mappings underlying RT paradigms provide a ready explanation for a number of inconsistencies that appear in the individual differences literature. For example, it has remained unclear why inspection time (which paradigm the Tachistoscopic Choice Reaction Time Task resembles) and choice RT tend to correlate only moderately. Given the importance of S-R codes in defining meaningful individual differences factors, it should be pointed out that the former involves a pictorial-key S-R code, whereas the latter has been assessed using the light-key RT paradigm. Similarly, in reanalyzing data presented in Kranzler (1990), Carroll (1993, p. 485) found evidence for a number of first-order DT factors, each of which might be reinterpreted within the theoretical framework currently suggested rather than according to (seemingly) intuitive analyses.

The Factor Structure of Cognitive Speed. When each of the speed of test-taking and speed of information-processing measures are combined in the one data set, factors remain well defined at the first-order. Accordingly, there is little or no evidence of overlap among Tir, TVia, G,, MT LM,MTMC,MTR/A,DTL/K,DTP/M,~~~DT~/ ". Importantly, from a S&mid-Leiman transformation, evidence for three secondorder factors was obtained-MTo, DTc, and a General Psychometric Test-Taking Time (PTc) factor. The first two factors share loadings as previously detailed, whereas the third factor had salient loadings from each of the speed of test-taking abilities. However, because each one of the speed of test-taking measures appears to represent second-stratum abilities at the first-order, it is likely the PTc factor lies somewhere between a second- and third-stratum of cognitive speed. Nevertheless, each of the three second-order factors combines to define a third-order general speed factor (G,). Traditional views concerning the hierarchial organization of cognitive abilities place sensory processes at the bottom of a hierarchy, perceptual and memory processes within middle rungs, and thinking processes at the top (e.g., Horn 1987). Based on the assumptions underlying this model it might be predicted that the factor structure of cognitive speed would resemble this hierarchy. Given five broad factors of mental-processing speed, these constructs should theoretically be ordered with respect to apparent task complexity. Thus, either Ti, or TV/=may be viewed as the most complex cognitive speed factors-reflecting the global time needed for a person to work through typically demanding tests of intelligence. At the middle rung is DTc, reflecting aspects of simple and choice response to stimulus information. Finally, at the bottom of the hierarchy, is MTc-a psychomotor

90

lfN?A'/NG AND//W/V/DUAL D/FFf/WCfS

V0LlJME11.NUMBER1.1999

rather than cognitive speed factor. It seems logical to conclude that Decision Time is an aspect of both search time and a component of test-taking time defining any of the psychometric speed factors. This would be analogous to suggesting that perceptual processes of vision are a part of the thinking processes involved in solving, for example, the problems of a Raven’s Progressive Matrices test. This ordering of broad speed factors is not supported empirically in the present investigation. This postulation would require that psychometric speed measures share higher loading on the general response speed factor than do measures of Decision or Movement Speed. Instead, the results obtained indicate that decision times involving a pictorial-motor response-code share the highest loading on the general speed factor. This finding indicates that intuitive analyses are perhaps flawed and that early notions of complexity reflected in mental-speed measures require reconceptualization (Stankov & Roberts 1997; see also Tomer & Cunningham 1993). These results also suggest that the Decision Time factor acts in a similar way to Gf in traditional (level) studies where it is sometimes difficult to distinguish a general factor from Gf (Gustafsson 1984,1992a, 199213). These findings raise several interesting issues. For example, is it possible to identify further broad speed factors, such as natural tempo, coincidence timing, inspection time, and the like? How many lower-order factors of speed of test-taking or speed of information processing can be identified? Is it best to conceive of speed measures as forming a separate taxonomy of human cognition or to try to integrate these constructs with level abilities.7 To what extent has insufficient knowledge of the complex structure underlying mental-processing speed impoverished attempts to understand learning and individual differences? These questions seem to be of little import to many contemporary researchers interested in finding the basic processing skills of intelligence through speed-of-performance measures. The answers to such questions, however, influence the way mental speed is conceptualized, the status of speed as a basic process, and ultimately the way in which intelligence should be assessed. For example, the complex nature of cognitive speed presently uncovered implies that some speed factors may be important in one area of human cognition while other speed factors may be important to others. Therefore, research should not focus on a simplistic notion of mental speed determined from a few experimental paradigms; rather, attempts should be made to provide for a wider representation of this domain. In turn, this of course suggests urgent reconceptualization of many models presented in the psychological literature that rely on correlations between mental-speed constructs and the mechanism that it is though to elucidate (e.g., cognitive development). Cognitive Speed within the Wider Structure of Human Abilities. The structural model of mental processing speed discussed in the preceding section was also referenced to cognitive abilities defined by level. Factor analysis, first with five second-stratum speed factors (TV,,, TirrG,, DTc, and MTc) and seven second-stratum level abilities (Gf, G,, G,, G,, SAR, IR, and CPA), suggested one general factor linked to speed and one general factor linked to level, with these two minimally correlated. By contrast, when a third-stratum, general speed ability (G,) was in-

PROCESSlNGSPEEDANDABlllTlES

91

eluded in a factor analysis with the seven second-stratum level abilities, all factors

shared high salient loadings on a general cognitive ability factor. At present these results should be viewed with some degree of tentativenessrequiring perhaps both a wider array of speed and level measures to define firstorder factors than was presently attempted, and modeling with confirmatory factor-analytic techniques. Nevertheless, findings involving measures of cognitive speed could be referenced to the type of stratum model that Carroll (1993) proposes in the following manner. Assuming for the moment that there is a general factor that exists at a higher-order, each one of the second-stratum abilities (i.e., Gf, G,, and so forth) takes up its respective positions with the general factor of cognitive speed (i.e., G,) identified in the current study. Below this are the various primary factors defining second-stratum “level” factors with the broader speed factors of DTc, MTG, TV,,, Tir, and G,. Initially, this might appear somewhat problematic. Where does this leave the primary mental-speed factors? However, consideration of still lower “strata” of performance suggested by both psychometric tests and chronometric tasks allows a plausible account of these measures within the structure of human cognitive abilities. Thus, with regard to level abilities, under the primary mental abilities are the actual tests that define each first-order factor. These tests would share equal status with primary mental-speed factors. This model suggests that a given ECT has comparable status to a psychometric test item-a highly feasible proposition given that the processing involved in either of these behavioral acts is driven by similar constraints. Features of this model are given in Figure 2, which contains an outline of one level ability (GJ in relation to each of the speed factors identified in the present study as well as lower-“level” processes that form the more speculative aspects of this model. The proposed theory suggests that the tradition of acknowledging perceptual speed (and, more recently, both processing speed and rate of test-taking [i.e, CDS]) as second-order factors is largely incorrect. However, although the account given above offers much in terms of understanding the broad speed factors defined by elementary cognitive tasks (i.e., DT and MT), it leaves open the question as to why complex tasks (such as those defining the Tv,a and T,, factors) share common variance at any level. However, one engaging explanation is that all such speeddependent abilities are constrained by an individual’s natural (or personal) tempo. RELATIONSHIP BETWEEN MENTAL SPEED AND HUMAN COGNITIVE ABILITIES: AN EVALUATION OF EXPLANATORY MODELS Cognitive models that have been proposed as accounts of the relationship between mental speed and intelligence are critically dependent on the choice of intraindividual parameters to be used as measures of performance. This study is unequivocal in pointing to the usefulness of intraindividual measure of central tendency over a series of trials (see also Roberts 1999a). Results also suggest that other types of intraindividual parameter are suspect. Consequently, explanatory models that attempt to account for the correlation between such parameters and intelligence

are questionable.

A three-stratum

DT

DT

DT

model of cognitive abilities that takes into account the hierarchical

FIGURE 2

Primarv Mental Abilities (seeCarroll. 1993. p 626)

structure of mental-processing

MTMTMT

speed.

Test

d

w

lntraindividual Regression Parameters: The Problematic Nature of Explanatory Models that Postulate a Relationship between Rate of Information Processing and Intelligence. Interest in the slope of DT (which originally spawned interest in the Hick paradigm [see Roth 19641) has waxed and waned considerably over the past two decades, An attempt to provide a better understanding of this construct was made by Barrett et al. (1986), who examined regression parameters as a function of differential degrees of intraindividual model fit. These results were suggestive of the need to cull individual participants not fitting the Hick model as these simply add noise to the data, thereby attenuating theoretically meaningful correlations (G.A. Smith 1989). On the basis of findings obtained with the present ECTs (reported more extensively in Roberts [1999a, 1999b]) culling would seem unnecessary, largely because intraindividual regression parameters have uncertain empirical status.29 Thus, analyses indicated the slope of RT to be largely an artifact of selecting an insufficient number of data points to have made definitive statements about intraindividual fit to the Hick model. Note that no past studies examining this relationship had employed five or more levels of stimulus information. In the present study, when five data points were examined in paradigms that, in principle, should have provided high subject conformity, the results were surprisingly poor. This outcome could not be attributed to methodological inadequacies or sample characteristics; by selecting the same number of data points as most frequently reported in the literature, the percentage of participants fitting the model was shown to be high. The failure of individual participants to conform to underlying information-processing models is probably common to many ECTs and would certainly seem to require more careful attention than has been given in the past (see Lohman 1994). More critically, perhaps, the consistent failure of individual participants to fit the Hick model questions those theoretical explanations that suggest individual differences in rate of processing are responsible for individual differences in intelligence (e.g., A.R. Jensen 1982a).

lntraindividual Variability Parameters: The Problematic Nature of Two Explanatory Models. Measures of intraindividual variability and their correlation with intelligence appear to have contributed greatly to the renewed interest expressed by psychometricians in RT (A.R. Jensen 1992a, 1993a). However, in the present investigation, intraindividual variability measures are show in a particularly unfavorable light. Indeed, sdDT fails to exhibit a simplex pattern, while sdDT, measures are neither independent from more generic DT, parameters nor are they adequately defined by their “subcomponents” (see also Roberts 1999a, 1999b). Moreover, correlations with intelligence measures seem less robust than would be required of an efficacious model. Indeed, these parameters generate substantial anomalies that are difficult to reconcile with their purported status. Elsewhere A.R. Jensen (1987a) has claimed that: Since sdDT increases linearly as a function of set size, with increasing set size one should predict a greater opportunity for individual differences in the oscillation process hypothesized to underlie both sdDT and g to be increasingly

LEARM'JGANDINDIVIDUALDIFFERENCES

V0LlJME11.NUMBER1.199-3

manifested, thereby making for a monotonically increasing (negative) correlation between sdDT and g as a function of set size. This prediction [has] not been borne out in the least. . . This is the only really substantial anomaly for the oscillation model. (p. 166) Findings in the present study constitute further anomalies in this “neural oscillation” model that A.R. Jensen (1987a) has proposed. Indeed, within the individual differences literature two theories-the “neural oscillation” model and a theory that posits errors in transmissionhave been based on the correlation that sdDT parameters share with intelligence indices. Neural Oscillution. This model involves the notion that “neural oscillation” may be an important physiological process underlying intelligence (A.R. Jensen 1979, 1992a). Importantly, several well-known physiologists subscribe to a similar concept including Sir Francis Crick (1993). However, it remains unclear as to why neural oscillation needs be connected to measures of intraindividual variability in processes linked to speed. It seems equally plausible that neural oscillation be associated with measures of speech production or some other qualitative aspect of cognitive performance. Drawing an analogy, “faster” computers perform more operations at greater speed but nonetheless are prone to make errors. Good thinking implies making fewer errors and speed cannot always be beneficial in this regard. Perhaps better image resolution or finer sound reproduction can be achieved with faster computers-speed of doing computations is just one of the consequences of faster neural oscillation. It is also likely that this type of mechanism is not the only function affected by oscillations. The operation of a cognitive system depends both on the speed of neural oscillation and the wiring of various subcomponents (Rabbitt 1994; Stankov & Roberts 1997). In and of themselves, neural oscillation do not determine the efficiency of a given cognitive system; conceivably, performance efficiency also depends critically on many hardware and software features (see Sternberg 1986). Errors in Neural Transmission. This model states that “noise” within the information-processing system (due to errors in synaptic transmission [H.J. Eysenck 1987aJ) is the critical component contributing to individual differences in intelligence. Because good thinking depends on being able to make as few mistakes as possible, this explanation of the link between measures of intraindividual variability and intelligence has some intuitive appeal. However, in a recent study (Stankov et al. 1994) and in reanalyses conducted by Carroll (1993, pp. 4&l-485), the correlation between measures of central tendency and intraindividual variability in performance was found to be high. This was also evidenced in the present data set. As a result it would seem injudicious to claim that these processes are in any way different. Findings of the present study reinforce this view (see also Brody 1992, pp. 52-53). Failure to establish correlations between sdMT and psychometric measures should also cause some concerns for the proponents of this theoretical position.

Explanatov Models Relevant for Measures of lntraindividual Differences in Central Tendency. It has been observed in previous research that measures of median DT are

PROCESShVG SPEEDAND ABlllTlES

95

reliable (e.g., Widaman & Carlson 1989). The extent to which this was evidenced across a variety of ECTs examined in this study is similarly impressive (see Roberts 199913).Note also that each measure of central tendency subscribed to various lawful principles such that there was generally an increase in correlation within the conditions of each ECT and a high degree of conformity to Hick’s law in those instances where group data were analyzed. Furthermore, consideration of the factor structure of processing-speed factors in relation to “traditional” cognitive abilities suggests that aside from DT, most performance-speed factors obtained in this study are independent of level abilities. In this section, attention is given to two seemingly viable explanations that are evaluated in light of present results. Decision Time (DT) as u “Biological” Correlate of Gk As noted earlier, part of the resurgent interest in speed of information-processing measures has emanated from the assumption that this reflects a biological property of the organism (e.g., H.J. Eysenck 1987a). Current findings might be construed as complementary to this account. Of all the cognitive abilities sampled in this study, Gr would appear to be most intimately linked to the biological “hardware” of the individual, and it alone shares substantial correlation with DT. Within this context, acknowledgment should be given to the distinction made between “vulnerable” and “maintained” abilities within Gr/G, theory (Horn & Hofer 1992). Note that vulnerable abilities such as Gf, decline first and foremost with age in adulthood and are also most affected by damage to the central nervous system. These abilities stand in contrast to maintained abilities such as G,, which may initially be affected by brain damage, but tend to return to premorbid levels over the passage of time. However, it appears that speed of information processing per se cannot account for the increase in correlation with Gf observed in tasks involving differential degrees of cognitive complexity. As such, the complexity of a chronometric task derives (at least in part) from the degree of compatibility evident between the S-R mapping. Komblum et al. (1990) provide a model accounting for variable-response speeds under differing degrees of compatibility. In this dimensional-overlap model they implicate attention, automatic@, and biological processes. Based on available evidence, the biological mechanisms underlying more complex tasks appear to be different from those involved in “simple” RT paradigms (see Georgopoulos et al. 1989). If correlations with intelligence (i.e., Gf) are obtained in increasingly complex tasks, the question must be asked as to whether (or not) attempting to formulate a link with biology purely through processing speed is justified. The above also suggests a possible reason for the inconclusive results obtained in research linking nerve conduction velocity (NCV) to intelligence (e.g., Reed & A.R. Jensen 1991, 1993). Within this research paradigm, psychologists remain puzzled as to why there is no relationship between RT and NCV, although there are moderate correlations between RT and intelligence and NCV and intelligence. Plausibly, correlations between RT and NCV might be higher if RT paradigms were utilized that involved lower S-R compatibility. Note, however, even this explanation is not entirely satisfactory given that NCV appears much the same in monkeys (and still lower animals) as it does in humans, making it difficult to envisage it sharing correlation with cognitive abilities within species of Homosapiens (Stankov & Roberts, 1997).

96

LEARNlNGANDlNDiVlDUALDIFFERENCES

V0LUME11.NlJMBER1.1999

Attention, IX”, and Gf An aspect of this study, which was mentioned only briefly, is that it contained within its design those conditions that allowed for a rigorous investigation of divided attention and its relationship to broad cognitive abilities (Roberts 1995). For this purpose two paradigms (card-sorting and wordclassification) were presented under single and competing task conditions and tested for quantitative structure. It was expected that the competing condition would show higher correlation with measures of intelligence than the single-task condition. This expectation was partially supported in the present study-cardsorting (competing) DT had higher correlation with Gf than with card-sorting (single) DT (see Tables 17 and 18). It was also envisaged that, having demonstrated the empirically testable quantitative property of this variable (see Stankov & Cregan 1993), correlations with cognitive abilities might increase systematically with attentional load. However, correlations across established quantitative dimensions of attentional deployment (although moderate) did not increase in a lawful way observed previously by Roberts et al. (1988). One possible explanation for this resides in the demonstrated high S-R compatibility of the word-classification task. Elsewhere, Kornblum et al. (1990) note the importance of both automaticity and attention to tasks involving high S-R compatibility (i.e., dimensional-overlap), where automaticity occurs within and between stages: Automaticity within stages means that the recoding or transformations

inside any stage occur immediately and in a preset way without any interference or intervention by monitoring or controlling processes. Automaticity between stages means that the output of any one stage is directly transmitted to and received by the subsequent stages without interference or intervention. When there is no dimensional overlap, of course, no congruent response is defined and the automatic process is nonexistent. When there is dimensional overlap, however, the activation process is automatically brought into play with the final level of readiness, or activation, of the congruent response postulated to vary with the degree of dimensional overlap. (p. 262)

In turn, Kornblum et al.‘s (1990) dimensional-overlap model links each of the RT tasks of the present study to differential degrees of automaticity within and between processing stages of response identification, selection, and programming (see Figure 1). Although this might ultimately provide a theoretically rigorous explanation of the higher correlation found between DT and Gf (for example, in the card-sorting paradigm [low S-R compatibility] relative to the Joystick Reaction Task [high S-R compatibility]), it should be noted that the present study was not designed explicitly for this purpose. A study involving the nine types of S-R ensemble provided by Kornblum (1994) in relationship to Gf/G, theory would clearly be judicious before making definitive statements. Notwithstanding, a recent investigation by Roberts (1997b), in which S-R compatibility was manipulated by having participants respond to arrows pointing up, down, left or right, provides preliminary evidence (iV = 151) for the dimensional-overlap model and its explanatory potential for understanding human cognitive abilities. In a compatible condition, participants were re-

PROCESSlNGSPEEDANDABlllTlES

97

quired to respond with letters corresponding to the direction of an arrowhead (i.e., U for up, D for down, R for right, and L for left). In an incompatible condition, participants were required to respond with the letter corresponding to the opposite direction of each arrowhead (i.e., D for up, U for down, L for right, and R for left). The correlations with markers of fluids intelligence were near zero (r = 0.01) in the compatible condition and quite high (r = 0.31) in the incompatible condition. Moreover, while the incompatible task shared salient loadings on a Gf factor, the compatible task shared loadings on a factor interpreted as G,.

A WORKING MODEL FOR SPEED-ABILITY RELATIONSHIPS Biological accounts suggesting that the correlation between speed and ability be attributed to such causes as neural oscillation, errors in neural transmission, or nerve-conduction velocity clearly need to be rejected. In their place, it is proposed that a major role be given to a psychological theory that combines attentional features of automaticity and control and has a biological link (i.e., to S-R compatibility effects). The status of these constructs within a complete model of human abilities can be understood in terms of the following analogy:

Cognitive Complexity : G,:: S-R Compatibility : G1. This analogy illustrates an important relationship between theoretical constructs (cognitive complexity and S-R compatibility) on the one hand, and individual differences constructs (accuracybased Gr and time-based Gt) on the other. This analogy does not yet constitute a theory-it is best described as a working model; further empirical data will need to be collected before it is allowed to approach the status of theory. What relationships within the analogy are well established? What relationships require further empirical evidence? The binary link on the left-hand side is well established: Intelligence has been defined in terms of dealing with the complexity of stimulus patterns (see R.B. Cattell 1950; Guttman 1992; Thomson 1939/1948). Some of the more recent demonstrations of the role of complexity of mental operations in intelligence are provided by Stankov and his associates (see Stankov 1994; Stankov & Crawford 1993; Stankov & Cregan 1993; Stankov & Raykov 1995). Within this program of research, the complexity of a task is dependent on the number (and diversity) of elementary cognitive operations that are needed to arrive at a successful solution. The relationship on the right-hand side has also gathered some empirical support: Timed measures of performance are not only sensitive to variations in S-R compatibility (in the sense that more incompatible tasks take longer time) (Kornblum et al. 1990), but also display a simplex pattern across levels of compatibility (Roberts 1997b). Thus, individual differences in mental speed are influenced by S-R compatibility manipulations. Additional research, examining properties of the tasks developed under Komblum’s (1994) dimensional-overlap model, is necessary to understand this relationship more fully. What is the relationship between cognitive complexity and S-R compatibility? It would appear that changes in S-R compatibility represent a complexity manipulation so far ignored in the literature. Again, further empirical research is needed

98

LE4RNlNGANDINDIVlDUALDIFFERENCES

V0LlJME11.NUMBER1.1999

to examine this relationship more closely. Note, however, consideration of dimensional-overlap as implicating differential degrees of complexity (a) allows insight into the cognitive processes underlying individual differences and (b) provides a taxonomic model to which the complexity of a given RT task may be referred. This latter aspect of the model obviates the largely atheoretical approach to the complexity construct previously occurring when chronometric tasks were employed. It is crucial for the reader to keep in mind that S-R compatibility effects represent a complexity manipulation relevant for aspects of mental speed, and DT in particular. It is just one of several instances of a complexity manipulation that may or may not be related to some other kind of complexity manipulation. Thus, the Swaps test, used initially by Stankov and Crawford (1993), is based on increasing the number of permutations of three letters at different levels of task difficulty. Accuracy scores from consecutively more difficult levels also exhibit increasingly greater correlation with measures of fluid intelligence (i.e., they are more complex). Interestingly, timed measures from this task (i.e., timed measure of level abilities analogous to those presented in Table 7 of this article) usually do not result in increasing correlation with external measures of intelligence. It seems reasonable to assume that S-R compatibility effects and the complexity evident in the Swaps test represent different types of the same underlying construct-namely cognitive complexity. By limiting S-R compatibility effects to mental speed or a subset thereof (e.g., DT) it is possible, at least in principle, to account for (a) the existence of a separate Gt factor and (b) some of the reported correlations between timed measures and fluid intelligence (Gt). Conceivably, some types of complexity manipulation that are clearly relevant to Gt may be closely linked to S-R compatibility effects whereas others may show no such relationship at all.

CONCLUSION Ina recent article, Carroll (1995) has attempted to establish the frames of reference for methodology in the study of individual differences. Carroll argues that the goal of this subdiscipline is the exploration of “the diversity of intellect in the people of this planet-the many forms of cognitive processes and operations, mental performances, and creations of knowledge and art” (p. 429). To achieve this end, Carroll goes on to argue, untapped domains of mental activity must be discovered and encapsulated within a comprehensive model. It is clear from the preceding survey of the literature that insufficient attention has been given to delineating a taxonomy of mental speed although “it is a basic fact of mental life that much cognitive processing is temporally limited” (Kail 1995, p. 79). The taxonomic model of mental processing speed currently proposed should be considered as a step toward redressing this imbalance. Evidence presented throughout this report also suggests that it is time for indi-

99

PROCESSING SPEEDANDABlLlTlES

vidual differences research to take stock of its attempts to derive explanatory models of human intelligence. Arguably, the future success of this pivotal undertaking, and perhaps the field as a whole, rests on two complementary directions. First, it would appear important to relate cognitive models to the various firstand second-order abilities rather than the questionable construct, psychometric g. Second, rather than capitalize on cognitive models that are worn and outdated, the individual differences researcher should invest wisely in current cognitive theory. Examining the dimensional-overlap model and its relationship to fluid intelligence constitutes one such endeavor meeting these two criteria. It is to be hoped that other (still newer) cognitive models will be applied not only to this factor but also to the many others embellished within Gf/G, theory (see Ackerman 1996).

NOTES 1. Part of this research was based on the doctoral dissertation of the first author under the supervision of the second author at the University of Sydney, Sydney, Australia. However, a significant portion of this study was written while the first author was working under the auspices of the National Research Council (NRC) and USAF Air Force Research Laboratory at Brooks AFB, Texas. Thus, we wish to acknowledge the support of all three institutions. We would also like to thank a number of individuals (in particular, Scott Chaikken, Earl Hunt, Martin Ippel, Nick Karadimis, Patrick Kyllonen, Austin PrichardLevy, Brett Myors, Klaus Obueauer, Gerry rallier, Vanessa Danthiir, Agnes Petocz, and Steve Watson) either for help with test design and construction or thoughtful comments on an earlier draft of the manuscript. 2. It would appear a little known fact that a year earlier Hick (1951) attempted to apply information-theory principles to model the concept of intelligence directly. 3. Even allowing that several studies have attempted to assess factors other than psychometric g, it would appear that this research has been limited only to fluid and crystallized intelligence constructs, not the full list of broad cognitive factors. 4. Although accuracy measures were obtained on all of these ECTs very few participants made more than a handful of errors in any task (i.e., accuracy rates approached 99% across all conditions of most ECTs). Thus, these data are not presently considered. One exception to this rule (Tachistoscopic Choice Reaction Task) requires a brief comment. Here accuracy rates were not subject to ceiling effects because of the experimental manipulation that was undertaken. In fact, in this task number correct was found to obey Hick’s law (Roberts 1995). Because this outcome is peripheral to the aims of the present study, the results obtained with this parameter are not reported (even) for this ECT (see, however, Roberts 199910). 5. Roberts, rallier, and Stankov (1996) provide a particularly detailed commentary and analysis of the subtests comprising Test 36 and their relationship to all other variables assessed in the present study. As a result, the parameters obtained from Test 36 are not generally discussed in the current study. Note, however, it is included in the design because both parameters were included in various factor analyses of mental-speed constructs. The reader interested in understanding this ECT’s microstructure and/or relationship to broad cognitive ability factors is referred to Roberts et al. (1996). 6. Carroll (1993, p. 441ff.) has, in making the distinction between number correct and

100

LEARNlNGANDINDIVlLlUAL DIFFERENCES

VOLUME1l,NLIMBER1,1999

time measures, refered to the former as a measure of “level,” the latter as a measure of “speed.” This distinction, which is borrowed from E.L. Thorndike, Bregman, Cobb, and Woodyard (1926), is acknowledged here, as it distinguishes these sets of analyses from those presented in subsequent analyses of the current investigation. Even so, measures of G, have been included in the current data set because of their frequent presence in the literature with various level abilities. 7. The scoring of Test 8 (i.e., Water Jars) is somewhat unconventional in that the measure obtained is intended to reflect participants’ ability to rid themselves of an induced mental set. If this is spontaneous, the minimum number of steps the participant would perform is 25. Thus, the closer to this value the better the performance. Summary statistics for the Water Jars Test presented in Table 2 are to be interpreted relative to this qualificationlow scores indicate better performance. 8. Throughout all present analyses, based on a sample size, N = 179, a correlation, Y = 2 0.173 corresponds to probability, p = 0.01 and a correlation, Y = f 0.229 corresponds to probability, p = 0.001. 9. The decision to use exploratory over confirmatory factor-analytic techniques was mitigated by the desire to provide data analogous to those provided in Carroll (1993). Note that while there are good reasons for using confirmatory solutions in contemporary data analysis, Carroll’s writings provide a number of rationales for exploratory factor-analytic procedures. 10. In regard to this particular interpretation it should also be noted that within Carroll’s (1993) extensive meta-analysis of the psychometric literature “no data set examined . . yielded any factor that could be identified as a factor of ‘working memory capacity”’ (p. 303). 11. Elsewhere several commentators have argued that increased complexity arising through competing tasks implicates processes that are central to fluid intelligence alone (see e.g., Roberts et al. 1991a; Stankov 1983a; Stankov & Crawford 1993). 12. In outlining a three-stratum theory of intelligence, Carroll (1993, p. 647ff.) cautions researchers against making specious claims on the basis of insufficient knowledge of hierarchical factors (i.e., Stratum III within his model). To highlight the problems inherent in this approach he reanalyzed a data set provided by Palmer et al. (1985), who argued for a general factor when it was clearly not warranted. The point of Carroll’s warning is well taken in the present analysis but even more so in light of chronometric studies that defineg as the first principal component extracted from a large and diverse battery of cognitive-ability tests (e.g., A.R. Jensen 1979,1991,1993a). Such a factor, when extracted from the 25 variables of the present study, while having somewhat higher loadings (as expected from principal components analysis), resembles in its factor pattern results presented in Table 6. Indeed, with regards to this “general” factor, the individual’s regression factor score (obtained from first principal components analysis) correlates 0.93 with their Bartlett factor score (extracted from hierarchical factoring of each of the seven second-order abilities). In light of evidence presented in these analyses, however, one would be guarded in making substantiative claims pertaining to the cognitive correlates of the ‘g’ extracted in the present study. This of course begs the question- “How general is the g extracted from study to study?” (see also Horn 1985,1998; Horn & No11 1994; Roberts et al. 199913). 13. It should not go unnoticed that until recently the status of CDS in this model was somewhat speculative. Moreover, even now the distinction between CDS and G, is largely arbitrary, being related to the difficulty of the task such that G, is “best indicated by simple tasks in which almost all people would get the right answer if the task were not highly speeded” (Horn & Hofer 1992, p. 64), whereas CDS is measured as “quickness in providing answers in tasks that require one to think” (Horn & No11 1994, p. 13). 14. It should be noted that times for the Water Jars Test were not included in these anal-

PROCESSlNGSPEEDANDABlllTlES

101

yses because the output measure obtained in this psychometric test does not correspond to what might strictly be referred to as a level measure (see note 7). 15. As mentioned earlier, the measures of G, in the present study (average time per item) represent a change in the way this factor has “traditionally” been assessed. In the present case, participants potentially have unlimited time to solve each problem. 16. Within this schema the previously identified G, factor should probably be redefined as T,--with the s now standing for “search.” However, we retain its earlier nomenclature because of the extensive literature upon which it has previously been based. 17. This factor shares some parallels with a factor that Carroll (1993) has tentatively interpreted as Speed of Reasoning (p. 463), which, in turn, is thought to constitute a primary factor of fluid intelligence (seep. 626). However, as will be demonstrated shortly, the present factor is minimally correlated with Gf. 18. Within this context it should also be pointed out that this result is completely the reverse of that obtained with RT tasks; they typically share higher correlation with level measures as they increase in complexity. One viable explanation for this outcome is that the complexity of difficult RTs is analogous to the complexity of simple speed measures from psychometric tasks. Aligned on a complexity continuum for all speed measures, they represent qualitatively distinct tasks of similar intermediate difficulty. This proposition is tested in Roberts (1995). It entailed observing significant positive correlations between complex RTs and “speed“ measures like T,,, and G,, and insignificant correlations between these speed measures and both simple RT tasks and more complex types of speed of test-taking (i.e., Ti,). 19. ECTs administered on a “face-to-face” basis resulted in more global measures of processing speed. For example, the card-sorting parameters were derived from the average number of cards sorted per unit of time. As a consequence, these processing-speed measures were based upon the mean (rather than median) index of performance. 20. We borrow this term from several authors who have employed it to describe the chronometric apparatus most frequently used in individual differences research. This consists of a home button around which eight lights are arranged in a semi-circular array. However, it should be noted that Roth’s (1964) principal contribution was to use informationtheory principles (i.e., Hick’s law) in intelligence research. The actual apparatus he used was extremely crude-even by standards of the day (see A.R. Jensen 1982a, p. 100, for a brief description). A.R. Jensen’s contribution, in contrast, would appear to be the more precise and technically sophisticated development of the apparatus used to measure parameters of RT. However, an identical task consisting of “a home key and a semi-circle of lights with response buttons directly beneath them” (Kaufman & Levy 1966, p. 967) was developed in America as early as 1965 (see Lamb & Kaufman 1965). This fact is not acknowledged in any of the contemporary individual differences literature. 21. On the basis of this account, a motivational hypothesis cannot be ignored. Perhaps in having to perform this task on so many occasions, some participants simply became bored, in turn attenuating the obtained correlations. 22. A.R. Jensen (1987a) discusses this construct in his meta-analysis, concluding that the apparatus he has most frequently used has high S-R compatibility. Given studies that have demonstrated (a) that this paradigm is open to practice effects (e.g., Widaman & Carlson 1989), (b) that it is possible to develop tasks having considerably higher compatibility (e.g., A.R. Jensen 1987a), and (c) tasks having high S-R compatibility conform to Longstreth et al.‘s Power law (rather than Hick’s law) (see Roberts 1999b), there would appear to be no grounds for this assumption per se. 23. Matthews and Dorn (1989) label one of the tasks used in their study an incompatible response task, qualifying this claim later by noting “this is a somewhat atypical form of S-R

102

LEARNlNGANDlNDlVlDUALDIFFERENCES

VOLUMEll.NUMBER1.1999

compatibility, since it depends on habit reversal rather than on any intrinsically incompatible S-R mapping” (p. 305). 24. Several authors have commented on the interaction between S-R compatibility and the number of stimulus alternatives-all of which indicate the importance of the former feature in the current tasks. For example, Broadbent (1971) notes “the degree of naturalness or obviousness of the response appropriate to a particular signal alters the effect of the number of alternatives” (p. 282). Equally, Theios (1975) states that “if the response code is similar to the name code (or highly practiced) then the response time determination is small and relatively independent of. . . the number of alternatives” (p. 242). 25. We do not believe this criticism should be reserved simply for information-theory approaches to ECTs, as it appears to be also true of disparate theories that make use of the Saul Stemberg scanning paradigm, the Stroop paradigm, and many others. The dimensional-overlap model explicated within these passages would appear to provide a unifying taxonomy and theoretical model for each of these types of effect (see Komblum 1994; Komblum et al. 1990). 26. These outcomes actually occur within the same task (i.e., Test 30), where the lower correlation emanates from the one-bit condition, the higher from the three-bit condition. All other correlations within this task are closer to the lower range except the zero-bit condition, where the correlation is moderate (r = -0.36). This poses a further problem in interpreting the intraindividual variability construct. 27. Even so, some caution should be exercised in interpreting the results that follow, largely because in the present study, psychometric measures were originally selected within the framework of a higher-stratum design. It is likely that psychometric speed factors are broader at the first-order of analysis than would be the case for either the DT or MT factors. 28. Horn and Hofer’s (1992) account of the speed factors within G f/G, theory suggests two broad speed factors tied to psychometric tasks-CDS (akin to the TV,, and Ti factors extracted in this study) and G,. 29. The interpretation of culled data is, irrespective of the present outcome, more problematic than its advocates would have readers believe. The problem rests with the conclusions that may be drawn. Any inferences become limited to a subsample of the population (i.e., it limits the population with respect to intelligence measures). This state of affairs represents something of a trade-off between theoretical explanation and generality. In other words, theories should apply to the intelligence of the people, not to a subset who fit a theoretical model.

REFERENCES Ackerman, P.L. (1996). “A theory of adult intellectual development: Process, personality, interests, and knowledge.“ Intelligence, 22,227-257. Agrawal, R. & A. Kumar. (1993). “The relationship between intelligence and reaction time as a function of task and person variables.” Personality and Individual D&Ferences, 14, 287-288. Anstey, K. (1997). Predictors of cognitive performance in old age: The role of contextual, cognitive, sensorimotor and physiological variables. Unpublished Ph.D. thesis, The University of Queensland, Brisbane, Australia. Attneave, F. (1959). Applications of information theory to psychology. New York: Holt, Rinehart & Winston.

Barratt, P.E.H. (1953). “Imagery and thinking.” Australian Journal ofPsy~hology, 5,154-164. Barratt, P.E.H. (1956). “The role of factors in ability theory.” Australian Journal OfPsych0@/, 8,93-105. Barrett, P., H.J. Eysenck, & S. Lucking. (1986). “Reaction time and intelligence: A replicated study.“ Intelligence, 10,940. Bates, T.C., & H.J. Eysenck. (1993). “Intelligence, inspection time, and decision time.” Intelligence, 17,523-531. Bayley, N. (1949). “Consistency and variability in the growth of intelligence from birth to eighteen years.” Journal of Genetic l+yckology, 75,165-196. Beh, H.C., R.D. Roberts, & T. Pearse. (1491). Processing speed and intelligence: A developmental perspective. Paper presented at the 18th Annual Experimental Psychology Conference, Flinders University, South Australia, September. Beh, H.C., R.D. Roberts, & A. Prichard-Levy. (1994). “The relationship between intelligence and choice reaction time within the framework of an extended model of Hick’s law: A preliminary report.“ Personality and Individual Diflerences, 16,891-897. Berger, M. (1982). “The ‘scientific approach’ to intelligence: An overview of its history with special reference to mental speed.” Pp. 1343 in A modelfor intelligence, edited by H.J. Eysenck. New York: Springer-Verlag. Binet, A., & T. Simon. (1905a). “Sur la necessite d’etablir un diagnostic scientifique des etats inferieurs de l’intelligence.” L’Annee Psyckologique, 21,163-190. Binet, A., & T. Simon. (1905b). “Methodes nouvelles pour le diagnostic du niveau intellectuel des anormaux.” L’Annee Psyckologique, 12,191-244. Binet, A., & T. Simon. (1983). The development of intelligence in children. Salem, NH: Clyer. (Original work published 1916) Bittner, A.C. Jr., R.C. Carter, R.S. Kennedy, M.M. Harbeson, & M. Krause. (1986). “Performance evaluation tests for environmental research (PETER): Evaluation of 114 measures.” Perceptual and Motor Skills, 63,683-708. Bors, D.A., & B. Forrin. (1995). “Age, speed of information processing, recall and fluid intelligence.” Intelligence, 20, 229-248. Bors, D.A., C.M. MacLeod, & B. Forrin. (1993). “Eliminating the IQ-RT correlation by elirninating an experimental confound.” Intelligence, 17,475-500. Bowling, A.C., & B.D. Mackenzie. (1996). “The relationship between speed of information processing and cognitive ability.“ Personality and individual Differences, 20,775-800. Boyle, G.J., L. Stankov, & R.B. Cattell. (1995). “Measurement and statistical models in the study of personality and intelligence.” Pp. 417446 in International handbook of personality 6 intelligence, edited by D. Saklofske & M. Zeidner. New York: Plenum Publishing. Brainard, R.W., T.S. Irby, P.M. Fitts, & E.A. Alluisi. (1962). “Some variables influencing the rate of gain of information.” journal of Experimental Psychology, 63,105-110. Brand, C.R. (1987). “The importance of general intelligence.” Pp. 251-265 in Arthur Jensen: Consensus and controversy, edited by S. Modgil & C. Modgil. London: Falmers Press. Brand, C.R. (1996a). “Doing something about g. Intelligence, 22,311-326. Brand, C.R. (199613). The g factor. Unpublished manuscript. Bricker, P.D. (1955a). “The identification of redundant stimulus patterns.” Journal of Experimental Psychology, 49,73-81. Bricker, P.D. (1955b). “Information measurement and choice time: A review.” Pp. 160-172 in Information theory in psychology, edited by H. Quastler. Glencoe, IL: The Free Press. Broadbent, D.E. (1971). Decision and stress. New York: Academic Press. Brody, N. (1992). Intelligence (2nd ed.). New York: Academic Press. Buckhalt, J.A., T.G. Reeve, & L.A. Dornier. (1990). “Correlations of movement time and intelligence: Effects of simplifying response requirements.” Intelligence, 14,481491.

104

LEARNINGANDINDIVIDUALDIFFERENCES

VOLUME11,NUMEERl,1999

Burt, CR. (1954). “Differentiation of intellectual ability.” British ]ournal of Educational Psychology, 24, 76-90. Carlson, J.S., & C.M. Jensen. (1982). “Reaction time, movement time and intelligence: A replication and extension.” Infelligence, 6,265-274. Carlson, J.S., C.M. Jensen, & K.F. Widaman. (1983). “Reaction time, intelligence and attention.“ Intelligence, 7, 329-344. Carlson, J.S., & K.F. Widaman. (1987). “Elementary cognitive correlates of g: Progress and prospects.“ Pp. 69-100 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Carroll, J.B. (1978). “How shall we study individual differences in cognitive abilities?Methodological and theoretical perspectives.” Intelligence, 2,87-115. Carroll, J.B. (1981). “Ability and task difficulty in cognitive psychology.” Educational Researcher, 2 0, 1 l-21. Carroll, J.B. (1987). “Jensen’s mental chronometry: Some comments and questions. With a reply to Professor Eysenck.” Pp. 297-307 in Arthur lensen: Consensus and controversy, edited by S. Modgil & C. Modgil. London: Falmers Press. Carroll, J.B. (1993). Human cognitive abilities: A survey of factor-analytic studies. New York: Cambridge University Press. Carroll, J.B. (1995). “On methodology in the study of cognitive abilities.” Multivariate Behavioral Research, 30,429452. Case, R. (1995). “Capacity-based explanations of working memory growth.” Pp. 2344 in Memory performance and competencies: Issues in growth and development, edited by F.E. Weinert & W. Scheider. Mahwah, NJ: Erlbaum. Cattell, J. McKeen (1890). “Mental tests and measurements.” Mind, 15,373-380. Cattell, R.B. (1950). Personality: A systematic andfactual study. New York: McGraw-Hill. Cattell, R.B. (1963). “Theory of fluid and crystallized inteligence: A critical experiment.” Journal of Educational Psychology, 54, l-22. Cattell, R.B. (1966). “The screen test for the number of factors.” Multivariate Behavioral Research, 1,245-276. Cattell, R.B. (1971). Abilities: Their structure, growth and measurement. Boston: Houghton Mifflin. Cattell, R.B. (1982). The inheritance of personality and ability. New York: Academic Press. Cattell, R.B. (1987). Intelligence: Its structure, growth and uction. Amsterdam: North-Holland. Cattell, R.B., & J.L. Horn. (1978). “A check on the theory of fluid and crystallized intelligence with description of new test designs.“ lournal of Educational Measurement, 15, 139-164. Cohen, J.D., K. Dunbar, & J.L. McClelland. (1990). “On the control of automatic processes: A parallel distributed processing account of the Stroop effect.” Psychologicul Review, 97,332-361. Cohn, G.J., J.S. Carlson, & A.R. Jensen. (1985). “Speed of information processing in academically gifted youths.” Personality and Individual Differences, 6,621-629. Cornelius, SW., S.L. Willis, J.R. Nesselroade, & P.B. Baltes. (1983). “Convergence between attention variables and factors of psychometric intelligence in older adults.” Infelligence, 7,253-269. Crawford, J.D. (1988). Intelligence, task complexity and tests of sustained attention. Unpublished Ph.D. dissertation. University of New South Wales, Sydney, Australia. Crawford, J.D. (1991). “The relationship between tests of sustained attention and fluid intelligence.” Personality and Individual Differences, 12,599-611. Crick, F. (1993). Astonishing hypothesis: The scientific search for the soul. New York: Maxwell Macmillan. Cricket Software Inc. (1991). Cricket graph [Versoin 2.3.11. Malvem, PA: Cricket Software.

PROCESSlNGSPEEDANDABlllTlES

105

Crossman, E.R.F.W. (1953). “Entropy and choice reaction time: The effect of frequency imbalance on choice-response.” The Quarterly journal of Experimental Psychology, 5,41-51. Davies, M., L. Stankov, & R.D. Roberts. (1998). “‘Emotional Intelligence’: In search of an elusive construct.” Journal of Personality and Social Psychology. In press. Deary, I.J., V. Egan, G.J. Gibson, E.J. Austin, C.R. Brand, T. Kellaghan. (1996). “Intelligence and the differentiation hypothesis.” Intelligence, 23,105-132. DeJong, R., C.-C. Liang, & E. Lauber. (1994). “Conditional and unconditional automaticity: A dual-process model of effects of spatial-response correspondence.” Journal of Experimental Psychology: Human Perception and Performance, 20,731-750. Detterman, D.K. (1982). “Does g exist?” Intelligence, 6,99-108. Detterman, D.K. (1987). “What does reaction time tell us about intelligence?” Pp. 177-200 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Diascro, M.N., & N. Brody. (1994). “Odd-man-out and intelligence.” Intelligence, 29,79-92. Drake, R.M. (1954). Manual for Drake musical aptitude tests. Chicago: University of Chicago Press. Draycott, S.G., & I’. Kline. (1994a). “Speed and ability: A research note.“ Personality and Individual Difierences, 17,763-768. Draycott, S.G., & I’. Kline. (1994b). “Further investigations into the nature of the BIP: A factor analysis of the BIP with primary mental abilities.” Personality and lndividual Differences, 17,201-209. Eimer, M. (1995). “Stimulus-response compatibility and automatic response activation: Evidence from psychophysiological studies.” Journal of Experimental Psychology: Human Perception and Performance, 21,837-854. Ekstrom, R.B., J.W. French, & H.H. Harman, with D. Dermen (1976). Manual for Kit ofFactor-Referenced Cognitive Tests, 2976. Princeton, NJ: Educational Testing Service. Ekstrom, R.B., J.W. French, & H.H. Harman. (1979). “Cognitive factors: Their identification and replication.“ Multivariate Behavioral Research Monographs, No. 2. Eysenck, H.J. (1939). “Review of Thurstone’s primary mental abilities, 1938.” British Journal of Educational Psychology, 9,270-275. Eysenck, H.J. (1967). “Intelligence assessment: A theoretical and experimental approach.” British Journal of Educational Psychology, 37,81-98. Eysenck, H.J. (Ed.) (1973). The measurement ofintelligence. Baltimore: Williams & Wilkins. Eysenck, H.J. (1984). “The place of individual differences in a scientific psychology.” Pp. 233-285 in Annals of theoretical psychology, edited by J.R. Royce & L.P. Mos. New York: Plenum Press. Eysenck, H.J. (1986). “Toward a new model of intelligence.” Personality and individual Differences, 7,731-736. Eysenck, H.J. (1987a). “Speed of information processing, reaction time, and the theory of intelligence.“ Pp. 21-68 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Eysenck, H.J. (1987b). “Intelligence and reaction time: The contribution of Arthur Jensen. With a reply to Professor Carroll.” Pp. 285-295 in Arthur Jensen: Consensus and controversy, edited by S. Modgil & C. Modgil. London: Falmers Press. Eysenck, H.J. (1995). “Can we study intelligence using the experimental method?” Intelligence, 20,217-228. Eysenck, M.W. (1982). Attention and arousal: Cognition and performance. Berlin Springer-Verlag. Fitts, P.M. (1954). “The information capacity of the human motor system in controlling the amplitude of movement.” Journal of Experimental Psychology, 47,381-391. Fitts, P.M., & R.L. Deininger. (1954). “S-R compatibility: Correspondence among paired el-

106

LEARNlNGANDINDIVlDUALDIfFEflfNCES

VOLUMEll,NUMBER1.1999

ements within stimulus and response codes.” Journal of Experimental Psychology, 48, 483491. Fitts, P.M., & J.R. Peterson. (1964). “Information capacity of discrete motor responses.” Journal of Experimental Psychology, 67,103-112. Fitts, P.M., & C.M. Seeger. (1953). “S-R compatibility: Spatial characteristics of stimulus and response codes.“ Journal of Experimental Psychology, 46,199-210. Fitts, P.M., & G. Switzer. (1962). Cognitive aspects of information processing: I. The familiarity of S-R sets and subsets.“ Journal of Experimental Psychology, 63,312-329. Fleishman, E.A. (1954). “Dimensional analysis of psychomotor abilities.” Journal of Experimental Psychology, 48,437454. Fleishman, E.A. (1964). The structure and measurement of physical fitness. Englewood Cliffs, NJ: Prentice-Hall. Fleishman, E.A. (1972). “Structure and measurement of psychomotor abilities.” Pp. 78-106 in The psychomotor domain: Movement behaviors, edited by R.N. Singer. Philadelphia: Lea and Febiger. Fleishman, E.A., & M.K. Quaintance. (1984). Taxonomies of human performance: The description ofhuman tasks. Orlando, FL: Academic Press. Fogarty, G. (1984). The structure of abilities underlying performance on competing tasks. Unpublished Ph.D. dissertation, University of Sydney, Sydney, Australia. Fogarty, G. (1987). “Time sharing in relation to broad ability domains.” Intelligence, 21,207-231. Fogarty, G., & L. Stankov. (1982). “Competing tasks as an index of intelligence.” Personality and Individual Dzzerences, 3,407422. Fogarty, G., & L. Stankov. (1988). “Abilities involved in performance on competing tasks.” Personality and lndividual Differences, 9,3549. Frearson, W.M., P.T. Barrett, & H.J. Eysenck. (1988). “Intelligence, reaction time and the effects of smoking.“ Personality and Individual Differences, 9,497-517. Frearson, W.M., & H.J. Eysenck. (1986). “Intelligence, reaction time (RT) and a new “oddman-out” RT paradigm.“ Personality and Individual Differences, 7,808-817. Frearson, W.M., H.J. Eysenck, & P.T. Barrett. (1990). “The Furneaux model of human problem solving: Its relationship to reaction time and intelligence.” Personality and Individual Differences, 2 1,239-257. French, J.W. (1957). “The factorial invariance of pure-factor tests.” Journal of Educational Psychology, 48,93-109. Fry, A.F., & S. Hale. (1996). “Processing speed, working memory, and fluid intelligence: Evidence for a developmental cascade.” Psychological Science, 7, 237-241. Fumeaux, D. (1952). “Some speed, error and difficulty relationships within a problemsolving situation.“ Nature, 170,37-38. Fumeaux, D. (1960). “Intellectual abilities and problem solving behaviour.” Pp. 167-193 in Handbook of abnormal psychology edited by H.J. Eysenck. New York: Basic Books. Galton, F. (1883). Inquiries into humanfaculty and its development. London: Macmillan. Galton, F. (1908). Memories of my life. London: Methuen. Garner, W.R. (1962). Uncertainty and structure as psychological concepts. New York: Wiley. Garrett, H.E. (1946). “A developmental theory of intelligence.” American Psychologist, 1,372-378. Georgopoulos, AI’., J.T. Luruito, M. Petrides, A.B. Schwartz, & J.T. Massey. (1989). “Mental rotation of the neuronal population vector.“ Science, 243,234-236. Guadagnoli, M. A., & T.G. Reeve. (1994). “Stimulus-response compatibility and motor-programming effects: A test of theoretical accounts of compatibility.” Human Performance, 7,291304. Gustaffson, J-E. (1984). “A unifying model for the structure of intellectual abilities.” Intelligence, 8, 179-203.

fROCESSlNGSPEEUANDABll/T/ES

107

Gustafsson, J-E. (1988). “Hierachical models of individual differences in cognitive abilities.” Pp. 35-71 in Advances in the psychology ofhuman intelligence (Vol. 4), edited by R.J. Stemberg. Hillsdale, NJ: Erlbaum. Gustafsson, J-E. (1992a). “The relevance of factor analysis for the study of group differences.” Multivariate Behavioral Research, 27,239-248. Gustafsson, J-E. (1992b). “The ‘Spearman Hypothesis’ is false.” Multivariate Behavioral Research, 27‘265-267. Guttman, L. (1954). “A new approach to factor analysis: The radex.” Pp. 258-348 in Mathematical thinking in fhe social sciences, edited by P.F. Lazarsfeld. Glencoe, IL: Free Press. Guttman, L. (1955). “A generalized simplex for factor analysis and a faceted definition of intelligence.” Psychometrika, 20,173-192. Guttman, L. (1965). “The structure of the interrelations among intelligence tests.” In Proceedings of the 1964 invitational Conference on Testing Problems. Princeton, NJ: Educational Testing Service. Guttman, L. (1992). “The irrelevance of factor analysis for the study of group differences.” Multivariate Behavioral Research, 27,175-204. Hakstian, A. R., & R. B. Cattell. (1974). “The checking of primary mental ability structure on a broader basis of performances.“ British Journal of Educational Psychology, 44, 140154. Hakstian, A.R., & R.B. Cattell. (1978). “Higher-stratum ability structures on a basis of twenty primary abilities.“ Journal of Educational Psychology, 70,657-669. Hale, S., & J. Jansen. (1994). “Global processing-time coefficients characterize individual and group differences in cognitive speed.“ Psychological Science, 5,384-389. Harrison, R. (1941). “Personal tempo and the interrelationships of voluntary and maximal rates of movements.“ Journal of General Psychology, 24,343-379. Hermstein, R.J., & C. Murray. (1994). The bell curve: Intelligence and class structure in American life. New York: Free Press. Hick, W.E. (1951). “Information theory and intelligence tests.” British Journal of Psychology, 4,157-164. Hick, W.E. (1952). “On the rate of gain of information.” Quarterly Journal of Experimental Psychology, 4,11-26. Hofstaetter, P.R. (1954). “The changing composition of intelligence: A study in T-technique.” Journal of Genetic Psychology, 85,159-164. Horn, J.L. (1976). “Human abilities: A review of research and theory in the early 1970s.” Annual Review of Psychology, 27,437485. Horn, J.L. (1979). “The rise and fall of human abilities.” ]ournal of Research and Development in Education, 12,59-79. Horn, J.L. (1980). “Concepts of intellect in relation to learning and adult development.” Zntelligence, 4, 285-317. Horn, J.L. (1985). “Remodelling old models of intelligence.” Pp. 267-300 in Handbook of intelligence: Theories, measurements and applications, edited by B.B. Wolman. New York: Wiley. Horn, J.L. (1986). “Intellectual ability concepts.” Pp. 201-261 in Advances in the psychology of human intelligence, edited by R.J. Stemberg. Hillsdale, NJ: Erlbaum. Horn, J.L. (1987). “A context for understanding information processing studies of human abilities.“ Pp. 201-238 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Horn, J.L. (1988). “Thinking about human abilities.” Pp. 645-648 in Handbook of multivariate experimental psychology, edited by J.R. Nesselroade & R.B. Cattell. New York: Plenum Press.

108

LEARNING AND /ND/V/DUAL DIFFERENCES

VOLUMEll.NUMBEA 1,1999

Horn, J.L. (1989). “Cognitive diversity: A framework of learning.” Pp. 61-116 in Learning and individual differences: Advances in theory and reserack. A series of books in psychology, edited by P.L. Ackerman, R.J. Stemberg, & R. Glaser. New York: W.H. Freeman. Horn, J.L. (1998). “A basis for research on age differences in cognitive capabilities.” Pp. 5791 in Human cognitive abilities in theory and practice, edited by JJ. McArdle & R.W. Woodcock. Chicago, IL: Riverside. Horn, J.L., & R.B. Cattell. (1966). “Refinement of the theory of fluid and crystallized general intelligences.“ Journal of Educational Psychology, 57,253-270. Horn, J.L., & R.B. Cattell. (1967). “Age differences in fluid and crystallized intelligence.” Acta Psyckologica, 26, 107-129. Horn, J.L., & G. Donaldson. (1980). “Cognitive development: II. Adulthood development of human abilities.“ Pp. 445-529 in Constancy and change in human development: A volume of review essays, edited by O.G. Brim & J. Kagan. Cambridge, MA: Harvard University Press. Horn, J.L., G. Donaldson, & R. Engstrom. (1981). “Apprehension, memory and fluid intelligence decline in adulthood.“ Research on Aging, 3,33-84. Horn, J.L., & S.M. Hofer. (1992). “Major abilities and development in the adult period.” Pp. 44-99 in Intellectual development, edited by R.J. Stemberg & C. Berg. New York: Cambridge University Press. Horn, J.L., & J. NOB. (1994). “A system for understanding cognitive capabilities: A theory and the evidence on which it is based.“ Pp. 151-203 in Current topics in human intelligence: Volume 4, edited by D.K. Detterman. New York: Springer-Verlag. Horn, J.L., & L. Stankov. (1982). “Auditory and visual factors of intelligence.” Intelligence, 6, 165-185. Houlihan, M., K. Campbell, & R.M. Stelmack. (1994). “Reaction time and movement time as measures of stimulus evaluation and response processes.” Intelligence, 28,289-307. Humphreys, L.G. (1962). “The organization of human abilities.” American Psychologist, 27, 475483. Hunt, E.B. (1974). “Quote Raven? Nevermore.” Pp. 129-157 in Knowledge and cognition, edited by L.W. Gregg. Potomac, MD: Erlbaum. Hunt, E.B. (1978). “Mechanisms of verbal ability.” Psychological Review, 85,109-130. Hunt, E.B. (1980). “Intelligence as an information-processing concept.” British Journal of Psychology, 71,449474. Hunt, E.B., & M. Lansman. (1982). “Individual differences in attention.” Pp. 207-254 in Advances in the psychology of human intelligence, edited by R.J. Stemberg. Hillsdale, NJ: Erlbaum. Hyman, R. (1953). “Stimulus information as a determinant of reaction time.” Journal of Experimental Psychology, 45,188-196. Jackson, M.D., & J.L. McClelland. (1979). “Processing determinants of reading speed.” Journal of Experimental Psychology: General, 708, 151-181. Jacobs, P.I., & M. Vandeventer. (1968). “Progressive matrices: An experimental, developmental, nonfactorial analysis.” Perceptual and Motor Skills, 27, 759-766. Jenkinson, J.C. (1983). “Is speed of information processing related to fluid or crystallized intelligence?” Intelligence, 7,91-106. Jensen, A.R. (1979). “g: Outmoded theory or unconquered frontier?” Creative Science and Technology, 2,16-29. Jensen, A.R. (1980). Bias in mental testing. New York: Free Press. Jensen, A.R. (1982a). “Reaction time and psychometric 8.” Pp. 93-132 in A model for intelligence, edited by H.J. Eysenck. New York: Springer-Verlag. Jensen, A.R. (1982b). “The chronometry of intelligence.” Pp. 255-310 in Advances in the psychology ofkuman intelligence (Vol. l), edited by R.J. Sternberg. Hillsdale, NJ: Erlbaum.

PROCESSINGSPEED AND ABlLlTlES

109

Jensen, A.R. (1984a). individual and group differences in intelligence and speed of information processing. Unpublished manuscript, University of California, Berkeley. Jensen, A.R. (1984b). “Alternative models of g.” Invited Address, Gatlinburg Conference. Jensen, A.R. (1985). “The nature of black-white differences on various psychometric test: Spearman’s hypothesis.“ The Behavioral and Brain Sciences, 8,193-263. Jensen, A.R. (1986). “g: Artifact or reality?” Journal of Vocational Behavior, 29,301-331. Jensen, A.R. (1987a). “Individual differences in the Hick paradigm.” Pp. 101-176 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Jensen, A.R. (1987b). “Process differences and individual differences in some cognitive tasks.“ Intelligence, 11, 107-136. Jensen, A.R. (1991). “Speed of elementary cognitive processes: A chronometric anchor for psychometric tests of g. Psychological Test Bulletin, 4‘59-70. Jensen, A.R. (1992a). “The importance of intraindividual variation in reaction time.” Personality and Individual Differences, 13,869-891. Jensen, A.R. (199213). “Understanding g in terms of information processing.” Educational Psychology Review, 4,271-308. Jensen, A.R. (1992~). “Spearman’s hypothesis: Methodology and evidence.” Multivariate Behavioral Research, 27,225-234. Jensen, A.R. (1993a). “Why is reaction time correlated with psychometric g?” Current Directions in Psychological Science, 2, 53-56. Jensen, A.R. (1993b). “Spearman’s hypothesis tested with chronometric information processing tasks.“ Intelligence, 17,47-77. Jensen, A.R. (1996). The g factor. Unpublished manuscript. Jensen, A.R., G.E. Larson, & S.M. Paul. (1988). “Psychometric g and mental processing speed on a semantic verification test.” Personality and lndividual Differences, 9,243-255. Jensen, A.R., & E. Munro. (1979). “Reaction time, movement time and intelligence.” lntelligence, 3,121-126. Jensen, A.R., & P.A. Vernon. (1986). “Jensen’s reaction time studies: A reply to Longstreth.” Intelligence, 10, 153-179. Jensen, A.R., & L-J. Weng. (1994). “What is a good g?” Intelligence, 28,231-258. Juhel, J. (1991). “Relationships between psychometric intelligence and information-processing speed indexes.“ European Bulletin of Cognitive Psychology, 12, 1,73-105. Kahneman, D., & A. Treisman. (1984). “Changing views of attention and automaticity.” Pp. 29-61 in Varieties ofattention, edited by R Parasuraman & D.R. Davies. Orlando, FL: Academic Press. Kail, R. (1995). “Processing speed, memory, and cognition.” Pp. 71-88 in Memory performance and competencies: lssues in growth and development, edited by F.E. Weinert & W. Scheider. Mahwah, NJ: Erlbaum. Kane, H.D., B.E. Procotor, & J.H. Kranzler. (1997). “Reliability and validity of a non-verbal measure of the speed and efficiency of long-term memory retrieval.” Personality and lndividual Differences, 22,127-130. Kaufman, H., & R.M. Levy. (1966). “A further test of Hicks Law with unequally likely alternatives.” Perceptual and Motor Skills, 22‘967-990. Keating, DE., & D.J. MacLean. (1987). “Cognitive processing, cognitive ability, and development: A reconsideration.“ Pp. 239-270 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Komblum, S. (1967). “Choice reaction time for repetitions and non-repetitions: A re-examination of the information hypothesis.“ Pp. 178-187 in Attention and performance 1, edited by A.F. Sanders. Amsterdam: North-Holland. Kornblum, S. (1968). “Serial-choice reaction time: Inadequacies of the information hypothesis.” Science, 259,432434.

LEARNlNGANDlNDlWDUAL DIFFERENCES

110

VOLUME11.NUMBER1.1999

Kornblum, S. (1994). “The way irrelevant dimensions are processed depends on what they overlap with: The case of Stroop- and Simon-like stimuli.” Psychological Research, 56, 130-135. Komblum, S., T. Hasbroucq, & A. Osman. (1990). “Dimensional overlap: Cognitive basis for stimulus-response compatibility-a model and taxonomy.” Psychological Review,

97,253-270. Komblum, S., & J.-W. Lee. (1995). “Stimulus-response compatibility with relevant and irrelevant stimulus dimensions that do and do not overlap with the response.” Journal of

Experimental Psychology: Human Perception and Performance, 21,855-875. Kotovsky, K., & H.A. Simon. (1973). “Empirical tests of a theory of human acquisition of concepts for sequential patterns.” Cognitive Psychology, 4,399-424. Kranzler, J.H. (1990). The nature of intelligence: A unitary process or u number of independent processes? Unpublished doctoral dissertation, University of California at Berkeley. Kranzler, J.H. (1992). “A test of Larson and Alderton’s (1990) worst performance rule of reaction time variability.“ Personality and Individual Difherences, 13,255-261. Kranzler, J.H., & A.R. Jensen. (1991a). “The nature of psychometric g: Unitary process or a number of independent processes?“ Intelligence, 15,379422. Kranzler, J.H., & A.R. Jensen. (1991b). “Unitary g: Unquestioned postulate or empirical fact?” Intelligence, 15,437448. Kranzler, J.H., P.A. Whang, & A.R. Jensen. (1988). “Jensen’s use of the Hick paradigm: Visual attention and order effects.” Intelligence, 22,379-391. Kyllonen, P.C. (1991). “Principles for creating a computerized test battery.” Intelligence, 25,

1-15. Kyllonen, P.C. (1998). Smart testing. Manuscript submitted for publication. Kyllonen, P.C., & R.E. Christal. (1990). “Reasoning ability is (little more memory capacity?!” Intelligence, 14,389433. Lamb, J.C., & H. Kaufman. (1965). “Information transmission with equally tives.“ Perceptual and Motor Skills, 21,255-259. Lansman, M., G. Donaldson, E.B. Hunt, & S. Yantis. (1982). Ability factors processes.“ Intelligence, 6,347-386. Lansman, M., & E.B. Hunt. (1982). “Individual differences in secondary task

than) working likely alternaand cognitive performance.”

Memory and Cognition, lO, lO-24. Larson, G.E., & D.L. Alderton. (1990). “Reaction time variability and intelligence: A ‘worst performance’ analysis of individual differences.” Intelligence, l4,309-325. Larson, G.E., CR. Meritt, & SE. Williams. (1988). “Information processing and intelligence: Some implications of task complexity.” Intelligence, 22,131-147. Larson, G.E., & D.P. Saccuzzo. (1986). “Jensen’s reaction-time experiments: Another look.”

Intelligence, 10, 231-238. Larson, G.E., & D.P. Saccuzzo. (1989). “Cognitive correlates of general intelligence: Toward a process of g.” Intelligence, 13,5-31. Lawley, D.N. (1940). “The estimation of factor loadings by the method of maximum likelihood.“ Proceedings of the Royal Society of Edinburgh, 60,64-82. Lawley, D.N., & A.E. Maxwell. (1963). Factor analysis as a statistical method. London: Butterworths. Lehrl, S., & B. Fischer. (1988). “The basic parameters of human information processing: Their role in the determination of intelligence.” Personality and Individual Differences, 9, 883-

896. Lehrl, S., & B. Fischer. (1990). “A basic information psychological parameter (BP) for the reconstruction of concepts of intelligence.” European ]ournal of Personality, 4,259-286. Lezak, M.D. (1983). Neuropsychologiacl assessment. New York: Oxford University Press.

PROCESSNGSPEEOAIVDA~/L/T~ES

111

Lindley, R.H., W.R. Smith, & TJ. Thomas. (1988). “The relationship between speed of information processing as measured by timed paper-and-pencil tests and psychometric intelligence.” Intelligence, 22, 17-25. Lmdley, R.H., SM. Wilson, W.R. Smith, & K. Bathurst. (1995). “Reaction time (RT) and the shape of the task complexity function.” Personality and individual Di@rences, 18,339-345. Lochman, D.F. (1979). Spatial ability: A review and reanalysis offke correlational literature. Stanford, CA: Aptitude Research Project, School of Education, Stanford University Technical Report No. 8. Lohman, D.F. (1994). “Component scores as residual variation (or why the intercept correlates best).” Infelfigence, 19, l-11. Lohman, D.F., J.W. Pellegrino, D.L. Alderton, & J.W. Regian. (1987). “Dimensions and components of individual differences in spatial abilities.” Pp. 253-312 in lnfelkgence and cognition: Contemporayframes of references, edited by S.H. Irvine & SE. Newstead. Dordrecht: Martinus Nijhoff. Longstreth, L.E. (1984). “Jensen’s reaction time investigations of intelligence: A critique.” Intelligence, 8, 139-160. Longstreth, L.E. (1986). “The real and the unreal: A reply to Jensen and Vernon.” Infelligence, 20,181-191. Longstreth, L.E., N. El-Zahhar, & M.B. Alcorn. (1985). “Exceptions to Hick’s Law: Explorations with a response duration measure.” Journal of Experimental Psychology: General, 114‘417-434. Lute, R.D., & J.W. Tukey. (1964). “Simultaneous conjoint measurement: A new type of fundamental measurement.” Journal @Mathematical Psychology, 1, l-27. Lynn, R., C. Cooper, & S. Topping. (1990). “Reaction times and intelligence.” Current Psychology Research and Reviews, 9,264-276. Lynn, R., & M. Holmshaw. (1990). “Black-White differences in reaction times and intelligence.“ Social Behavior and Personality, 18, 299-308. Lynn, R., & K. Owen. (1994). “Spearman’s hypothesis and test score differences between Whites, Indians and Blacks in South Africa.” Journal of General Psychology, 221,27-36. McArdle, J.J., & J.L. Horn. (1983). Validation by systems modefling ofWAK abilities. Baltimore: National Institute of Aging. McCall, R.B., MI. Appelbaum, & P.S. Hogarty. (1973). “Developmental changes in mental performance.” Monographs of the Society for Research in Child Development, 38, l-83. McDonald, R.P. (1985). Factor analysis and related methods. Hillsdale, NJ: Erlbaum. MacLeod, CM. (1991). “Half a century of research on the Stroop effect: An integrative review.” Psychological Bulletin, 209,163-203. Mangan, G.L. (1959). “A factorial study of speed, power and related temperament variables.” British Journal of Educational Psychology, 29,144-154. Marr, D.B., & RJ. Stemberg. (1987). “The role of mental speed in intelligence: A triarchic perspective.“ Pp. 271-294 in Speed of information-processing and intelligence, edited by P.A. Vernon. Nor-wood, NJ: Ablex. Marshalek, B., D.F. Lohman, & R.E. Snow. (1983). “The complexity continuum in the radex and hierarchical models of intelligence.” Intelligence, 7, 107-127. Matarazzo, J.D. (1972). Wecksler’s measurement and appraisal of adult intelligence, (5th ed.). Baltimore: Williams & Wilkins. Matthews, G., & L. Dom. (1989). “IQ and choice reaction time: An information processing analysis.” Intelligence, 13,299-317. Merkel, J. (1885). “Die zeitlichen Verhpltnisse der Willenstptigkeit.” Pkilosopkiscke Sfudien, 2,73-127. Michell, J. (1990). An introduction to the logic ofpsychological measurement. Hillsdale, NJ: Erlbaum.

LEARNINGANDINDIVIDUALDIFFERENCES

112

Miller, E.M. (1994). “Intelligence and brain myelination:

VOLUMEll,NUMBER1,1999

A hypothesis.”

Personality and In-

dividual Di’erences, 17,803-832. Miller, L.T., & P.A. Vernon. (1992). “The general factor in short-term memory, intelligence, and reaction time.” Intelligence, 26,5-29. Morin, R.E., & D.A. Grant. (1955). “Learning and performance on a key-pressing task as a function of the degree of spatial stimulus-response correspondence.” Journal of Experi-

mental Psychology, 49,3947. Morin, R.E., A. Konick, N. Troxell, & S. McPherson. (1965). “Information and reaction time for ‘naming’ responses.” ]ournal of Experimental, Psychology, 70,309-314. Myors, B. (1985). Hick’s Law and intelligence: An investigation involving an XT task constructed for microcomputer presentation. Unpublished manuscript. Available at the University of Sydney. Myors, B., L. Stankov, & G.W. Oliphant. (1989). “Competing tasks, working memory and intelligence.” Australian journal of Psychology, 41, 1-16. Nagileri, J.A. (1997). “IQ: Known and unknowns, hits and misses.” American Psychologist,

52,75-76. Necka, E. (1992). “Cognitive analysis of intelligence: The significance of working memory processes. ” Personality and Individual Dr@rences, 13,1031-1046. Neisser, U. (1967). Cognitive psychology. New York: Appleton-Century-Crofts. Neisser, U. (1997). “Never a dull moment.” American Psychologist, 52,79-81. Neisser, U., G. Boodoo, T.J. Bouchard, Jr., A.W. Boykin, N. Brody, S.J. Ceci, D.F. Halpern, J.C. Loehlin, R. Perloff, R.J. Sternberg, & S. Urbina. (1996). “Intelligence: Knowns and unknowns.” American Psychologist, 51,77-101. Nelson, H.E., & A. O’Connell. (1978). “Dementia: The estimation of premorbid intelligence levels using the New Adult Reading Test.“ Cortex, 14,234244. Nettelbeck, T. (1987). “Inspection time and intelligence.” Pp. 295-346 in Speed of information-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Nettelbeck, T. (1990). “Intelligence does exist: A rejoinder to M.J.A. Howe.” The Psycholo-

gist, 3,494--497. Nettelbeck,

T., & N.H. Kirby. (1983). “Measures of timed performance

and intelligence.”

In felligence, 7,39-52. Nettelbeck,

T., & M. Lally. (1976). “Inspection

time and measured

intelligence.”

British

journal of Psychology, 647,17-22. Nettelbeck, T., & P.M.A. Rabbitt. (1992). “Aging, cognitive performance and mental speed.“ Intelligence, 16, 189-205. Neubauer, A.C. (1990a). “Speed of information processing in the Hick paradigm and response latencies in a psychometric intelligence test.“ Personality and Individual Differ-

ences, 11,147-152. Neubauer, A.C. (1990b). “Selective reaction times and intelligence.” Intelligence, 14, 79-96. Neubauer, A.C. (1991). “Intelligence and RT: A modified Hick paradigm and a new RT paradigm.“ Intelligence, 15, 175-192. Neubauer, A.C., C. Bauer, & G. Holler. (1992). “Intelligence, attention, motivation and speed-accuracy trade-off in the Hick paradigm.“ Personality and individual Differences,

13,1325-1332. Neubauer, A.C., & H.H. Freudenthaler. (1994). “Reaction times in a sentence-picture verification test and intelligence: Individual strategies and effects of extended practice.”

Intelligence, 29,193-218. Neubauer, A.C., R. Riemann, R. Mayer, & A. Angleitner. (1997). “Intelligence and reaction times in the Hick, Sternberg and Posner paradigms.” Personality and Individual Differ-

ences, 22, 885-894.

113

Norusis, M.J. (1990). SPSS reference guide. Chicago, IL: SPSS Inc. rallier, G., R.D. Roberts, & L. Stankov. (1998). Biological versus psychometric intelligence: Hutstead (2947) Re-visited. Archives of Clinical Neuropsychology. In press. Palmer, J., C.M. MacLeod, E. Hunt, & J.E. Davidson. (1985). “Information-processing correlates of reading.” Journal of Memory and Language, 24,59-88. Paterson, D.G., R.M. Elliott, L.D. Anderson, H.A. Toops, & E. Heidbreder. (1930). Minnesota mechanical ability tests. Minneapolis, MN: University of Minnesota Press. Payne, D.L., R.E. Christal, & P.C. Kyllonen. (1984). “Individual differences in learning abilities. “Advances in Reading/Learning Research, 4,27-36. Pearson, K. (1901). “On lines and planes of closest fit to systems of points in space.” Philosophical Magazine, 2,559-572. Peterson, N.G., & D.A. Bownas. (1982). “Skill, task structure, and performance acquisition.“ Pp. 49-105 in Human performance and productivity (Vol. 2): Human capability assessment, edited by M.D. Dunnette & E.A. Fleishman. Hillsdale, NJ: Erlbaum. Pollack, 1. (1963). “Speed of classification of words into superordinate categories.” Journal of Verbal Learning and Verbal Behavior, 2,159-165. Posner, M.I. (1964). “Information reduction in the analysis of sequential tasks.” Psychological Review, 72,491-504. Proctor, R.W., T. Van Zandt, C.-H. Lu, & D.J. Weeks. (1993). “Stimulus-response probability for moving stimuli: Perception of affordances or directional coding?” Journal of Experimental Psychology: Human Perception and Performance, 29,81-91. Rabbitt, P.M.A. (1994). “Crystal quest: A search for the basis of maintenance of practiced skills into old age.“ Pp. 188-230 in Attention, selection, awareness and control: A tribute to Donald Broadbent, edited by A. Baddeley & L. Weiskrantz. London: Clarendon Press. Rabbit, P.M.A. (1996). “Do individual differences in speed reflect ‘global’ or ‘local’ differences in mental abilities?“ Intelligence, 22,698s. Rabbitt, P.M.A., N. Banerji, & A. Szymanski. (1989). “Space fortress as an IQ test? Predictions of learning and of practiced performance in a complex interactive video-game.” Acta Psychologica, 71,243-257. Ree, M.J., & T. Carretta. (1996). “The central role of g in military pilot selection. The International Journal of Aviation Psychology, 6, 11-123. Ree, M.J., & J.A. Earles. (1991). “The stability of g across different methods of estimation.” Intelligence, 25,271-278. Ree, M.J., & J.A. Earles. (1992). “Intelligence is the best predictor of job performance.” Current Directions in Psychological Science, 1,86-89. Ree, MJ., & J.A. Earles. (1993). “g is to psychology what carbon is to chemistry: A reply to Stemberg and Wagner, McClelland and Calfee.” Current Directions in Psychological Science, 2,11-12. Ree, M.J., J.A. Earles, & M.S. Teachout. (1993). “Predicting job performance: Not much more than g.” Journal of Applied Psychology, 79,5X$-524. Reed, T.R., & A.R. Jensen. (1991). “Arm nerve conduction velocity (NCV), brain NCV, reaction time and intelligence.” Intelligence, 25,3347. Reed, T.R., & A.R. Jensen. (1993). “Choice reaction time and visual pathway nerve conduction velocity both correlates with intelligence but appear not to correlate with each other: Implications for information processing.” Intelligence, 17,191-203. Rimoldi, H.J.A. (1951). “Personal tempo.” Journal of Abnormal and Social Psychology, 46,283-303. Roberts, R.D. (1985). An investigation of dual task performance within an information-theory framework. Unpublished 8. A. (Hons.) dissertation, University of Sydney, Australia. Roberts, R.D. (1995). Speed of processing within the structure of humun cognitive abilities. Unpublished Ph.D. dissertation, University of Sydney, Australia.

114

LEARMINGANDIMDIVIDUALDIFFERENCES

VOLlJMEll,NUMBER1,1999

Roberts, R.D. (1997a). “Fitts’ Law, movement time and intelligence.” Personality and Individual Di@rences, 23‘227-246. Roberts, R.D. (1997b). Processing speed, stimulus-response compatibility and intelligence. Paper presented at the VIIIth Meeting of the International Society for the Study of Individual Differences, Aarhus, Denmark. Roberts, R.D. (1998a). Individual d#erences in performance on elementary cognitive tasks (ECTsJ: Lawful vs. problematic parameters. Manuscript submitted for publication. Roberts, R.D. (1999b). “A description and empirical evaluation of ten elementary cognitive tasks (ECTs) subscribing to information theoretic principles.” Technical Report Number 2. Sydney: University of Sydney. Roberts, R.D., H.C. Beh, G. Spilsbury, & L. Stankov. (1991a). “Evidence for an attentional model of human intelligence using the competing task paradigm.” Personality and Individual Differences, 12,445455. Roberts, R.D., H.C. Beh, & L. Stankov. (1988). “Hick’s law, competing tasks and intelligence.” Intelligence, 22,111-131. Roberts, R.D., G.N. Goff, & P.C. Kyllonen. (1999a). ASVAB: Little more than acculturated learning (Cc)? Manuscript submitted for publication. Roberts, R.D., G. rallier, & G.N. Goff. (1999b). “Sensory processes within the structure of human cognitive abilities.” Pp. 339-368 in Learning and individual differences research: Processes, traits, and content determinants, edited by P.L. Ackerman, P.C. Kyllonen, & R.D. Roberts. Washington, DC: American Psychological Association, In press. Roberts, RD., G. Pallier, & L. Stankov. (1996). “The basic information processing (SIP) unit, mental speed and human cognitive abilities: Should the BIP R.I.P.?” Intelligence, 23,133-155. Roberts, R.D., L. Stankov, G. rallier, & B. Dolph. (1997). “Charting the cognitive sphere: Tactile/kinesthetic performance within the structure of intelligence.“Intelligence, 25,111-148. Roberts, R.D., L. Stankov, & M.B. Walker. (1991b). Fitts’ law, competing task performance and intelligence. Unpublished manuscript. Available at the University of Sydney. Robinson, D.L. (1999). “The ‘IQ’ factor: Implications for intelligence theory and measurement.” Personality and individual Differences, 27, 715-735. Rosch, E. (1975). “Cognitive representations of semantic categories.” Journal of Experimental Psychology: General, 104‘192-233. Rosch, E. (1978). “Principles of categorization.” Pp. 2748 in Cognition and categorization, edited by E. Rosch & B.B. Lloyd. Hillsdale, NJ: Erlbaum. Roskam, E.E. (1987). “Towards a psychometric theory of intelligence.” Pp. 151-174 in Progress in mathematical psychology, edited by E.E. Roskam, R. Suck. North-Holland: Elsevier Science. Roth, E. (1964). “Die Geschwindigkeit der Verarbeitung von Information und ihr Zussammenhang mit Intelligenz.” Zeitschriffur Experimentelle und Angewandte Psychologie, 11, 616-623. Ruchalla, E., E. Schalt, & F. Vogel. (1985). “Relations between mental performance and reaction time: New aspects of an old problem.” Intelligence, 9,189-205. Rushton, J.P. (1995). Race, evolution and behavior. New Brunswick, NJ: Transaction. Saccuzzo, D.P., G.E. Larson, & B. Rimland. (1986). “Visual, auditory, and reaction time approaches to the measurement of speed of information processing and individual differences in intelligence.” Personality and Individual Difirences, 2,659-668. Salthouse, T.A. (1994). “The nature of the influence of speed on adult age differences in cognition.“ Developmental Psychology, 30,240-259. Salthouse, T.A. (1995). “Selective influences of age and speed on associative memory.” American ]ournal of Psychology, 108,381-396. Salthouse, T.A. (1996). “The processing-speed theory of adult age differences in cognition.” Psychological Review, 303,403428.

PROCESWG SPEEDANDABlllTlES

115

S&mid, J., & J.M. Leiman. (1957). “The development of hierarchical factor solutions.” PSYchometrika, 22,53-61. Schmidt, F.L., & J.E. Hunter. (1992). “Development of a causal model of processes determining job performance.” Current Directions in Psychological Science, 2,89-92. Schmidt, F.L., D.S. Ones, & J.E. Hunter. (1992). “Personnel selection.” Annual Review ofPsy-

chology, 43,627-670. Schweizer, K. (1993a). “The contribution of access to external information, stimulus complexity, and variability to cognitive abilities.“ Personality and Individual Differences, 14,87-95. Schweizer, K. (199313). “The effect of two information-procesing skills on the speed-ability relationship.” Personality and Individual Difirences, 14,713-722. Shannon, C.E., & W. Weaver. (1949). The mathemuticul theory of communication. Urbana: University of Illinios Press. Simon, H.A., & K. Kotovsky. (1963)” Human acquisition for concepts for sequential patterns.” Psychological Review, 70,534-546. Simon, J.R., P.E. Sly, & S. Vilapakkam. (1981). “Effect of compatibility of S-R mapping on reactions toward the stimulus source.” Actu Psychologicu, 47,63-81. Sliwinski, M., H. Buschke, G. Kuslansky, G. Senior, & D. Scarisbrick. (1994). “Proportional slowing and addition speed in old and young adults.“ Psychology and Aging, 9,1,72-80. Smith, E.E. (1968). “Choice reaction time: An analysis of the major theoretical positions.”

Psychological Bulletin, 69,77-110. Smith, G.A. (1989). “Strategies and procedures affecting the accuracy of reaction time parameters and their correlations with intelligence.” Personality and Individual Differences,

10,829-835. Smith, G.A., & M. Carew. (1987). “Decision time unmasked: Individuals adopt different strategies.” Australian Journal ofPsychology, 39,337-349. Smith, G.A., & G. Stanley. (1980). “Relations between measures of intelligence and choice reaction time.“ Bulletin of the Psychonomic Society, 16,8-10. Smith, G.A., & G. Stanley. (1983). “Clocking g: Relating intelligence and measures of timed performance.” Intelligence, 7, 353-368. Smith, G.A., & G. Stanley. (1987). “Comparing subtest profiles of g loadings and correlations with RT measures.” Intelligence, 12,291-298. Snow, R.E. (1989). “Aptitude processes.” Pp. 27-63 in Aptitude, learning, and instruction: Vol. 2. Cognitive process analyses of aptitude, edited by R.E. Snow, P-A. Federico, & W.E. Montague. Hillsdale, NJ: Erlbaum. Spearitt, D. (1996). “Carroll’s model of cognitive abilities: Educational implications.” Inter-

national Journal of Educational Research, 25,107-198. Spearman, C. (1904). “General intelligence, objectively determined and measured.” Ameri-

can Journal of Psychology, 25,201-293. Spearman, C. (1927). The abilities of man. New York: Macmillan. Spilsbury, G. (1992). “Complexity as a reflection of the dimensionality

of a task.” Intelli-

gence, 16,3145. Spilsbury, G., L. Stankov, & R.D. Roberts. (1990). “The effects of a test’s difficulty on its correlation with intelligence.” Personality and lndividuul Differences, 11,1069-1077. Stankov, L. (1978). “Fluid and crystallized intelligence and broad perceptual factors among the ll-12-year-olds.” Journal of Educational Psychology, 70,324334. Stankov, L. (1983a). “Attention and intelligence.” Journal of Educational Psychology, 75,471490. Stankov, L. (1983b). “The role of competition in human abilities revealed through auditory tests.” Multivariate Behavioral Research Monographs, No. 83-Z. Stankov, L. (1987). “Competing task and attentional resources: Exploring the limits of primary-secondary paradigm.“ Australian Journal of Psychology, 39,123-137.

116

LEARN~NGANDINDIVllUALDIFFERENCES

V0LUME11.NLlMBER1.1999

Stankov, L. (1988a). “Single tests, competing tasks and their relationship to broad factors of intelligence.“ Personalify and Individual Differences, 9,25-33. Stankov, L. (1988b). “Aging, attention and intelligence.” Psychology and Aging, 3,59-74. Stankov, L. (1988~). “Understanding intelligence: Auditory abilities, competing tasks, and attentional resources.” Pp. 98-110 in Intelligence: Controversy and change, edited by A. Watson. Melbourne: ACER. Stankov, L. (1989). “Attentional resources and intelligence: A disappearing link.” Personality and Individual Diflerences, 10,957-968. Stankov, L. (1994). “The complexity effect phenomenon is an epiphenomenon of agerelated fluid intelligence decline.“ Personalfiy and Individual Dfjerences, 26,265-288. Stankov, L., & J.D. Crawford. (1993). “Ingredients of complexity in fluid intelligence.” Learning and lndividual Differences, 5,73-111. Stankov, L., & A. Cregan. (1993). “Quantitative and qualitative properties of an intelligence test: Series completion.“ Learning and Individual Differences, 5,137-169. Stankov, L., & J.L. Horn. (1980). “Human abilities revealed through auditory tests.” Journal of Educational Psychology, 72,21-H Stankov, L., & B. Myers. (1990). “The relationship between working memory and intelligence: Regression and COSAN analysis.” Personality and Individual Dzfterences, 11,1059-1068. Stankov, L., & T. Raykov. (1995). “Modeling complexity and difficulty in measures of fluid intelligence.“ Structural Equation Modeling, 2,335-366. Stankov, L., & R.D. Roberts. (1997). “Mental speed is not fke basic process of intelligence.” Personality and lndividual Diperences, 22,69-84. Stankov, L., G.J. Boyle, & R.B. Cattell. (1995). “Models and paradigms in intelligence research.“ Pp. 1543 in lnfernafional handbook of personality and intelligence, edited by D. Saklofske & M. Zeidner. New York: Plenum Press. Stankov, L., R.D. Roberts, & G. Spilsbury. (1994). “Attention and speed of test-taking in intelligence and aging.” Personality and Individual Difirences, 2,273-284. Stanton, W.R., & J.A. Keats. (1986). “Intelligence and ordered task complexity.” Australian Iournal of Psychology, 38,125-131. Stemberg, R.J. (1986). “Haste makes waste versus a stitch in time? A reply to Vernon, Nador and Kantor.” Intelligence, 10,265-270. Teichner, W.H., & M.J. Krebs. (1974). “Laws of visual choice reaction time.” Psychological Review, 81,75-98. Telzrow, CF. (1983). “Making child neuropsychological appraisal appropriate for children: AIterative to downward extension of adult batteries.” Clinical Neuropsyckology, 5,136-141. Theios, J. (1975). “The components of response latency in simple human information processing tasks.” Pp. 418440 in Attention and Performance V, edited by P.M.A. Rabbitt, S. Domic. London: Academic Press. Thissen, D. (1983). “Timed testing: An approach using item-response theory.” Pp. 178-203 in New horizons in testing: Latent trait test theory and computerized adaptive testing, edited by D.J. Weiss. New York: Academic Press. Thomson, G.A. (1948). Thefacforial analysis of human ability (3rd ed.). Boston: Houghton Mifflin. Thomdike, E.L. (1921). “Intelligence and its measurement: A symposium.” journal of Educational Psychology, 12,124-127. Thomdike, E.L., E.O. Bregman, M.V. Cobb, & E. Woodyard. (1926). The measurement of intelligence. New York: Bureau of Publications, Teachers College, Columbia University. Thomdike, R.L. (1985). “The central role of general ability in prediction.” Multivariate Behavioral Research, 20,241-254. Thorndike, R.L. (1986)“The role of general ability in prediction.” Journal of Vocational Behavior, 29,332-339. Thurstone, L.L. (1937). “Ability, motivation, and speed.“ Psyckomefrika, 2,249-254.

Pf?OCESS/NGSi'EEOANDAB/L/T/ES

117

Thurstone, L.L. (1938). Primary mental abilities. Chicago: University of Chicago Press. Tomer, A., & W.R. Cunningham. (1993). “The structure of cognitive speed measures in old and young adults.“ Multivariate Beakvioral Research, 28, l-24. Umilta, C. (1994). “The Simon effect: Introductory remarks.” Psychological Research, 56,127-129. Umilta, C., & R. Nicoletti. (1990). “Spatial stimulus-response compatibility.” Pp. 89-116 in Stimulus-response compatibility: An integrated perspective, edited by R.W. Proctor & T.G. Reeve. Amsterdam: North-Holland. Van der Ven, A.H.G.S. (1974). “The correlation between speed and precision in time-limit tests.“ Nederlands Tijdsckrift voor de Psyckologie, 29,447-456. Vernon, P.A. (1981). “Reaction time and intelligence in the mentally retarded.” Intelligence, 5,345-355. Vernon, P.A. (1983). “Speed of information processing and general intelligence.” Intelligence, 7,53-70. Vernon, P.A. (1986). “The g-loading of intelligence tests and their relationship with reaction times: A comment on Ruchalla et al.” Intelligence, 10,93-100. Vernon, P.A. (1987). “New developments in reaction time research.” Pp. l-20 in Speed ofinformation-processing and intelligence, edited by P.A. Vernon. Norwood, NJ: Ablex. Vernon, P.A. (1990). “The use of biological measures to estimate behavioral intelligence.” Educational Psychologist, 25,293-304. Vernon, P.A., & A.R. Jensen. (1984). “Individual and group differences in intelligence and speed of information-processing.” Personality and Individual Differences, 5,411423. Vernon, P.A., & L. Kantor. (1986). “Reaction time correlations with intelligence test scores obtained under either timed or untimed conditions.“ Intelligence, 10,315-330. Vernon, P.A., S. Nador, & L. Kantor. (1985). “Reaction time and speed of processing: Their relationship to timed and untimed measures of intelligence.” Intelligence, 9,357-374. Wang, H., & R.W. Proctor. (1996). “Stimulus-response compatibility as a function of stimulus-code and response modality.“ Journal of Experimental Psychology: Human Perception and Performance, 22,1201-1217. Wechsler, D. (1981). Manualfor the Wecksler Adult Intelligence Scale-Revised. New York: The Psychological Corporation. Welford, A.T. (1968). Fundamentals ofskill. London: Methuen. Welford, A.T. (1980). Reaction times. London: Academic Press. Welford, A.T. (1986). “Longstreth versus Jensen and Vernon on reaction time and IQ: An outsider’s view.” Intelligence, 10,193-195. Welford, A.T., A.H. Norris, & N.W. Shock. (1969). “Speed and accuracy of movement and their changes with age.” Acta Psyckologica, 30,3-15. White, P.O. (1973). “Individual differences in speed, accuracy and persistence: A mathematical model for problem solving.” Pp. 246-260 in The measurement ofintelligence, edited by H.J. Eysenck. Baltimore: Williams &Wilkins. White, P.O. (1982). “Some major components in general intelligence.” Pp. 44-90 in A model for intelligence, edited by H.J. Eysenck. Berlin: Springer-Verlag. Wickens, CD. (1980). “The structure of attentional resources.” Pp. 239-257 in Attention and performance (Vol. 8), edited by R. Nickerson. Hillsdale, NJ: Erlbaum. Widaman, K.F., & J.S. Carlson. (1989). “Procedural effects on performance on the Hick paradigm: Bias in reaction time and movement time parameters.” Intelligence, 23,63-85. Wiedl, K.H., & J.S. Carlson. (1976). “The factorial structure of the Raven Colored Progressive Matrices Test.“ Educational and Psychological measurement, 36,409413. Wissler, C. (1901). “The correlation of mental and physical tests.” Psychological Revie7u Monographs, 26,1-62. Wittenborn, J.R. (1943). “Factorial equations for tests of attention.” Psyckometriku, 8,19-35. Woodcock, R.W., & M.B. Johnson. (1989). Woodcock-Johnson tests of cognitive abilities: Standard and supplemental batteries. Chicago: Riverside. Yee, A.H. (1997). “Evading the controversy.” American Psychologist, 52,70-71.

02. Letter Counting (LC) 03. Letter Sets (SL) 04. Number SeriesSingle (NSS) 05. Number SeriesCompeting (NSC) 06. Letter SeriesSingle (LSS) 07. Letter SeriesCompeting (LSC) OS. Water Jars (WJ) 09. Scrambled Words (SW) 10. General Information (GU 11. Vocabulary Multichoice (VM) 12. Esoteric Analogies (EA)

01. Progressive Matrices (RM)

Test

09 02

17

06

31 24

32

22

10

29

25 09

25

09

24

21

32

12

44

26

22

29

16

33

34

l

03 SL

26 -03

35

l

02 LC

38 57

.

01 RM

22

11

15

30

38 25

44

45

l

08

03

02

16

18 22

27

9

04 05 NSS NSC

Correlations

24

22

15

30

50 27

l

27

21

11

25

28

l

06 07 LSS LSC

25

08

11

09

l

08 WJ

36

31

22

l

09 SW

l

61

56

20 GI

63

l

11 VM

TABLE A.1 Among All Cognitive-Ability

APPENDIX

l

12 EA

13 SF

14 SB

15 CR

16 17 18 19 20 CFB HFS HFC TMS TMC

Variables Given in Table 2 21 SD

(Continued)

22 23 24 NCT SCT SST

01

02

24

10

06

26

22

06

43

35

40

40

35

05

-13 -22 -16 -16

27 13

21 13

-07 -30 -19 -05

25

16

RM LC

-14 -31 -19 -37

14

31

30

24

21

22

26 23

30

03 SL

Nate: Decimal points omitted from numbers.

(SD) 22. Number Comparison (NCT) 23. Stroop Color (SCT) 24. String Search (SST) 25. Digit Symbol (DST)

(SF) 14. Digit Span Backwards (SB) 15. Card Rotations (CR) 16. Computer Form Boards (CFB) 17. Hidden FiguresSingle (HF’S) 18. Hidden FiguresCompeting (HFC) 19. Tonal MemorySingle (TMS) 20. Tonal MemoryCompeting (TMC) 21. Speech Distortion

13. Digit Span Forwards

Test

-07 -28 -21 -09

14

31

20

-08 -12 -17 -12

04

-06 -17 -17 -03

10

32

16

30

11

20

18

33

12

26

09

27

17

11

24

25 11

22

01

08 W]

-08 -16 -09 00

17

42

35

31

23

23

-12 -13 -10 -02

04

21

06

21

15

07

33 17 13 -02

32

06 07 LSS LSC

31 27 10 -02

33

04 05 NSS NSC

09

10 Gl

20

11 VM

09

16

-13 -19 -21 -13

20

23

15 12

08

14

16

13

12

05

32

24

28

34

19

24 10

24

EA

13

22

39

35

15

18

11

53 10

l

SE

03 09 -05 05 -06 07 -16 -07 -IO -05 -23 02 10 16 02 -04

07 -02

12

14

15 -04

19

05

32 07 20 14 -06 -11

32

09 SW

TABLE A.1 Continued 14

-11 -16 -07 -10

07

41

31

22

28

18

9 08

SB

15

16

17

18

19 20

-16 -27 -14 -23

16

29

12

33

33

15

.

10 00 03 -04

02

27

28

31

36

l

-08 -16 -18 -16

11

40

20

68

l

-07 -17 -26 -13

10

55

28

l

-04 -26 -07 -03

01

59

l

16 -13 -41 -21 -18

l

CR CFB HFS HFC TMS TMC

-13 -06 06 -17

l

50 51 51

l

52 38

l

31

l

21 22 23 24 SD NCT SCT SST

Notes:

.52 .42 44 .39 .39 .31 .23 .18 .40 .43 .38 .24 .25 .28 -.12 -.27 .27 .26 .21 .20 - .02 .14 -.03 .16 .20 .21 .13 .ll .21 -.Ol -.17

.95 .37 .39 .14 .17 .07 .21 34 .29 .14 .15 .20 -.07 -.32 .38 .44 .16 .18 .08 .22 .35 .31 .15 .15 .20 -.08 -.32 .77 .13 .ll .16 .24 .27 .18 .06 .06 .33 -.06 - .29 .20 .13 .23 .29 .31 .21 .08 -.03 .38 -.13 -.31

.38 .67 .73 .36 .35 .31 .20 .39 -.02 -.06

DT28

For the variables listed, DT = Decision Time; MT = Movement Time; RT = Reaction Time; TR = short-term given to each ECT in the Method section.

.41 .37 .24 .27 .I5 .25 .05 -.16 .07 .I3 .28 .18 -.04 .02 .I8 -.04 -.I0

34 .32 .36 .37 .41 .40 -18 .35 .09 .I? .09 .23 .26 .22 .11 .08 .24 -.06 -.26

26. Fitts’s Movement (MT26) 28. Single-Response (MT28) 29. Tachistoscopic (MT29) 30A. Complex Choice (MT30) 31. Binary Reaction (MT31) 32. Cards Single (MT32) 33. Cards Multitask (MT33) 34. Word Single (MT34) 35. Word Multitask (MT35) 28. Single Response (DT28) 29. Tachistoscopic (DT29) 30. Complex Choice (DT30) 31. Binary Reaction (DT31) 32. Cards Single (DT32) 33. Cards Multitask (DT33) 34. Word Single (DT34) 35. Word Multitask (DT35) 27. Joystick (RT27) 36. Lehrl-Duration (Ta) 36. L&d-BIP (C,)

.35 .56 .58 .28 .30 .30 .36 .05 .11 .25 .22 .24 .14 .Ol .03 .26 -.Ol -.15

MT26 MT28 MT29 MT30 MT31 MT32 MT33 MT34 MT35

TABLE A.2 Among ECT Measures

Tesf

Correlations

.67 .32 .26 .I9 .14 .32 -.06 -.Ol

.43 34 .30 .24 .44 -.05 -.23

.78 .33 .36 .24 -.13 -.25

.45 .44 .17 -.14 -.26

DT30 DT31 DT32 DT33

.67 .I4 -22 -.33

.04 -.lO -.29

.33

TR

to those

.Ol .13

D-l-34 DT35 RT27

store; CK = duration of presence. Numbers correspond

.38 .49 34 .32 .25 .22 .I7 -.02 -.22

DT29

2 E Sz z