Learning and Individual Differences 13 (2003) 259 – 272
Technical students’ metacognitive skills: relating general vs. specific metacognitive skills to study success Marcel V.J. Veenman*, Joke Verheij Department of Developmental and Educational Psychology, Leiden University, Wassenaarseweg 52, 2333 AK Leiden, The Netherlands Received 2 January 2002; received in revised form 7 December 2002; accepted 9 December 2002
Abstract The first objective of this study was to determine whether metacognitive skillfulness is entirely part of intelligence as predictor of learning or not. Furthermore, the generality vs. domain-specificity of metacognitive skillfulness was investigated. Sixteen technical university students participated in the experiment. They performed two tasks while thinking aloud, a model construction task that was part of their curriculum and an unfamiliar discovery –learning task representing a fictitious domain. Both the participant’s metacognitive skillfulness and learning performance were assessed for each domain. Furthermore, exam grades and study credits were collected. Results support the generality of metacognitive skills across tasks and domains. Results further show that metacognitive skillfulness contributed to learning results (partly) independent of intellectual ability. Implications for metacognitive skill training are being discussed. D 2002 Elsevier Science Inc. All rights reserved. Keywords: Metacognitive skills; Generality; Domain-specificity; Learning performance
* Corresponding author. Tel.: +31-71-527-3463. E-mail address:
[email protected] (M.V.J. Veenman). 1041-6080/02/$ – see front matter D 2002 Elsevier Science Inc. All rights reserved. doi:10.1016/S 1 0 4 1 - 6 0 8 0 ( 0 2 ) 0 0 0 9 4 - 8
260
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
1. Introduction The transition from secondary education to university in general and to technical university in particular is for many students a difficult one (Taconis, Ferguson-Hessler, & Verkerk, 1997). It often results in the slowing down of the study progress or even in study discontinuation, which is a problem many teachers and supervisors are faced with. The students concerned are at risk. For teachers and supervisors, the need exists to trace those students in an early stage of their study in order to help them with their transitional problems. Among others (motivation and attributional processes, persistency, study support), relevant predictors of study outcomes appear to be intellectual ability (Brody, 1992; Minnaert & Janssen, 1999; Sternberg, 1996) and metacognitive skillfulness (Brown, 1978; Wang, Haertel, & Walberg, 1990). On several occasions, Veenman and Elshout (1991, 1995, 1999) and Veenman, Elshout, and Meijer (1997) have launched the issue whether study progress results from a high level of intelligence, a high level of metacognitive skillfulness, or a combination of both student characteristics. In other words, research establishing the extent to which metacognitive skillfulness is entirely part of intellectual ability or an independent predictor of novice learning may yield more precise information about what lack of skills characterizes students at risk. 1.1. Intellectual ability Intelligence may be perceived as the magnitude and quality of the human cognitive toolbox, which contains basic cognitive operations (Elshout, 1983). The content and quality of this toolbox are not only determined by the biological substratum (e.g., hereditary factors or brain damage), but increasingly by the opportunities one seeks and by what the environment offers for acquiring useful cognitive strategies (e.g., at home or in educational settings). In the same vein, Humphreys (1968, 1989), Snow (1989), and Snow and Lohman (1984) regard intelligence as the acquired repertoire of intellectual or cognitive skills that is available to a person at a particular point of time. An intelligence test samples this repertoire. 1.2. Metacognitive skillfulness Though not beyond discussion, metacognitive skillfulness often is distinguished from metacognitive knowledge (Alexander, Carr, & Schwanenflugel, 1995; Baker, 1994; Kuhn, 1999; Schraw & Moshman, 1995; Veenman & Elshout, 1999). The latter concerns the declarative knowledge one has about the interplay among person characteristics, task characteristics, and the available strategies in a learning situation (Flavell, 1979). Such knowledge does not automatically lead to the appropriate task behavior. For instance, a student may know that monitoring one’s activities is necessary and yet refrain from doing so for various reasons. The task may be uninteresting or too difficult or the student may lack the necessary skills. Metacognitive skills, on the other hand, concern the procedural knowledge that is required for the actual regulation of and control over one’s learning activities. Task
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
261
orientation, planning, monitoring, checking, and reflection are manifestations of such skills (Brown, 1978; Veenman & Elshout, 1999; Wang et al., 1990). 1.3. Relation between intellectual and metacognitive skills The main question here is whether metacognitive skills are entirely part of the intellectual repertoire, or as Slife, Weiss, and Bell (1985) adequately formulated: ‘‘. . .whether metacognition can be reduced to cognition.’’ There are three alternative models for describing the relation between intellectual ability and metacognitive skillfulness as predictors of novice learning (Veenman, 1993; Veenman, Elshout et al., 1997; Veenman & Elshout, 1991, 1999). The first model regards effective metacognitive activity as a manifestation of intellectual ability. Sternberg (1990) and Davidson, Deuser, and Sternberg (1994), for instance, regard metacognitive skill as a core process component in Sternberg’s triarchic theory of intelligence. According to the ‘‘intelligence model,’’ metacognitive skills cannot have a predictive value for learning, independent of intellectual ability. Support for this model was obtained by Elshout and Veenman (1992) in an experiment with first year psychology students working in a computer-simulated calorimetry environment. Several other researchers reported significant differences in the usage of metacognitive strategies between intellectually gifted and average students (Shore & Dover, 1987; Span & Overtoom-Corsmit, 1986; Zimmerman & MartinezPons, 1990). These differences were, however, rather small and they were not consistently obtained for all metacognitive strategies reported (cf. Alexander et al., 1995). In a second contrasting model, intellectual ability and metacognitive skillfulness are assumed to be entirely independent predictors of novice learning. Some support for this ‘‘independency model’’ was gathered by Allon, Gutkin, and Bruning (1994) with secondary school children, by Slife et al. (1985) comparing learning disabled with regular elementary school children, by Swanson (1990) with elementary school children, and less profoundly by Minnaert and Janssen (1999) with university students. The last model is a mixed one. According to the ‘‘mixed model,’’ metacognitive skillfulness is related to intellectual ability to a certain extent, but it also has a surplus value on top of intellectual ability for the prediction of novice learning. This mixed model has been corroborated by several studies with first year psychology students performing learning tasks either with computer simulations in the domains of calorimetry, electricity, statistics, behavioral psychology, and a fictitious domain represented by Deton Lab (Elshout & Veenman, 1992; Veenman, 1993; Veenman, Elshout et al., 1997; Veenman & Elshout, 1991; Veenman, Elshout, & Busato, 1994), with a learning-by-doing environment for solving problems in the domain of thermodynamics (Veenman & Elshout, 1999) or with studying texts on law and geography (Veenman, Beishuizen, & Niewold, 1997). Van der Heijden (1989) obtained strong support for the mixed model with 145 second-grade students solving arithmetic problems. Research results of Maqsud (1997) with high school students performing math and reading tasks also fitted the mixed model. Finally, the mixed model was endorsed by research comparing learning disabled with mentally retarded children (Berger & Reid, 1989). Some of the aforementioned research reports, however, cannot fully support either one of the three alternative models. Either reports lack a complete set of data (Allon et al., 1994;
262
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
Shore & Dover, 1987; Span & Overtoom-Corsmit, 1986) or intellectual ability and metacognitive skillfulness were assessed as orthogonal factors (Slife et al., 1985; Swanson, 1990). In order to distinguish among those three models, all correlations among intellectual ability, metacognitive skillfulness, and (learning) performance as continuous variables should be available (see Veenman & Elshout, 1991). Most of the research reports with complete data sets supported the mixed model. Some of these reports, however, may lack ecological validity as results exclusively pertained to discovery learning in simulation environments or to first year psychology students. Therefore, a first objective of the present study is to generalize the mixed model to technical university students with tasks that are either part of their curriculum or not. 1.4. Generality vs. domain-specificity of metacognitive skillfulness A related research issue concerns the generality vs. domain-specificity of metacognitive skills. If those skills are mainly domain-specific, they cannot be entirely part of intelligence. In that case, metacognitive skills are most likely acquired along with expertise in a domain (Berger and Reid, 1989; Glaser, 1990). If they represent a general disposition, on the other hand, they may develop either along with intelligence or as a separable repertoire of acquired skills. Research of De Jong (1992) and Glaser, Schauble, Raghavan, and Zeitz (1992) has shown that the number and nature of metacognitive activities deployed varied across tasks and domains. They concluded that metacognitive skills are domain-specific. Veenman, Elshout et al. (1997), however, argued that, despite this variety across tasks and domains, metacognitive activity might still have represented general metacognitive skillfulness. For instance, overt orienting behavior might differ for text studying (e.g., scanning headings and subheadings, getting grasp of the theme and the overall text structure) relative to math problem solving (e.g., comprehension of the problem statement, making a sketch representing the problem, selecting relevant data and goals). Such activities might even vary within individuals over tasks. The inclination of analyzing the task and its requirements prior to action taking, however, might be a person-related characteristic; perhaps not so much a trait, but rather acquired behavior that proved to be effective to task performance in general (Veenman, 1993). Veenman, Elshout et al. (1997) adopted the experimental design of Glaser et al. (1992). Their 14 participants passed through simulation environments representing three different domains (calorimetry, statistics, and an artificial one). Results revealed that the metacognitive skillfulness of novices represented a general student characteristic across domains, rather than being domain-specific. Moreover, metacognitive activity drew on a source of general skills that was not identical to intellectual ability, much in line with the mixed model. However, the scope of the Veenman et al. study was limited to one type of task (i.e., simulations) while varying the domains of subject matter. Therefore, a second objective of the present study is to broaden this scope by investigating the generality vs. domainspecificity of metacognitive skills deployed during performance on tasks that differ in both task format and domain of subject matter. Task format will be manipulated by contrasting learning in a simulation environment with learning from a paper-and-pencil task. The nature
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
263
of the learning processes involved with task performance, on the other hand, will not be varied for reasons of experimental control. Both tasks concern the construction of mathematical models or formula through inductive reasoning. The ultimate objective of this study concerns the identification of (technical) students at risk. It is hypothesized that study success can be predicted by intellectual ability and metacognitive skillfulness according to the mixed model. Moreover, metacognitive skillfulness is hypothesized to be general by nature, surpassing task formats and domains. From a theoretical perspective, the nature of the relation between intellectual ability and metacognitive skillfulness is an intricate one. From a practical perspective, investigating both hypotheses may shed light on the necessity of diagnosing and remedying metacognitive deficiencies in an early stage.
2. Method 2.1. Participants Sixteen first year male students at the Technical University of Delft in the Netherlands voluntarily participated in the study. Eleven participants studied computer science and five participants studied electrical engineering. Because of the time-consuming method of protocol analysis (see below), only a relatively low number of students could participate in the study. They were awarded a small amount of money for their participation. 2.2. Intellectual ability Some weeks prior to the experiment, the participants’ intellectual ability (IA) was assessed by a series of ability tests, representing five primary intelligence factors (inductive reasoning, quantitative reasoning, verbal ability, closure flexibility, and sequential reasoning) in the reanalyses of factor-analytic studies by Carroll (1993). The test battery included tests for vocabulary, verbal analogies, linear syllogisms, number series, number speed, and embedded figures. The unweighted mean of the scores on these six tests may be regarded as an ‘‘IQ equivalent’’ (Veenman & Elshout, 1999). 2.3. Tasks The first task involved the computer-simulated Deton Lab that was used by Veenman, Elshout et al. (1997). In this fictitious simulation environment, participants had to examine the explosive power of four unknown materials from an alien planet. Specified amounts of each material could be poured into a container. The content of this container could be detonated in an isolated compartment of the lab and the explosive power could be read off a meter. Each of the four materials affected the explosive power differently. Material E contributed strongly to the explosive power, but was tempered by the presence of another material (material S). Material K contributed steadily without being affected by other
264
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
materials, whereas material G was neutral. By systematically combining different quantities of the materials, both the separate as well as the aggregate explosive power of the materials could be unraveled. Participants were asked to examine the explosive power of each material separately and in combination with other materials. In fact, they were asked to construe a formula for determining the explosive power of any mixture of these four materials. Deton Lab represented an inductive learning task in a fictitious domain that did not call upon any prior domain-specific knowledge. The second task, a paper-and-pencil task, concerned the construction of mathematical models for two given problems. The first problem pertained to the break out of a contagious disease on an island. A mathematical model of the population distribution of susceptible, infected, and immune islanders over time was to be formulated, given a number of constraints (e.g., no population growth and the infection emitting from one single islander). The second problem dealt with the phenomenon of osmosis. Participants had to form a differential equation for the rise of the saline water level in a tube, which was vertically placed in a water reservoir, with a semipermeable membrane. Once more, the problem description contained certain model constraints (e.g., the water level in the tube initially matches the water surface in the reservoir). By mentally generating the effects of conditions or constraints on the dependent variables over time, participants could induce rules for inclusion into the mathematical model. Although participants were familiar with basic principles of model construction, an essential part of both first year curricula in computer science and electrical engineering, solving both problems relied on complex inductive reasoning far beyond those basic principles. 2.4. Metacognitive skillfulness During the performance of all tasks, participants were requested to think aloud. These were transcribed and were analyzed on metacognitive skillfulness according to the judgmental procedure of Veenman (1993), Veenman and Elshout (1991, 1995), and Veenman, Elshout et al. (1994, 1997). Metacognitive skillfulness was rated 0 – 4 points on five subscales: orientation activities, systematical orderliness, accuracy, evaluation, and elaboration activities. Orientation activities were judged on indications of analyzing the problem statement, determining the independent and dependent variables, building a mental model of the task, and generating hypotheses and predictions. Judgements of systematical orderliness were based on the quality of planning activities, the systematical execution of plans, completing an orderly sequence of actions, and the avoidance of unsystematic events (such as varying two variables at the same time or generating one formula after another by trial and error). The criteria for accuracy were the following: precision in calculation, correct usage of quantities, tidiness and completeness of note taking, and avoidance of negligent mistakes. Evaluation activities were judged on monitoring and checking, both on the local level (e.g., detecting errors and checking calculations) as well as on the global level of keeping track of progress being made (e.g., verifying whether the obtained results provide an answer to the problem statement). Finally, judgments of elaboration were based on indications of recapitulating, drawing conclusions beyond the information given, relating these conclusions to the subject
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
265
matter, generating explanations, and reflecting on the learning process. It must be emphasized that judgments were based on the quality of performing metacognitive activities, not on the quality of information these activities produced. For instance, participants generating a wellconsidered action plan scored high on systematical orderliness, regardless of the outcome of such plan. Similarly, participants drawing elaborated, yet incorrect conclusions scored high on elaboration. For more details concerning this method of rating metacognitive skillfulness, see Veenman, Elshout et al. (1997). Two judges, who received no prior information about the participants’ intelligence, independently rated the protocols of Deton Lab and the contagious disease problem. In order to obtain two independent measures of metacognitive skillfulness, Deton Lab protocols were divided into two coherent segments for separate analyses. Adequate interrater reliabilities were obtained for protocol ratings of Deton Lab (alpha=.89) and the contagious disease problem (alpha=.81). These reliabilities allowed for the analyses of the osmosis problem protocols by a single judge, whose ratings of all protocols were used for further analyses. Mean scores of metacognitive skillfulness were calculated over the five subscales. Hence, for each participant, two separate metacognition scores were obtained from the Deton Lab protocols, as well as two separate metacognition scores from the model construction tasks. 2.5. Task performance Task performance in Deton Lab was measured by the Post-D test, which was administered subsequent to working in Deton Lab. Post-D consisted of 25 multiple-choice items and three open-ended questions. Some multiple-choice items concerned the positive, neutral, or negative contribution of a specific material to the explosive power, whereas other items concerned the extent to which materials interactively contributed to the explosive power of a mixture of materials. For instance, a multiple choice item was: ‘‘If the amount of material K is increased tenfold in a mixture, containing also material S and E, then the explosive power will: a) be the same, b) decrease by tenfold, c) increase but less than tenfold, d) increase by tenfold, or e) increase by more than tenfold.’’ For 9 items, a choice between three alternatives had to be made, while 16 items presented five alternatives. Each correct answer on the multiple choice items added one point to the participant’s score on Post-D. An example of an open-ended question was: ‘‘Which material has a mitigating effect on the explosive power of another material? Specify exactly how the explosive power is affected by a mixture of both materials.’’ Two judges rated the answers to the open-ended questions, one item at the time for all participants. Scores for each item were given on different scales, depending on the number of relations between materials that were to be mentioned and explained by the participant (scales ranged from 0 to 3 for two items and from 0 to 2 for the other item). Hence, the maximum score on Post-D was 33 points. Post-D appeared to be fairly internally consistent (alpha=.85). The participants’ final solution to both model construction problems were analyzed and scored with respect to the number of correctly formulated model components. For instance, the equation X(t + 1) = Xt + bXt contains three such components. Each correct component was
266
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
awarded with one point. Reformulated components that represented a similar equation structure, such as Xt(1 + b), were awarded accordingly. For the contagious disease problem, seven essential model components were identified, while the solution to the osmosis problem contained five components. Solution components were scored from the protocols and the participants’ notes prior to and separated from the protocol analyses of metacognitive skillfulness. 2.6. Study success Individual study credits and mean exam grades were used as measures of study success. Although these measures were obtained from students of two disciplines, inspection of both measures did not reveal differences between disciplines [t(14) < 0.65]. Both disciplines have several courses in common. It was verified that their study success scores were representative for both disciplines.
3. Results 3.1. Generality vs. task specificity of metacognitive skillfulness A principal component analysis (PCA) was performed on the metacognitive skillfulness scores for both types of tasks. In order to allow for the extraction of domain- or task-specific components, at least two measures of metacognitive skillfulness were available for each task type. PCA yielded a first component with an eigenvalue of 3.33 and a variance proportion of .83. The second component extracted had an eigenvalue of 0.48 with a variance proportion of .12. Factor loadings for both components are depicted in Table 1. All measures of metacognitive skillfulness loaded heavily on the first component. Consequently, this component may be regarded to represent general metacognitive skillfulness across both tasks. The second component contrasted metacognitive skillfulness for the osmosis problem with metacognition measures for Deton Lab. The low eigenvalue, variance proportion, and factor loadings, however, indicated that this component was rather insignificant. Subsequently, the metacognition scores for Deton Lab were aggregated, as well as the metacognition scores for the model construction tasks. Table 1 Unrotated component matrix for metacognitive skillfulness Meta-D (first segment) Meta-D (second segment) Meta-MC (contagious disease) Meta-MC (osmosis)
Component 1
Component 2
.93 .94 .93 .85
.33 .30 .19 .49
Meta-D = metacognitive skillfulness for Deton Lab; Meta-MC = metacognitive skillfulness for the model construction tasks.
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
267
Table 2 Correlations among intellectual ability, metacognitive skillfulness, task performance, and study success Meta-D Meta-MC Post-D Post-MC Grades Credits IA
Meta-MC
Post-D
Post-MC
Grades
Credits
.77y .81y .80y .66y .28
.68y .60y .68y .18
.68y .57y .31
.86y .51 *
.46 *
y
.77 .81y .52 * .65y .59y .22
Meta-D = metacognitive skillfulness for Deton Lab; Meta-MC = metacognitive skillfulness for the model construction tasks; Post-D = task performance for Deton Lab; Post-MC = task performance for the model construction tasks; Grades = exam grades; Credits = study credits; IA = intellectual ability. y P < .01. * P < .05.
3.2. Descriptive statistics The following descriptives were obtained: metacognitive skillfulness in Deton Lab: mean = 1.78, S.D. = 1.11; metacognitive skillfulness of the model construction tasks: mean = 1.38, S.D. = 0.84; performance on the posttest of Deton Lab: mean = 25.94, S.D. = 5.45; performance on the model construction tasks: mean = 7.62, S.D. = 3.69; exam grades: mean = 6.72, S.D. = 1.10; study credits: mean = 32.69, S.D. = 10.82. Scores on intellectual ability were standardized. 3.3. Correlational analyses Correlations were calculated among intellectual ability, metacognitive skillfulness, task performance, and study success (see Table 2). Correlations of intellectual ability with metacognitive skillfulness and with task performance were rather low for both Deton Lab and the model construction tasks. In order to determine the independent contribution of metacognitive skillfulness to performance, intellectual ability was partialled from the correlations between metacognitive skillfulness and task performance. Partialling out intellectual ability, however, did not substantially reduce these correlations. Semipartial correlations (Nunnally, 1967) were .79 ( P < .01) for Deton Lab and .75 ( P < .01) for the model construction tasks. Intellectual ability correlated significantly with both exam grades and study credits (see Table 2). Furthermore, both measures of study success were significantly related to metacognitive skillfulness.1 Partialling intellectual ability from the correlations between 1
Correlations were also calculated on factor scores obtained for both metacognition components. Correlations of scores on the first general metacognition component with intellectual ability and learning performance were similar to those of the raw metacognition scores. Correlations of scores on the second component with intellectual ability and learning performance were low.
268
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
Table 3 Proportions of variance accounted for in study success and task performance IA unique
Meta unique
IA + Meta shared
Total
Meta-D Exam grades Study credits Post-D
.14 .12 .00
.30 .25 .62
.12 .09 .04
.56 .46 .66
Meta-MC Exam grades Study credits Post-MC
.09 .08 .01
.46 .30 .56
.17 .13 .09
.72 .51 .66
Meta-D = metacognitive skillfulness for Deton Lab; Meta-MC = metacognitive skillfulness for the model construction tasks; Post-D = task performance for Deton Lab; Post-MC = task performance for the model construction tasks; IA unique = variance proportion accounted for uniquely by intellectual ability; Meta unique = variance proportion accounted for uniquely by metacognitive skillfulness; IA + Meta shared = variance proportion shared by intellectual ability and metacognitive skillfulness; Total = accumulated proportion of variance accounted for by intellectual ability and metacognitive skillfulness.
metacognitive skillfulness in Deton Lab and study success resulted in semipartial correlations of .55 ( P < .01) for exam grades and .50 ( P < .05) for study credits. Similarly, semipartial correlations associated with metacognitive skillfulness of the model construction tasks were .68 ( P < .01) for exam grades and .55 ( P < .01) for study credits. A regression analytic technique was used to distribute the variance in both measures of study success to various sources (Pedhazur, 1982; Veenman, 1993). For instance, the squared multiple correlation of intellectual ability and metacognitive skillfulness in Deton Lab (metaD) predicting exam grades was calculated from the correlations presented above (R2=.56). Taken together, intellectual ability and meta-D accounted for 56% of variance in exam grades. Both the semipartial correlation between meta-D and exam grades with intellectual ability partialled from meta-D (.55, see above) and the semipartial correlation between intellectual ability and exam grades with meta-D partialled from intellectual ability (.37, P < .05) were calculated. Consequently, it was estimated that 14% of the total variance in exam grades were accounted for uniquely by intellectual ability, 30% were accounted for uniquely by meta-D, whereas 12% of variance were shared by intellectual ability and meta-D (see Table 3). Finally, 44% of the total variance in exam grades was not accounted for by either of the two predictors. The same procedure was repeated for intellectual ability and meta-D as predictors of study credits and for intellectual ability and metacognitive skillfulness for the model construction task as predictors of either exam grades or study credits. Results are depicted in Table 3.
4. Discussion Metacognitive skillfulness on both tasks appeared to be strong predictors of both task performances and study success. Moreover, PCA showed that metacognitive skillfulness was
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
269
a general, person-related characteristic, rather than a domain- or task-specific feature. Therefore, the general nature of metacognitive skillfulness might be extended from surpassing domains to surpassing different task formats as well. The relevance of this general metacognition component was further emphasized by its predictive value for study success. This conclusion on the generality of metacognitive skills is in line with the results of Schraw, Dunkle, Bendixen, and Roedel (1995), Schraw and Nietfeld (1998), and Veenman, Elshout et al. (1997) obtained from nontechnical students performing a variety of tasks. Surprisingly, intellectual ability correlated low with both task performance in Deton Lab and performance on the model construction tasks. Due to these low correlations, one cannot decide between the three models representing alternative relations of intellectual and metacognitive skills. All models assume at least a moderate correlation between intellectual ability and task performance. Low correlations were not due to restriction of range in intelligence scores of technical students. Although technical students significantly outperformed the psychology students from Veenman, Elshout et al. (1997) study on five out of six intelligence tests, standard deviations of test scores leveled those of the psychology student population. Perhaps, technical students were not entirely novices with respect to mathematical reasoning aspects of task performance on these inductive learning tasks. Math had been a relevant part of their secondary school education. Both Elshout (1987) and Raaheim (1988) postulated that the impact of intelligence on performance declines with increasing expertise or familiarity with the task. Indeed, Veenman and Elshout (1999) obtained much lower correlations between intellectual ability and task performance for advanced participants in physics relative to novices. In the same vein, prior math knowledge might have reduced the impact of intellectual ability on task performance in the present study, despite the knowledgeleanness of Deton Lab and the complexity of the model construction tasks. Significant correlations, on the other hand, were obtained for intellectual ability and study success, much in line with results of Minnaert and Janssen (1999). Here, the low magnitude of correlations between intellectual ability and metacognitive skillfulness allowed for rejection of the intelligence model. Moreover, the unique contribution of metacognitive skillfulness to study success equaled or exceeded the total variance accounted for by intellectual ability, i.e., the unique contribution of intellectual ability plus the variance shared by both predictors of study success. The latter two variance components were less influential, but they contributed to study success nevertheless. On deciding between the independency and the mixed model, one might argue that the glass is either half full or half empty. Indeed, correlations between intellectual ability and metacognitive skillfulness were not statistically significant. Clearly, a limited number of participants, due to the time-consuming method of protocol analysis, may have reduced statistical power. The correlations, however, did not significantly deviate (Fisher z ratios .57) from a correlation of .37 obtained by Veenman (1999) for 184 novices working in different domains and on different task formats. Furthermore, the variance shared by both predictors was on a par with the unique contribution of intellectual ability. Consequently, the present results on study success might be regarded as supportive of the mixed model in favor of the independency model. Using another intelligence test for assessing higher levels of cognitive ability would further corroborate the mixed model.
270
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
Results support the notion that metacognitive skillfulness should be defined as partly independent of or even superordinate to cognitive processes (cf. Nelson, 1999). To identify (technical) students at risk, the quality of their metacognitive skillfulness, deployed during task performance, should be regarded as a relevant student characteristic. The general nature of metacognitive skillfulness allows for its assessment during performance on a knowledgelean task, such as Deton Lab, which is not even part of the student’s curriculum. Remarkably, metacognitive skillfulness on such an extraneous task appears to be a relevant predictor of study success. These findings may encourage the development of diagnostic instruments for the assessment of metacognitive skills. For instance, in order to circumvent the laborious analyses of thinking-aloud protocols, we have recently developed a procedure for scoring metacognitive skillfulness from computer log files of Deton Lab. These log files contain information about the sequence of student activities in Deton Lab, which can be scored by computer on the completeness of experimentation, the adequacy of sequencing activities, and the avoidance of omissions. Preliminary results show 87% convergence in metacognition scores obtained through either log file or protocol analyses. More research, though, is required to fully validate such a log file measure, in particular for anticipating study problems and identifying (technical) students at risk on a larger scale. Another implication of the present research results concerns the enhancement of metacognitive skills in students at risk, once identified. Results show that metacognitive skillfulness is not entirely part of intellectual ability. The unique contribution of metacognitive skillfulness to study success may indicate that metacognition is a malleable, rather than a fixed student characteristic. Indeed, it has been established that metacognitive skill training programs may be advantageous to both high and low intelligent students (Alexander et al., 1995; Brown & Palincsar, 1989; Campione, Brown, & Ferrara, 1982; Veenman, Elshout et al., 1994). Therefore, the mixed model may offer new prospects for remedying study problems of students at risk.
References Alexander, J. M., Carr, M., & Schwanenflugel, P. J. (1995). Development of metacognition in gifted children: directions for future research. Developmental Review, 15, 1 – 37. Allon, M., Gutkin, T. B., & Bruning, R. (1994). The relationship between metacognition and intelligence in normal adolescents: some tentative but surprising findings. Psychology in the Schools, 31, 93 – 97. Baker, L. (1994). Fostering metacognitive development. In H. W. Reese (Ed.), Advances in child development and behavior, vol. 25 (pp. 201 – 239). San Diego: Academic Press. Berger, R. S., & Reid, D. K. (1989). Differences that make a difference: comparisons of metacomponential functioning and knowledge base among groups of high and low IQ learning disabled, mildly mentally retarded, and normally achieving adults. Journal of Learning Disabilities, 22, 422 – 429. Brody, E. B. (1992). Intelligence. Nature, determinants, and consequences (2nd ed.). New York: Academic Press. Brown, A. L. (1978). Knowing when, where, and how to remember: a problem of metacognition. In R. Glaser (Ed.), Advances in instructional psychology, vol. 1 (pp. 77 – 165). Hillsdale, NJ: Erlbaum. Brown, A. L., & Palincsar, A. S. (1989). Guided, cooperative learning and individual knowledge acquisition. In L. B. Resnick (Ed.), Knowing, learning, and instruction. Essays in honor of Robert Glaser ( pp. 393 – 451). Hillsdale, NJ: Erlbaum.
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
271
Campione, J. C., Brown, A. L., & Ferrara, R. A. (1982). Mental retardation and intelligence. In R. J. Sternberg (Ed.), Handbook of human intelligence ( pp. 392 – 490). Cambridge: Cambridge University Press. Carroll, J. B. (1993). Human cognitive abilities. A survey of factor-analytic studies. Cambridge: Cambridge University Press. Davidson, J. E., Deuser, R., & Sternberg, R. J. (1994). The role of metacognition in problem solving. In J. Metcalfe, & A. P. Shimamura (Eds.), Metacognition ( pp. 207 – 226). Cambridge: MIT Press. De Jong, F. P. C. M. (1992). Zelfstandig leren. Regulatie van het leerproces en leren reguleren: een procesbenadering [Independent learning. Regulation of the learning process and learning to regulate: a process approach]. Dissertation, Katholieke Universiteit Brabant, Tilburg. Elshout, J. J. (1983). Is measuring intelligence still useful? In S. B. Anderson, & J. S. Helmick (Eds.), On educational testing ( pp. 45 – 56). San Francisco: Jossey-Bass. Elshout, J. J. (1987). Problem solving and education. In E. de Corte, H. Lodewijks, R. Parmentier, & P. Span (Eds.), Learning and instruction (pp. 259 – 273). Oxford: Pergamon/Leuven: University Press. Elshout, J. J., & Veenman, M. V. J. (1992). Relation between intellectual ability and working method as predictors of learning. Journal of Educational Research, 85, 134 – 143. Flavell, J. H. (1979). Metacognition and cognitive monitoring: a new area of cognitive – developmental inquiry. American Psychologist, 34, 906 – 911. Glaser, R. (1990). The reemergence of learning theory within instructional research. American Psychologist, 45, 29 – 39. Glaser, R., Schauble, L., Raghavan, K., & Zeitz, C. (1992). Scientific reasoning across different domains. In E. de Corte, M. C. Linn, H. Mandl, & L. Verschaffel (Eds.), Computer-based learning environments and problem solving. NATO ASI Series F, vol. 84 (pp. 345 – 371). Heidelberg: Springer-Verlag. Humphreys, L. G. (1968). The fleeting nature of the prediction of college academic success. Journal of Educational Psychology, 59, 375 – 380. Humphreys, L. G. (1989). Intelligence: three kinds of instability and their consequences for policy. In R. L. Linn (Ed.), Intelligence ( pp. 193 – 216). Urbana, IL: University of Illinois Press. Kuhn, D. (1999). Metacognitive development. In L. Balter, & C. S. Tamis-LeMonda (Eds.), Child psychology. A handbook of contemporary issues ( pp. 259 – 286). Philadelphia, PA: Psychology Press. Maqsud, M. (1997). Effects of metacognitive skills and nonverbal ability on academic achievement of high school pupils. Educational Psychology, 17, 387 – 397. Minnaert, A., & Janssen, P. J. (1999). The additive effect of regulatory activities on top of intelligence in relation to academic performance in higher education. Learning and Instruction, 9, 77 – 91. Nelson, T. O. (1999). Cognition versus metacognition. In Sternberg R. J. (Ed.), The nature of cognition ( pp. 625 – 641). Cambridge, MA: MIT Press. Nunnally, J. C. (1967). Psychometric theory. New York: McGraw-Hill. Pedhazur, E. J. (1982). Multiple regression in behavioral research.(2nd ed.). New York: Holt, Rinehart and Winston. Raaheim, K. (1988). Intelligence and task novelty. In R. J. Sternberg (Ed.), Advances in the psychology of human intelligence, vol. 4 (pp. 73 – 97). Hillsdale: Erlbaum. Schraw, G., Dunkle, M. E., Bendixen, L. D., & Roedel, T. D. (1995). Does a general monitoring skill exist? Journal of Educational Psychology, 87, 433 – 444. Schraw, G., & Moshman, D. (1995). Metacognitive theories. Educational Psychology Review, 7, 351 – 371. Schraw, G., & Nietfeld, J. (1998). A further test of the general monitoring skill hypothesis. Journal of Educational Psychology, 90, 236 – 248. Shore, B. M., & Dover, A. C. (1987). Metacognition, intelligence and giftedness. Gifted Child Quarterly, 31, 37 – 39. Slife, B. D., Weiss, J., & Bell, T. (1985). Separability of metacognition and cognition: problem solving in learning disabled and regular students. Journal of Educational Psychology, 77, 437 – 445. Snow, R. E. (1989). Aptitude – treatment interaction as a framework for research on individual differences in learning. In P. L. Ackerman, R. J. Sternberg, & R. Glaser (Eds.), Learning and individual differences ( pp. 13 – 59). New York: Freeman.
272
M.V.J. Veenman, J. Verheij / Learning and Individual Differences 13 (2003) 259–272
Snow, R. E., & Lohman, D. F. (1984). Toward a theory of cognitive aptitude for learning from instruction. Journal of Educational Psychology, 76, 347 – 376. Span, P., & Overtoom-Corsmit, R. (1986). Information processing by intellectually gifted pupils solving mathematical problems. Educational Studies in Mathematics, 17, 273 – 295. Sternberg, R. J. (1990). Metaphors of the mind: conceptions of the nature of intelligence. Cambridge: Cambridge University Press. Sternberg, R. J. (1996). Intelligence: knowns and unknowns. American Psychologist, 51, 77 – 101. Swanson, H. L. (1990). Influence of metacognitive knowledge and aptitude on problem solving. Journal of Educational Psychology, 82, 306 – 314. Taconis, R., Ferguson-Hessler, M., & Verkerk, G. (1997). Physics problem solving and the transition from general secondary education to higher education. Tijdschrift voor Onderwijsresearch, 22, 123 – 144. Van der Heijden, M. K. (1989). Veranderingsdiagnostiek bij handig rekenen. Tijdschrift voor Onderwijsresearch, 14, 103 – 110. Veenman, M. V. J. (1993). Intellectual ability and metacognitive skill: determinants of discovery learning in computerized learning environments. Amsterdam: University of Amsterdam. Veenman, M. V. J. (1999). It takes two to tango: over het samenspel van intellectuele en metacognitieve vaardigheden. In R. Hamel, M. Elshout-Mohr, & M. Milikowski (Eds.), Meesterschap; Zestien stukken over intelligentie, leren, denken en probleemoplossen voor Jan J. Elshout (Mastery: sixteen papers on intelligence, learning, thinking and problemsolving in the honour of Jan J. Elshout). ( pp. 187 – 196). Amsterdam: Vossiuspers AUP. Veenman, M. V. J., Beishuizen, J. J., & Niewold, P. (1997). Intellectual ability and metacognitive skills during text studying. In S. Vosniadou, E. Matsagouras, K. Maridaki-Kassotaki, & S. Kotsanis (Eds.), 7th European Conference for Research on Learning and Instruction ( pp. 145 – 146). Athens: Gutenberg University Publications. Veenman, M. V. J., & Elshout, J. J. (1991). Intellectual ability and working method as predictors of novice learning. Learning and Instruction, 1, 303 – 317. Veenman, M. V. J., & Elshout, J. J. (1995). Differential effects of instructional support on learning in simulation environments. Instructional Science, 22, 363 – 383. Veenman, M. V. J., & Elshout, J. J. (1999). Changes in the relation between cognitive and metacognitive skills during the acquisition of expertise. European Journal of Psychology of Education, 14, 509 – 523. Veenman, M. V. J., Elshout, J. J., & Busato, V. V. (1994). Metacognitive mediation in learning with computerbased simulations. Computers in Human Behavior, 10, 93 – 106. Veenman, M. V. J., Elshout, J. J., & Meijer, J. (1997). The generality vs domain-specificity of metacognitive skills in novice learning across domains. Learning and Instruction, 7, 187 – 209. Wang, M. C., Haertel, G. D., & Walberg, H. J. (1990). What influences learning? A content analysis of review literature. Journal of Educational Research, 84, 30 – 43. Zimmerman, B. J., & Martinez-Pons, M. (1990). Student differences in self-regulated learning: relating grade, sex, and giftedness to self-efficacy and strategy use. Journal of Educational Psychology, 82, 51 – 59.