Learning and Individual Differences 42 (2015) 36–43
Contents lists available at ScienceDirect
Learning and Individual Differences journal homepage: www.elsevier.com/locate/lindif
Learning strategies explaining differences in reading proficiency. Findings of Nordic and Baltic countries in PISA 2009 Ülle Säälik a,⁎, Kari Nissinen b, Antero Malin b a b
Faculty of Social Sciences and Education, University of Tartu, Salme 1a, Tartu 50103, Estonia Finnish Institute for Educational Research, University of Jyväskylä, PO Box 35, FI-40014, Finland
a r t i c l e
i n f o
Article history: Received 4 January 2015 Received in revised form 15 June 2015 Accepted 16 August 2015 Keywords: Learning strategies Reading literacy Multilevel modeling Individual differences Group effect
a b s t r a c t There are useful metacognitive learning strategies improving learning results significantly. Students can be trained to use them to achieve a higher level of proficiency in different academic domains, including reading. The current study was aimed to discover how student awareness and use of learning strategies explains differences in reading literacy test results, using PISA (the Program for International Student Assessment) 2009 data of three Nordic and three Baltic countries. The student level differences appeared partly due to the differences between schools, in the Baltic countries more than in the Nordic countries, which is considered a concern in the countries stating the equality of compulsory education. The student awareness of metacognitive learning strategies showed the highest explanatory power both at student and school levels. In the Baltic countries, students reported using traditional way of learning such as memorization more often, but it had no explanatory power. © 2015 Elsevier Inc. All rights reserved.
1. Introduction How students learn is obviously related to their academic achievement. The contemporary large-scale educational surveys offer a variety of representative data to investigate how different influencing factors work. There are several reading literacy surveys such as PIRLS (Progress in International Reading Literacy Study) or PISA (Program for International Student Assessment), but only PISA 2009 has included variables of learning strategies, describing student's awareness and choice of learning strategies. The current paper will consider how student's reading proficiency is explained by student's awareness of different learning strategies and ability to recognize the useful learning strategies. With multilevel modeling, both student and school levels were examined to detect, whether the variation of student's reading comprehension depends on differences between schools they go to. If these individual differences are even partly due to group effect (schools causing the differentiation), this situation definitely requires educators' attention, as the standpoint of many countries concerning equal educational opportunities is compromised. Educational equity, equal possibilities for obtaining comprehensive education has been and will be one of the main goals in education, mentioned in several official documents in Finland, Estonia, or Sweden (Brink, Nissinen, & Vettenranta, 2013; Education and Research 2011–2016, 2012; Eesti tegevuskava “Haridus kõigile”, 2004;
⁎ Corresponding author. E-mail addresses:
[email protected] (Ü. Säälik), kari.nissinen@jyu.fi (K. Nissinen), antero.malin@jyu.fi (A. Malin).
http://dx.doi.org/10.1016/j.lindif.2015.08.025 1041-6080/© 2015 Elsevier Inc. All rights reserved.
Haridus- ja Teadusministeeriumi…, 2014; The Swedish National Agency for Education, 2009). Not only schools in one country are often expected to be similar, but in many points of everyday life the neighboring countries also tend to show similarities, either in educational aspects or overall beliefs and attitudes in society. To learn from the best practices, the well-performing Finland was chosen for the analysis. To widen the analysis and see if any effect patterns appear, five more countries were included. Two other Nordic countries were expected to show similar traits with Finland with their common idea of state's duty to provide equality of opportunity in education, as “the Nordic countries have generally followed the same course in different tempos with Sweden being the main source of inspiration” (Telhaug, Medias, & Aasen, 2006, p.245). The three Baltic countries are close neighbors with Finland and Sweden, and, at least during their two decades of independence there has been quite close cooperation in education and sharing or practices going on, which might have assimilated them to some extent. The Baltic countries, might supposedly differ from the Nordic countries, as being geographically and historically separated, but do not necessarily oppose them.
1.1. Reading literacy and reading comprehension There are many different fields and subjects in compulsory education, but reading literacy is considered an essential skill for coping with life in the future, either in further studies, working life or simply being an active citizen (Linnakylä, Välijärvi, & Arffman, 2007). Therefore, how students do in reading deserves to be in the focus to discover shortcomings as well as enhancing factors.
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43
37
Reading literacy could be defined narrowly as reading acquisition, learning to read, but the broader reading comprehension means how the person manages with printed or written information (Perfetti & Marron, 1998). “Adolescent literacy is understood as the ability to read, write, understand and interpret, and discuss multiple texts across multiple contexts” (International Reading Association, 2012). In the PISA 2009 study, the reading literacy was defined as student's ability to understand, use and reflect on written texts; students demonstrating their proficiency in accessing and retrieving information, forming general understanding of the text, interpreting it and reflecting on its contents (OECD, 2010a).
(Malin, 2005; OECD, 2010c), and hence this might be the case in the Baltic or Nordic countries as well. When discovering a systematic variation in academic performance, it helps to track the “sources” of this variation (Malin, 2005, p. 22). If differences between schools appear to be explained by student's preference of some learning strategies, we can assume that this somehow indicates on school contribution to student's learning habits, as a result of teaching practices in schools.
1.2. Learning strategies, metacognition and their effect on achievement
The influence from the surrounding plays role in student's success. Among many other important contributors to student academic performance school climate and classroom climate have been pointed out (Scheerens, Glas, & Thomas, 2007; Wang, Haertel, & Walberg, 1994). Poor classroom discipline is found to be related to low achievement (Lee & Bryk, 1989; Ma & Willms, 2004). Disruptive disciplinary climate and poor teacher-student relations were found to be related to the struggling readers (Garbe, Holle, Weinhold, Meyer–Hamme, & Barton, 2010; OECD, 2010b). In this paper the effect of the environment was not in main focus, but to accept the presence of such indirect influence. The student with his or her results, abilities and skills is no longer seen as an isolated item in education, but rather in wider context, therefore in analyses several influencing factors should be combined together. Educationally disadvantaged students are seen as a risk for the future labor market and generally for the society (Brink et al., 2013). The struggling readers tend to have certain background characteristics such as male gender and low socio-economic status (Garbe et al., 2010; OECD, 2010b). Yet, the low socio-economic status does not necessarily result in poor performance (OECD, 2010a), or as Marks (2010) found, the effect of socio-economic status disappeared when school academic context was taken into account. Estonia, Finland and Norway showed weaker relationship between socio-economic background and performance in PISA 2009 (OECD, 2010c). Although PISA tests and questionnaires are translated into all main languages spoken in a country, the schools or regions of minority languages often show lower results. In the PISA 2009 study in Estonia, the students in Russian-speaking schools performed on a statistically significant lower level in reading compared to their peers in Estonianspeaking schools (Mikk et al., 2012). The school language factor explained about 12% of between-school variance in reading literacy performance in PISA 2009 in Estonia (Säälik et al., 2013, p. 75). In Finland, where the Swedish language is the second official language, protected by laws and regulations, the students in the minority Swedish-speaking schools still obtained lower reading literacy scores, and there were more disadvantaged students not able to reach the average performance level in Swedish-speaking schools (Brink et al., 2013; Harju-Luukkainen & Nissinen, 2011). Several reading literacy assessments have confirmed an existing gender gap in favor of girls (Mullis, Martin, Foy, & Drucker, 2012; OECD, 2010b, 2011). The OECD analyses implied that learning strategies tend to mediate the gender gap in reading performance: if boys had the same level of awareness of metacognitive skills as girls do, their results could be improved by 15 points (OECD, 2010b). Including the gender variable in the multilevel model together with learning strategies, the joint part of effect of those variables will be eliminated and the specific effect of each will appear. Although the current study is focused on learning strategies, the background factors should not be ignored but included in the analysis to see the relative ratio of the explained proportion of variation compared to the learning strategies. The aim of the current paper was to discover how student awareness and use of learning strategies explains the variation of reading literacy test scores at school and student levels in three Nordic and three Baltic countries, when the background and learning environment factors are controlled for.
In the current study, the student's awareness and ability to choose the best strategy for a certain task are in the spotlight, as the successful students are said to be able to generate strategies that meet their learning needs, and then they implement the useful strategies (Hacker, Dunlosky, & Graesser, 2009). Learning strategies are defined as a set of one or more procedures that an individual acquires to facilitate the performance on a learning task (Riding & Rayner, 2000, p.80). First, learner becomes aware of strategies and decision making processes, and strategic use appears when learner is able to select the best strategy for a certain task (Fisher & Williams, 2002; Jones, 2007). The metacognitive learning skills appear to be strong predictors of academic performance (Delclos & Harrington, 1991; Pennequin, Sorel, Nanty, & Fontaine, 2010); they have highly positive effect on improving learning results in different academic domains, including reading (van der Stel & Veenman, 2008, 2010). The metacognitive skillfulness could be developed with training at quite young age partly independent of the student's intellectual ability, compensating somewhat for low ability and insufficient prior knowledge (Jones, 2007; Roebers, Krebs, & Roderer, 2014; van der Stel & Veenman, 2008, 2010). When teachers practiced teaching both cognitive and meta-cognitive strategies continuously, encouraging reasoning for learning, thinking aloud, it was successful and helped everyone to become a more fluent reader (Steklàcs, 2010). Young students could be trained to recognize and use metacognitive learning strategies, and it can help to overcome academic difficulties, teaching such metacognitive skills should be provided to all pupils, especially consistently to those struggling with learning. 1.3. Reading literacy and learning strategies in PISA studies The PISA study is designed to monitor performance in reading, math and science among 15-year-olds all over the world. In 2009, the reading literacy was the main assessment component and it included questions on student awareness and the preferred use of different reading or learning strategies (OECD, 2010a, 2010b). The students had a reading task and a choice of options for how to approach this task, and they had to decide how useful each approach would be for the certain task. The options had been tested and analyzed earlier, and the experts had determined the effectiveness of each option. An index describing the awareness of metacognitive learning strategies as well as probable use of other learning strategies was calculated (OECD, 2012, p. 282). The new data about student's awareness and choice of learning skills give us an opportunity to analyze the effect of those learning skills on reading test results. The analysis of the PISA 2009 Estonian data showed that lowperforming students tend to report lower awareness level and less probable use of metacognitive skills, preferring more traditional methods such as memorizing or control strategies (Mikk, Kitsing, Must, Säälik, & Täht, 2012). Almost one third of student reading score variation between schools in Estonia was explained by the student's awareness of useful metacognitive strategies (Säälik, Malin, & Nissinen, 2013). In most PISA countries, a great share of the variation in student performance has been attributable to differences between schools
1.4. Learning environment and background factors
38
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43
2. Method
Table 2 Country mean scores in reading performancea, plausible value 1 mean scores (used in current analysis) and standard deviations in PISA 2009.
2.1. Sample In the PISA 2009 study, a two-stage stratified sample design was used. In the first stage, individual schools with 15-year-old students were sampled from a comprehensive national list of schools with probabilities that were proportional to a measure of size — the number of PISA-eligible 15-year-old students enrolled in the school. The students were randomly sampled within each sampled school (OECD, 2012). The national sample sizes of the considered countries are given in Table 1. 2.2. Data — test scores and indices As a part of the PISA study, each student filled in a pencil-and-paper reading literacy test booklet with simple or complex multiple-choice, closed or open constructed-response type tasks. After the test the students answered a questionnaire about their personal background, their learning habits and choice of learning strategies (OECD, 2010a). The test scores from the PISA data have been derived from student responses using item-response methodology. In determining the scores, PISA uses the methodology usually known as plausible values, reported to possess favorable properties in regard to estimating population-level statistics (OECD, 2009). The plausible values calculated for students take the role of ‘test score’ used as response variables in statistical analyses. The scale of the plausible values was originally set in the first PISA study in 2000 to have the international (OECD) average of 500 and standard deviation of 100 (OECD, 2010a, 2012). The current analyses were carried out with plausible value 1 of the PISA 2009 reading literacy score. In PISA five plausible values are computed, but since the plausible values of a student are highly correlated (in PISA 2009 the correlations of the reading literacy plausible values among OECD countries were all above 0.92), an approximate analysis can be obtained by selecting just one of the five plausible values and performing the analysis with it. It can be seen in Table 2 that the national plausible value 1 means and standard deviations appear very similar to the official country means and standard deviations. Thus, the simpler approach with one plausible value is justified. The indices used in this analysis, as provided by the OECD, were constructed through scaling of items from student questionnaires, and then standardizing them so that the mean of the index value for the OECD student population was 0 and the standard deviation was 1, countries being given equal weight in the standardization process (OECD, 2012, p.280). The internal consistency for the scales was high in the considered countries (OECD, 2012, p.16). The metacognitive learning strategies (summarizing and understanding and remembering) were derived from students' reports on the usefulness of the strategies. Several different ways of learning approaches were listed. The indices were scored using a rater-scoring system. Through a variety of trial activities, both with reading experts and national centers, a preferred ordering of the strategies according to their effectiveness to achieve the intended goal was agreed. The understanding and remembering strategy (UNDREM) had six options: A) I concentrate on the parts of the text that are easy to understand; B) I quickly read through the text twice; C) After reading the text, I Table 1 Numbers of participating schools and students in PISA 2009. Country
N of schools
N of students
Estonia Latvia Lithuania Finland Sweden Norway
175 184 196 203 189 197
4727 4502 4528 5810 4567 4660
Area
Country
Mean scores
SD
Plausible value 1 mean scores
SD
Baltic
Estonia Latvia Lithuania Finland Sweden Norway
501 484 468 536 497 503
83 76 86 86 99 91
502 488 467 531 498 503
82 79 86 86 98 91
Nordic
a
OECD, 2010d, p.196.
discuss its content with other people; D) I underline important parts of the text; E) I summarize the text in my own words; and F) I read the text aloud to another person. The experts' agreed order of the six items consisting this index is CDE N ABF (OECD, 2010b, p.113). The summarizing strategy (METASUM) was derived from students' reports on the usefulness of the following strategies for writing a summary of a long and rather difficult two-page text about fluctuations in the water levels of a lake in Africa: A) I write a summary. Then I check that each paragraph is covered in the summary, because the content of each paragraph should be included; B) I try to copy out accurately as many sentences as possible; C) before writing the summary, I read the text as many times as possible; D) I carefully check whether the most important facts in the text are represented in the summary; and E) I read through the text, underlining the most important sentences, then I write them in my own words as a summary. The experts' agreed order of the five items consisting this index is DE N AC N B (OECD, 2010b, p.113). The index of memorization (MEMOR) was derived from the frequency with which students did the following when they were studying: i) try to memorize everything that is covered in the text; ii) try to memorize as many details as possible; iii) read the text so many times that they can recite it; and iv) read the text over and over again. The index of elaboration (ELAB) was derived from the frequency with which students did the following when they were studying: i) try to relate new information to prior knowledge acquired in other subjects; ii) figure out how the information might be useful outside school; iii) try to understand the material better by relating it to my own experiences; and iv) figure out how the text information fits in with what happens in real life. The index of control strategies (CSTRAT) was derived from students' reports on how often they did the following statements: i) when I study, I start by figuring out what exactly I need to learn; ii) when I study, I check if I understand what I have read; iii) when I study, I try to figure out which concepts I still haven't really understood; iv) when I study, I make sure that I remember the most important points in the text; and v) when I study and I don't understand something, I look for additional information to clarify this (OECD, OECD, 2010b, p.112). A complete overview of indices used in the analysis is given in Appendix A. 2.3. Statistical analyses In school studies, single student observations are usually not completely independent, as the data are hierarchically structured with two levels (students are nested within schools), and students in same school tend to perform more similarly, often indicated as a ‘group effect’ (Hox, 2010). The statistical analyses in the current study were conducted using two-level modeling, which allows you to draw statistical inference for regression-type analyses under the hierarchical data structure, and in which these dependencies (individuals being possibly influenced by the group they belong to) are explicitly taken into account (e.g. Goldstein, 2011; Hox, 2010). Considering school data, this dependence may appear due to students being exposed to similar teaching and, therefore, individual characteristics might be more or less affected by the practices and
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43
principles common in those schools (Malin, 2005). To see if there is any between-school variation in the country, a null model with no explanatory variables was built and the total variance was divided into two variance components: the variation between students within schools (student level, individual level), and between schools (school level). Dividing the between-school variance component by the total variance produces the intra-class correlation coefficient (ICC), showing the homogeneity of students within schools and quantifying students' dependence as regards the considered response variable (here the reading literacy). The explained variation was computed. By adding the explanatory variables into the model and observing the reduction of unexplained variation, we can trace the sources of the variation. The proportional reduction in variance components (Snijders & Bosker, 2002) was used as a measure for the explained variation. The statistical analyses were conducted using MLwiN software (Rasbash, Browne, Healy, Cameron, & Charlton, 2013). Student weights were used in modeling. Separate analyses were conducted for each country data.
3. Results 3.1. Descriptive statistical results Mean scores in reading performance (Table 2) show that students in Sweden, Estonia, and Norway performed close to the OECD average level of 500 points. Finland obtained the highest mean score of 536, Lithuania the lowest 468, the difference between the highest and the lowest being 68 points. The Nordic countries performed at higher level. The standard deviations refer to slightly more homogeneous reading performance in Latvia and, on the other hand, larger differences in reading performance in Sweden. The means and standard deviations of the indices used in the current analysis are presented in Table 3. As for the metacognitive learning strategies (UNDREM and METASUM), they appeared to be 0.25 and 0.16 points higher than OECD average in Estonia, around the OECD average in Finland (− 0.03 and 0.05, respectively), and slightly below the OECD average in Latvia, Lithuania and Sweden. The level of using memorization (MEMOR) among students was 0.19 in Lithuania and 0.18 in Sweden. In Finland and Norway the students reported less use of this traditional learning method (− 0.20 and − 0.44, respectively). Elaboration strategies (ELAB) appeared above the OECD average in
39
Baltic countries, and below it in Nordic countries. The index of control strategies (CSTRAT) was below the OECD mean in all six countries. The students in Lithuania and Sweden perceived the teacherstudent relationship (STUDREL) more positively than students averagely in OECD countries (0.14 and 0.15, respectively). In Latvia and Estonia, this index was slightly below the OECD average (−0.05), in Finland and Norway even lower (−0.14 and −0.17 respectively). In the Baltic countries, the disciplinary climate (DISCLIMA) was systematically rated more positively than in OECD countries on average (0.05–0.30), while the students of Nordic countries assessed the learning climate below the OECD average (from − 0.03 to − 0.27). The economic, social and cultural status was reported over the OECD average level in all six countries.
3.2. Explaining variance at the school and student level The null model variances and the ICC coefficients are shown in Table 4. The ICC shows how much of the total variance in reading literacy scores was due to differences between schools. A positive ICC, indicating on some group effect, was detected in all given countries. In the Nordic countries, the ICC was smaller than in Baltics (9–16% and 21–32%, respectively), which means that in the Baltic countries students' academic results were more dependent on the schools they attended. The Nordic countries, especially Finland and Norway, seem to be closer to their equality claim. Tables 5 and 6 present how much of the variation between schools and between students within schools in reading literacy score each variable explained, and how much these variables explained together (the full model). The metacognition variables had the strongest explanatory power in explaining differences in reading proficiency in every country both at the school and student levels. Summarizing explained 25–42% of the differences between schools (Table 5) and 14–25% of the differences within schools (Table 6). Control strategies seemed to play some role in explaining the between-school differences, especially in Nordic countries. At the student level, the contribution was lower, particularly in the Baltic countries. In all the countries, the contribution of memorization strategies to the explained variance was very low. The elaboration strategies explained some variation in Nordic countries, especially at the school level, the share of explained between-school variance being around 7–10%.
Table 3 Means and standard deviations of the explanatory variables in the countries in PISA 2009. Country Estonia Latvia Lithuania Finland Sweden Norway
Mean SD Mean SD Mean SD Mean SD Mean SD Mean SD
ESCSa
META–SUMb
UND–REMc
MEMORd
ELABe
C–STRATf
DIS–CLIMAg
STUD–RELh
0.18 0.79 0.12 0.91 0.11 1.02 0.35 0.82 0.16 0.86 0.25 0.78
0.16 0.89 −0.17 0.98 −0.17 0.95 0.05 0.99 −0.14 1.05 0.12 0.96
0.25 0.95 −0.18 0.97 −0.15 0.96 −0.03 1.02 −0.17 1.03 −0.29 1.02
0.08 0.83 0.14 0.81 0.19 0.78 −0.20 0.88 0.18 0.94 −0.44 0.98
0.09 0.85 0.14 0.81 0.18 0.83 −0.18 0.96 −0.09 0.99 −0.08 1.01
−0.12 0.82 −0.14 0.76 −0.07 0.84 −0.37 0.95 −0.09 0.93 −0.42 0.96
0.05 0.96 0.22 0.90 0.30 0.95 −0.27 0.94 −0.03 0.90 −0.25 0.93
−0.05 0.85 −0.05 0.85 0.14 1.09 −0.14 0.87 0.15 1.02 −0.17 1.03
Indices were scaled so that OECD mean = 0 and SD = 1. For more specific explanations of the variables see Appendix A. a Economic, social and cultural status. b Metacognition summarizing. c Metacognition understanding and remembering. d Memorization. e Elaboration. f Control strategies. g Disciplinary climate. h Teacher–student relations.
40
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43
Table 4 Variances in null models and intra-class correlations (student weights in use).
School level Individual level Total Intra-class correlation (ICC)
Estonia
Latvia
Lithuania
Finland
Sweden
Norway
1408 5384 6772 0.207
1470 4963 6433 0.228
2352 5072 7424 0.316
651 6920 7571 0.085
1550 8219 9769 0.158
863 7475 8338 0.103
Table 5 Between-school variation in reading literacy explained by each explanatory variable and by full model with all variables together (%). Explanatory variable
Estonia
Latvia
Lithuania
Finland
Sweden
Norway
Background variables Gendera School languageb ESCSd
4.4 11.5 24.6
6.1 n/ac 31.2
9.8 n/a 28.8
0.3 25.2 12.3
0 n/a 44.2
0.1 n/a 26.4
Learning strategies Metacognition: summarizing Metacognition: understanding and remembering Memorizing Elaboration Control strategies
32.7 29.3 1.5 3.1 8.7
25.2 24.5 1.8 2.4 7.4
24.8 22.7 1.4 1.1 8.1
41.6 36.9 0 10.0 18.3
38.8 39.3 3.8 7.1 14.6
35.5 31.6 5.0 10.4 18.2
Learning environment Disciplinary climate Teacher–student relations Full model
3.1 5.0 62.9
2.0 0 55.7
5.8 0 58.3
2.6 1.7 67.3
7.8 3.9 66.3
9.6 14.9 55.9
a b c d
Male gender is the reference. Main language in the country (Estonian, Finnish) is the reference. Data not available. Economic, social and cultural status.
At the student level the variables of learning environment did not indicate much explanatory power, with the possible exception of Norway. Both disciplinary climate and teacher-student relationship could explain rather small part of between-student variation (DISCLIMA about 0–4%, STUDREL about 3–8%). At the school level, 2–10% of differences between schools could be attributed to disciplinary climate: 2–3% in Latvia, Finland and Estonia, and 6–10% in Lithuania, Sweden and Norway. The teacher-student relationship explained practically nothing in Latvia or Lithuania. Only in Norway could the school differences in reading scores be remarkably explained by the teacher-student relationship (15% of school variation). As Table 6 presents, the individual level variances were larger in Nordic countries. The ESCS appeared to be an important predictor, but mainly at the school level. Since there were no gender-homogeneous
schools in the study, the ratio of boys to girls should be close to constant over all schools. It was therefore expected that gender would not explain any of the between-school differences. This held up for the Nordic countries, but in Baltics gender explained 4–10%. It appeared that in the Baltics the distribution of boys in selected schools was less uniform – many schools had more boys in the sample. The school language determined about 12% of between-school differences in Estonia and 25% of between-school differences in Finland, but those figures themselves should not be compared, as the between-school variance itself in Finland was small, so the range to be explained was easy to cover. All in all, the full models with all the variables showed that about one third of the variance between students and more than a half of the between-school variance could be attributable to the background, learning strategies, disciplinary climate and teacher-student relationship
Table 6 Student level variation in reading literacy explained by each explanatory variable and by full model with all the variables together (%). Explanatory variable
Latvia
Lithuania
Finland
Sweden
Norway
7.5 0 3.3
8.7 n/ac 3.1
12.6 n/a 4.1
10.9 0.1 7.6
6.4 n/a 11.1
7.6 n/a 7.6
Learning strategies Metacognition: summarizing Metacognition: understanding and remembering Memorizing Elaboration Control strategies
16.4 13.1 0.4 1.0 2.0
17.6 12.7 0.6 0.5 2.5
13.6 11.9 0.9 0.8 4.1
25.2 18.6 0.7 1.8 7.9
23.2 20.5 4.9 5.3 9.5
21.3 15.8 1.6 5.1 8.1
Learning environment Disciplinary climate Teacher–student relations Full model
1.6 3.7 29.3
1.7 2.7 31.7
2.0 3.0 33.6
0.6 4.8 37.7
4.1 7.1 37.4
2.4 8.1 35.8
Background variables Gendera School languageb ESCSd
a b c d
Estonia
Male gender is the reference. Main language in the country (Estonian, Finnish) is the reference. Data not available. Economic, social and cultural status.
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43
variables. The extent of the explained variance appeared rather similar in all countries. At student level the full model explained 30% of variance in Estonia, 32% in Latvia, 34% in Lithuania, 38% in Finland, 37% in Sweden and 36% in Norway. At school level the explained part of variance was 63% in Estonia, 56% in Latvia, 58% in Lithuania, 67% in Finland, 66% in Sweden and 56% in Norway. It means that at the individual level there were possibly many other factors affecting the results and probably causing differences, but at school level a large part of differences could be changeable by teachers creating better climate in the classroom or maintaining good relations with students, but especially by developing and training student's learning skills. 4. Discussion It became evident that the metacognitive strategies explained a great part of both between-school and within-school differences in reading literacy in every country, indicating how closely such learning skills relate to reading proficiency, just like earlier studies have shown (Hattie, 2009; Pennequin et al., 2010; Wang et al., 1994). The explanatory power of summarizing strategy appeared strikingly high, and the assumption of student awareness of metacognitive strategies enhancing reading comprehension is supported by the results of previous longitudinal studies (Jones, 2007; Chiu, Chow, & Mcbride-Chang, 2007; Roebers et al., 2014; van der Stel & Veenman, 2008, 2010). Yet, not too much emphasis should be put directly on the numbers or comparison of different learning strategies, as the constructs of asking about them were different, as described in the methods part, and also the nature of the analysis was rather approximate, showing the tendencies. The situation of compulsory education in the Nordic and Baltic countries is rather similar as all countries have the compulsory education up to the age of 15–16. This stage of education is available and equal for everyone, and obtaining this level of education is a norm in these countries (Klapp Lekholm, 2011; The Swedish National Agency for Education, 2009, 2010; OECD, 2011). The selection of students to the lower secondary schools is not a common practice in those countries. The school systems are characterized by a low level of differentiation in selecting and grouping students, and the classrooms tend to be heterogeneous (OECD, 2010c). Therefore, the student selection cannot be influencing the school differences. In the Baltic countries, the schools appeared more similar concerning learning strategies of students. Possibly, in the Baltic countries the students were either taught (or students had adopted) learning strategies almost equally much or equally little. Although the Swedish compulsory schools are usually equitable with only small betweenschool variations, the differences in outcomes between schools have. Results from both Swedish and international research showed that the impact of socio-economic background was significantly stronger at school level than at the individual level (The Swedish National Agency for Education, 2009), and that was the case in our study with the variable of economic, social and cultural status showing very high explanatory power (44%) of between-school variation. There were also some indications of patterns revealed in the study that the effect of variables may appear similar in different countries. The students of Baltic countries reported greater use of traditional learning method such as memorization, which does not contribute to success in reading tasks, but rather would decrease the reading score points (Chiu et al., 2007; Säälik, 2015). The Baltic countries with their 50-year Soviet period and only two decades of independence might still have some influence from formal, strict and traditional teaching practice during the Soviet time. In the Soviet Union the main (or even only) source of educational models and resources to the Soviet republics was from the central political power in Russia (Takala & Piattoeva, 2012), therefore the educators being affected by similar traditional understanding of teaching. Several pedagogical materials from Soviet period emphasize learning alone, learning by heart and memorizing facts, importance of order and silence, repetition etc. (Koemets, 1979; Kulko
41
& Cehmistrova, 1983; Bardin, 1987). Many teachers in Russia have continued to rely on traditional practices and values despite the Western influence of more contemporary teaching practices (Elliott & Tudge, 2007), and so that could be the case in former Soviet republics such as Estonia, Latvia, and Lithuania as well. However, although the educators all over the world desire finding causes for success or failure, such regression-type analyses do not indicate firm causal effects, but only might give reasonable assumptions for that. The causality must not be taken for granted without doubt. With the support of other theoretical sources these assumptions could give basis for further studies and researches. Also the backward effect of student competence (i.e. reading proficiency) on learning strategies would need some further studies. Although metacognition as a tool for improving learning has been in focus for about 30 years, the importance of it or how to instruct students about it may not have been acknowledged by all educators. As the current study discovered, the student's awareness of learning strategies plays an important role in explaining the variance in reading, but not all students have obtained the skills of recognizing the useful strategies. The studies in Norway (OECD, 2011) pointed out that their teachers might need help for teaching more student oriented practices and involving students in the planning of their work (just like is needed for developing metacognitive skills). The meeting point of student's, teacher's and school's possibilities to contribute to student's academic achievement seems to lie in the intensive promotion of student's activeness in learning, and development of more advanced thinking abilities in a friendly learning climate and free environment. Acknowledgments This research and the article were supported by the European Social Fund and Internationalization Program DoRa, which is being carried out by Foundation Archimedes (grant number 30.1-9.3/1668). Appendix A. Description of student background questionnaire indices for the PISA study used in the paper
Name of the index
Acronym
Economic, social and cultural status
ESCS
Metacognition: summarizing
Metacognition: Understanding and remembering
Sample questions
The index is calculated relying on the highest occupational status of the parents (HISEI), highest educational level of the parents in years of education (PARED), and home possessions (HOMEPOS) METASUM You have just read a long and rather difficult two-page text about fluctuations in the water level of a lake in Africa. You have to write a summary. How do you rate the usefulness of the following strategies for writing a summary of this two-page text? (Answers on a six-point scale) A) I write a summary. Then I check that each paragraph is covered in the summary, because the content of each paragraph should be included; B) I try to copy out accurately as many sentences as possible; C) Before writing the summary, I read the text as many times as possible; D) I carefully check whether the most important facts in the text are represented in the summary; and E) I read through the text, underlining the most important sentences, then I write them in my own words as a summary. UNDREM Reading task: You have to understand and remember the information in a text. How do you rate the usefulness of the following strategies for understanding and (continued on next page)
42
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43
Appendix A (continued) (continued) Name of the index
Acronym
Memorization strategies
MEMOR
Elaboration strategies
ELAB
Control strategies
CSTRAT
Disciplinary climate
DISCLIMA
Teacher-student relations
STUDREL
Sample questions memorizing the text? (answers on a six-point scale) A) I concentrate on the parts of the text that are easy to understand; B) I quickly read through the text twice; C) After reading the text, I discuss its content with other people; D) I underline important parts of the text; E) I summarize the text in my own words; and F) I read the text aloud to another person. When you are studying, how often do you do the following? (Answers on a four-point scale) When I study, I try to memorize everything that is covered in the text When I study, I read the text so many times that I can recite it When I study, I try to memorize as many details as possible When I study, I read the text over and over again When you are studying, how often do you do the following? (answers on a four-point scale) When I study, I try to relate new information to prior knowledge acquired in other subjects When I study, I figure out how the text information fits in with what happens in real life When I study, I try to understand the material better by relating it to my own experiences. When I study, I figure out how the information might be useful outside school When you are studying, how often do you do the following? (answers on a four-point scale) When I study, I start by figuring out what exactly I need to learn When I study, I check if I understand what I have read When I study, I make sure that I remember the most important points in the text When I study, I try to figure out which concepts I still haven't really understood When I study and I don't understand something, I look for additional information to clarify this How often do these things happen in your test language lessons? (answers on a four-point scale) Students don't listen to what the teacher says There is noise and disorder The teacher has to wait a long time for the students quieten down Students cannot work well Students don't start working for a long time after the lesson begins To what extent do you agree or disagree with the following statements? (answers on a four-point scale) I get along well with most of my teachers Most of my teachers are interested in my well-being Most of my teachers really listen to what I have to say If I need extra help, I will receive it from my teachers Most of my teachers treat me fairly
References Bardin, K. V. (1987). Kak naučit detei učitsa. Kniga dlja učitelja. [How to teach children to learn. A book for a teacher]. Moscow: Prosveščenije (note: in Russian).
Brink, S., Nissinen, K., & Vettenranta, J. (2013). Equity and excellence. Evidence for policy formulation to reduce the difference in PISA performance between Swedish speaking and Finnish speaking students in Finland. Jyväskylä: Finnish Institute for Educational research (https://ktl.jyu.fi/en/publications/g047.pdf). Chiu, M. M., Chow, B. W., & Mcbride-Chang, C. (2007). Universals and specifics in learning strategies: Explaining adolescent mathematics, science, and reading achievement across 34 countries. Learning and Individual Differences, 17(4), 344–365. http://dx. doi.org/10.1016/j.lindif.2007.03.007. Delclos, V. R., & Harrington, C. (1991). Effects of strategy monitoring and proactive instruction on children's problem-solving performance. Journal of Educational Psychology, 83, 35–42. Education and Research 2011–2016 (2012). A development plan. Reports of the Ministry of Education and Culture, Finland 2012:3. Ministry of Education and Culture (http://www. minedu.fi/export/sites/default/OPM/Julkaisut/2012/liitteet/okm03.pdf?lang=en). Eesti tegevuskava”Haridus kõigile” [Education for All, Estonian development plan]. http:// www.unesco.ee/public/documents/Haridus_koigile_arengukava_2004.pdf(2004). Elliott, J., & Tudge, J. (2007). The impact of the west on post-Soviet Russian education: Change and resistance to change. Comparative Education, 43(1), 93–112. http://dx. doi.org/10.1080/03050060601162420. Fisher, R., & Williams, M. (2002). Unlocking writing. London: David Fulton. Garbe, C., Holle, K., Weinhold, S., Meyer–Hamme, A., & Barton, A. (2010). Characteristics of adolescent struggling readers. In C. Garbe, K. Holle, & S. Weinhold (Eds.), ADORE– Teaching Struggling Adolescent Readers in European Countries. Key Elements of Good Practice (pp. 1–44). Frankfurt am Main: Peter Lang. Goldstein, H. (2011). Multilevel statistical models (4th ed ). Chichester: Wiley. Hacker, D. J., Dunlosky, J., & Graesser, A. C. (2009). A growing sense of “agency”. In J. D. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of Metacognition in Education (pp. 1–4). NY: Routledge. Haridus- ja Teadusministeeriumi valitsemisala arengukava „TARK ja TEGUS RAHVAS” 2015–2018 [The development plan in the field of Ministry of Education and Research” Smart and effective people” 2015–2018]. (note: in Estonian) http:// www.hm.ee/sites/default/files/tark_ja_tegus_rahvas_2015_2018_final.pdf(2014). Harju-Luukkainen, H., & Nissinen, K. (2011). Finlandssvenska 15–åriga elevers resultatnivå I PISA 2009 –undersökningen. Jyväskylä: Finnish Institute for Educational research (note: in Swedish). Hattie, J. A. C. (2009). Visible learning: A synthesis of meta-analyses relating to achievement. New York: Routledge (http://visible-learning.org/hattie-ranking-influences-effectsizes-learning-achievement/). Hox, J. J. (2010). Multilevel analysis — Techniques and applications. NY: Routledge. International Reading Association (2012). Adolescent literacy (Position statement, Rev. 2012 Ed.). (Newark, DE) http://www.reading.org/Libraries/resources/ps1079_ adolescentliteracy_rev2012.pdf Jones, D. (2007). Speaking, listening, planning and assessing: The teacher's role in developing metacognitive awareness. Early Child Development and Care, 177(6 & 7), 569–579. http://dx.doi.org/10.1080/03004430701378977. Klapp Lekholm, A. (2011). Effects of school characteristics on grades in compulsory school. Scandinavian Journal of Educational Research, 55(6), 587–608. http://dx.doi. org/10.1080/00313831.2011.555923. Koemets, E. (1979). Kuidas õppida. [how to learn]. Tallinn: Valgus (in Estonian). Kulko, V. A., & Cehmistrova, T. D. (1983). Formirovanije u učaščihsja umenij učitsa. Posobije dlja učitelja. [Forming in learners the skillfulness of learning. Auxiliary material for teachers]. Moscow: Prosveščenije (note: in Russian). Lee, V. E., & Bryk, A. S. (1989). A multilevel model of the social distribution of high school achievement. Sociology of Education, 62, 172–192. http://dx.doi.org/10.2307/ 2112866. Linnakylä, P., Välijärvi, J., & Arffman, I. (2007). Reading literacy — High quality by means of equity. In P. Linnakylä, & I. Arffman (Eds.), Finnish Reading Literacy. When Quality and Equity Meet (pp. 127–138). Jyväskylä: Jyväskylä University Press. Ma, X., & Willms, J. D. (2004). School disciplinary climate: Characteristics and effects on eighth grade achievement. Canadian Journal of Education, 25, 41–44. Malin, A. (2005). School differences and inequities in educational outcomes. PISA 2000 results of reading literacy in Finland. (Doctoral dissertation) University of Jyväskylä. Marks, G. N. (2010). What aspects of schooling are important? School effects on tertiary entrance performance. School Effectiveness and School Improvement, 21, 267–287. http://dx.doi.org/10.1080/09243451003694364. Mikk, J., Kitsing, M., Must, O., Säälik, Ü., & Täht, K. (2012). Eesti PISA 2009 kontekstis: tugevused ja probleemid. Programmi Eduko uuringutoetuse kasutamise lepingu aruanne [Estonia in PISA 2009 context: Strengths and issues. The Eduko programme grant report]. http://www.hm.ee/sites/default/files/eesti_pisa_2009_kontekstis.pdf (note: in Estonian) Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012). The PIRLS 2011 International Results in Reading. Chestnut Hill, MA: TIMSS & PIRLS International Study Center (http://timss.bc.edu/pirls2011/downloads/P11_IR_FullBook.pdf). OECD (2009). PISA Data Analysis Manual: SAS. 2nd ed.: OECD Publishing. OECD (2010a). PISA 2009 Assessment Framework: Key competencies in reading, mathematics and science. http://www.oecd.org/dataoecd/11/40/44455820.pdf OECD (2010b). PISA 2009 results: learning to learn — Student engagement, strategies and practices (Vol III) http://dx.doi.org/10.178/9789264083943–en. OECD (2010c). PISA 2009 results: what makes a school successful? — Resources, policies and practices (Vol IV) http://dx.doi.org/10.1787/9789264091559–en. OECD (2010d). PISA 2009 results: what students know and can do – Student performance in reading, mathematics and science (Vol I). http://dx.doi.org/10.1787/9789264091450en. OECD (2011). Improving lower secondary schools in Norway. OECD Publishing. http://dx. doi.org/10.1787/9789264114579-en. OECD (2012). PISA 2009 Technical Reporthttp://dx.doi.org/10.1787/9789264167872–en.
Ü. Säälik et al. / Learning and Individual Differences 42 (2015) 36–43 Pennequin, V., Sorel, O., Nanty, I., & Fontaine, R. (2010). Metacognition and low achievement in mathematics: The effect of training in the use of metacognitive skills to solve mathematical word problems. Thinking & Reasoning, 16(3), 198–220. http://dx.doi. org/10.1080/13546783.2010.509052. Perfetti, C. A., & Marron, M. A. (1998). Learning to read: Literacy acquisition by children and adults. In D. A. Wagner (Ed.), Advances in Adult Literacy Research and Development (pp. 1–41). Hampton Press (http://www.pitt.edu/~perfetti/PDF/Learning%20to% 20read,%20literacy%20acq%20by%20children%20and%20adults-%20Marron.pdf). Rasbash, J., Browne, W. J., Healy, M., Cameron, B., & Charlton, C. (2013). MLwiN Version 2. 29. Centre for Multilevel Modelling, University of Bristol (http://www.bristol.ac.uk/ cmm/software/mlwin/ordering/). Riding, R., & Rayner, S. (2000). Cognitive styles and learning strategies. Understanding style differences in learning and behaviour. London: David Fulton Publishers. Roebers, C. M., Krebs, S. S., & Roderer, T. (2014). Metacognitive monitoring and control in elementary school children: Their interrelations and their role for test performance. Learning and Individual Differences, 29. (pp. 141–149). Elsevier. http://dx.doi.org/10. 1016/j.lindif.2012.12.003. Säälik, Ü. (2015). Reading performance, learning strategies, gender and school language as related issues — PISA 2009 findings in Finland and Estonia. International Journal of Teaching & Education, III(2), 16–29. http://www.iises.net/international-journal\of-teaching-education/publication-detail-117. Säälik, Ü., Malin, A., & Nissinen, K. (2013). The role of learning strategies in PISA 2009 in Estonia: Metacognitive skillfulness giving readers a head start. In J. Mikk, M. Veisson, & P. Luik (Eds.), Change in Teaching and Learning (pp. 65–82). Frankfurt am Main: Peter Lang (https://www.researchgate.net/publication/263516567_The_Role_of_ Learning_Strategies_in_PISA_2009_in_Estonia_Metacognitive_Skilfulness_Giving_ Readers_a_Head_Start). Scheerens, J., Glas, C., & Thomas, S. M. (2007). Educational evaluation, assessment and monitoring: A systematic approach. London, New York: Taylor&Francis. Snijders, T., & Bosker, R. (2002). Multilevel analysis. An introduction to basic and advanced multilevel modeling. London: Sage Publications.
43
Steklàcs, J. (2010). Key element No. 6: Teaching cognitive and meta-cognitive reading strategies. In C. Garbe, K. Holle, & S. Weinhold (Eds.), ADORE — Teaching Struggling Adolescent Readers in European Countries. Key Elements of Good Practice (pp. 123–132). Frankfurt am Main: Peter Lang. van der Stel, M., & Veenman, M. V. J. (2008). Relation Between Intellectual Ability and Metacognitive Skillfulness as Predictors of Learning Performance of Young Students Performing Tasks in Different Domains. Learning and Individual Differences, 18. (pp. 128–134). Elsevier. http://dx.doi.org/10.1016/j.lindif.2007.08.003. van der Stel, M., & Veenman, M. V. J. (2010). Development of metacognitive skillfulness: A longitudinal study. Learning and Individual Differences, 20. (pp. 220–224). Elsevier. http://dx.doi.org/10.1016/j.lindif.2009.11.005. Takala, T., & Piattoeva, N. (2012). Changing conceptions of development assistance to education in the international discourse on post-Soviet countries. International Journal of Educational Development, 32. http://dx.doi.org/10.1016/j.jedudev.2010.10.003. Telhaug, A. O., Medias, O. A., & Aasen, P. (2006). The Nordic model in education: education as part of the political system in the last 50 years. Scandinavian Journal of Educational Research, 50(3), 245–283. http://dx.doi.org/10.1080/00313830600743274. The Swedish National Agency for Education (2009). What influences educational achievement in Swedish schools? A systematic review and summary analysis. http://www. skolverket.se/publikationer?id=2318 The Swedish National Agency for Education (2010). The Swedish National Agency for Education supervises and provides support for better schooling. (presentation). http:// www.skolverket.se/publikationer?id=2492 Wang, M. C., Haertel, G. D., & Walberg, H. J. (1994). What helps students learn? Educational Leadership, 51/4, 74–79 (http://www.ascd.org/publications/educationalleadership/dec93/vol51/num04/Synthesis-of-Research-~-What-Helps-StudentsLearn%C2%A2.aspx).