Available online at www.sciencedirect.com
ScienceDirect Procedia - Social and Behavioral Sciences 177 (2015) 295 – 299
Global Conference on Contemporary Issues in Education, GLOBE-EDU 2014, 12-14 July 2014, Las Vegas, USA
Upper Secondary Schools Students’ Progression in Operational Scientific Skills – A Comparison between Grades 10 and 12 Regina Soobarda*, Miia Rannikmäea, Priit Reiskab a
Centre for Science Education, University of Tartu, Vanemuise 46-226, Tartu 51014 , Estonia Institute of Mathematics and Natural Sciences, Tallinn University, Narva Rd 25, Tallinn 10120, Estonia
b
Abstract The goal of this study is to investigate upper secondary students’ learning progression in operational scientific skills over a 3 year period. Operational scientific skills is taken to mean utilizing science knowledge and skills, particularly with relevance to creative problem solving and making reasoned decision in real life situations. An interdisciplinary contextualized instrument based on real life related items following the SOLO taxonomy was used. Results (grades N10=1128 and N12=764) show no expected shift in operational skills. Changes are needed in upper secondary science education to ensure students give more appropriate responses related to problem solving and decision making items. ©2015 2014The TheAuthors. Authors. Published Elsevier © Published by by Elsevier Ltd.Ltd. This is an open access article under the CC BY-NC-ND license Peer-review under responsibility of the Scientific Committee of GLOBE-EDU 2014. (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Scientific Committee of GLOBE-EDU 2014. Keywords: Operational scientific skills, SOLO taxonomy
1. Introduction The goal of science education is to promote the development of scientific literacy (e.g. Roberts, 2007; Soobard & Rannikmäe, 2011; Choi et al., 2011). However, the cognitive component of this term relates to multiple operational skills, which lead to the development of higher levels of scientific literacy, if acquired and demonstrated by students (e.g. OECD, 2007; EURYDICE, 2011). Based on this and to investigate the development of the cognitive component of scientific literacy, this study breaks down multiple operational skills to give a better overview of
* Regina Soobard. Tel.:+4-345-434-111. E-mail address:
[email protected]
1877-0428 © 2015 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the Scientific Committee of GLOBE-EDU 2014. doi:10.1016/j.sbspro.2015.02.342
296
Regina Soobard et al. / Procedia - Social and Behavioral Sciences 177 (2015) 295 – 299
students’ progress. Operational scientific skills are defined in relation to the following definition of scientific literacy - utilizing science knowledge and skills, particularly with relevance to creative problem solving and making reasoned decision in real life situations (Holbrook & Rannikmäe, 2009) and cover the following: giving scientific explanation, solving scientific problems and socio-scientific making decisions. These operational skills are included also in the Estonian national science curriculum for upper secondary school students, as part of scientific literacy (Estonian Government, 2011). It is expected that students’ undergo learning progression to enhance their scientific abilities during science studies (Krajcik, 2011). For assessing students’ progress, this study utilizes the SOLO taxonomy for item developments, because, besides testing progression, it allows analysis and evaluation of both the quantity (uni- and multi-structural level) and the quality (relational and extended abstract level) of students’ responses and therefore is not only a guide for developing test items, but allows for a range of student learning to be analysed (Biggs, 1996). Also, this taxonomy is more student-centered in that students are required to demonstrate multiple operational skills while giving responses at different levels and does not focus only on the degree of correctness of more complex answers (Biggs, 1996). Assessment of operational scientific literacy skills should be in a student-relevant context (Roberts, 2007; 2011). It thus seems desirable for instruments measuring progress in scientific literacy to be connected with real life issues, e.g. context-based (Bennet et al., 2007) and not focusing simply on science content itself (OECD, 2007). This can be expected to enable students to appreciate the relevance of the question and stimulate higher level responses. On the other hand, Sadler (2004) and Zeidler et al. (2005) have indicated that using socio-scientific issues in science subjects are more demanding and difficult for students to solve, because these are usually ill-defined and more complex. This outcome is also found by Cavagnetto (2010). Nevertheless, learning and hence assessment in science subjects should provide students with challenges to help them to progress. The following research questions are posed: 1. How do grade 10 and 12 students demonstrate their competence in operational scientific literacy skills when responding to context-based situations? 2. What changes in operational scientific literacy skills occur over a 3 year period at upper secondary level?
2. Methodology 2.1. Sample All grade 10 (N=1128) and 12 (N=764) students were selected from a representative sample of schools (N=44) in Estonia. Schools were chosen based on location (the capital; towns with at least two gymnasiums; rural areas), location being taken into account to ensure equal possibility of schools to be selected. The data gathering period was November 2011 (grade 10) and January 2014 (grade 12). In Estonia, all students were required to study all four science subjects (Biology, Chemistry, Physics, and Geography) and therefore this sample assumed students had received instruction in these science subjects. 2.2. Instrument The test instrument used a contextualized, extraordinary phenomena-related situation (the Dead Sea). It required students to transfer science knowledge and skills to a new context, giving indicators of scientific literacy identified through measures of scientific explanation, problem solving and the manner in which decisions made were justified. Tasks in the test instrument were based on the levels derived from SOLO taxonomy to determine learning progression (Biggs, 1996). Students’ responses were coded using a three-point scale (1- minimal or incorrect to 3maximal response). Minimal response indicated that student gave incorrect or insufficiently specific responses and
297
Regina Soobard et al. / Procedia - Social and Behavioral Sciences 177 (2015) 295 – 299
was considered as minimal (in the case of decision making, for example, or giving a scientific explanation). Maximal response indicated a correct answer and also well-reasoned decision making. The test instrument was composed of 8 items. Item 1 (uni-structural) and 2 (multi-structural) required students to choose a correct scientific explanation (item 1 required one correct and item 2 two correct scientific explanations). These first two items were of the multiple choice type. Items 3-5 (relational) focused on one scientific problem solving situation, related to solubility of salts (item 3 required analyzing and interpreting the solubility graph; items 4-5 required giving scientific explanations related to a previous analysis of a graph but with a slightly changed temperature situation). Items 6-8 (extended abstract) focused on decision making (item 6 required choosing correct arguments or evidence to support the claim; item 7 required listing as many arguments as possible based on given task and item 8 required making a reasoned decision). These items were taken to relate to an extended abstract level, based on SOLO taxonomy, as they required further science knowledge. The instrument was validated using the expert opinions of four school science teachers and 2 university science staff members. Based on their recommendations, the instrument was modified to make it more suitable for both the upper secondary level and expectations at the university level. The reliability was calculated using Cronbach alpha (0,62) and was considered as acceptable for this instrument. A Principal Component Analysis was also conducted to ensure that the item division between SOLO taxonomy levels was suitable. A four factor solution was found describing 63% of variance and the same factors were found in both grades indicating that item division was suitable. 2.3. Data analysis Data were analyzed using frequency distribution and differences between the two grades were calculated using Mann-Whitney U-test (Cohen et al., 2007) in IBM SPSS Statistics 20. 3. Results and discussion An overview of the results is presented in table 1. Grade 10 results show that based on the percentage of maximal responses, item 5 is more difficult for those students than other items (1.8% of all students give maximal response). This item requires the giving of a scientific explanation for salts solubility at night time when water temperature drops from 35oC to 10oC. In item 6, grade 10 students are generally able to select appropriate arguments to support the given claim. At the same time, their competence to make a reasoned decision (item 8) is not high (only 5.3% of students give maximal response). Grade 12 results indicate that the most difficult item was item 3 (Table 1). In this item, students’ are required to choose three salts that most likely give the highest precipitation, based on analyzing both the graph and the problem stimulus text. Similarly to grade 10 students, item 6 results indicate that this item is not difficult for students (67.6% selected correct arguments to support the claim). Table 1. Grade 10 and 12 frequency distribution at the different SOLO levels and with respect to completeness of responses. SOLO level
Uni-structural Multi-structural Relational Extended abstract
Item
1 2 3 4 5 6 7 8
Grade 10 (n = 1128) % Response (total =100) 1 2 3 21.1 28.4 50.5 12.6 70.0 17.4 35.5 62.6 1.9 56.4 34.4 9.2 84.4 13.8 1.8 3.7 35.7 60.6 65.0 26.9 8.1 65.6 29.1 5.3
Grade 12 (n = 764) % Response (total = 100) 1 2 3 25.1 25.2 49.7 12.2 70.4 17.4 46.0 52.4 1.6 53.0 31.2 15.8 76.7 20.4 2.9 2.4 30.0 67.6 44.8 37.4 17.8 40.7 47.9 11.4
Difference significance U
p
408995.0 420482.5 343169.5 260598.5 304285.0 381384.0 215664.0 261804.0
>0,05 >0,05 <0.001 <0.05 <0.001 <0.05 <0.001 <0.001
298
Regina Soobard et al. / Procedia - Social and Behavioral Sciences 177 (2015) 295 – 299
Items 1 and 2, focusing on the growth of knowledge based on SOLO taxonomy (Biggs, 1996), indicated little shift in students responses from grade 10 to 12 (no statistical difference was found). A similar percentage of students gave maximal response in both grades despite three years of additional learning. This indicated minimal change in the knowledge and skills among upper secondary students. In items related to problem solving (relational level), grade 12 students gave more maximal responses than grade 10 students, although in items 4 and 5 the main shift was from minimal response to medium response (Table 1). Of concern is that in item 3, grade 10 students did better than grade 12 students (at least a partially complete level) and this difference was statistically significant. This is an important reminder about learning progression, because grade 10 students lacked the additional learning experienced by the grade 12 students. In items related to reasoned decision-making (items 6-8), it was easier for students in both grades to select appropriate arguments to support the claim (item 6), but it was more difficult to list as many arguments as they could related to a given problem (item 7), or even more, to make a reasoned decision (item 8). These outcomes were supported by the percentages of maximal responses. It could also be seen that the percentage of students giving maximal response in item 8 (making reasoned decision) was higher in grade 12, although it was still only 11.4% of students. Research found that socio-scientific decision making with reasoning was more difficult for students than applying science content knowledge (Sadler, 2004; Zeidler et al., 2005; Cavagnetto, 2010; Soobard & Rannikmäe, 2011). Based on those outcomes it can be said that there is a change in operational science skills in terms of giving scientific explanation, solving problems and making reasoned decision making. But the main change is shown in the percentage of students’ answers shifting from the minimal to the medium level. Nevertheless, the overall pattern of students’ responses in both grades remains similar, e.g. at the extended abstract level, generally students’ in both grades achieve higher for item 6, then item 7 and least well for item 8 (based on percentages of maximal responses). A similar pattern occurs in problem solving items (relational level). This suggests that over a three year period, some operational skills are promoted more than others. Even more, students, in both grades, give greater maximal responses to the first two items (items related with growth of knowledge based on SOLO taxonomy) than for problem solving and decision making items. Based on the outcomes of this study, there is a need to focus more on progression in operational scientific skills development. This could lead to a situation, were more students are capable to solve scientific problems in their lives and make reasoned decisions in everyday life situations. The education offered needs to be seen as promoting the range of education goals (e.g. operational science skills), linked to the development of levels of scientific literacy (Bybee, 1997; Soobard & Rannikmäe, 2011) and demands for future workforce (Bybee & McCrae, 2011). 4. Conclusions This study was conducted to investigate how students’ progress in operational science skills defined through cognitive components of scientific literacy (applying interdisciplinary knowledge, giving scientific explanation, solving problems and making decisions) over three year period. It was found in response to first research question that grade 10 and 12 students performed in uni- and multistructural level items in a similar manner. No statistically significant differences were found. In problem solving and decision making items, both grade 10 and 12 students’ did better in items focusing on the knowledge and had more difficulties with items (related to problem solving and decision making) focusing on the quality of responses based on SOLO taxonomy. In response to the second research question, in general, grade 12 students achieve better on items requiring problem solving (relational) and decision making (extended abstract), but the percentage of students giving maximal response was low for school leavers in grade 12. Based on this, there is a need to ensure better progression in
Regina Soobard et al. / Procedia - Social and Behavioral Sciences 177 (2015) 295 – 299
gaining competences in operational scientific skills, which are part of the general goal in science education for developing scientific literacy. Acknowledgements This study has been supported by European Social Fund programme EDUKO grant LoTeGüm and Estonian Science Foundation grant GLOLO8219. References Bennet, J., Lubben, F., & Hogarth, S. (2007). Bringing Science to Life; A Synthesis of the Research Evidence on the Effects of Context-Based and STS Approaches to Science Teaching. Science Education, 91(3), 347–370. Biggs, J. (1996). Constructing Learning and What It Is to Understand. In J. Biggs (Ed.), Testing: To Educate or To Select? Education in Hong Kong at the Crossroads (pp. 46–84). Hong Kong: Hong Kong Educational Publishing Co. Bybee, R. (1997). Toward an understanding of scientific literacy. In W. Gräber & C. Bolte (Eds.), Scientific literacy: An international symposium (pp. 37–68). Kiel, Germany: IPN. Bybee, R., & McCrae, B. (2011). Scientific Literacy and Student Attitudes: Perspectives from PISA 2006 science. International Journal of Science Education, 33(1), 7–26. Cavagnetto, A.R. (2010). Argument to Foster Scientific Literacy: A Review of Argument Interventions in K-12 Science Contexts. Review of Educational Research, 80(3), 336–371. Choi, K., Lee, H., Shin, N., Kim, S-W., & Krajcik, J. (2011). Re-Conceptualization of Scientific Literacy in South Korea for the 21st Century. Journal of Research in Science Teaching, 48(6), 670–697. Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education. London: Routledge. Estonian Government (2011). Gümnaasiumi riiklik õppekava (National Curriculum for gymnasium). Regulation of the Government of the Republic of Estonia), No. 2. Tallinn. EURYDICE, (2011). Science education in Europe: National policies, practices and research, Brussels: EURYDICE. Holbrook, J., & Rannikmäe, M. (2009). The Meaning of Scientific Literacy. International Journal of Environmental & Science Education, 4(3), 275–288. Krajcik, J. (2011). Learning Progressions Provide Road Maps for the Development and Validity of Assessments and Curriculum Materials. Measurement, 9, 155–158. OECD. (2007). PISA 2006. Science competencies for tomorrow’s world. Volume I: Analysis. Paris: OECD. Roberts, D.A. (2007). Scientific literacy/Science literacy. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 729–780). USA: Lawrence Erlbaum Associates, Inc. Roberts, D.A. (2011). Competing Visions of Scientific Literacy: The Influence of a Science Curriculum Policy Image. In C. Linder, L. Östman, D. Roberts, P Wickman, G. Erickson, A., MacKinnon (Eds.), Exploring the Landscape of Scientific Literacy (pp. 11–27). New York: Routledge. Sadler, T.D. (2004). Informal reasoning regarding socio-scientific issues: A critical review of the literature. Journal of Research in Science Teaching, 41(4), 513–536. Soobard, R. & Rannikmäe, M. (2011). Assessing student’s level of scientific literacy using interdisciplinary scenarios. Science Education International, 22(2), 133–144. Zeidler, D.L., Sadler, D.T., Simmons, M.L., & Howes, E.V. (2005). A Research Based Framework for Socio-scientific issues education. Science Education, 89(3), 357–377.
299