Poster sessions as an authentic assessment approach in an open-Ended University general chemistry laboratory

Poster sessions as an authentic assessment approach in an open-Ended University general chemistry laboratory

Available online at www.sciencedirect.com Procedia Social and Behavioral Sciences 1 (2009) 829–833 World Conference on Educational Sciences 2009 Po...

114KB Sizes 3 Downloads 31 Views

Available online at www.sciencedirect.com

Procedia Social and Behavioral Sciences 1 (2009) 829–833

World Conference on Educational Sciences 2009

Poster sessions as an authentic assessment approach in an openEnded University general chemistry laboratory Alev Do÷ana, Osman Nafız Kayab* a

Gazi University, Gazi Faculty of Education, Department of Science Education, Ankara-06500, TURKEY b Firat University, Faculty of Education, Department of Science Education, Elazig-23169, TURKEY Received October 22, 2008; revised December 16, 2008; accepted January 04, 2009

Abstract The purpose of this study is to explore the views of Pre-service Science Teachers (PSTs) on their posters as an authentic assessment tool in an open-ended chemistry laboratory course. Forty-three PSTs set up their research questions, designed and performed laboratory investigations, obtained results of their experiments and prepared their posters as an alternative to traditional laboratory reports. They engaged in the process of assessment in the course as both self and peer evaluators. At the end of each laboratory session, each group of the PSTs presented posters of their laboratory investigations to their peers and instructors. Results of the PSTs’ interview analyses indicated that most PSTs expressed many benefits to the presentation of laboratory investigations using posters in comparison to the traditional laboratory reports. © 2009 Elsevier Ltd. All rights reserved Keywords: Pre-service teachers; science teacher education; science laboratory; authentic assessment; poster session.

1. Introduction In higher education, instructors have decided what should be assessed and determined what students learned. They choose the assessment methods, set up the criteria and carried out the assessment. In this process, students held no power and did not participate in decision-making about their learning progression at all (Giles et al., 2004; Heron, 1981; Reynolds & Trehan, 2000). As a consequence of these kinds of concerns, during the past decade, there have been proposals for more participative approaches to assessment such as self, peer, collaborative, or consultative especially in higher education (Reynolds & Trehan, 2000). However, unfortunately, most of the assessments in the university courses are still continuing to be designed and conducted by the instructors, and students are rarely given a central responsibility for the entire process of planning, designing and implementing an assessment (Giles et al., 2004). In terms of science education, during the past decade, assessment tools and approaches often used by science educators (e.g., multiple-choice and open-ended questions) have also been questioned because of the lack of

* Nafız Kaya. Tel.: +90-505-866-8902; fax: +90-424-236-5064. E-mail address: [email protected]. 1877-0428/$–see front matter © 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.sbspro.2009.01.148

830

Alev Dog ˘ an / Procedia Social and Behavioral Sciences 1 (2009) 829–833

students’ active participation in the assessment process, reflecting rote learning (Zoller et al., 1999). In particular, assessments in science laboratories persistently have been carried out using traditional laboratory reports and quizzes. Twenty-two years from their first literature review on the science laboratories in 1982, Hofstein and Lunetta (2004)’s study in a new era of reform in science education examined what changed in science laboratories. Since 1982, they have expressed that the knowledge about new laboratory teaching approaches and strategies such as problem-based, discovery and inquiry type laboratories has substantially increased, while they have found that most of the assessment of students’ understanding in the science laboratory continues to be confined to conventional, usually objective, paper and pencil measures (Hofstein & Lunetta, 2004). For example, Yung (2001) reported that many teachers lack experience with methods enabling them to assess their students’ understanding and performance in the science laboratory. Zoller and his colleagues (1997, 1998 and 1999) have called that there is an urgent need for research to develop and implement authentic assessment approaches that attempt to capture more complex aspects of learning in science classrooms and laboratories, since very few assessment ways or tools that push beyond tasks of information retrieval, re-labeling or recognition or focus beyond basic skills in the subject matter is currently used from elementary to college level (Zoller et al., 1999, p. 112). In this connection, assessments of conceptual understanding of the Pre-service Science Teachers (PSTs) in general chemistry laboratories at education faculties in Turkey have been generally done through traditional assessment methods such as quizzes, laboratory reports and a final exam. The purpose of using these tests is to differentiate between the teacher candidates and rank them according to their achievement in the chemistry laboratory. Domin (1999a,b) states that this kind of learning environment in general chemistry laboratories is also common world-wide. For example, Abraham et al. (1997) investigated how general chemistry laboratory courses are taught and managed and what varieties of practices including assessment were being used in 203 randomly selected U.S. colleges and universities with chemistry programs approved by the American Chemical Society. The findings of this study showed that most of the respondents said that laboratory reports and quizzes or exams in most institutions are the major contributor to the laboratory grade. The purpose of this study was to investigate the views of Pre-service Science Teachers (PSTs) on their posters as an authentic assessment tool in an open-ended general chemistry laboratory course. This study focuses on PSTs’ posters as an authentic assessment tool in a student-centered approach to evaluate their understanding involving general chemistry laboratory investigations. In this paper, “authentic assessment” is described as assessment which is used to enhance learning and teaching. PSTs have a central responsibility for the entire process of planning, designing and implementing the assessment 2. Method

1.1. Sample Forty-three science teacher candidates (26 females and 17 males, ages 18 to 20) were randomly selected from a total of 603 teacher candidates enrolled in a university general chemistry laboratory course in the Gazi Faculty of Education, Gazi University, Ankara, Turkey. The PSTs were put into two sections and were taught by the same instructors. Within these two sections, students were put into small groups of 3-4 to allow them to communicate with each other and seek assistance primarily from their peers before, during and after the laboratory investigations.

1.2. Research design and procedure In this study, the general chemistry laboratory course for the PSTs was 13 weeks long for each of the two semesters. The PSTs met once each week for the laboratory session, which was 3-hours in duration. The PSTs performed the experiments by the expository laboratory approaches with traditional laboratory reports during the first semester of the course, while they carried out their laboratory investigations in an open-ended laboratory environment accompanied by their poster presentations as an alternative to the traditional laboratory reports during the second semester of the course. At the end of the study, the PSTs were individually interviewed to identify their views on the posters as an assessment tool. Each interview conducted by one of the researchers lasted about 25-30 minutes. All interviews were audio-recorded and transcribed verbatim. (All of the interviews were carried out in Turkish)

Alev Dog˘ an / Procedia Social and Behavioral Sciences 1 (2009) 829–833

831

1.3. Data analysis The PSTs’ interview transcripts were read and interpreted by the researchers. In this analysis, an inductive process was first used to identify themes and patterns describing the teacher candidates students’ views concerning the new forms of assessment experienced in the laboratory course. The student responses were coded. Then, the researchers and one external expert independently analyzed the interview responses by using the same transcripts and coding sheet. When discrepancies arose across individual analyses, the researcher and expert used the same coding sheet and reviewed the interview transcripts together, discussing discrepancies in coding, and reaching consensus on the student views regarding the assessment approach of the laboratory course.

1.4. Program development and context During the first semester, instruction in the general chemistry laboratory was carried out through expository approaches. The PSTs spent more time trying to get the correct results by using laboratory manuals than thinking about how the chemical concepts are related to the laboratory experiment. In this kind of learning environment, our students in the laboratory strive to collect records of their experiments, to transform these records into graphs, or diagrams and drawing conclusions, often, without knowing the reasons. Assessments of the PSTs’ understanding during the first semester of the chemistry laboratory were primarily done through small quizzes and traditional laboratory reports for each laboratory session and a final exam. During the first semester, the PSTs learned and practiced how to do their laboratory experiments using the step-by-step instructions from the laboratory guide, prepare the laboratory reports and take quizzes or exams for their laboratory grades. During the second semester of the course, the same 43 PSTs met once each week for the laboratory session and prepared their posters as an alternative to the traditional laboratory reports for their laboratory grades. The course was structured as an open-ended laboratory. PSTs were not provided a laboratory manual for their laboratory investigations. In the beginning, the PSTs were given only the chemical topics of each week for the whole semester (e.g., acids-bases, reaction rate, and chemical equilibrium) that they would perform the laboratory investigations. The PSTs were not given their research questions, the step-by-step procedure of the experiments, or what the possible results of their experiments might be. The instructors communicated with the students primarily through emails by a special online communication group that the students created and regularly met with the PSTs during 12 hours per week sessions to especially help them to shape their own laboratory investigations. During these pre-lab meetings, 15-20 minutes were spent with each group of the PSTs. The PSTs were asked before these meetings to determine their research questions and design of their laboratory investigations, including the necessary chemicals and materials. In these meetings, the students were also expected to provide a rationale for their choice of research questions to understand whether or not their research questions are dealing with chemical concepts involving the objectives of the general chemistry laboratory course, and the instructor could provide the necessary chemicals, materials and a safe laboratory environment for the PSTs.

1.5. Poster presentations In the beginning of the second semester, the PSTs were given detailed information about what a scientific poster was, its’ purpose and the way it was prepared and presented. Then, the researchers distributed some of their posters that they presented in scientific conferences to the PSTs as examples. Finally, we discussed with the PSTs about the frequently errors which would occur in poster presentations. For example, they were asked to prepare their posters with tables, graphs, and pictures which would summarize their laboratory research in the most appropriate and appealing manner. The posters were prepared in the general format stated in literature (e.g., Huddle, 2000; Sisak, 1997). All the posters started with the title, the names and the universities of the presenters and an abstract as a scientific poster. The PSTs were told to keep the text as short as possible and present the texts and the schemes readable from a distance of 3 m when preparing their posters. They were also advised to use different colors and symbols in order to illustrate the important points. At the end of this training session, a rubric was developed by researchers and students regarding specific structure and goals of the course to assess the posters. The rubric set the standards for student work products and consisted of three parts. The first part was composed of the content,

832

Alev Dog ˘ an / Procedia Social and Behavioral Sciences 1 (2009) 829–833

organization, and design of the poster. The second part focused on the individual assessment of the students. Students were individually assessed in terms of their abilities to respond to the visitors’ questions during the poster sessions. The 3-criteria used for the individual grading is as follows: number of the questions, difficulty level of the questions, and accuracy of the response. For the third part, students were expected to write and provide feedback as narrative. In this part of the assessment, the students were asked to give their comments and feedback on how their laboratory research could be improved. The active participation of the students was assessed based on the number and quality of the questions students asked of their peers during the poster presentations. As a result, we arrived at one conclusion that the poster presentations would show the quality of their laboratory work. Accordingly, grading of the course was based on the quality and students’ performance with the poster presentations and active participation of the students for other poster presentations. Each group started to prepare their posters after immediately completing their laboratory investigations. They first worked with the group members to summarize their theoretical and experimental data as a result of their laboratory investigations into 10-12 pages. At the end of each laboratory session, each group of the PSTs presented the posters of their laboratory investigations to their peers and instructors. They engaged in the process of assessment in the course as both self and peer evaluators. In self assessment, students judge their own work, while in peer assessment they judge the work of their peers. The last 10-15 minutes of each laboratory session were spent to assess the poster of each group of the PSTs using the rubric. The evaluation of the course consisted of self, peer, and instructor assessments. Accordingly, the final grade of the course was calculated based on the average score obtained from self, peer, and instructor assessments. In addition, PSTs’ poster presentation, including all posters, was held in an exhibition hall of the faculty with the presence of the dean, assistant deans, faculty, staff, and other teacher candidates at the final week of the course. The time and the place of the poster presentations were informed to whole faculty and teacher candidates in the faculty of education with emails and written ads. The PSTs answered the questions of the visitors in the spirit of a true scientist. Utmost care was shown to make the poster presentation the same as in real life scientific conference.

3. Results Results of the interview analyses showed that most of the PSTs expressed many benefits to the presentation of laboratory investigations using posters compared to the traditional laboratory reports. Almost all of the teacher candidates reported that poster sessions as an assessment approach were very exciting and interesting way of sharing information obtained from their laboratory investigations. Thirty-six PSTs stated that posters dramatically decreased their test anxiety and so increased their motivation toward laboratory work. Most of the students also reported that poster sessions and their active role in assessment improved their attitudes toward chemistry laboratory work. Some excerpts of the interview responses of the students include: -

-

-

The lab reports we have previously prepared were almost identical to each other. But it is not the case here. Everybody’s posters are different. I think that it is very different way to share your study with others. I would rather present posters than write a formal laboratory report. I really enjoyed while preparing and assessing our posters. I have never felt that somebody was assessing my knowledge or investigation in the chemistry laboratory during this process. I also used to feel insecure in lab especially due to exams. However, I am feeling much more relaxed in chemistry lab now because I was involved in the whole process of poster assessments as both self and peer evaluator. This application decreased my stress and increased self-confidence in lab work because everything about the assessment was very clear, and I am also part of it. I think that this approach will provide a meaningful learning process and improve the attitudes of many my friends towards lab in a positive way. By this application chemistry labs are no longer an obligation to me. I now regard labs as a hobby garden which grows my scientific capacity. I don’t realize how the time passes.

Thirty-nine PSTs expressed that their poster sessions as an assessment approach encouraged creativity in their laboratory work and increased their investigation capability, argumentativeness and communication skills in science and helped them better understand the laboratory investigations such as evaluating the quality of their data and its

Alev Dog˘ an / Procedia Social and Behavioral Sciences 1 (2009) 829–833

833

validity, attempting to find alternative explanations for their findings based on evidence and logic. Some excerpts taken from the interviews include: -

-

-

-

I used to memorize the topics before the exams. I used to forget everything after the exam. But this application enabled me to learn much more easily and it stays in my head more permanently. Now we are discussing scientific concepts within scientific context through our poster assessments. In our poster assessments, we learned how to scientifically communicate with each other and critically analyze our lab investigations. This is very important because it led us to work so much harder in chemistry laboratory. During our poster sessions, we gradually discovered what kinds of data should be reported, how we can make a link between our data and claims using scientific reasons and how we make counterarguments to other criticisms. This approach, of course, improved our communication skills and encouraged us to learn more and more in the chemistry laboratory. When we assess each other through our posters, we discussed a lot around the concepts, principles, theories and results related to our experiments and how could the design and procedures be better. Sometimes, we arrived at one conclusion and often did not. Accordingly, we understood that discussion is one of the most important elements of scientific progress.

In brief, student response has been generally positive comments were in the majority. Student comments on an assessment were usually along the lines of “…poster sessions help to break up the chemistry lab stress…” and “…the poster sessions were a good way for assessment. I think that they helped put each lab into scientific perspective”. Some students commented that ‘‘…I would rather present posters than write a formal laboratory report’’ and “…I am sure that this assessment approach is a near perfect way to teach student how scientific knowledge is formed and presented. That is why I want to use it very much when I start to teach myself”.

References Abraham, M. R., Cracolice, M. S., Graves, A. P., Aladamash, A. H., Kihega, J. G., Palma Gil, J. G., & Varghese, V. (1997). The nature and state of general chemistry laboratory courses offered by colleges and universities in the United States. Journal of Chemical Education, 74, 591594 Domin, D. S. (1999a). A review of laboratory instruction styles. Journal of Chemical Education, 76, 543-547. Domin, D. S. (1999b). A content analysis of general chemistry laboratory manuals for evidence of higher order cognitive tasks. Journal of Chemical Education, 76, 109-111. Giles, A., Martin, S.C., Bryce, D., & Hendry G.D. (2004). Students as partners in evaluation: student and teacher perspectives. Assessment and Evaluation in Higher Education, 29, 681-685. Heron, J. (1981). Self and peer assessment. In T. Boydell & M. Pedler (Eds.), Management Self Development, Concepts and Practices (pp. 111128). Farnborough: Gower. Hofstein, A., & Lunetta, V.N. (1982). The role of the laboratory in science teaching: Neglected aspects of research. Review of Educational Research, 52, 201-217. Hofstein, A., & Lunetta, V.N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88, 2854. Huddle, P.A. (2000). How to present a paper or poster. Journal of Chemical Education, 77, 1152. Reynolds, M. & Trehan, K. (2000) Assessment: a critical perspective. Studies in Higher Education, 25, 267-278. Sisak, M. E. (1997). Poster sessions as a learning technique. Journal of Chemical education, 74, 1065-1067. Yung, B. H. W. (2001). Three views of fairness in a school-based assessment scheme of practical work in biology. International Journal of Science Education, 23, 985-1005. Zoller, V., Ben-Chaim, D., & Kamm, S. D. (1997). Examination type preferences of college science students and their faculty in Israel and USA: A comparative study. School Science and Mathematics, 97, 3-12. Zoller, U., & Ben-Chaim, D. (1998). Student self-assessment in HOCS science examinations: Is there a problem? Journal of Science Education and Technology, 7, 135-147. Zoller, U., Fastow, M., Lubezky, A., & Tsaparlis, G. (1999). Students’ self assessment in chemistry examinations requiring higher-and lowerorder cognitive skills. Journal of Chemical Education, 76, 112–113.