Using a voting system in conjunction with interactive whiteboard technology to enhance learning in the English language classroom

Using a voting system in conjunction with interactive whiteboard technology to enhance learning in the English language classroom

Computers & Education 50 (2008) 338–356 www.elsevier.com/locate/compedu Using a voting system in conjunction with interactive whiteboard technology t...

2MB Sizes 0 Downloads 56 Views

Computers & Education 50 (2008) 338–356 www.elsevier.com/locate/compedu

Using a voting system in conjunction with interactive whiteboard technology to enhance learning in the English language classroom Euline Cutrim Schmid

*

University of Education (Pa¨dagogische Hochschule), Heidelberg, Germany Received 20 February 2006; received in revised form 3 July 2006; accepted 6 July 2006

Abstract This study discusses the pedagogical potential of an interactive voting system used in conjunction with interactive whiteboard technology. The data discussed here are drawn from a qualitative study, carried out in the context of a British university pre-sessional programme in English for Academic Purposes and Study Skills for international students in the summers of 2003 and 2004. Research data were collected via a variety of ethnographic research instruments, namely classroom observations and feedback from critical colleagues, teacher’s field notes, video recording of classes, semi-structured interviews with students, and pre- and post-course student questionnaires. The findings indicate that the electronic voting system was seen to increase considerably the scope of interactivity during the lessons by helping students to enhance their development into active participants. However, the data have also indicated that the levels of interactivity in the approaches adopted in the context investigated could still be considered relatively ‘‘shallow’’, and some suggestions have been provided to improve this aspect of technology use. Ó 2006 Elsevier Ltd. All rights reserved. Keywords: Evaluation of CAL systems; Improving classroom teaching; Interactive learning environments; Pedagogical issues

1. Introduction In the UK, there has been a considerable investment in the installation of Interactive Whiteboards (IWBs) in schools. The percentage of primary schools with IWBs increased from 48% in 2003 to 63% in 2004 and secondary schools from 82% in 2003 to 92% in 2004 (DfES, 2004). However, because IWB technology is a relatively new technology in education, the available academic literature is limited. Nevertheless, there are a number of reports of small-scale research projects undertaken by individual teachers, schools and higher education institutions all over the world, which are available on the Internet. There are also a number of descriptions of good practice published in newspapers and magazines. * Present address: English Department, University of Education (Pa¨dagogische Hochschule), Keplerstraße 87, D-69120, Heidelberg, Germany. Tel.: +49 6221 477252. E-mail addresses: [email protected], [email protected]

0360-1315/$ - see front matter Ó 2006 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2006.07.001

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

339

Fig. 1. This is the way the questions and answers are presented to the students. They then have to choose one option and vote by using their voting keypads.

Fig. 2. The results can be shown in terms of percentages of right and wrong answers for each question. This option is usually used when the teacher wants to know the performance of the group as a whole.

The IWB is a touch-sensitive presentation device that is used in connection with a computer and a digital projector. The computer images are displayed on the board by the digital projector, where they can be seen and manipulated via touching the board, either with your finger, or with an electronic pen/stylus. The Promethean TM system (the brand of whiteboard technology used in this research) has also developed

340

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

Fig. 3. The teacher can also opt for showing the results in terms of who answered what. If the teacher knows which device (voting keypad number) each student is using it is then possible for her/him to know who got it wrong and who got it right.

Fig. 4. In the end the teacher can also show the students’ overall scores. In this way, they are able to know their overall performance in the activity and check their position in relation to the whole group.

a whole suite of software and peripheral hardware to complement the use of an interactive whiteboard, such as ‘‘ACTIVstudio’’ software and the ‘‘ACTIVslate’’1 and ‘‘ACTIVote’’. This article focuses specifically on the voting component of Promethean IWB technology. 1

The ACTIVslate is an A5 graphic tablet which operates remotely with the ACTIVboard enabling teachers and students to take control of the IWB from anywhere in the class.

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

341

The ACTIVote System is a wireless response system enabling students to respond to assessment and other questions. Students are given an ACTIVote keypad (which can be registered with the IWB so that each student has a unique ID), and can respond to teachers’ questions. Results can then be displayed immediately on the ACTIVboard in graphical format and can be exported to a spreadsheet. See Figs. 1–4 for more information on how this system works. The study from which the data are drawn was carried out in the context of a British university pre-sessional programme in English for Academic Purposes (EAP) and Study Skills for international students in the summers of 2003 and 2004. The research attempted to understand the impact of the use of IWB technology (and peripheral hardware used in conjunction with it) in the research situation by analysing classroom interaction2 data and the views of a range of participants. In order to address the purposes of such a study, two main research questions were formulated. The first (see RQ 1 below) concerning itself with investigating the classroom teaching/learning processes generated by the use of IWB technology, and the second (see RQ 2 below) focusing on the perceptions of the main participants concerning these processes. RQ1 was, in turn, subdivided in order to focus (in RQ 1a) on types of classroom interaction and (in RQ 1b) on pedagogical goals, i.e., the major kinds of learning being facilitated by the use of the IWB technology. The following broad research questions were thus chosen: 1. How is the classroom teaching/learning process affected by the use of IWB technology? (a) What kinds of interactions are produced when the technology is implemented? (b) What kinds of pedagogical goals (e.g., to enhance collaboration) may the technology help to achieve? 2. How do teachers, learners and researchers perceive the introduction of IWB technology in the language classroom in terms of teaching-learning processes? In general, research findings have indicated that the IWB technology has the potential to extend teaching/learning opportunities both at the whole class and individual learning levels. This article, however, focuses only upon the ACTIVote component of interactive whiteboard technology. It presents and analyses data which indicate that the use of a voting system in conjunction with IWB technology increases the scope of interactivity in the language classroom. The article also examines the notion of interactivity as it pertains to learning (Aldrich, Rogers, & Scaife, 1998) and evaluates the levels of interactivity that teacher and learners experienced in the context investigated. It concludes with some suggestions for exploiting the potential of the voting system for facilitating interactivities which support effective learning in the language classroom. 2. Literature review 2.1. The promise of enhanced interactivity IWB technology and the remote devices3 that are used in connection with it have been mainly advertised with a promise of enhanced ‘‘interactivity’’ in the classroom.4 As Smith, Higgins, Wall, and Miller (2005, p. 94) observe, ‘‘One of the major advantages claimed with regard to IWBs as a teaching tool is that they are ‘interactive’’. In the literature reviewed, several authors have emphasised the potential of IWB technology for facilitating more interactive lessons. For these authors (e.g., Bell, 2000; Glover & Miller, 2001; Greiffenhagen, 2000) an ideal class using interactive whiteboard technology would feature students and

2

The theory of learning that underlies this work is that learning is socio-historically constructed. I adopt, therefore, a Vygotskian perspective in which learning is characterized as being essentially a process that is enabled by social interaction. Learning is here defined as a joint interactive action towards the development of common knowledge (Bruner, 1985; Vygotsky, 1978). Hence, the analysis of interaction is considered essential to the understanding of the processes of knowledge construction in the classroom. 3 ACTIVslate and ACTIVote. 4 It’s important to bear in mind that most of the literature on IWBs in schools deals with their use for whole class teaching without interactive voting systems. However, these findings are still relevant as it can be argued that the key issues of interactivity remain the same for IWBs on their own or for voting systems used in conjunction with IWBs.

342

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

teachers working together to construct the content of the lesson by using the resources available by the technology and relying on the expertise of the whole class. They argue that teachers should adopt a more interactive approach to teaching if they want the IWB technology to become a transformative device to enhance learning. These authors emphasise the interactive feature of the technology, i.e., the idea that the electronic whiteboard should be used as an important tool to improve the levels of interaction in the classroom and promote collaborative learning. Greiffenhagen’s (2000) findings, for instance, suggest that in order to achieve a real educational benefit with the use of this technology, it is important to focus on the possibilities of enhancing the communication and interaction of the students. He mentioned the use of electronic tablets, such as the ACTIVslate, as a way of encouraging students to share their work with others, and making sure in this way that more text written by students (rather than the teacher) might be displayed on the board. The availability of electronic slates is also mentioned by other authors (e.g., Glover & Miller, 2002) as an important feature of the technology that can be used to promote higher levels of classroom interaction. They point out that these devices can enable pupils to work in small groups or individually and then transmit their work to the board for whole class discussion. Bell (2000) also emphasised the potential of IWB technology for promoting higher levels of interactivity. She conducted a quasi-experimental study during a six-week period in order to explore the effect of the IWB as a shared collaborative workspace on writing achievement, writing attitudes, and computer attitudes among eighth grade students in a school in Texas. Her findings suggest that the interactive quality of the board lends it to a degree of group participation not offered by other presentation devices, thus making it an especially valuable teaching tool. In the research reported by Levy (2002), the teachers also highlighted that the IWB can be considered an ‘‘effective stimulus for teacher–student interaction since the students can contribute to generating learning resources’’. Moreover, they saw the board as an effective means of enabling students to present and discuss work. It is important to consider, however, that many of the claims about ‘‘enhanced interactivity’’ refer to ‘‘technical or physical interactivity’’ between students and computer or whiteboard, for instance, through the use of remote devices (e.g., Greiffenhagen, 2000). However, little has been said about the potential of the technology for encouraging ‘‘cognitive interactivity’’, i.e., the learning activities which are supported when interacting directly with the technology (Aldrich et al., 1998), and ‘‘socio-cognitive’’ interactivity associated with learning processes involving co-construction of knowledge between teacher and learners (Smith et al., 2005). Aldrich et al. (1998) draw attention to the fact that the term ‘‘interactivity’’ has become ubiquitous, especially in association with the use of new technologies. We constantly hear about the alleged benefits of interactive TV, interactive multimedia, interactive video and so on. Through the analysis of CD-ROM packages and reviews, they developed a method for assessing the ‘added’ value of educational CD-ROMS compared with traditional materials by examining the notion of interactivity as it pertains to learning. The method they propose is aimed at moving the emphasis away from the level of physical interactivity at the interface (i.e., button presses and mouse clicks) to a consideration of cognitive interactivity (i.e., the learning activities which are supported when interacting with the software). The authors concluded that a key element in putting the interactivity potentially provided by information technology to good use is designing activities that are engaging but also enable the learner to understand concepts and to reflect on and integrate different kinds of knowledge. Therefore, they highlight that it is important to identify the interactivities that support effective learning compared with those that are largely superficial. Therefore, what does ‘‘interactivity’’ really mean? And how has it been implemented in contexts in which IWB technology is used? In order to answer these questions, several authors (e.g., Burns & Myhill, 2004; English, Hargreaves, & Hislam, 2002; Hargreaves, 2003; Mroz, Smith, & Hardman, 2000; Smith, Hardman, Wall, & Mroz, 2004) have examined teachers’ understandings of ‘‘whole class interactive teaching’’ and analysed classroom interaction in literacy and numeracy lessons, which are contexts in which IWB technology is widely used. They analysed classroom interaction, interviewed teachers and learners and reached the conclusion that the term interaction was used by the teachers investigated to refer to a range of differing levels of communication. An interesting finding was that interaction is usually seen as question-dependent and that teachers’ utterances that are not questions, are not interactive, the ‘‘higher the ratio of question to statements, the more

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

343

interactive the teaching’’ (English et al., 2002, p. 22). Teachers seemed to be interpreting ‘most interactive’ as intense ‘question and answer’ sessions. Therefore, most examples of ‘‘interaction’’ comprised mainly quick-fire question and answer work. Feedback follow-up, which would encourage thinking and learning in the IRF sequence (Sinclair & Coulthard, 1975) was not in evidence in the lessons investigated and the pattern of control of content was fully in the teachers’ hands. The authors saw this as a major constraint on interaction to promote thinking. In these studies, some teachers recognized that they did not always listen to students’ responses. The main focus was on recitation questioning, which seeks predictable correct answers, and teachers’ questions are rarely used to assist pupils to more complete or elaborated ideas (Mroz et al., 2000, p. 2). Therefore, the authors concluded that interaction in whole class settings is, for most teachers investigated equated with discourse patterns where pupils take part mainly in response to the teachers’ questions and invitations to respond, with little extension of opportunities for use of talk to work actively on their own thinking and learning experiences. Although these authors have not focused specifically on the uses of IWB technology, they have analysed classroom interaction in settings (UK primary schools) in which this technology is widely used and some of them make reference to the technology and the so called ‘‘technical forms’’ of interaction enabled by the technology. For instance, when discussing ‘‘surface’’ forms of interactive teaching, which had been found in their investigation, Hargreaves (2003) gives the examples of ‘gimmicky’ techniques such as ‘‘the use of phoneme fans, various games or whiteboards’’. The findings of this body of research have been supported by further research in primary school level, namely Goodison (2003), Hall and Higgins (2005) and Smith et al. (2006), who have focused on the analysis of pupils’ perceptions regarding the use of IWB technology and classroom interaction supported by IWB technology. Goodison’s work (2003) is part of a wider investigation into how IWB technology is being used in UK primary schools. It focused on the analysis of 20 lessons in a school where interactive whiteboards had been in use in all subjects, and at all grades, for over a year. Through a detailed analysis of two lessons, she illustrated how IWB technology can be used for completely different purposes, depending on teachers’ views of what learning is and how knowledge should be constructed in the classroom. She concluded that teachers’ own understandings of ‘‘interactivity’’ defined to a great extent their ways of exploiting the potential of technology for teaching/learning. She observes that, while in the science lesson, the technology was used by the teacher as an important ally in the process of co-construction of knowledge through interaction, in the history lesson the technology contributed very little to the learning process and perhaps even impeded it, since a teacher-centred approach was used with ‘‘interaction’’ being mainly used to fulfil the teacher’s agenda. Hall and Higgins (2005) analysed pupils’ perceptions of IWB use and reached the conclusion that students’ involvement in lessons where IWB was used (in the context investigated) was mainly limited to fulfilling the teachers’ objectives of the lesson, which they considered a rather limited form of interaction. Similar conclusions were reached by Smith, Hardman, and Higgins (2006), who investigated the impact of IWBs on teacher–pupil interaction at key Stage 2 in the teaching of literacy and numeracy in English primary schools. After observing and analysing 184 lessons (from which 70 used IWBs), the authors concluded that traditional patterns of whole class interaction (i.e. the typical IRF sequence) persist despite the emphasis on interactive whole teaching and the introduction of IWBs. Therefore, it seems to be the case that what has been reported in the literature as ‘‘enhanced interaction’’ might carry different meanings depending on the level of interaction that is being referred to. This article draws on research data to discuss the importance of developing computer-mediated classroom activities which support cognitive and socio-cognitive levels of interactivity (Aldrich et al., 1998). The next section describes the research context in which this investigation has taken place and presents the research methods used in this study. 3. Research context and research methodology The EAP/Study Skills summer programme is intended to prepare international students for study at the university during the coming academic year in terms of their English language and study skills needs.

344

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

The programme recruits international students from a number of countries who will start a variety of different degrees at the university. Most students enrolled on the programme were from mainland China and Taiwan, but there were also students from Japan, Korea, Africa, South America and Europe. Their target subject area was mainly Management Sciences. Other subjects included were Computer Sciences, Engineering, Law, Environmental Sciences and Linguistics. They had, in general, a good command of the English language (upper intermediate to advanced level). All the students in both studies were postgraduates, apart from one undergraduate in course 1, and their ages ranged from 20 to 36 years. The main components of the EAP/Study Skills programme are: (a) academic reading and writing, (b) listening comprehension and (c) speaking skills. The students are also offered a number of ‘‘complementary courses’’, two of which – ‘‘Using ICT for Academic Study in English’’ (in 2003 – Study 1) and ‘‘Web resources for learning English’’ (in 2004 – Study 2) – provided the sites for the research which is the focus of this article. The addition of these two ‘‘complementary courses’’ was motivated by the interest in creating a component within the EAP/Study Skills programme on the use of ICT in developing academic literacy practices (this had not been done in any meaningful way prior to this). Given their focus, the IWB was seen as a particularly appropriate technology to use for teaching such courses. The other components of the programme, however, were taught in the same way as in the previous years, i.e., in classrooms equipped with a computer and a projector, which were rarely used by the teachers. Thus, in the summer of 2003, the first complementary course, entitled ‘‘ICT for Academic Study in English’’ was implemented. Four units of teaching activities, each lasting for 90 min of class time, were developed for this course on the following topics: (a) Websites for lexical study, (b) evaluating Web material, (c) Internet search strategies and (d) avoiding online plagiarism. For the course investigated in stage 2, implemented in the summer of 2004, four units of teaching activities were developed on the following topics: (a) University Life in the UK, (b) British Life and Culture, (c) Learning with Computers and (d) Writing and Reading Online. The IWB technology (and the peripheral hardware associated with it) were used quite intensively in all stages of the lessons. Several electronic flipcharts and voting questionnaires (involving the use of the ACTIVote system) were designed for all units; the teacher and students used the ACTIVboard (which replaced the traditional whiteboard) as a presentation device and as a platform for integrating different types of technology (e.g., video, sound, multimedia, Internet). The teacher also used the ACTIVslate to give ‘‘technical’’ support to the students while they were at the board. The ACTIVote system was used to support a wide range of classroom activities. It was sometimes used in the introduction stage to find out what the students already knew about the theme of a session and/or foster their curiosity about a certain topic. For instance, for the session entitled ‘‘British life and culture’’ the students did a voting activity in the beginning of the lesson in the form of a quiz, entitled: What do you know about British life and culture? The quiz contained eight questions related to cultural concepts (e.g., shepherd’s pie, double-decker, cricket, etc.) and general knowledge about British politics and culture (e.g., prime minister, Shakespeare, etc.). Students were then divided into groups of three in order to take part in a ‘‘competition game’’. In other sessions, the voting system was used to launch discussions and stimulate debate. For instance, in the session entitled ‘internet search strategies’, the voting activity contained questions about what would be the best online search strategies to be used for specific searches on the Internet. For instance, ‘‘if you want to find out about the relationship between food and hyperactivity, which of these search strategies would you use: (a) ‘‘food hyperactivity’’, (b) intitle:food hyperactivity, (c) food or hyperactivity and (d) allintitle:food hyperactivity’’. The voting exercise was designed in a way that there was not one correct answer but several possibilities. The aim of the voting exercise was to trigger a reflection and discussion about search strategies and consolidate the students’ knowledge about the topic. The voting system was also used to evaluate students’ level of understanding before making certain pedagogical decisions. For instance, in the session entitled ‘learning with computers’, a listening comprehension activity was carried out in the form of a voting activity. By doing this, I was able to check the percentage of students who got the right answer for a specific question and I used this information to make the decision as to whether or not to show the students the electronic transcript of a specific section of the video clip.

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

345

Since there were 29 students enrolled on the programme in Study 1 and 33 in Study 2, it was decided that each unit would be taught three times to different sub-groups of students. This way, the teacher–researcher could work with relatively small groups composed of only eight to eleven students, thus facilitating the implementation of a hands-on approach with the use of IWB technology, where the students had the opportunity to use the technology to take part in class activities and carry out tasks. Another advantage of this arrangement is that it resulted in each lesson being taught three times, which facilitated fine-tuning of the materials and broadened possibilities for data collection. The aim of the research was to explore the potential of the IWB technology (and peripheral hardware) for supporting the language learning process and also for helping learners to acquire electronic literacy during these two courses. It was a classroom-based investigation, in which I (the teacher–researcher) was responsible for designing and implementing the course as well as collecting and analysing the data. All the lessons were video recorded and after each lesson I wrote field notes that focused mainly on my perceptions of the impact of IWB technology on the pedagogical process. The research approach was collaborative in the sense that other ten critical colleagues (five in Study 1 and five in Study 2) were involved in the investigation process. All of them were highly experienced language teachers and four of them were academic researchers with a good level of expertise in qualitative research. Each critical colleague observed one lesson, wrote field notes and filled out a research questionnaire. At the end of both modules, the students filled out a course evaluation questionnaire, focusing on students’ perceptions of the impact of IWB technology on their learning process. In the final week of Study 1, I also ran a focus group meeting with 12 students, who volunteered to take part. At the end of Study 2, I carried out semi-structured individual interviews with ten students who volunteered for this role, and each interview lasted on average for 30 min. 4. Data analysis 4.1. Perceived pedagogical benefits 4.1.1. Increasing the scope of interactivity The analysis of students’ questionnaires and focus group findings showed that the ACTIVote system was the component of Promethean IWB technology that students mostly associated with interactivity and active participation. The following quotes are some examples of students’ statements on this score taken from the post-course questionnaires: The voting activity made me feel participating more in class. (Post-course questionnaire – Study 1) It made everyone take part in it, and sometimes brought a little competition. (Post-course questionnaire – Study 1) I can express my views through voting technology. (Post-course questionnaire – Study 2) Through voting everyone can participate in the activities during the lessons. (Post-course questionnaire – Study 2) One of the critical colleagues in Study 1 pointed out that he/she was impressed with the levels of students’ participation during the lessons in which the voting system was used. He/she observed: That’s maybe the main advantage of the whiteboard technology over a traditional whiteboard. Each and every student was involved throughout the whole lesson. (Critical colleague – 20/08/03 – Unit 2 – Study 1) In the individual interviews conducted in Study 2, several students emphasised the important role played by the voting system in enhancing the scope of interactivity and student active participation during the sessions. In the following extract, the student pointed out that, by using the voting system, she was encouraged to join in classroom activities and this made her feel more ‘‘useful in the class’’:

346

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14.

T: The whiteboard and other things, the software, the voting or the tablets that we used, that sometimes I used, can you think of any advantages? S: I think it’s very convenient for students to follow the teacher’s step and students can use it, and students like to enjoy the class, it’s very interesting for students. T: And why? Why do you think the student found it interesting? S: Cause the students will think they are in the class and they will join the class and he is a member of the class, he’s useful in the class, I think. T: Useful? S: Not just listen what teacher said, he is joining, joining the class. T: Joining? S: Yeah. T: How? How did the students join? S: Students can use the whiteboard and voting and the students get the main point of this class. (Post-course interview with CJ – Study 2)

In lines 6, 7 and 9, the student highlighted that the voting system was especially valuable for allowing her (and the other students) to participate actively in the class and interact with the teacher and other classmates. It also seems that the use of the voting system had a positive impact on the affective dimension of the pedagogical process. In line 7, the student also made some reference to how she felt during the voting activities, i.e., as a useful member of the class. It is reasonable to assume that this also affected the disposition to participate. Another student also highlighted this specific advantage of the technology. As it can be seen in the following extract of a post-course interview: S: [. . .] The voting system for me is very new, I had never seen something like that, the students can take part of the courses, you have to be active on the course, so I think it’s a good thing. (Post-course interview with Andy – Study 2) Also, in the following interview extract, the student referred to the use of the voting system as a ‘‘fun obligation’’ (line 5), i.e., although the students were ‘‘forced’’ to participate, they enjoyed doing that and benefited from it: 1. 2. 3. 4. 5. 6. 7. 8.

T: Can you talk a little bit more about the advantages of using the voting system in class? For your learning, or for the dynamics, or for anything. What kind of advantages can you see? S: Maybe because it makes that all the people can participate in the class, this sounds as an obligation, but a fun obligation. T: An obligation, but a fun obligation? (laughing) S: Yes, and there are many people who don’t like speaking their points of view and with this system, all of us need to participate. (Post-course individual interview with Lauren – Study 2)

According to this student, the main advantage of using this system is that it ‘‘forces’’ everyone to participate, even those students who did not feel comfortable expressing their points of view in public. The added value of this technology for encouraging active participation of shy and self-conscious students was a recurring theme in the interviews. Although the students recognized the value of active participation in class, they also admitted that many of them were too shy and the voting system was a good way to give voice to these students. As it can be seen in the following extract of a Study 2 interview: 1. 2. 3. 4. 5.

S: Yeah, I liked the voting, because it’s quite interesting, because sometimes our course goes until noon, so sometimes we want to go away, but I think it’s quite interesting, so I can wake for the course. T: So you can wake up during the session because you have to vote. S: It’s quite interesting . . . and I can know other’s thinking, and maybe

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

6. 7. 8. 9. 10. 11. 12. 13. 14.

347

sometimes people are too shy and the voting (. . .) explain the idea . . . T: Sorry? The voting? S: I mean that (pointing to the voting keypads). I don’t know how to say that. T: Ah, you mean the voting keypads? S: Yeah, yeah. T: You said sometimes people are shy and then . . . S: They are shy if they have to speak something in the class, but the voting you have to explain your idea . . . even via that machine, but I think it’s a good way. (Post-course Interview with June – Study 2)

In this extract, the student mentioned three advantages of the technology, which are related to active participation and enhanced interactivity. In line 3, the student stated that the voting system helps her to keep awake during the sessions, since she finds the voting activities interesting. In lines 5, 12 and 13, she explains further why she finds the use of the voting system attractive: i.e., it helps her to know what other people think about the topics discussed in class, and it allows shy people to express their own points of view. In the following extract, the student also referred to the same topic but added some thoughts about the importance of the anonymity provided by the equipment: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17.

T: What about . . . are there any advantages of using the voting system in class, in terms of learning, because you said this first voting activity was very interesting and you talked, conversation, etc. . . . but can you see any advantages of using this voting system in class? S: It will encourage students to answer, because if no one glances at the number, nobody will know my answer, so maybe people will be very nervous to raise their hand, to answer in public, but it’s just another way, I can hide but I can still express my opinion. And sometimes you have to do that, because there is a small line on the top that will show who didn’t vote. T: Yeah, you have to vote anyway. S: Yeah, it will force us to think, to think, to get the answer very quickly, so . . . T: It forces you to think? S: Cause for Asian students, the teaching style for us is just normally just the teacher talking, talking, talking and we just listening, active students will listen and have responses but it’s not necessary in class, so I can imagine that most students will get lost . . . but this one (pointing to the voting system) . . . concentrate our attention, with questions. (Post-course interview with Sheena – Study 2)

In line 8, the student stated: ‘‘I can hide but I can still express my opinion’’. Therefore, for this student, the fact that she could express herself without the fear of getting embarrassed in front of the whole group was seen by her as an important advantage of the voting system. By using the voting keypads the students can answer questions without embarrassing themselves, since the voting system can be operated in anonymous mode.5 Although there is the possibility of working in named mode,6 this option was not used because I was not particularly interested in evaluating the progress of individual students, especially in order to avoid inhibiting their participation. From line 13 onwards, the student referred to the kind of classroom interaction Asian students are usually used to, i.e., the teacher talks while the students listen. She then went further to say that the voting system was

5

The term anonymous mode is used to indicate that users will be asked to vote and then the results will be displayed after the vote, but the group will be unable to determine who voted for what answer. 6 The term named mode is used to indicate that users will be asked to vote and then the results will be displayed and the group will be able to see each individual named alongside their vote.

348

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

effectively used as a tool to break this pattern, since the students were required to answer questions and this helped them to concentrate on the lessons and not ‘‘get lost’’ so easily, as it would have been the case in a traditional teacher–student interaction. In the context of EAP/Study Skills courses the characteristics of students can vary a lot, since the groups are composed of learners from different nationalities, age ranges and with different cultural and educational backgrounds. Some students are eager to participate while others typically never say a word during the lessons, and this system is especially beneficial for the latter type of students because it appears to stimulate all of them to participate more actively. I also perceived the use of the voting system as an ally to keep the students alert and engaged in the lessons. In her field notes of Unit 3 of Study 1, I wrote: The class became very interactive, with the students being forced to participate through the voting system. As they said, they were forced to think all the time because I required them to answer questions about the lesson and they didn’t want to make mistakes. There was a kind of competition going on among the students. They also didn’t want me to think that they weren’t following the class. (Teacher’s field note – 28/08/03 – Unit3 – Study 1) Here I suggested that one of the reasons why the students tended to keep alert all the way through the lesson was because they knew that at any time I might ask them for feedback or involve them in some kind of discussion. Unlike the normal classroom, where this could also happen, in the IWB lesson, this can be done in an instantaneous, completely comprehensive and public way, thus greatly increasing the force of this element of the pedagogy. Taken as a whole, thus, the data reviewed in this section indicate that the voting system added to the functionality of the IWB, by increasing the scope of interactivity between teacher and students and students among themselves during the lessons. The data indicate that the students thought that the voting system used in conjunction with IWB technology forced them to become more actively involved in the learning process, especially those who tended to be more passive in other lessons. However, it must also be said that other parts of the research data point towards a variety of potential problems related to the appropriate pedagogical exploitation of the interactive voting system. While some of these problems were related to inherent characteristics of the technology, e.g., the fact that it only supports multiple-choice questions, other problems were related to the pedagogical treatment of the interactions that surrounded its use. These issues will be further discussed in the following section. 4.2. Perceived pedagogical challenges 4.2.1. Guessing and lack of subsequent clarification The fact that the voting system allows the students to guess the correct answer was cited by several students in the individual interviews as one of the main disadvantages of this technology. In the following extract, for instance, a student discusses this issue and its implications for the learning process: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.

T: Can you think of any disadvantages of using the voting system in class? For learning during the sessions? S: Ah, yes, the sole disadvantage I see is that it gives you the opportunity to guess, and I think this is not a good thing, we have to be sure of what we are doing. T: And then you can guess? S: Yeah, you can guess, you can choose, I vote A, A, A; and in the end you may have nearly 50%. T: Yeah, but you just guessed. S: Yeah, you just guessed. (Post-course Interview with Soul – Study 2)

In line 7, he referred to a strategy that can be used by students when they want to obtain a good overall score through guessing. He also highlighted, in lines 4–5, that this kind of attitude impedes the learning pro-

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

349

cess since the students ‘‘are not sure of what they are doing’’, i.e., they do not actively engage with the learning material to construct new knowledge. The analysis of the interaction patterns between teacher and learners indicates that another factor that possibly encouraged guessing was the excessive focus on the ‘‘search for the correct answer’’ aspect of this kind of activity. This limited students’ active participation and tended to work against remediation. The following classroom exchange, for instance, illustrates the type of teacher–learner interaction that typically characterized the follow up discussions of voting results. The main aim of the activity that was being carried out was to check students’ understanding of the numerous online resources that could be used to check meanings of words, collocations, related words, translations. These resources had been presented earlier in the session. Each multiple-choice question presented a language problem and after the voting activity the students were asked about the resources that could be used to solve that specific problem. The exercise being discussed here asked them about the preposition that goes together with the verb draw: 1. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15.

T: Ok, so let’s see the correct answer. Oh! Only three people got it right! Ss: OOOOh!! T: The correct answer is number? Ss: Two. T: Two, I draw on research I conducted in Egypt. So you have three correct ones and six wrong, ok? Six wrong . . . let’s see . . . 33% right and 66% wrong. Ss: AAAAh! T: Ok, imagine for this problem, if you were writing an essay, what kind of website would you use? I mean in terms of these resources that we were discussing, what do you think? S1: I would use Collins. S2: Collins. T: Collins, yeah, because you could use Verb + preposition, ok? And then you could see, ah! It’s draw on, not draw out and so on . . . ok? So now let’s see the next one. (13/08/03 – Unit 1 – Study 1)

In this exchange, the students were not encouraged to move beyond the ‘‘search for the correct answer’’ and to use their previous knowledge, experience and creativity to initiate or contribute to a follow-up discussion after the voting results had been shown. In lines 4, 12 and 13, for instance, the students provided answers to the teacher’s questions, but they were not encouraged to justify their answers. Although the students were asked about the resource they would choose to solve the problem, they were not asked why the one they proposed constituted the best option. Instead, I provided the explanation myself in line 14 and moved on to the next exercise. Another interesting feature of this exchange is the emphasis placed by the teacher and the students on the percentage of correct and wrong answers and their emotional involvement with the results. In lines 2 and 8, the whole class lamented for not having ‘‘scored’’ well in the exercise. A review of the videos of the lessons shows that the interaction pattern of most voting follow-up discussions was: (a) the students answered a question and (b) the teacher–researcher commented on their responses, taking more care over those questions where many students had answered incorrectly. However, the students rarely went beyond this. They were not routinely given the opportunity to justify their answers, ask for clarification, ask questions or raise issues. As a result, the level of interactivity mediated by the voting system in the context investigated can still be considered relatively shallow, since the focus was on the level of physical interactivity (Aldrich et al., 1998). These findings are similar to those obtained by, e.g., Goodison (2003) (in her analysis of the history lesson) and Hall and Higgins (2005), who observed that most examples of ‘‘interaction’’ around IWBs comprised mainly quick-fire question and answer work, and the pattern of control of content was fully in the teachers’ hands.

350

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

The analysis of videotaped recordings of the lessons has also shown that I often underexploited the potential of the technology for providing an insight into students’ level of understanding. For instance, when the question involved right and wrong answers, I tended to focus on the correct answer and overall scores, and hardly ever tried to check what each student had answered, by making use of the ‘‘who answered what’’ mode of showing voting results. By knowing what the students had answered or why they had made a mistake, I would be in a better position to guide them in the process of knowledge construction since I would be able find out what their current level of understanding was and work within their ‘‘zone of proximal development’’ (Vygotsky, 1978). This would also probably help me to find out whether the students were guessing or if their answers were the result of active engagement with the learning material. As an example, in unit 3 of Study 2 I used the voting system in conjunction with a listening activity. The students were given a handout with multiple-choice questions and listened to the video clip. After watching the video twice and providing answers to the questions, the students had to submit their responses through a voting activity. Their responses were then analysed, and, depending on the results, I could make decisions regarding what to do next, which would vary from moving on to the next question straightaway (if all students were right) to showing the transcript of the video extract (if at least half the students were wrong). The analysis of all the video recordings of this specific activity has shown that the only time in which I used the mode ‘‘who answered what’’ to try to understand why the students had made a mistake was in the following sequence: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13.

T: Wow, 40% correct and wrong 60%! So let’s discuss this, maybe I’m wrong. I would say letter B is the correct answer: ‘‘older people are getting more acquainted with computers’’. Do you understand the meaning of this expression here: ‘‘acquainted with’’? S: Yes. T: What does it mean, Perry? S: To become better, to know how to use a computer. T: Yeah, to become familiar with computers, to get knowledge of computers, Ok. So this is the correct answer, but some people got different answers, so let’s see who answered what. Many people chose A ‘‘older people are afraid of technology’’, let’s see if this is the focus, maybe she said something like that, but what has she really noticed in the last three years? So let’s see . . . (24/08/04 – Unit 3 – Study 2)

Since there were a high percentage of wrong answers in this specific question, I decided to find out what kind of answers the students had provided. By doing this, I learnt that most students had chosen option (A). Although the students were not asked to explain their reasoning, the analysis of the results allowed me to deduce why they had made a mistake; and in lines 11–13, I explained to the students where they had gone wrong. Therefore, in this sequence it can be seen that the interactive voting system was used to support higher levels of cognitive and socio-cognitive interactivity, since the system was used as a medium for facilitating the process of knowledge construction and not only for assessing students’ performance. The research findings, however, have shown that this specific use of the interactive voting system was not exploited to its full potential in the context investigated. I believe that the students could have benefited from this application of the technology, especially if it had been used in conjunction with a more interactive approach, in which the students were often encouraged to explain the reasoning behind their choices in case they felt like doing it. The following section discusses some pedagogical implications of the findings and provides some recommendations for solving problems associated with classroom interactive voting activities. 4.3. Pedagogical implications 4.3.1. Dealing with guessing Wit (2003) analysed the implementation of a Personal Response System (PRS) within a statistics service course for first year psychology students at the University of Glasgow. According to him, a good way to

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

351

discourage guessing is by including an option in which the students can admit their ignorance. He therefore suggests that an ‘‘I don’t know’’ choice be included by default in all voting questions. In his investigation he found that this ‘‘strategy’’ worked best when this ‘‘expression of ignorance’’ was taken seriously by the students and teachers, i.e., the students only started using this option when they understood that their honesty could contribute positively to their learning process. Another possible way to discourage guesses during the voting sessions would be through the use of confidence buttons. Some other systems, like the PRS hardware, contain some devices that enable the students to point out the level of confidence with which they answered the question. The handsets usually have 10 number buttons and 2 ‘‘Confidence buttons’’. Therefore, a student can use these extra two buttons to modify their response to show either higher or lower confidence in their answer and the handset defaults to ‘‘normal confidence’’. In my view, this seems to be a better way to deal with the problem of guessing, since the students would probably have less fear to lose face this way than by simply choosing an ‘‘I don’t know’’ option. Also, it provides the teacher with some idea of the student’s certainty of their answer being correct. While examining the usefulness of discussions during the voting activities, one student in Study 2 suggested that one way to discourage guessing would be to launch peer discussions before voting. According to him, by discussing all alternatives with a peer or group, students may be in a better position for providing more confident responses: S: I prefer discussing before voting because I don’t want to guess, I want to be sure of my answers, I want to have objective answer, no (. . .) I want to discuss before, I don’t want to guess, I want to be sure of my answers. (Post-course Interview with Soul – Study 2) In my view, this kind of strategy could be used in cases when the question is really difficult and when an individual response would not be possible for many students in the group. In this case, a group discussion might help them to unpack the question and possibly discourage guessing. Another factor that seems to have an important influence on the amount of guessing done by the students is the kind of remediation process (or subsequent clarification) that is carried out during the voting sessions. As the research data discussed in the following section will show, the students tend to think more carefully about their answers if they know that they might be asked to provide the reasoning behind them during the remediation process. 4.3.2. Providing effective follow-up discussion or subsequent clarification Several authors (e.g., Cutts & Kennedy, 2005; Stuart, Brown, & Draper, 2004; Draper & Brown, 2004) have drawn attention to the importance of combining the use of voting systems with the provision of effective remediation (or subsequent clarification). It has been pointed out by research that lecturers, when using this technology, become often fully aware of the range of misunderstandings in their groups, but they cannot address them all due to time pressure. Stuart et al. (2004), for instance, highlighted that voting systems provide an interesting way to gain insight into students’ progress. However, this is not valuable unless the lecturer is flexible enough to respond to students’ feedback. These authors claim that in the long term, the lack of effective remediation may have a negative impact on students’ level of engagement with voting activities. Cutts and Kennedy (2005), for instance, have thrown some light on this issue. They have conducted a three-year investigation on the use of electronic voting system (EVS) in programming lectures and their findings have shown that many students did not make use of the system as often as expected. One of the reasons they pointed out as a possible explanation for this behaviour was the fact that lecturers did not seem to always provide effective remediation during voting sessions, which according to them ‘‘limited learning and reduced over time the enthusiasm to work out an answer or respond to a question’’ (p. 3). How could then teachers work effectively on remediation in this context? In the study reported here, the students made several suggestions as how to make the best use of the interactive voting system. Thus, some students provided some hints, for instance, on how to make sure that all students, even those who guessed, engage actively with learning material. The student below, for instance, suggested that, after the results had been shown, the students who had provided correct answers in multiple choice

352

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

exercises should be given the opportunity to provide a rationale for their choices i.e., provide arguments for and against certain options (lines 6–7):: 1. 2. 3. 4. 5. 6. 7.

S: When you are voting, sometimes you ask about questions and we choose the wrong answers and we want to know which one is right, so we can get the right one quickly. T: So you like that? S: Yes, but the important one is that after you get the right answer, I think the teacher should give the students the opportunity to explain why they chose this right answer. (Post-course interview with Win/Danny)

Another student emphasised the importance of the discussions that happened after the voting activity. For her, the most important aim of these discussions should be to explore the rationale behind the answers for each question, which in line 9 she refers to as ‘‘profound opinions’’: 1. 2. 3. 4. 5. 6. 7. 8. 9.

T: And after knowing the results, after I showed the results, did you compare your performance with the others in the group? S: Yes, I always compared with others and maybe I am right, but I want to know why I am right and why he or she is wrong. T: So even when you are right you want to know why the others were wrong? S: Yeah. T: Why, why do you think this happens? S: Because the questions are not only right or wrong, the answers are not only right or wrong, maybe some profound opinions. (Post-course Interview with Birdie – Study 2)

In this extract the student highlighted that the learning process involves not only knowing what is right or wrong, but mainly understanding what it actually means to be right or wrong. Therefore, the student stressed the importance of students developing principled knowledge instead of ritual knowledge (Edwards & Mercer, 1987).7 This issue was also raised by another student in the following extract: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17.

S: I think the voting system can never answer who is right and who is wrong. After the voting system the teacher says, let the students who answered right stand up and explain. (laughter) T: Why do you think this is important? S: Because the students who got the wrong answer want to get the explanation. T: Why do you think I should get the people who got it right to explain? S: Makes the class more active. T: Makes the class think? S: I think the web resources class is . . . most of the time . . . you speak too much. T: I speak too much? (all laughing) S: make the students say more. T: make the students say more? S: Share their opinions, say . . . they (. . .) T: If you don’t force people they won’t say. Do you agree?

7 Edwards and Mercer (1987) defined principled knowledge as ‘‘essentially explanatory, oriented towards an understanding of how procedures and processes work, of why certain conclusions are necessary or valid’’ (1987, p. 97). Ritual knowledge of curriculum content, on the other hand, would be merely knowing what to say or do when asked by the teacher.

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

353

18. S: Yes, imagine you are wrong, and your classmate is right, but he just 19. guessed, you will think it’s unfair. (Post-course Interview with Win/Danny – Study 2) In line 1, the student drew attention to a certain limitation of the voting system in terms of providing an overall picture of students’ knowledge and level of understanding of issues taught in the class, since the students are able to guess. Therefore, he proposed, in lines 2–3, that the students be asked to justify their choices in case they provided the right answers. In line 6, he argued further that this kind of technique would also benefit those students who were not as successful and needed to accommodate their knowledge. He then concluded, in line 11, by ‘‘criticizing’’ the teacher for not giving the students enough opportunities to speak in class, which according to him would allow the students to become more active. Therefore, the students emphasised, in this section, the importance of working towards principled knowledge, i.e., conceptual understanding. In other words, for these students it is important that the teacher focus more on the rationale behind the answers, i.e., on the ‘‘whys’’ and not so much on the voting results or on students’ performances. 4.3.3. Improving levels of learner-initiated dialogue As discussed earlier in this article, the typical model of use of voting systems in lessons is that the teacher asks a question, waits until learners respond to the question using their voting handsets and then analyse the answers, following thus the pattern of the IRF sequence. However, some authors have pointed out some possibilities of exploring the system for alternative modes of interaction, as for instance, for opening up opportunities for learner questions and reactions. Cutts, Kennedy, Mitchell, and Draper (2004), for instance, conducted an investigation on the use of voting systems in connection with Group Response Systems (GRSs) and extensions to GRS, such as the Clapometer. According to the author, this software opens more possibilities for learner initiated dialogue, since it is a GRS mode which continuously allows learners to inform the lecturer whether they are following the presentation or not. By using the Clapometer, the students are able to respond to a question while the teacher is delivering material. The lecturer can display questions like: Are you confused by the current topic? Or are you bored by the current topic? Although the students cannot ask questions, they can at least ‘‘flag’’ that they have something to say by using the system. The current response set can be continuously displayed either privately to the teacher or to the entire class and the students can change their votes at any time. According to the authors, this system can be used to facilitate dialogue between learners and lecturers and also among learners themselves. While the lecturer can use the response display to adapt his presentation to the needs and reactions of the audience, the students are able not only to expose their own reactions, but also to find out how the whole class is responding to the lecture. As a result, they may feel more comfortable to raise issues or ask questions, since they know that they are not the only ones who want to say something. In my view, the use of these extensions to the voting system could add a great deal to the pedagogical process, since the students would have access to other tools, which could motivate them to contribute more actively to the lesson. The main differential aspect of this technology is that it would allow them to voice their questions and doubts with less chance of losing face. This way, the amount of learner-initiated dialogue would be more likely to increase. Although the students did not refer specifically to the voting system, I believe that, in the context investigated, the system was underexploited in this regard, i.e., in encouraging learners to make use of the technology to initiate dialogue and not only to respond to questions created by the teacher, and in this way supporting higher levels of interactivity (Aldrich et al., 1998). Although the voting system has a great potential for encouraging active student participation, this involvement remains limited if all the voting activities are only designed by the teacher and do not leave so much space for learner initiated questions. In Study 1, all the voting activities were designed by the teacher. In Study 2, however, I decided to involve learners in the design of one of the activities. During Unit 3 sessions each class was divided into two groups. Each member of the two groups was supposed to send her two questions by email related to the content of the course. I then chose some of these questions to design the questionnaires that would be used by the groups to

354

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

test each other on the final assessment competition. Therefore, the students took part in two different stages of the voting activity, i.e., design and implementation. I believe that the students could have benefited from more opportunities like this, in which they would be asked to interact with the content of the course in a different way, i.e., ‘‘playing the tutor’’ (Clarke, 1989). This would also give the teacher a better idea of what students regard as important or what topics the students found more interesting. 5. Summary and conclusion This article firstly presented and discussed some data which indicate that the use of a voting system in conjunction with IWB technology can contribute to enhance the scope of interactivity in the language classroom. In other words, the use of the voting system seems to have the potential to help learners to develop a greater sense of ownership of their learning, since during the voting sessions the attention seems to shift away from the teacher to the views of the students themselves, in a very clear-cut and rapid way. In the second part, the article discussed some pedagogical challenges that involved the use of the voting system in the context investigated. The data indicate, for instance, that the potential of the voting system for opening up opportunities for guessing was seen negatively by some students. This article also presented some classroom data which show how certain patterns of teacher–students interaction slowed down the remediation process and possibly encouraged guessing; as for instance, the excessive focus on the ‘‘search for the correct answer’’. This means that the effects of the interactive voting system discussed in Section 4.1.1 can by no means be taken for granted: much depends on the pedagogical treatment of the interaction that surrounds its use. The data analysis suggests that, in order to make use of the voting system for supporting high levels of cognitive and socio-cognitive interactivity, it would be necessary to work further on the appropriate management of the voting sessions. Throughout this article, some pedagogical implications drawn from this investigation and recommendations for the use of classroom voting activities were also presented and discussed, involving techniques such as: ways of handling guessing in the context of pedagogical voting activities, providing effective subsequent clarification and improving levels of learner-initiated dialogue (see Table 1 for a summary of the main findings). One possible limitation of this research is that the nature of the data did not allow me to draw any definite conclusions about the effects of technology use on learning outcomes. Although I consider that ‘‘learning outcomes’’ is a valuable feature to be investigated, its evaluation requires methods that measure students’ performance at the end of a period of study, which is beyond the scope of this investigation. Therefore, the focus was mainly on learning processes as a way of understanding the impact of the technology on how students learn and on how I teach. The main aim of the research was to provide a thorough analysis of classroom practices so Table 1 Summary of main findings Perceived pedagogical benefits

Perceived challenges

Pedagogical implications

Increasing the scope of interactivity  Enhanced participation  Anonymity  Ownership of learning  Instant feedback  Fun

Guessing

Dealing with guessing  Including an ‘I don’t know’ option  Using confidence buttons  Launching peer discussions before voting

Lack of subsequent clarification

Providing effective subsequent clarification  Students providing a rationale for their choices  Effective follow-up discussions  Effective remediation

Lack of learner initiated dialogue

Learner-initiated dialogue  Using the clapometer  Involving learners in the design of voting sessions

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

355

as to better understand the process of IWB technology integration in a specific educational context. The results of this analysis (although not generalizable) can then be used by other educationalists either as food for thought or possible guidance in their own endeavours. The findings of this research have indicated that the use of a voting system in conjunction with IWB technology has the potential to enhance the scope of interactivity in the language classroom. Therefore, I believe it is worth pursuing this study on a larger scale, which would involve making use of the technology in a greater number of lessons, and involving more students as well as other teachers in the pedagogical process. This would open possibilities for a deeper evaluation of the impact of the technology on language teaching practices and language learning processes. Also, because this investigation has been conducted only at an early stage of implementation, it would be interesting to investigate the use of the technology in lessons in which any initial novelty-effect had worn off in order to find out whether pedagogical practices and teachers and students’ perceptions would change considerably after they have acquired greater familiarity with the IWB technology. References Aldrich, F., Rogers, Y., & Scaife, M. (1998). Getting to grips with ‘‘interactivity’’: helping teachers assess the educational value of CDROMs. British Journal of Educational Technology, 29(4), 321–332. Bell, M. A. (2000). Impact of the electronic interactive whiteboard on students attitudes and achievement in eighth-grade writing instruction. Unpublished PhD dissertation, Baylor University. Burns, C., & Myhill, D. (2004). Interactive or inactive? A consideration of the nature of interaction in whole class teaching. Cambridge Journal of Education, 34, 35–49. Bruner, J. (1985). Vygotsky: A historical and conceptual perspective. In J. Wertsch (Ed.), Culture, communication and cognition: Vygotskian perspectives. Cambridge: Cambridge University Press. Clarke, D. (1989). Materials adaptation: why leave it all to the teacher? ELT Journal, 43(2), 133–141. Cutts, Q., & Kennedy, G. (2005). Connecting learning environments using electronic voting systems. In Proceedings of the 7th Australasian Conference on Computing Education (Vol. 42, pp. 181–186). Australia: Newcastle. Cutts, Q., Kennedy, G., Mitchell, C., & Draper, S. (2004). Maximising dialogue in lectures using group response systems. In Proceedings of the 7th IASTED international conference on computers and advanced technology in education, Hawaii, USA. DfES (2004). Information and communications technology in schools in England 2004 – First Release. Accessed 23rd August 2005 from: http://www.dfes.gov.uk/. Draper, S. W., & Brown, M. I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning, 20, 81–94. Edwards, D., & Mercer, N. (1987). Common knowledge: The development of understanding in the classroom. London: Methuen. English, E., Hargreaves, L., & Hislam, J. (2002). Pedagogical dilemmas in the National Literacy Strategy: primary teachers’ perceptions, reflections and classroom behaviour. Cambridge Journal of Education, 32(1), 9–26. Glover, D., & Miller, D. (2001). Running with technology: the pedagogic impact of the large-scale introduction of interactive whiteboards in one secondary school. Journal of Information Technologies for Teacher Education, 10(3), 257–278. Glover, D., & Miller, D. (2002). The interactive whiteboard as a force for pedagogic change: the experience of five elementary schools in an English Education Authority. Information Technology in Childhood Education Annual, 1, 5–19. Goodison, T. (2003). Integrating ICT in the classroom: a case study of two contrasting lessons. British Journal of Educational Technology, 34(5), 549–566. Greiffenhagen, C. (2000). From traditional blackboards to interactive whiteboards: a pilot study to inform technology design. In Proceedings of the 24th international conference psychology of mathematics education (Vol. 24 (2), pp. 305–312). Hall, I., & Higgins, S. (2005). Primary school students’ perceptions of interactive whiteboard. Journal of Computer Assisted Learning, 2, 102–117. Hargreaves, L. (2003). How do primary school teachers define and implement ‘interactive teaching’ in the National Literacy Strategy in England? Research Pages in Education, 18(3), 217–236. Levy, P. (2002). Interactive whiteboards in learning and teaching in two Sheffield schools: a developmental study. Department of Information Studies (DIS), University of Sheffield. Accessed 5th December 2002 from: http://dis.shef.ac.uk/eirg/projects/ wboards.htm. Mroz, M. A., Smith, F., & Hardman, F. (2000). The discourse of the literacy hour. Cambridge Journal of Education, 30, 379–390. Sinclair, J. M., & Coulthard, R. M. (1975). Towards an analysis of discourse: The English used by teachers and pupils. London: Oxford University Press. Smith, F., Hardman, F., & Higgins, S. (2006). The impact of interactive whiteboards on teacher–pupil interaction in the national literacy and numeracy strategies. British Educational Research Journal, 32(3), 443–457. Smith, F., Hardman, F., Wall, K., & Mroz, M. (2004). Interactive whole class teaching in the National Literacy and Numeracy Strategies. British Educational Research Journal, 30, 395–411. Smith, H. J., Higgins, S., Wall, K., & Miller, J. (2005). Interactive whiteboards: boon or bandwagon? A critical review of the literature. Journal of Computer Assisted Learning, 21(2), 91–101.

356

E. Cutrim Schmid / Computers & Education 50 (2008) 338–356

Stuart, S., Brown, M., & Draper, S. W. (2004). Using an electronic voting system in logic lectures: one practitioner’s application. Journal of Computer Assisted Learning, 20, 95–102. Vygotsky, L. S. (1978). Mind in society: Development of the higher psychological processes. Harvard: Harvard University Press. Wit, E. (2003). Who wants to be . . . the use of a personal response system in statistics teaching. MSOR Connection, 3(2), 14–20. Euline Cutrim Schmid is an assistant professor of English and applied linguistics at the University of Education (Pa¨dagogische Hochschule) Heidelberg in Germany. She has a Ph.D. in Linguistics from Lancaster University, U.K. Her doctoral research, concluded in November 2005, focused on the use of interactive whiteboard technology for the teaching of English as a Foreign Language (EFL). From 2003 to 2005 she worked as a part-time tutor at Lancaster University, where she taught applied linguistics and English for Academic Purposes (EAP) at undergraduate and postgraduate levels. She has an M.A. in Language Teaching from Lancaster University and an M.A. in Applied Linguistics from the Federal University of Rio de Janeiro, Brazil. Before teaching in the higher education context, she taught EFL for more than 10 years in various language schools in Brazil.