Computers and Composition 21 (2004) 217–235
The impact of e-feedback on the revisions of L2 writers in an academic writing course Frank Tuzi MTS Technologies, Johnstown, PA 15901, USA
Abstract This study explored the relationship between electronic feedback (e-feedback) and its impact on second-language (L2) writers’ revisions specifically focusing on how L2 students responded to their peers and what kinds of revisions they made as a result of the feedback they received. The 20 L2 writers wrote, responded, and revised on a database-driven web site specifically designed for writing and responding. Other forms of feedback they received included oral feedback from friends and peers and from face-to-face meetings with university writing center tutors. Results suggest students preferred oral feedback. However, e-feedback had a greater impact on revision than oral feedback, implying that e-feedback might be more useful. Additionally, e-feedback helped L2 writers focus on larger writing blocks. Thus, L2 writers may use e-feedback to create macro revisions. This exploratory study highlights a new form of revising and responding and offers insights into joining oral response to online collaboration. © 2004 Elsevier Inc. All rights reserved. Keywords: E-feedback; Electronic response; Peer feedback; Revision; Second language (L2) writing; Teacher feedback
1. Introduction A new form of feedback is emerging with the expansion of the Internet. Electronic feedback (e-feedback)—feedback in digital, written form and transmitted via the web—transfers the concepts of oral response into the electronic arena. The merging of oral response and technology benefits the second-language (L2) classroom; however, it can also filter out much-needed and often-used modes of communication. It is important to discover if limiting the modes of communication to digital written messages is a benefit to or an obstacle for L2 writers. The Email address:
[email protected] (F. Tuzi). 8755-4615/$ – see front matter © 2004 Elsevier Inc. All rights reserved. doi:10.1016/j.compcom.2004.02.003
218
F. Tuzi / Computers and Composition 21 (2004) 217–235
literature on oral feedback is substantial and continues to grow. Yet, there have been few prior studies that investigated the impact of e-feedback on L2 writers’ revision. Thea Van der Geest and Tim Remmers (1994) were correct when they stated, “With increasing pace, people are using networks to collaborate on writing tasks” (p. 249). Instructors and students are using technology to collaborate on writing tasks but the research on the effects of this technology on L2 writing has not kept up with the increase of technology in L2 instruction. The increasing impact of technology on L2 instruction is inevitable. What is important for L2 instructors and researchers is to determine how to most effectively use technology in curricula, and, if possible, have a part in molding the technology even as it molds methodology. Of central concern in this research is determining how technology influences L2 writing. More specifically, this study focuses on how e-feedback impacts L2 writers’ revisions. The results of this study can assist educators to better understand how e-feedback and web-based writing can impact L2 writing and suggest ways to incorporate this writing environment into L2 writing programs. This study attempted to mold technology by having an online application developed for the specific purpose of posting student writings online and accepting e-feedback from web site users and visitors. This research combined response theory and writing technology—the very components that form electronic feedback, and investigated their impact on the L2 writing process. The central focus is to examine how e-feedback affects L2 writers’ revisions.
2. Related literature There is a growing body of literature on the first component of this study—response to writing. Several studies investigated the effects of peer- and teacher-response on L2 revision and indicated benefits it has on L2 writing. For example, oral feedback increases participation and interaction among L2 writers because it includes more opportunities for negotiating meaning (Mendonça & Johnson, 1994) and for scaffolding (Carson & Nelson, 1996). Oral feedback also aids in developing critical reading skills, analyzing writing skills, helping L2 writers to recognize audience needs, and encouraging writing as a process (Ferris & Hedgecock, 1998; Mendonça & Johnson, 1994; Mittan, 1989). Oral-based feedback also increases opportunities for practicing social interaction skills like turn taking, collaboration, and taking and relinquishing authority (Villamil & DeGuerrero, 1996). Although oral feedback provides a number of advantages, some studies have suggested that there are disadvantages associated with adopting oral feedback in L2 writing courses. One caveat centers on the changes made as a result of oral feedback. For example, L2 writers tend to focus too heavily on surface errors in oral response (Berger, 1990; Hall, 1990; Hedgecock & Lefkowitz, 1992; Paulus, 1999). It is also apparent that English-as-a-Second-Language (ESL) students overwhelmingly value teacher feedback (Curtis, 1997; Nelson & Carson, 1998; Paulus, 1999). Shuqiang Zhang (1995), therefore, encouraged L2 writing teachers to be wary of applying first-language (L1) research results to L2 writing classes where students often bring different ideas about intervention and revision to the writing process. Zhang (1995) maintained that L2 writers do not appreciate “the affective advantage of peer feedback” as claimed in L1 writing studies (p. 218; see also Zhang, 1999). L1 writers have prior cooperative educational
F. Tuzi / Computers and Composition 21 (2004) 217–235
219
experience at their disposal while L2 writers have limited experience with student-centered education. Hence, they rely on the expertise of the teacher (Tsui & Ng, 2000). A final factor that influences the effectiveness of feedback is training. Composition researchers focus on a variety of response characteristics, but they have one thing in common: the belief that it is important to prepare students to participate in peer response activities (Berg, 1999; Connor & Asenavage, 1994; MacLeod, 1999; Mittan, 1989; Stanley, 1992; Tannacito, 1999). Jane Stanley (1992) spent much time training students to be effective peer evaluators, and the result of his study was confident students who demonstrated greater ability to offer specific, meaning-level feedback. Thus, response training attempts to establish competent responders. All of the aforementioned studies focused on oral rather than electronic response. However, electronic response differs from traditional response in a number of areas. Table 1 summarizes the basic differences between oral, written, and electronic response. In typical oral response, writers and responders communicate and negotiate verbally and nonverbally in realtime as well as employ the printed text, which they can view, refer to, and mark up. In written response, responders read and then write a response on paper. Students may be required to write a response in class or by the next class. After the response is written, it is often given to the writer during a peer group session in which negotiation and interaction often take place. In the electronic environment, however, L2 writers using e-feedback may not be able to participate in the myriad of communication activities used in traditional oral response because the nonverbal elements are missing, or there is a time delay involved in the dialog, or the added writing filter in e-feedback makes encoding and deciphering messages more difficult. Additionally, the greater sense of anonymity may discourage a sense of community in some students, which can also inhibit scaffolding. Mark Warschauer (1997) suggested that, “the special features of online communication— that it is text-based and computer-mediated, many-to-many, time- and place-independent, and distributed via hypermedia links—provide an impressive array of new ways to link learners” (p. 475). E-feedback is a potentially powerful tool for collaborative writing under certain Table 1 General differences between oral, written and e-feedback Oral feedback
Written feedback
E-feedback
Face-to-face Oral Time dependent Pressure to quickly respond Place dependent Nonverbal components More personally intrusive Oral/cultural barriers Greater sense of involvement Negotiation of meaning Less delivery effort N/A
Face-to-face/distant Written Depends Pressure to respond by next class Depends No nonverbal components Depends Written/cultural barriers Greater sense of involvement Negotiation of meaning Greater delivery effort No cut & paste
More distant Written Time independent No pressure to quickly respond Place independent No nonverbal components More personally distant Written/cultural barriers Greater sense of anonymity Less negotiation of meaning Less delivery effort Cut & paste
220
F. Tuzi / Computers and Composition 21 (2004) 217–235
circumstances, but because e-feedback and oral feedback are generated, transmitted, received, and deciphered differently, the benefits evident in oral feedback may not exist in e-feedback. Further studies need to be conducted to investigate whether the benefits of oral-based feedback also exist in e-feedback. In other words, the environment that teachers and students typically perform in when responding is being replaced with a new forum, an electronic one. A number of studies have investigated what the impact of this new type of feedback has on writing instruction and the writing process. For example, one benefit of this e-feedback system is a reduction in paperwork problems like lost or forgotten papers (Palmquist, 1993; Sullivan, Brown, & Nielson, 1998). Provided that the hardware and software are properly established, students can submit and retrieve their work online, and reviewers can respond online. Thus, students will no longer lose or forget their work. Teachers will not need to carry, or possibly lose, bundles of papers. Elaine DiGiovanni and Girija Nagaswami (2001) indicated that e-feedback provided a better means of monitoring conversations. Since online peer feedback was available to the instructors, they could monitor the conversations and provide guidance to writers who needed it. Knowing that the teachers could monitor their conversations also encouraged the students to stay on task. DiGiovanni and Nagaswami also identified that “computer conversations are a form of hybrid communication that allows students to respond spontaneously, yet offers them the opportunity to reflect on their ideas, rehearse their responses, and respond at their own pace” (p. 269). These options are not typically evident in traditional oral feedback or in traditional written feedback. Laura MacLeod (1999) highlighted several important characteristics of responding and of e-feedback. First, e-feedback helped the students be more honest in responding. Because the reviewers could criticize peer writers without having to face the writers, the reviewers felt more comfortable stating their true thoughts. In connection with honesty, e-feedback can allow students to respond anonymously, which MacLeod considered “a plus” (p. 92). Beth Hewett’s (2000) research is the only related literature available that actually explored the impact of e-feedback on revision. She researched “whether and how students apply what they learned through peer talk to their writing-in-progress” (p. 266). She indicated that the type of interaction had a different impact on revision. Whereas oral talk included abstract and global ideas development, e-feedback focused on more concrete writing issues. It is important to mention that Hewett used technology specifically designed for responding to writing. Most research incorporating e-feedback is conducted with technology not designed for writing and responding. George Braine (1997) offered a more critical view of e-feedback based on his comparison of traditional writing versus local-area-network (LAN)-based writing. Braine concluded that LAN-based writing is no more advantageous than traditional writing. His reason for this conclusion is based in part on the technology used in the LAN-based writing system. The project used a bulletin board system on which students posted comments. It was cumbersome for students to traverse through the list of comments for those that were directed to them. Thus, the response system was seen more as an obstacle to writing rather than a benefit. To summarize these studies, e-feedback increases the amount of student participation, reduces the role of the teacher, increases the ability to monitor conversations, increases the amount of time students actually write, and provides multiple and redundant responses for students. These are interesting and praiseworthy findings, but they do not shed any light on the
F. Tuzi / Computers and Composition 21 (2004) 217–235
221
impact e-feedback has on revision. Does e-feedback help the L2 writer improve his writing abilities, and, if it does, how does it help? In an attempt to address the question of the impact of e-feedback on L2 writing, this study explored the relationship between peer and teacher electronic feedback and their impact on L2 writers’ revisions. In particular, this study focused on how peer and teacher e-feedback affected L2 writers’ revision process in a multi-draft process-approach writing environment and what kinds of revisions L2 writers made as a result of peer and teacher e-feedback.
3. Method The study was conducted in a freshman composition course at a four-year college in Pennsylvania. The writing activities were primarily completed in an Internet accessible classroom where the participants studied and practiced academic writing. To make this research possible, I developed a database-driven web site that housed all of the essays and responses in a database. The participants had the needed technological abilities required for this study, which included using a standard Internet browser and email. They wrote their essays and responses directly into a standard browser where the information was secured using individual accounts and organized on a few easy-to-manage web pages. Figure 1 depicts the class page. All of the gray numbers represent links to drafts that students posted. From this page, the participants and visitors could read and respond to the posted essays. When the initial deadline for each essay arrived, the L2 writers would post their essays onto the writing web site with their user accounts. The accounts allowed them to add new essays and
Fig. 1. The class webpage. Note: The names of the participants have been altered.
222
F. Tuzi / Computers and Composition 21 (2004) 217–235
drafts or edit existing drafts automatically. After posting their initial essay, visitors to the web site could read the essays and submit comments to the author. The students had approximately ten days to read any e-feedback they received from their peers, teachers, and web site visitors, and revise their papers before the final drafts were due. E-feedback was submitted via the web site and sent to the email accounts of the instructor and of the author. The students could also obtain oral or written feedback from their peers and assistance from visiting the writing center. The students wrote six papers and could revise each essay up to five times. They did so at their own discretion after receiving comments from their peers, the instructors, and visitors. Of these six papers, the first paper was used to practice posting, reading, and responding to essays on the writing web site. I collected data for this study from four of the remaining five essays. The sixth essay was a magazine spread that had to be placed on poster board and, therefore, could not be posted to the web site. The research was conducted in a natural setting that incorporated an emergent design and subjective data collection from human subjects in the form of interviews and observations. Coupled with this qualitative data was the statistical analysis and coding of the written drafts and responses. 3.1. The participants The participants in this study included 20 L2 writers, the instructor, and the researcher as a participant observer. I selected the L2 participants of this classroom-based study using purposive sampling. These ESL students were taking a freshman writing course at a four-year state university. The writers came from several different regions of the globe including Africa (4), Americas (1), Asia (9), Europe (3), and The Middle East (3). The first language of these L2 writers were as follows: Arabic (2), Dutch (2), French (1), German (1), Indonesian (1), Japanese (6), Korean (1), Mandarin (1), Portuguese (1), Spanish (2), Turkish (1), and Urdu (1). The average age was 20, and the average duration in the US was 5.2 months. The other participants were the instructor and the researcher. The instructor, a professor with special interest and experience in Teaching English as a Second Language (TESL) and composition, gladly participated in the study despite the fact that she had little knowledge of computer technology. Finally, I am an ESL instructor with extensive knowledge and experience with computers, ESL, and composition. I took responsibility for all technological aspects of the research and assumed responsibility for training the L2 writers to create quality e-feedback. 3.2. Preparation for online writing and responding I trained the participants to use the technology and to produce quality e-feedback. Research by Stanley (1992) suggested that students who received training and coaching look at their peers’ writings more closely and offer more specific guidelines for revision than untrained students. Students who received training developed better quality responses, which contained more specific suggestions for improving an essay. Her research was later confirmed by Catherine Berg (1999) and Trena Paulus (1999). Training students to create quality feedback resulted in a baseline or a template from which all students could expect to receive specific suggestions for revising papers instead of simply receiving vague comments and empty praise.
F. Tuzi / Computers and Composition 21 (2004) 217–235
223
Thus, the L2 writers in this study received approximately two hours of training on how to create effective responses and one hour of hands-on practice using the technology. The participants received hands-on practice using an Internet browser and received instructions for logging onto and using the web-writing application during the first week. After modeling the software, the L2 writers practiced using the site, accessing their accounts, posting drafts, and writing responses. The e-feedback training process began with introducing and familiarizing the L2 writer with the process approach to essay writing and with effective responding concepts. I explained the process approach to writing as a method that incorporates multiple revisions of the same essay. The focus of the process approach was a four-step process including brainstorming, organizing, drafting, and revising. The L2 writers also received instructions about generating quality e-feedback responses as previous research suggested. Comments were to include critiques, suggestions, questions, statements, advice, and alternatives as well as the typical praise comments students include. To enhance the e-feedback instruction, the L2 writers practiced responding by reading previously written multiple-revision essays in small groups and then writing responses to these drafts. The class discussed the appropriateness and value of the responses. The training focused on what elements of an essay to respond to and when the response should be given. For example, the participants were instructed to respond to meaning first and to form in later drafts. The training also focused on the proper tact that should accompany a response. Following the training period, the L2 writers and the instructor practiced creating e-feedback during the development of their first writing task. This task was used to practice with the web application and to practice creating e-feedback. After the students posted their first essay drafts, I collected e-feedback samples that the participants wrote from the web site database. Some of the responses were well-crafted and contained praise, criticism, and alternatives. I placed the e-feedback on transparencies and displayed them in class for analysis. The writers viewed the e-feedback and commented on the various components they contained. It took little coaxing for the students to express their feelings and ideas on the e-feedback being displayed primarily because the e-feedback authors were viewing their own responses. One or two of the writers admitted to being the owner of an e-feedback and tried to explain why they wrote what they wrote. I also asked them to describe how they would react to receiving the e-feedback that I placed on the overhead. One of the writers commented on a poorly constructed e-feedback saying, “I do not want to get this response because it has no information to help me with my paper.” I agreed and encouraged the responders to create quality comments according to the format I introduced. I also suggested that the e-feedback recipients write to their responders and request more information if they felt the responder did not send a quality response. Another student commented, “It was good to see the responses on the screen. I could see the good responses and the bad ones.”
4. Data collection Following the initial training and practice, the L2 writers completed four more tasks that were used in the analysis in this study. Each task consisted of producing an essay and up to
224
F. Tuzi / Computers and Composition 21 (2004) 217–235
Table 2 The essay analysis rubric for evaluating the revisions Level
Type
Purpose
Stimulus
Clause Essay No change Paragraph Phrase Punctuation Sentence Word
Add Combine Delete Move No change Replace Rewrite Split
Clarify intended meaning Grammar Impact New information Structure Surface (spelling, capitalization, punctuation)
E-feedback No changes No recall Notes Oral Self Unknown Writing center
Unnecessary
Note. Appendix A contains explanations and illustrations of these components.
five revisions that they posted on the web site. To assist the L2 writers in their revisions, the participants read the drafts and responded to the L2 writers via e-feedback. The instructor, the L2 writers, and I read and responded to at least two or three papers for each draft. As each task’s deadline passed, I analyzed the drafts and responses and then interviewed a number of writers regarding their revisions and the changes they made. I chose to develop a taxonomy based on Chris Hall’s (1990) revision analysis rubric. Hall attempted to develop a multi-layered approach to revision analysis that included the time, level, type, and purpose of revision. I modified Hall’s rubric to provide for the needs of this study. The essay analysis was completed by identifying specific characteristics of the essays including the level and type of revisions made and the purpose and stimulus for those revisions. The essay analysis rubric is summarized in Table 2 and detailed in Appendix A. To establish the reliability of this analysis, I enlisted two raters to review 20% of the essays. The inter-rater reliability was 93 and 98%, respectively. The analysis process began by collecting all of the revisions of a particular essay and comparing each draft with the subsequent revision to determine the differences. All of the changes were logged in a notebook, and I also noted an initial rationale for the change. After the changes were identified and logged, I interviewed the authors to discuss each of the changes they made. I asked the authors why they made the changes and what the stimulus was for the change, whether it was their own idea, or some other stimulus like a conversation (oral) or e-feedback. Finally, all of the data was entered into a database. In addition to the analysis of the essays, I analyzed the e-feedback. The e-feedback analysis rubric was based on Stanley’s (1992) response analysis. Table 3 summarizes the e-feedback analysis rubric for the responses and provides an example of each component. I evaluated each response the participants sent and identified the components each contained. A typical response contained 90 words and 50 components. These documents were the focal point of my research as they contained the actual feedback and changes. Not only did I use these documents for the basis of many of my questions during the interviews, I also analyzed them to determine the changes that the students made and compared the revisions and the e-feedback to determine if any of the e-feedback suggestions influenced the changes of subsequent revisions.
F. Tuzi / Computers and Composition 21 (2004) 217–235
225
Table 3 The response analysis rubric for evaluating the responses Components
Example
Advises
“You might want to include an example here.”
Alternatives
“You need a more specific claim. For example, XXXX.”
Questions
“What’s the topic of your paper?” “Is this a logical response?”
Quick fixes
“I try to break the door down → tried to break.”
Statements
These components did not fit any of the other categories. “I can relate to your experience. When I was young I did the same thing.”
Praise
“This is an excellent beginning to your essay!”
Criticism
“Your first two sentences don’t fit the rest of the paragraph at all.”
Requests
“Can you give me a reason for believing this?” “Can you explain this for me?”
Throughout the semester I conducted a total of 61 interviews with the L2 writers and the instructor using Benjamin Bloom’s (1950) stimulated recall techniques. The interviews focused on getting the L2 writers’ perspectives on the writing response process and to question them about the actual changes they made in their revisions as well as the stimulus for the changes. The interviews helped me to identify how the L2 writers took e-feedback and incorporated it into their revisions.
5. Results 5.1. The essays The L2 writers posted 274 drafts, 97 of which were first drafts and 177 were revisions. A revision could be as small as adding or moving a comma or as large as conducting major editing changes to completely rewrite the essay. Table 4 summarizes the number of essay tasks and drafts that the L2 writers created. On average, the L2 writers produced about 14 drafts each throughout the semester or approximately three drafts for each of the five essays posted online. Students were allowed to post up to five drafts of an essay but were only required to hand in the final revision Table 4 A summary of the online L2 writing drafts and revisions (n = 20) Essays
Practice
#1
#2
#3
#4
Total
First drafts Revisions
19 32
20 43
20 34
19 34
19 34
97 177
Total drafts
51
63
54
53
53
274
226
F. Tuzi / Computers and Composition 21 (2004) 217–235
Table 5 The stimuli for the changes made in the drafts (n = 20) Stimulus
Revisions Change portion (%)
Self
Unknown
E-feedback
Writing center
Oral
Total
791 42.1
352 17.9
296 15.6
280 14.8
178 9.6
1,897 100
Note. Appendix A contains explanations and illustrations of these components.
for a grade. For example, one student wrote and posted an initial draft of her second essay, and then posted two more revisions to the web site after getting feedback for each revision posted. The interviews and essay analysis allowed me to identify the changes made in the revisions. Using the rubric detailed in Table 2, I compared the drafts and subsequent revisions to identify the changes between each set of essays. Table 5 summarizes the stimulus for the changes made. There were a total of 1,897 recorded revision changes. This data indicates that the most changes were introduced by the individual writers themselves with 42.1% of the changes resulting from the student’s own decisions. The large percentage of changes assigned to self is consistent with earlier studies by Paulus (1999) where 52% of the changes were stimulated not by peers or the instructor but some other source including the individual writers themselves. E-feedback (15.6%) compared favorably with the writing center (14.8%) as a stimulus for changes. In addition to analyzing the individual elements of this rubric, I also tried to do cross-element analysis. One cross-element analysis included the levels and stimulus as shown in Table 6. The purpose of this analysis was to illuminate the stimulus in reference to the level of the changes made. At every level of change except punctuation, the primary stimulus was self. The main stimulus for punctuation was the writing center. The writing center was also the third most often used stimulus at the word and phrase level. Although never the primary stimulus, e-feedback ranked second at the sentence, clause, and paragraph levels. This suggests that the use of e-feedback has a greater effect at the larger writing-level units.
Table 6 The stimuli for each level of revision (n = 20) Stimulus
Word
Sentence
Phrase
Self Unknown E-feedback Writing center Oral
356 210 98 171 115
235 71 107 39 29
84 42 26 31 12
Total
950
481
195
Paragraph
Punctuation
Clause
77 15 44 10 12
23 14 12 28 5
16 0 9 1 5
791 352 296 280 178
158
82
31
1,897
Note. Appendix A contains explanations and illustrations of these components.
Total
F. Tuzi / Computers and Composition 21 (2004) 217–235
227
Table 7 The stimuli for each type of revision (n = 20) Stimulus
Replace
Add
Delete
Move
Split
Self Unknown E-feedback Writing center Oral
368 211 117 187 108
359 117 152 79 52
28 11 12 8 12
12 5 2 1 2
8 3 7 2 0
Total
991
759
71
22
20
Combine
Rewrite
Rephrase
Total
6 5 1 2 2
5 1 2 0 1
5 0 1 1 1
791 352 296 280 178
16
9
8
1,897
Note. Appendix A contains explanations and illustrations of these components.
Another cross-level analysis I conducted compared the stimulus for the revisions with the type of revisions that the L2 writers made. Table 7 summarizes the type of revisions resulting from the different stimuli. All of the types of revisions listed in Table 7 contain the same primary stimulus for revisions, which is the L2 writer himself or herself. E-feedback ranked second in the add, delete, and split types and ranked third in the move and rephrase types. The writing center ranked third in the replace and delete types and fourth in the add, move, and split types. There seems to be a pattern here. The oral stimulus consistently ranked low for each type except for the delete type where it ranked second along with the e-feedback and unknown stimulus types. The third cross analysis is a comparison of the stimulus in reference to the different types of revisions. In other words, this comparison identifies the relationship between the stimuli and the purpose for the revisions made. Table 8 summarizes the data from this comparison. The primary stimulus for all but one of the revision purposes was the individual writers themselves. The writing center ranked first in providing stimulus for grammar change purposes. The writing center also ranked second in providing a stimulus for clarifying the intended meaning. E-feedback was never a primary stimulus for revision purposes. However, e-feedback ranked second in providing a stimulus for the purpose of adding new information and ranked third in providing a stimulus for the purpose of enhancing the impact of a revision. Hence, we could say e-feedback was an important stimulus for giving and receiving new ideas that can Table 8 The stimuli for each purpose of revision (n = 20) Stimulus
Meaning
Impact
New info
Surface
Self Unknown Writing center E-feedback Oral
185 98 105 71 87
216 85 39 69 33
216 56 22 97 20
80 55 48 19 10
Total
546
442
411
212
Grammar
Structure
Not needed
30 36 52 18 23
41 14 10 23 3
19 8 4 7 1
159
91
39
Note. Appendix A contains explanations and illustrations of these components.
228
Responders
Statements
Advice
Question
Praise
Quick fix
Criticism
Instructors (2) Students (20)
404 467
283 269
245 81
138 315
108 41
89 89
31 24
Total
871
552
326
453
149
178
55
Note. Appendix A contains explanations and illustrations of these components.
Alternatives
Request
Useless advice
Total
8 4
4 18
1,310 1,308
12
22
2,618
F. Tuzi / Computers and Composition 21 (2004) 217–235
Table 9 The components of the responses sent by the students, instructor, and researcher (n = 22)
F. Tuzi / Computers and Composition 21 (2004) 217–235
229
be incorporated into subsequent drafts. Of the 97 e-feedback-based new information added to drafts, 26 were paragraph additions and 63 were sentence additions. One L2 writer represented this idea by saying that e-feedback provided “good ideas for adding” to the paper. At least 6 of the L2 writers responded similarly during interviews. They suggested that e-feedback was beneficial in providing larger level additions to their documents. In summary, the analysis of the revisions included investigating the level, type, purpose, and stimulus of the revisions as well as comparing the stimulus for the revisions with each level and purpose. The e-feedback was never a primary stimulus for the revisions that the L2 writers made, but the e-feedback did play an important part in the revisions made especially when they focused on adding new information or on increasing the impact of a section of a paper. 5.2. The responses I analyzed the nearly 300 e-responses sent by the instructors (instructor and researcher) and the students and identified more than 2,600 components in those responses. Table 9 summarizes the breakdown of the components identified in the e-feedback the participants sent. The amount of advice, alternatives, and criticism is similar between the instructors and the L2 writers and represents a nearly 10:1 ratio, 10 advice components per instructor to 1 advice component per student. The amount of praise, however, was different with the L2 writers offering 315 praise comments to 138 for the instructors. That is a ratio of only 4.5:1 suggesting that the L2 writers are more comfortable writing praise comments. Praise was the second most common message component for the L2 writers but it was the fourth most common component for the instructors.
6. Discussion 6.1. The types of changes resulting from e-feedback The majority of draft changes (35%) made by the L2 writers resulted from e-feedback focused on adding new information to the original text. Additionally, e-feedback affected L2 writers’ revisions at a higher structural level. E-feedback had a greater impact on revisions at the clause, sentence, and paragraph levels. In contrast, out of a possible seven stimuli, e-feedback was fifth for stimulating essay changes at the word level, suggesting that e-feedback is more effective at encouraging changes at the sentence and paragraph levels. I am unsure why e-feedback had a greater impact on macro-level changes than on micro-level changes. Further research on this finding should be conducted. Although e-feedback is a relatively new form of feedback, it was the cause of a large number of essay changes. In fact, e-feedback resulted in more revisions than feedback from the writing center or oral feedback. E-feedback may be a viable avenue for receiving comments for L2 writers. Another interesting observation is that although the L2 writers stated that they preferred oral feedback, they made more e-feedback-based changes than oral-based changes. These L2 writers believed that there was a substantial difference between oral and e-feedback. Their preference for oral feedback may lie in the fact that they are more accustomed to it. But the
230
F. Tuzi / Computers and Composition 21 (2004) 217–235
benefit of having responses written down in electronic format may translate into more changes in subsequent revisions. Finally, the influence of e-feedback was concentrated primarily on the initial drafts. Later drafts received fewer comments for two probable reasons: (a) more students were willing to help initially so they would receive e-feedback in exchange, and (b) they perceived that earlier drafts required more work than later drafts that had already received assistance. The interviews and comments from the L2 writers in this study suggest e-feedback was a tool for getting and creating ideas for inclusion in their papers. One L2 writer, Aiko, said, “e-feedback helps me get good [sic] idea for my paper.” She used e-feedback for generating ideas and used other stimulants, like writing center tutors or oral feedback, for smaller surface types of problems. Birgid made a similar comment when she said, “I took their ideas and included them in my text.” Weiko echoed the same idea during an interview when she said, “e-feedback was good for idea building.” This pattern seems to suggest that the L2 writers used e-feedback as a tool for larger blocks of text like ideas, examples, introductions, and conclusions rather than smaller elements like grammar, punctuation, or single word changes. The analysis of the drafts seems to corroborate their perceptions. The changes the L2 writers made as a result of the e-feedback in this study centered on two areas: changing existing text to clarify meaning and adding new information. The majority of these e-feedback-based additions were enacted at the sentence and paragraph level. Many studies conclude that oral response affects meaning preserving changes at the word level (Berger, 1990; Connor & Asenavage, 1994; Leki, 1990; Mangelsdorf & Schlumberger, 1992). However, e-feedback influenced changes at the sentence and paragraph levels. Thus, L2 writing instructors may wish to include e-feedback in their response reservoir to encourage a more balanced response system. 6.2. How e-feedback affected the L2 writing process Many L2 writers mentioned that e-feedback influenced their writing process. The L2 writers indicated that receiving e-feedback from many people helped them focus on the strengths and weaknesses of their writings. Receiving multiple e-feedback encouraged students to re-think their paper and revise more. One student, Elizabeth, stated the situation clearly in one of her postings. E-mail feedback was really helpful. First of all I ignored the responses and said to myself that this person doesn’t have a clue. But after I got several more feedbacks which sounded all similar, I started to rethink my paper. Maybe they are right?. . . I thought of the fact that I might use my critics, like try to use more examples, to improve my own paper.
It appears that Elizabeth would not have revised her paper had she not received the e-feedback. The e-feedback she received altered the way she wrote to include a greater awareness of the audience and a willingness to revise her paper when she determined that her intended message was not being received. The L2 writers also indicated that receiving detailed comments that they could review at a later time encouraged them to revise more. For example, Birgid, one of the L2 writers, said, “If you watch the responds [sic] you got, you also know what you are doing wrong and you
F. Tuzi / Computers and Composition 21 (2004) 217–235
231
will review you essays better.” Another student mentioned that the e-feedback encouraged him to write more revisions. “They gave me specific ideas that my essay did not have. I read all of them and felt they would help me. If I did not join [this web site], I would not edit or revise my essay at all. So it gave me a chance to revise my essay and to make it better.” The process by which this student wrote prior to writing and responding online had changed. By writing and responding online this student expanded her writing process to include more revision. Another student, Thomas, also commented that “as I studied English specially writing in English, I found many mistakes in my writing, either grammars or organizing the idea. But with feedback it helps me a lot because I know where is [sic] my weakness in my writing and I try to work on it.” 6.3. Caveats It could be argued that the reason for the success of e-feedback was that the students were trained to provide quality feedback, and they did just that. This training could have influenced the results e-feedback had on student revisions. Despite the fact that more than 60% of the response components written by the students contained no guidelines or recommendations to improve a paper, these L2 writers did learn to create specific feedback to assist their peers in improving their papers. The L2 students were encouraged to revise by suggesting macro-level changes initially and micro-level changes in subsequent drafts. The training may have influenced the results. Another issue I did not investigate was the impact that learning styles had on the revision of papers. All writers employ a learning style when writing and revising a paper. For example, some students prefer to receive oral feedback to e-feedback. Other writers perform revisions in their mind before making changes on paper. Moreover, cultural influences may also affect the type and amount of feedback students give and receive. Learning styles and cultural influences may have influenced the results of this study. A final concern was the impact the interface had on the e-feedback given. It is possible to theorize that the interface—writing and responding in a web browser—influenced the types of feedback students offered. For example, it may be easier to describe a macro-level problem in a paper when revising online than it is to describe a micro-level problem like a grammar or spelling error. More research should be conducted to determine the influence that the interface has on creating responses. 6.4. Implications for L2 writing instruction There are a number of implications for L2 writing instruction and L2 writing from this study. First, online writing and e-feedback are wonderful tools for writing and receiving feedback as well as effective tools for expanding the audience and allowing L2 writers to feel that they are writing to more than just their classmates or instructor. Although it is a useful tool, I do not believe it is a replacement for oral feedback or classroom interaction even as Terry Tannacito (1999) suggested. Different writers have different means by which they get help and judge others. Thus, a variety of responding avenues—including e-feedback—should be available to the writers.
232
F. Tuzi / Computers and Composition 21 (2004) 217–235
Second, this study corroborates Berg’s (1999) and Stanley’s (1992) recommendations that training be given to all L2 writers. The L2 writers received training and learned response rhetoric. The training appears evident in their responses. Most of the L2 writers used their feedback training to respond to each writing assignment. Training students helped them become effective responders and highlighted areas that they needed to be concerned about when writing and responding. Third, a web-based writing environment expands the audience for L2 writers and provides a new avenue for feedback to be received. The expanded audience offers many advantages for both instructor and writer. For example, the instructor can read and send comments from any Internet location and provide specific written comments to each student without consuming class time. The expanded audience also allows L2 writers to receive input from many other people and gain a clearer picture of their audience and their own writing weaknesses. An added benefit of the expanded audience is the ability to read other writers’ drafts thereby providing opportunities for L2 writers to learn from the writing styles of others and incorporate them into their own writing. Finally, this study suggests that a web-based writing environment enables writers to submit drafts and e-feedback from any Internet-accessible computer regardless of the word-processing applications available. In other words, instructors can teach writing with any word processor or even without a word processor. It is even possible to theorize that Internet-based writing systems as robust as current word processors will flourish and allow greater interaction between writers and their audience.
Acknowledgments I would not have been able to complete this article without the vital assistance of Dan Tannacito. Thank you, Dan, for your insight and assistance.
Appendix A. Data coding explanations and examples Level
Example
Clause
We sat down and took a long rest. After reaching the top of the mountain, we sat down and took a long rest.
Paragraph
I find that this movie is not making light of one of the most horrific events in human history at all. It shows us simply how horrible this was and how it tore families apart, and shows the determined strength and great love of a father in a dire situation. I don’t know whether it is right or wrong to depict war comically. However I appreciate this movie making us rouse such opinions.
Phrase
Jesse waited for the call. → Jesse waited in her office for the call.
F. Tuzi / Computers and Composition 21 (2004) 217–235
233
Appendix A. (Continued ) Punctuation
Then he admitted what he did; he killed his horse. → Then he admitted what he did. He killed his horse.
Sentence
He got off the bus slowly.
Word
Look at the ball. → Look on the ball. → Look on the mirror.
Type
Example
Add
Salome went out the door. →When she heard the gun shots, Salome went out the door.
Combine
Debra was looking for the keys to the house. She found them under the plant. → Debra was looking for the keys to the house and found them under the plant.
Delete
Jeff took his shoes off all by himself.→ Jeff took his shoes off by himself.
Move
The old man walked slowly off the ramp. The old man slowly walked off the ramp.
Replace
Thomas runs to the phone. → Thomas ran to the phone.
Rewrite
I got angry at my father when I learned what he had done. → Having discovered the terrible crimes he committed, Alice became enraged at her own father.
Split
Jill liked the new home well enough, but she had a strange feeling because it was in the middle of the woods.→ Jill liked the new home well enough. But, she had a strange feeling because it was in the middle of the woods.
Purpose
Example/explanation
Grammar
We try to hit the ball over the fence. → We tried to hit the ball over the fence.
Impact
My father read quickly through the manual to find out what to do next. → My father raced through the manual to find out what to do next.
Meaning
Why do athletes do doping anyway? → Why do athletes take drugs and steroids anyway?
New info
Swimming was lots of fun for the kids. → Swimming was lots of fun for the kids. But they did not know an enemy was lurking under the surface—a snake.
Not needed
A particular section is deleted because the contents changes or are redundant making the section unnecessary.
234
F. Tuzi / Computers and Composition 21 (2004) 217–235
Appendix A. (Continued ) Structure
My father had many reasons for not trusting us. First of all, we all had a criminal record.
Surface
We tried to hit the ball ove the fence. → We tried to hit the ball over the fence.
Stimulus
Example
E-feedback
The author claimed in the interview that e-feedback was the stimulus for the change.
No changes
No changes made in the paper between the revisions.
No recall
The author didn’t remember what the stimulus for the change was.
Notes
The instructor wrote notes on the students paper.
Oral
The author claimed in the interview that an oral conversation was the stimulus for the change.
Self
The author claimed in the interview that they were the stimulus for the change.
Unknown
The writer was not interviewed.
Writing center
The author claimed in interview that the writing center was the stimulus for the change.
References Berg, E. Catherine. (1999). The effects of trained peer response on ESL students’ revision types and writing quality. Journal of Second Language Writing, 8(3), 215–241. Berger, Virginia. (1990). The effects of peer and self-feedback. CATESOL Journal, 3, 21–35. Bloom, Benjamin S. (1950). Problem solving processes of college students: An exploratory investigation [Supplemental educational monographs]. The Elementary School Journal, 73. Braine, George. (1997). Beyond word processing: Networked computers in ESL writing classes. Computers and Composition, 14, 45–58. Carson, Joan G., & Nelson, Gayle L. (1996). Chinese students’ perceptions of ESL peer response group interaction. Journal of Second Language Writing, 5(1), l–19. Connor, Ulla, & Asenavage, Karen. (1994). Peer response groups in ESL writing classes: How much impact on revision? Journal of Second Language Writing, 3(3), 257–276. Curtis, Anne. (1997, November). Is it better to give than receive: Feedback in student writing. Paper presented at the World Skills: Language and Living Conference, Victoria, British Columbia. DiGiovanni, Elaine, & Nagaswami, Girija. (2001). Online peer review: An alternative to face-to-face? ELT Journal, 55(3), 263–272. Ferris, Dana, & Hedgcock, J. S. (1998). Teaching ESL composition: Purpose, process and practice. Mahwah, NJ: Lawrence Erlbaum.
F. Tuzi / Computers and Composition 21 (2004) 217–235
235
Hall, Chris. (1990). Managing the complexity of revising across languages. TESOL Quarterly, 24(1), 43–60. Hedgecock, John, & Lefkowitz, Natalie. (1992). Collaborative oral/aural revision in foreign language writing instruction. Journal of Second Language Writing, 1(2), 255–276. Hewett, Beth. (2000). Characteristics of interactive oral and computer-mediated peer group talk and its influence on revision. Computers and Composition, 17, 265–288. Leki, Iona. (1990). Potential problems with peer responding in ESL writing classes. CATESOL Journal, 3, 5–19. MacLeod, Laura. (1999). Computer-aided peer review of writing. Business Communications Quarterly, 62(3), 87–94. Mangelsdorf, Kate, & Schlumberger, Ann. (1992). ESL student response stances in a peer-review task. Journal of Second Language Writing, 1(3), 235–254. Mendonça, Cássia O., & Johnson, Karen E. (1994). Peer review negotiations: Revision activities in ESL writing instruction. TESOL Quarterly, 28(4), 745–769. Mittan, Robert. (1989). The peer review process: Harnessing students’ communicative power. In Donna M. Johnson & Duane H. Roen (Eds.), Richness in writing: Empowering ESL students (pp. 207–219). New York: Longman. Nelson, Gayle L., & Carson, Joan G. (1998). ESL students’ perceptions of effectiveness in peer response groups. Journal of Second Language Writing, 7(2), 113–131. Palmquist, Michael. (1993). Network-supported interaction in two writing classrooms. Computers and Composition, 10(4), 25–57. Paulus, Trena M. (1999). The effect of peer and teacher feedback on student writing. Journal of Second Language Writing, 8(3), 265–289. Stanley, Jane. (1992). Coaching student writers to be effective peer evaluators. Journal of Second Language Writing, 1(3), 217–233. Sullivan, Dave, Brown, Carol E., & Nielson, Norma L. (1998). Computer-mediated peer review of student papers. Journal of Education for Business, 74(2), 117–121. Tannacito, Terry. (1999). Electronic peer response groups: Case studies of computer mediated communication in a composition class. Unpublished doctoral dissertation, Indiana University of Pennsylvania. Tsui, Amy B., & Ng, Maria. (2000). Do secondary L2 writers benefit from peer comments? Journal of Second Language Writing, 9(2), 141–170. Van der Geest, Thea, & Remmers, Tim. (1994). The computer as a means of communication for peer review groups. Computers and Composition, 11, 237–250. Villamil, Olga S., & DeGuerrero, Maria C. (1996). Peer revision in the L2 classroom: Social-cognitive activities, mediating strategies, and aspects of social behavior. Journal of Second Language Writing, 5(l), 51–75. Warschauer, Mark. (1997). Computer-mediated collaborative learning: Theory and practice. Modern Language Journal, 81(3), 470–481. Zhang, Shuqiang. (1995). Reexamining the affective advantage of peer feedback in the ESL writing class. Journal of Second Language Writing, 4(3), 209–222. Zhang, Shuqiang. (1999). Thoughts on some recent evidence concerning the affective advantage of peer feedback. Journal of Second Language Writing, 8(3), 321–326.
Dr. Frank Tuzi, of MTS Technologies, Inc., has published several research articles on computer-supported writing processes. He can be reached at
.