Teacher commentary on student writing: Descriptions & implications

Teacher commentary on student writing: Descriptions & implications

JOURNAL OF SECOND LANGUAGE WRITING, 6 (2), 155-l 82 (1997) Teacher Commentary on Student Writing: Descriptions & Implications DANA R. FERRIS Califor...

2MB Sizes 10 Downloads 34 Views

JOURNAL OF SECOND LANGUAGE WRITING,

6 (2), 155-l 82 (1997)

Teacher Commentary on Student Writing: Descriptions & Implications DANA R. FERRIS California State University, Sacramento

SUSAN PEZONE American River College

CATHY R. TADE Winters High School

SHAREE TINT1 Sacramento City College Teacher response to student writing is a vital, though neglected, aspect of L2 composition research. The present study adds to the previous research through the development and implementation of an original analysis model, designed to examine both the pragmatic aims and the linguistic forms of teachers’ written commentary. This model was used in the examination of over 1500 teacher comments written on a sample of 111 essay first drafts by 47 advanced ESL university students. It was found that the teacher changed her responding strategies over the course of two semesters, that she provided different types of commentary on various genres of writing assignments, that the amount of her feedback decreased as the term progressed, and that she responded somewhat differently to students of varying ability levels. The study raises several implications for L2 writing instruction as well as for analyses of teacher commentary.

Most experienced writing instructors know that responding to student writing can be the most frustrating, difficult, and time-consuming part of the job. Providing written feedback on student papers is, however, arguably the teacher’s most crucial task: It allows for a level of individualized attention and one-on-one communication that is rarely possible in the day-to-day operations of a class, and it plays an important role in motivating and encouraging students. Notwithstanding a number of discouraging Ll and L2 studies reporting the ineffectiveness of teacher response, Leki (1990), notes that “Writing teachers and students alike do

Direct all correspondence ns@cs~edu>

to: Dana R. Ferris. I15 Guaymas

Place. Daws. CA 95616 -=fer-

155

156

FERRIS, PEZONE,

TADE & TINT1

intuit that written responses can have a great effect on student writing and attitude toward writing” (p. 58). Given the fundamentally unchallenged importance of teacher response to student writing, it is therefore surprising that there has been so little examination or systematic description of teacher commentary, particularly in L2 writing. Further, research on teacher response has often failed to take into account the overall pedagogical context of the written feedback-the writing classroom itself and the relationship between teachers and students. This has led to skewed results that fail to capture the dynamics of the composition class and the developing communication between instructor and students (Leki, 1990; Reid, 1994). Previous LI and L2 research on teacher feedback has been limited in other important ways as well, as it typically examines teacher response at one particular moment in time, rather than over the course of an entire writing class, and it is often vague and imprecise. In addition to these limitations, which characterize both Ll and L2 research on teacher commentary, L2 composition researchers and teachers have, perhaps, been premature in applying the suggestions and prescriptions of Ll writing theorists to the teaching of L2 students. It has been argued by Silva (1993) and by Goldstein and Conrad ( 1990), and Zhang ( 1995), that ESL writers are significantly different from native speakers in their linguistic, rhetorical, and cultural knowledge, and it should not be assumed uncritically that techniques and principles recommended for Ll students will be effective or optimal for L2 writers. The present study was designed to address these issues. It is longitudinal, in that a teacher and her students were tracked over the course of a semester-long ESL college composition class by looking at multiple drafts of several different writing assignments, thus allowing the examination of variation across writing assignments, points of the semester, student ability levels, and of the impact of the teacher commentary on student revision.’ To accomplish these goals, an original analytic model was designed to describe, in more precise terms than have been employed by previous researchers, how the teacher responded. Finally, the analysis model was designed, in part, to look carefully at responding behaviors (e.g., the use of questions, hedges, grammar terminology) which have been posited elsewhere (Ferris, 1995b; Goldstein & Conrad, 1990; Patthey-Chavez & Ferris, 1997) as being potentially problematic for L2 writers. Besides these theoretical and methodological issues, this study was undertaken in response to the pedagogical concerns of the authors and their colleagues in a large MA TESOL program. Though we have successfully communicated to our graduate students the principles of teacher response in process-oriented writing classes (allow time for multiple drafts, give between-draft feedback, focus on ideas rather than grammar on early drafts, don’t “appropriate” the students’ ideas, etc.), these pieces of advice have not proven to be specific or focused enough to help our pre-service teacher trainees learn to identify the major strengths and weaknesses of a student paper, prioritize the type(s) of response they will provide,

TEACHER

COMMENTARY

157

and compose marginal and end comments which are clear and effective. Our students’ resulting weaknesses have been painfully evident in their performance in the practicum course, as teaching assistants, and on the comprehensive examination at the conclusion of the M.A. program. Thus, a second but equally important objective of this study was to develop tools and identify principles which could be used to prepare ESL writing teachers more effectively.

Background Though studies of peer The Scarcity of L2 Research on Teacher Response. response abound in the L2 literature, systematic studies of teacher response have been surprisingly rare. A recently published comprehensive bibliography of work on second language writing (Tannacito, 1995), lists only 111 entries on the topic of “Feedback” or “Responding;” of those, 95 deal wholly or partially with teacher response. Of those 95 citations (many of which are opinion pieces or lists of teaching suggestions, rather than empirical research), only 35 are readily accessible (published in major journals or in books by major publishers), 20 are accessible only with some effort (doctoral dissertations, articles in smaller journals or chapters in books published by obscure publishers), and the rest are inaccessible (conference papers, unpublished M.A. theses). Of the 55 accessible or somewhat accessible citations, only a handful deal primarily with the description of what teachers do when they respond to student writing. Some of the most often-cited and Limitations of the Previous Research. reprinted L 1 and L2 articles on the subject of teacher response to student writing have serious limitations and/or methodological flaws. For instance, Sommers (1982) criticizes the responding behaviors of 35 teachers being “arbitrary and idiosyncratic” (p. 149). Ironically, her analysis appears to share the weaknesses she attributes to the teachers’ commentary, as nowhere in her article does she explain how the teacher commentary was analyzed and on what basis she drew her conclusions. The best-known L2 study in this vein is that of Zamel (1985). who, following the approach taken by Sommers, analyzed 15 teachers’ comments on 105 student texts, arriving at conclusions similar to those reported for Ll students by Sommers, by Brannon and Knoblauch (1982), and later by Connors and Lunsford (1993). However, similar to the Ll researchers she cites, Zamel does not describe her method of analysis, saying merely that she “examined the comments, reactions, and markings that appeared on compositions” (p. 85). Silva (1988) discusses the limitations of Zamel’s 1985 study. Another frequently cited Ll study is that of Connors and Lunsford (1993). notable in that the authors examined the commentary written by 300 hundred teachers on some 3,000 papers. Yet the authors themselves admit, if somewhat sarcastically, that “the Methodology Police would probably bust us for the way the sample was gathered” (p. 205): The

158

FERRIS, PEZONE, TADE & TINT1

teachers, solicited by mail, simply sent their marked papers to the researchers, with no information provided as to which draft they sent, what the assignment was, at what point in the term the paper was written, or any other background about themselves or their students. Though Connors and Lunsford make interesting observations about their sample and offer compelling conclusions based on their findings, the design flaws certainly limit the study’s applicability and generalizability to any specific context. Other Ll studies on teacher feedback, for instance Sperling (1994) and Beason (1993), have been more systematic in their analyses (or at least more explicit in reporting them). Although Sperling’s study on the “orientations” (emotional, pedagogical, etc.) of teacher commentary is carefully designed and longitudinal, the author herself notes that she “did not examine systematically why such comments were made” (1994, p. 200). In contrast, Beason’s (1993) study examines the “aims of the comments” provided by peers and teachers, including categories such as “Detected Problem,” “Advised,” “Edited,” “Praised,” etc (p. 405). Though he describes the “aims” of the feedback, he does not examine the specific characteristics of the feedback (e.g., the linguistic forms used, the relative length of the comments, etc.).

Other L2 Research on Response to Student Writing. Besides Zamel’s (1985) analysis of Experimental/Comparison Studies. teacher commentary, there have been several experimental or comparison studies examining various issues surrounding L2 teacher feedback (including both commentary and error correction). These have included examinations of the effects of: praise versus criticism (Cardelle & Corno, 198 1); feedback on content versus feedback on form (Fathman & Whalley, 1990; Robb, Ross, & Shortreed, 1986); and corrections/rule reminders versus meaningful commentary (Kepner. 199 1). While all of these studies were systematically carried out and reported and have yielded valuable insights, they also are limited in scope by the constraints of the various experimental designs. For instance, Fathman and Whalley’s subjects were given only 30 minutes in class to revise their papers after receiving one of four possible feedback treatments.? Other Types of Response: Peer Feedback and Teacher-Student Conferences. Two forms of feedback recommended frequently by process advocates are peer response and face-to-face writing conferences between teacher and student. Studies of peer response and of one-to-one writing conferences (Carson & Nelson, 1994, 1996; Connor & Asenavage, 1994; Goldstein & Conrad, 1990; Lockhart & Ng, 1995; Mendoga & Johnson, 1994; Nelson & Murphy, 1992; Patthey-Chavez & Ferris, 1997; Villamil & deGuerrero, 1996) have utilized precise and systematic analytic methods and have contributed important insights

TEACHER

COMMENTARY

159

about these two important pedagogical techniques. However, feedback from peers has different purposes and effects than feedback from an expert or authority; teacher-student conferences, because they involve primarily spoken interaction, operate under different dynamics and constraints than does written teacher feedback. Though most Ll and L2 writing experts remain enthusiastic about peer feedback and one-to-one writing conferences as instructional options, they are not always more desirable than written teacher commentary, given individual student variation in L2 listening/speaking ability, in learning style preferences, and in cultural expectations of the teacher-student relationship. For these reasons and others (including constraints on teacher and class time), written teacher comments are likely to remain the most viable and common form of response to student writing and thus need equally careful analysis and evaluation. Surveys of ESL Student Reactions to Teacher Feedback. A particularly relevant line of research on teacher response to L2 writing has examined ESL student preferences about, reactions to, and strategies for dealing with teacher feedback (Cohen, 1987, 1991; Cohen & Cavalcanti, 1990; Ferris, 1995b; Hedgcock & Letkowitz, 1994, 1996; Leki, 1991; Radecki & Swales, 1988). but with the exception of two small studies by Cohen ( 199 1) and Cohen and Cavalcanti ( 1990), none of this research attempts to link what students say about teacher feedback with actual commentary provided by teachers. Further, the survey instruments used tend to be very broad, discussing the relative weight given to such general categories “grammar,” or “content,” in teacher response without looking more closely at what teachers say about grammar or content, and how they say it. Summary

and Research Questions.

Thus, despite a great deal of prior L 1 research on the topic of teacher response to student writing, little information was found that was either relevant to L2 writers or which provided analysis models specific enough to address the research questions in the present study. L2 research on this subject was surprisingly scarce and generally vague. To address these gaps, the present discourse analytic study was guided by the following research questions: 1.

What is the nature-considering both pragmatic of the teacher’s written comments?

2.

Is there evidence of variation in teacher response: *across student ability levels *across assignment types aat different points during the term?

intent and linguistic

form-

It was expected that the development of an analysis model to address these questions would provide new insights into the analysis of teacher feedback and tools

160

FERRIS, PEZONE, TADE & TINT1

which could be used to help practicing teachers evaluate their own feedback and to develop the schemata and skills of novice teachers.

METHOD Participants The teacher. The ESL teacher studied is an experienced instructor of ESL composition who has also trained writing teachers and supervised interns. She is considered an exemplary teacher at her institution (a large public university in northern California); other teachers have modelled their syllabi and materials after hers. The students. The 47 students were mostly freshmen and sophonlores at the university. They were enrolled in three different sections of a sheltered-ES1 freshman composition course, during the Spring and Fall I993 semesters. Nearly all of the students were permanent residents of the U.S.; most had attended local high schools. These subjects therefore were different from the more typical international ESL student at an American university. as they likely had more experience with Weste~ teachers’ expectations and response practices. The findings of this study may thus be limited in applicability to writing classes in which permanent resident students, rather than international students, predominate. The native language groups represented included Spanish, Vietnamese, Hmong, Lao, Chinese, Korean, Amharic, Estonian, Greek, and Russian. For the purposes of this study, each student was identified by the teacher at the conclusion of the term as being a “Strong, ” “Average,” or “Weak” writer, based upon the students’ cumulative written efforts (four essays of at least three drafts each, homework journal entries. inclass midterm and final essay examinations) over the course of a fifteen-week semester.’ Procedures The course. Students could be placed in the freshman composttion course either through obtaining a qualifying score on the in-house placement test for ESL students or having passed the previous course in the composition sequence. It was a graded three-unit course which counted toward graduation and fulfilled the freshman composition requirement. It was followed by a sophomore-level expository writing class in which the ESL students were mamstreamed with native speakers. The text used in the course was Guidelines (Spack, 1990). In the Fall, 1993 semester, the teacher also required an editing handbook (Fox, 1992); students

TEACHER

161

COMMENTARY

were given regular, individualized assignments from this handbook. The in-class activities included discussion of readings, instruction on reading and writing strategies, grammar mini-lessons, pre-writing and revision activities, peer response, student-led discussions, and oral reports. The students had four major essay assignments, with the fourth being a library research paper. They were required to write at least three drafts of each paper and to submit all drafts, prewriting and peer response notes, etc. together in a folder. After grades were given to the papers (on the third draft), the students were allowed to revise them further for higher grades if they so chose. Though the teacher did a fair amount of one-to-one conferencing in class and in her office, the conferences were voluntary and not regularly scheduled. The writing sample. With the permission of the teacher and students, copies of preliminary and revised drafts from the first three essay assignments were collected. The first drafts contained handwritten commentary-both marginal and end comments-provided by the teacher. Between the two semesters, the teacher slightly changed her method of providing end commentary, using a separate sheet for end comments rather than writing them on the students’ papers; copies of these sheets were collected as well. All told, 247 papers from the 47 different students were gathered. Of these, 111 first drafts were usable for analysis4 and there were 110 complete pairs of first and second drafts, for a complete sample of 220 papers. The papers were written in response to writing assignments (described in Figure 1) of three different types: personal narrative/analysis, relating another author’s text(s) to personal experience, and analysis of another author’s argumentative essay. During the second semester of data collection (Fall, 1993), the teacher elected to drop the personal narrative assignment and to require two argu-

Spring I993

Ewy I Write about a personal experience Week 3 of semester.

and analyze Its sigmflcance.

First draft due around

Essay 1: Choose one or more of several readmgs and relate the Ideas m the readmgca) to personal cxpenence. First draft due around Weeh 7 of the semester Essay 3: Choose one of several argumentawe Fwst draft due around Week IO of the semester

essays and critically

analyze the author’s arguments

Fall, 1993 Essay I: Choose one or more of several readmgs and relate the Ideas m the readmgcs) to personal expertence Ftrst draft due around Week 4 of the semester Es\ay 2. Choose one of several argumentative First draft due around Week 7 of the semester.

essay\ and crltlcally

analyze the author’s arguments.

Essay 3, Synetheslze several readmgs on the “Baby Jesalca” case and write an argumentattve essay on a chotce of several toptcs related to the case. First draft due around Week 10 of the semester [All assignment\

except for “Baby Jessica” were taken from Gurdrlrnes Figure 1.

Essay Assignments:

Overwew

(Spack. l990).]

162

FERRIS, PEZONE,

TADE & TINT1

ment analyses instead (Figure 1). Table 1 details the frequencies and percentages found in analysis of the student and class variables, demonstrating that the sample included a range of assignment types, student abilities, and points in the semester at which the papers were written. The Research Model Analysis of sample: Development of the model. As previously discussed, review of the previous literature on teacher response provided no analysis models which were clear or specific enough for the purposes of this study. Thus an original model was developed through the “constant comparative method” of analysis (Glaser & Strauss, 1967). This method is described by Lockhart and Ng (1995) as “an inductive approach that produces theory grounded in data” (p. 614). In the present study, a number of marked papers were examined to develop possible cat-

Description

TABLE 1 of Sample - Frequencies

Variable

and Percentages

Totals

Number of Students Sprmg.

.\2

(68 1%)

Fall. 1993

1993

Is

(31.2%;)

Total

47

Number of FIrat Draft\. Sprmg, I993

66

(59.5%)

Fall. 1993

Q

(30. I ‘7r)

Total

III

Student Ablhty (number of papers m each category): Strong

47

(42.3%)

Average

29

(16.1%)

Weak

;zi

(31.6%)

Total Aaslgnment

III Type\,

Pcraonnl Narrntlve

28

Relatmg Text to Exper.

37

Argumentative

a

Total

III

Semester Pant (# of papers): Early m Semester

11

(36.9%

M&CGemester

39

(35.1%)

Late Semester

31

(27 9%)

Total

Ill

I

TEACHER

A.

COMMENTARY

163

Aim or Intent of the Comment:

1.Dmxtives: a. Ask for informatton b. Make suggestion/request c. Gwe mformatlon 2. Grammar/Mechanics 3. Positive Comments B.

Linguistic Features of the Comment:

I. Syntactic Form: a Questlon b. Statement/Exclamation c. Imperative 2. Presence/Absence

of Hedge(s)

3. Text-Speclfic/Genenc Figure 2.

Categories

Used for Analysts

egories for analysis. During the preliminary development of categories, we also asked the teacher for some clarification and explanation as to the intent of her comments.5 The categories were tested on a subsample of papers, further refinements to the model were made, and then the remainder of the sample was analyzed according to the finalized scheme. Following several training meetings, during which the readers went over sample papers and discussed how specific comments fit into the categories and resolved various problem associated with the system, each paper, with the teacher’s original handwritten commentary, was read and coded by two researchers.’ The form used for each paper included the descriptive categories shown in Figure 2 and discussed below. Explanation of categories. The examination of the teacher commentary included two specific phases: First, the teacher’s goal(s) in writing the comments were judged, and then the linguistic forms of the comments were categorized.7 The specific categories are described below. Intent or Purpose of the Comment A. Directives. A number of Ll and L2 studies have analyzed either teacher commentary and/or student survey responses to see whether the teacher commented on issues of “content,” “organization,” etc. One of the goals of this study was to categorize these larger areas of response more specifically and systematically, in an effort to describe teachers’ aims for specific comments as they respond to various issues. In so doing, we relied on Searle’s (1976) taxonomy of speech acts. One general class distinguished by Searle is directives, consisting of “all spe-

164

FERRIS, PEZONE, TADE & TINT1

cific acts whose function is to get the hearer to do something” (Ellis, 1994, p. 337), and including asking for specific information and requests. In developing categories for analysis, three directive types (asking for information, giving a suggestion/request, and giving information) were identified. 1.

Asking for Information. A major purpose of the teacher’s comments (accounting for nearly a third of all marginal commentary), was to ask the student writer for further information. These requests for information occurred in three distinct subtypes: a. Asking for information unknown details from the writer’s personal writer’s intent):

to the teacher/reader experience

(e.g., specific or questions about the

EXAMPLE: Can you expluin what your friendship illness? (Paper #7, Comment #5) EXAMPLE:

What is yourfocus

here?(Paper

ws

like before her

#3. Comment #5)

b. Asking the student to provide information known to the teacher/reader (e.g., details from a course reading to which the writer is responding): EXAMPLE: Student text: This subject is \lery Mrelldiscussed by Mead and Metruux in their essay “On Friendship. ” It w’as interesting,for me to read their thorough research on this issue. Teacher comment: c. Rhetorical

What do they say? (Paper #l , Comment #5)

questions designed to spur the student to further thought:

EXAMPLE: But wouldn’t other (older) countries also benefit,from results of human civili:ation “? (Paper #3. Comment #7).

“the

EXAMPLE: Are you saying that all adoptil*e parents are better than all biological parents? (Paper #168. Comment #7) 2.

In this sample, a substantial number of Making a Suggestion or Request. both marginal and end comments were judged to have this aim. As can be seen in the two examples below, suggestions or requests could appear in either statement or question form: EXAMPLE: You need a brief summury of the 2 essays here. (Paper #l, Comment # 1)

TEACHER

COMMENTARY

165

EXAMPLE: Can you summarize Barna S essay a bit more in your intro? (Paper #54, Comment #l ) 3.

Giving Information. Another important goal of the commentary was to provide information to the student. Unlike the previous category, the teacher does not directly tell the student what to do with this information, but it is certainly implied (from the context, if nothing else) that the student is expected to somehow act upon the information-in other words, the illocution (intended effect) is still directive. As noted by Leki (1990), Reid (1993, 1994), and Sperling (1994), teachers exhibit different personae and orientations in their response. In this category, the teacher can be seen as an interested and informed reader, responding actively to the writer and the text. These types of comments occurred in two distinct forms: Giving the student writer information 3a. “Reader response ” information: about how the reader/teacher perceives the paper’s ideas or organization: EXAMPLE: You go a bit off track in this paragraph. You start by talking about the American system, then talk about Estonia, then compare Estonia to Indonesia. (Paper # 1, Comment #4) EXAMPLE: I think this is a stereofipe-an unfair one. Many Americans value theirfriendships highly. (Paper #7 1, Comment #15) 3b. Textual information: Giving the writer information about another author’s text which s/he may have missed or misinterpreted: EXAMPLE: This experience is not the same as Levine 5. His problem was that all the clocks were uprong. Yours was that you didn ‘t understand the time difference between California and Me.xico. (Paper #I 2, Comment #6) EXAMPLE: Neusner k point here is that college is easy, forgiving, and fun-unlike the real uaorld. This is a diflerent argumentfrom what you discuss here. (Paper #56, Comment #5)

B. Grammar/Mechanics Comments. Another form of directive involved suggestions or requests for information concerned exclusively with issues of grammar, mechanics (spelling, punctuation, typing), or other classroom management issues. Though these comments, like the ones discussed previously, fall into the speech act category of directives, they were analyzed separately because they dealt with the form(s), rather than the content, of the students’ papers. The

FERRIS, PEZONE, TADE & TINT1

166

responses in this category consisted of verbal comments in the margins or end notes (i.e., not symbols or corrections) about the students’ problems with grammar, spelling, punctuation, typing, leaving adequate margins, etc. Examples follow. EXAMPLE: The essay needs a lot of editing: There are a lot of spelling/ t)ping errors and some run-on sentences. (Paper #14, End comment #7) EXAMPLE:

Past or preseni tense? (Paper #21, Cogent

#3)

EXAMPLE: Your essays are supposed to be in a foldec with the assignment, any notes you made, and your classmates ’ comments. (Paper #I, End Comment #6) C. Positive Comments. Another comment category analyzed separately from directives involved comments whose sole purpose appeared to be to give positive feedback to the writer. Despite reports that positive comments are scarce in teacher feedback (Connors & Lunsford, 1993), that praise is not more effective than criticism in facilitating student improvement (Knoblauch & Brannon, 19Sl), and that too much praise may confuse, mislead, or demotivate students (Cardelle & Corno, 198 1), Ll and L2 researchers and teachers generally agree that comments of praise or encouragement are important to developing writers (Bates, Lane, & Lange, 1993; Connors & Lunsford, 1993; Ferris, 1995b). In this study, both positive marginal and end comments were found, as in the excerpts below: EXAMPLE:

Nice image (Paper # 1, Comment #3)

EXAMPLE: YOUhave an interesting perspective #3, End Comment #l)

on Ho 5 ideas. (Paper

~i~gujstic~urm of the co~~~?e~ts. In addition to rating the teacher’s intent in composing each comment, the linguistic aspects of the comments were analyzed. This analysis consisted of three distinct steps: 1. Noting the syntactic form of each comment (statement, question, exclamation, or imperative); 2. Noting whether the individual comment contained any “hedgeslsofteners” (Biber, 1988). These hedges could be either lexical (“Please,” “Maybe”), syntactic/pragmatic (using an indirect question form such as “Can you” to make a suggestion or request), or sentential (softening a criticism or suggestion with a positive statement at the beginning of a sentence). Examples of comments containing hedges include the following:

TEACHER

EXAMPLE:

COMMENTARY

Please summarize

167

what he did say. (Paper #7, Comment

#3) EXAMPLE: Maybe start this paragraph by summarizing/quoting Viorst and then go on to discuss what you think. (Paper t2 1, Comment #7) EXAMPLE: Can you introduce the second paragraph with a sentence that ties it to the ideas in your introduction? (Paper #16, Comment #2) EXAMPLE: You have raised some good points here, but they need more discussion and development. (Paper #I, End Comment #l). 3. Noting whether or not the comment was “text-specific.” As noted by Leki (1990), teacher commentary has been criticized by researchers both for being too generic (able to written on any paper) and for being too specific (unable to be generalized to other writing assignments or tasks). Bates et al. (1993), Sommers (I 982), and Zamel(1985) encourage teachers to provide text-specific commentary rather than vague generalizations which demonstrate little teacher involvement with the individual student or his/her paper. For the purposes of this study, text-specific comments were defined as comments which could onZy have been written on this particular essay, versus “generic” comments which could have appeared on any student paper; these distinctions are illustrated by the following examples from the sample. 3a.

Text-Specific

EXAMPLE: Comment #l)

Comments: You need a brief summary

of the 2 essays here (Paper #I,

EXAMPLE: You also need to explain and illustrate the “hard-to-discover” stumbling blocks that you mention (Paper #l, End Comment #3). 3b.

Generic (Non-Text-Specific)

EXAMPLE:

Comments:

Good intro! (Paper #l, Comment #2)

EXAMPLE: Please spell-check and proofread your papers before giving them to me. (Paper #l, End Comment #5) Analysis Procedures Though the teacher employed a few correction errors), only verbal comments were considered through the marked first drafts, they numbered on the paper and then completed a rating form

symbols (such as circling spelling in the analysis. As the raters read each marginal and end comment which included the analysis cate-

168

FERRIS, PEZONE, TADE & TINT1

Student/Class l

Variables

Semester Taken (Sprmg or Fall. 1993)

- Student Ablllty (Strong, Average. or Weak) * Aasqgment

Type (Personal Nnrmtwc.

Relatmg Text to Personal Expenence,

or Argumentative)

- Pomt In Semester (Early. Mid. Late) Text Variables: 7hrl#

(I/ Euch Cottvno~~ 7>ptz mtl From (wpenrtrd

l

Askmg for Int’wmamn

l

Requests

by Maqtnd

or End C~tntwt~t.~)

* Glvmp Informatnm - Positive l

Grammar/Mechanic~

l

Que\tmn

l

Statemcnt/Exclnmatlon

l

Imperative

l

Hedge!.

l

Text-Bahed Comment\ Figure

3.

Variable? U\ed m Statlstlcal Analy\ls

gories described above. When the forms were completed for the entire sample of 111 first drafts, they were coded for statistical analyses. Besides examining frequencies and percentages for each comment type, analyses of variance* examined the differences in comment types across several different student/class variables. These variables are summarized in Figure 3.

RESULTS Problems with the Model In working through the sample of first drafts with the analysis model, the raters had two major difficulties with the model: 1.

Determining where one comment eroded and arlother began. The teacher’s comments did not always break down neatly into single phrases, sentences, or idea units. Sometimes a comment was several sentences long; other times two comments were connected by dashes. Often, these “compound comments” crossed categories of both pragmatic intent and linguistic form, and were thus hard to classify. In these cases, the raters were told to use their best judgment as to whether the comment consisted of one large idea unit (to be analyzed as one “Very Long” comment) or as several different points (to be categorized separately). The example below shows a “compound comment” and explains how it was analyzed.

TEACHER

169

COMMENTARY

(a)You go a bit off track in this paragraph. (b) You start by talking about the American system, then talk about Estonia, then compare Estonia to Indonesia. (c) What is your focus here? (Paper #3, Comments #4 & 5)

In this example, sentences (a) and (b) were analyzed as being ment, which in the scheme described above, was classified tion,” and “Statement.” Sentence (c), on the other hand, separate comment (though obviously related to the previous as “Asking for Information,” and “Question.” 2.

one extended comas “Give Informawas analyzed as a one), and classified

Categorizing “Text-Specific” commerlts. Operationalizing the notion of a “text-specific” comment for research purposes was not as simple as it had appeared. For instance, if the “generic” comment “Good example” is written in the margin right next to the example being praised, it seemed likely that the writer would understand which specific point is being commented upon (see Reid, 1994, p. 284, for further discussion and examples of this point). Another example occurred frequently in end notes: “You have a lot of verb tense errors in this draft. I’ve underlined some examples on the first page.” It could be argued that this comment is text-specific (because it referred to a particular error pattern and referred to erroneous linguistic forms in the text) or generic (because the same comment could be written on any paper containing verb tense errors). After meeting to discuss this difficulty, the raters were told to use their best judgment, following the definitions for text-specificity established at the beginning of the study. Inter-rater reliabilities for this variable were somewhat lower than for the other parts of the model (Endnote 4). but were still respectable (over 86%), indicating that the raters had a reasonably consistent notion of what was “text-specific” as they completed the analysis.

Frequencies. Table 2 shows the frequencies of each comment type with means and percentages (per paper) for each type. This table demonstrates some important quantitative distinctions between the marginal and end comments in this sample. First, there were more total marginal comments than end comments, although the end comments tended to be much longer. There were many more instances of “Asking for Information,” and “Questions” in the marginal comments than in the end comments, while the overwhelming majority of the end comments were in statement form (rather than questions or imperatives). The end notes had a greater relative percentage of positive comments than did the marginal notes. Overall, there were few occurrences of grammar comments, hedges, or imperatives. Finally, although the vast majority of both marginal and end comments were classified as text-based, it is important to note that the percentage of text-based comments was considerably higher for marginal comments (81.3%) than for end comments (67.4%). probably reflecting the more general, summative nature of the end commentary.

FERRIS, PEZONE,

170

TADE & TINT1

TABLE 2 Types of Comments by Category (Ranked-Ordered by Mean and % of Total Comment) MARGINAL A.

NOTES

D.

(%)

END NOTES

Mean

Per Paper

(%)

Comment Type 2.90

(314%)

Request

1.86

Request

2.87

(31.1%)

Posmve

I .70

Posmve

1.64

(17 8%)

Grammar/Mech

.79

(147%)

Gtve Info.

I .52

( 16.5%)

Gtve Info

.71

(13.27ob)

.34

(3.6%)

Aslc for Info.

.40

(7 4%)

Comment

f34.5%) (31.57@)

Comment Form

Form

Statement

4.26

(46.2%)

Statement

1.38

(81.3%)

Questton

4.07

t-44 1%)

Imperative

.56

(104%)

.86

(9.3%‘)

Queatton

.49

(9.1%)

.93

(20 4%)

3.63

(79.4%)

Imperative C.

Per Paper

Ask for Info.

Grammar/Mech. B

Mean

Comment Type

Other

Other Hedges

1.54

(167%)

Hedges

Text-Based

7.50

(81 3%)

Text-Based

Total Margmal

Total End

9.23

5.39

Tables Differences in Comment Types Across Student & Class Variables. 3-6 show the ways in which the distribution of comment types differed across various contextual and student variables. Each table includes the standardized group mean for each text variable, showing, for instance, in which semester (Table 3) the teacher made more positive comments. In addition, each table shows the chisquare (the result obtained from the Kruskal-Wallis one-way analyses of variance). The variables are ranked from the largest to the smallest chi-square and the level of significance of each chi-square (with statistical significance established at p < .05). Table 3 shows the comment Differences Between Semesters (Spring & Fall). types which proved to be statistically significant across the two semesters of data collection. This table shows a number of significant differences between the two semesters in the teacher’s responding patterns-despite the fact that it was the same teacher teaching the same course with only a few months between semesters. Discussion with the teacher and examination of the course materials suggest that these differences may be attributed to two possible explanations (or some combination of the two): 1.

Different assignments in the fall);

(no personal

narrative

and two argumentative

papers

2.

During the period when the data were being Increased teacher sensitivity: collected, the teacher cooperated in two separate research projects related to

TEACHER

Differences Variable [M=marginal; A.

Sgnficunt

TABLE 3 in Comment Types/Forms

171

Across Semesters

Standardized

Group Mean

Spring

Fall

Cbi-Square

(df = 1)

Differences

Give Information

(E)

43.08

63.07

13.41

(p < .0005)

Give Information

(M)

46.55

67.48

12.33

(p < .0005) (p < .0005)

Textbased (M)

46.30

67.85

12.29

Questlon (E)

61.80

44.95

11.16

60.92

46.25

9.89

(p<.OOl) (p < ,005)

Ask for Information

B.

E=end]

COMMENTARY

(E)

(P < .ooo5)

Textbased (E)

47.47

66.13

9.39

Positive (E)

47 85

65.56

9.37

(P’.OO5)

Statement (M)

47.98

65.36

8.10

(p c ,005)

Request (M)

61.50

45.40

6.98

(p < .Ol)

Hedge (E)

49.38

63.30

5 75

(pc

Imperative (M)

59.13

48.90

3.42

(n.s.)

Imperative

58.69

49.55

2.84

(ns.)

Statement (E)

50.93

61.03

2.79

(ns.)

Positive CM)

51.44

60.26

2.19

(n.s.)

Hedge (M )

51.49

60.18

2.12

(n.s.)

59.90

51.68

1.81

(n.a.)

52.45

58.77

1.07

(n.s.)

56.90

52.19

.69

(n.s.)

56.38

52.95

.32

(n s.)

54.12

56.30

.23

(n.s.)

54.83

55.25

.Ol

(n.s.)

Non-Si&icunt

(E)

Ask for Information

(M)

Question (M) Grammar/Mechamcs Total Comments

(E)

(E)

Grammar/Mechamcs Request(E)

05)

Di@rencrs

(M)

the teaching of writing, one of which dealt directly with teacher commentary. She feels that her heightened awareness of her own responding practices and of students’ reactions to them had an effect over time on the amount, substance, and tone of her written comments. For instance, Table 3 demonstrates that the teacher wrote more “Ask for Information” comments and more questions during the Spring semester. Since it was also found that these types of comments appeared most frequently on papers involving the writer’s personal experience (see Table 5 and discussion below)either the personal narrative or relating text to personal experience assignmentsit seems likely that the change in assignments between semesters is the explanation for this change. On the other hand, there were more “Give Information” comments in Statement form during the Fall semester. These comments appeared more frequently on papers which responded to other authors’ texts-as all three of

172

FERRIS, PEZONE.

Differences

in Comment Types/Forms Variable E=end]

Group Mean

Early

Mid

Late

Chi-Square

(df = 2)

S/gr~ficmf Drffwnc~ct Total Comments

(E)

71.9’)

17.81

16.69

(p < .OOOSl

Grammar/Mechamcs

(E)

67 01

41.17

I I .67

(p < ,005)

Ask for Informatmn

(M)

65 70

S6.S

I

Y 36

(p’

01)

58.87

67 94

8.40

(p<

OS1

Queatlon (E)

63 72

49.3

I

6.27

(p < .05)

Request (E)

63 88

56.09

6.26

(p<

05)

ChveInformatIon (MI

36 48

62.56

6 20

(pi

.OS)

Imperatlvr

63.5

I

55.40

6.16

(p < 05) tn.\.)

Imperatlvc

B.

TABLE 4 Across Point in Semester at Which Feedback Was Given Standardized

[M=marginal; A.

TADE & TINT1

(M)

(E)

/WV?-Sl,qnfrctrnr Dr~~v
65.35

40.33

52 01

5 76

Quewon

60 93

59.86

14.63

s 47

(nh)

62 67

52.76

5137

4.94

(n.s.1 (n b.)

(M)

Ask for Information

(Et

Statement (Ej

63 67

53 00

19 69

3 YY

Hedge (E)

6.3 17

J9.YY

54.08

3.98

(n.a.)

Textbased (M)

56.81

61 71

17 74

3 3’

tn.&.)

Posmve (M)

‘8 _55 _._

60.90

SO 7’)

Request(M)

51.04 52.85

61 56

51 60

(n.s.)

60 46

s-1.51

I 98 I .9J I ‘X

57.54

57.31

S1 16

I.12

(n.\

58 63

S’7 64

56.74

.91

tn.\.)

Hedge CM) Grxnmar/Mechamcs Give Information

(EJ (E)

tns)

(n \.)

)

Statement (M)

52.5

I

58 03

58.06

78

(n.\.)

Poaltlve (M)

58.79

56.00

51.3 1

77

(n h.)

the Fall assignments required. Possible examples of increased teacher sensitivity include the fact that she wrote more positive comments, text-based comments, and included more hedges during the Fall semester and included fewer “Request” comments (the second semester of data collection). Comments Given at Different Points of the Semester (Early. Mid, Late). Table 4 shows that in general, the teacher produced fewer of most types of comments as semester progressed. There were some variables for which the highest mean was at mid-semester, but with only one exception, all of the variable means were lower at the end of the semester than they had been at the beginning. Possible explanations for this general trend include student improvement and greater shared knowledge: Because of in-class teaching & earlier feedback, the teacher didn’t need to tell them as much OH the papers as the semester went on. For the mid-semester increases, the explanation may have been a combination of

TEACHER

Differences in Comment Variable

Types

Group Mean

Pers. Narr.

Text-toExperience

79.32

50.53

46.2 1

20 69

tp < .OOOl)

70.39

43.87

17.17

tp < .0005)

73.46

49 86

50.30

16.71

tp < .0005)

71.96

52.00

19.50

16.55

(p < .0005)

73.38

46 47

53 70

12.71

tp<

005)

39.82

57.46

64.67

11.27

tp<

005)

Textbased (Mj

14.63

66.50

54 48

7.59

tp < .05)

Imperattve (M)

68.20

53 33

50.72

7.12

tp<

E=end]

Argument

Chi-Square

(df = 2)

S~<~njicuntD#vwws Total Comments tmperattve

(E)

CM)

56.9

Questton (E) Ask for Informatton

(E)

Grammar/?vlechamcs Give Information

B.

173

TABLE 5 Types and Forms Across Assignment Standardized

[M=marginal; A.

COMMENTARY

Nw-Slpjicmt

(E)

(M)

05)

D$jhtzces

Ask for Informatton

(E)

Questton (M) Give Informatton

I

(M)

59.36

63.01

37.51

5.93

(n s.)

54.48

65.40

49.30

5 34

tns.)

51.97

52.99

66.0 1

5.21

(n s.)

Statement (M)

44.63

58.47

60.93

4 90

(n.s

Request(M)

54.13

64 74

50 11

4.48

tns.)

)

Request(M)

61.79

60.35

48.98

4 09

tnb.)

Statement (E)

62.82

56.58

51 38

2.31

tn.s.)

Hedge (M)

49.9 1

55.82

59.85

1.78

(n.s.j

Postttve (E)

50.46

59.00

56.96

1.36

tnb.)

Textbased (E)

61.82

53.65

54.35

1.27

tn.s.)

Hedge (E)

59.00

53.85

55.90

.46

tns.)

Postttve (M)

54.82

58.03

55.09

.24

(n.s.1

56.43

56.30

55 50

.04

(n.s.)

Grammar/Mechamcs

(M)

differing assignments and the fact that the students had received some teaching and feedback but not had as much writing practice as later in the term. Another possibility, of course, is that teacher fatigue and an overwhelming paper load may have affected the number of comments given as the semester wore on. Variation Across Assignment Type. Table 5 shows a number of significant differences across the three assignment types. As previously mentioned, there were more “Ask for Information” comments on assignments which required inclusion of personal experience, and many more questions on the personal narrative assignment, but more “Give Information” comments, more statements, and more text-based comments on assignments which were related to readings. There were also more imperatives, grammar comments, and more total marginal comments on the personal narrative essays, but this trend could also be related to the

174

FERRIS, PEZONE,

TABLE 6 in Comment Types and Forms Across Student Ability Levels

Differences Variable

Standardized

[M=marginal; A.

Sqnficant Imperative

TADE & TINT1

E=end]

Group Mean

Strong

Average

Weak

57.94

65.86

58.66

7.28

53.66

68.60

48.70

6.66

(p < .05)

Chi-Square

(df = 2)

Di&~ences (M)

Request(M)

(p < .05)

Grammar/Mechanics

(E)

53.65

47.91

65.86

6 36

(p < .05)

Grammar/Mechanics

(M)

49.64

58.59

62.40

6.24

(p < .05)

(n.s.)

B

Non-S~gnficant Differrrtces Question (M)

61.84

59.00

45 67

5.49

Posltlve (M)

62.55

56.95

46.4 1

5.45

(n s.)

Statement (M)

56.78

65.45

47. I3

5.30

(ns.) (n s.)

Total Comments

(M)

54.67

65.86

49.61

1.22

Textbased (M)

56.69

62.55

49.64

2.61

(n.5.)

Posltlve (E)

60.35

54.10

51.73

1.80

(n.s

Request(E)

54.97

60.95

53 29

1.07

(ns.)

58.94

55 95

52.10

.97

(n.s

58.38

55.93

52.86

.61

(n s.)

58 33

55.31

53.23

.57

(nx)

53.74

58.38

57.06

54

(n.a.)

Give Informauon

(M)

Ask for Information

(M)

Hedge (M) Gwe Information

(E)

Textbased (E)

) )

51.27

52.69

57.04

.43

(n.s.)

56.56

57.60

53.91

.‘I1

(n.5.)

Questlon (E)

56.56

57.66

53 87

.?I7

(n.s.)

Statement (E)

57 64

56.33

53.53

.35

(n.s.)

55.02

57.88

55.76

.15

(ns.)

Ask for Information

Total Comments

(E)

(E)

in semester at which the assignment and feedback were given (the beginning) than to the assignment type itself. Feedback DifSerences Across Student Ability Level. Table 6 shows the distribution of comments across students designated by the teacher as ‘Strong,” “Average,” or “Weak” writers. It was rather surprising to note that the teacher did not show much variety in her feedback according to students’ perceived ability levels. Only four of the variables were statistically significant; even these have relatively low chi-squares and significance levels relative to those in Tables Three, Four, and Five. Of the four significant variables, the distribution of comments is not especially clear-cut (with the “average” writers having the highest group mean on two of the four significantly different variables). It is worth mentioning, though, that the “weak” group received the most comments on grammar, while the “strong” group was addressed with the fewest imperatives. These trends appear to provide some confirmation of reports from several studies of teacher student writpoint

TEACHER

COMMENTARY

175

ing conferences (Freedman & Sperling, 1985; Patthey-Chavez & Ferris, 1997; Walker & Elias, 1987) and that teachers take a more collegial, less directive stance when responding to stronger students, while focusing more on surface-level problems with weaker students. However, it is important to remember that grammar/ mechanics comments were relatively infrequent on aZZpapers (Table 2) and that the differences in questioning patterns were not statistically significant. The more accurate and important generalization is that there were few significant differences in comment type and form across student ability level and that those differences were relatively small, suggesting that the teacher was quite consistent in her feedback practices with all students.

DISCUSSION Summary

& CONCLUSIONS of Major Findings

Of course, analysis of the responding practices of one teacher does not yield sweeping implications which are generalizable to all composition teaching. Nor can it be claimed, without consideration of the students’ revisions and subsequent texts, that the commenting strategies used by this particular teacher were optimal in effecting student improvement as they revised and as the writing class progressed. Still, there are several implications of the study resulting from the development of the model itself and the analysis of the sample:

1.

Description of teacher response to student writing must go well beyond simple discussions of whether a teacher should respond to “content” or “form. ” Previous Ll and L2 research in the area of written teacher commentary (Cohen, 1987, 1991; Cohen & Cavalcanti, 1990; Ferris, 1995b; Hedgcock & Lefkowitz, 1994; Leki, 1991; Sommers, 1985; Zamel, 1985) has focused on very general categories of analysis: Do teachers comment on students’ “’tdeas/organization” or “grammar/mechanics”? But such description does not capture either the goals of teacher response or the specific forms that it may take. As seen in this analysis, the teacher had a variety of apparent aims for her commentary (as judged by the researchers), and these manifested themselves in a range of linguistic forms.

Studies of ESL students’ problems with and strategies for dealing with teacher feedback have suggested that L2 students may struggle with responding to teachers’ written questions, with understanding their teachers’ symbols and terminology, and even with reading the teacher’s handwriting (Cohen, 1987; Ferris, 1995b; Leki, 1990). These difficulties may be rooted in inadequate linguistic and pragmatic knowledge, whether of rhetorical and grammatical jargon used by the

176

FERRIS, PEZONE. TADE & TINT1

teacher or of the nature and function of indirect speech acts such as requests phrased as questions. They may also result from a mismatch of cultural expectations: A student, for instance, may misinterpret a teacher’s praise or questions as signs of incompetence, as abdications of authority (Goldstein & Conrad, 1990; Patthey-Chavez & Ferris, 1997), or as indications that there is nothing wrong with the paper (since the teacher has not said directly what the student should “fix”). With an analytic model which focuses both on the teacher’s aims in writing a particular comment and on the linguistic aspects of the commentary, it becomes possible to identify types and forms of feedback which may be more or less helpful to ESL students. Just as ESL writers may have problems coping with teacher feedback that is unclear to them, novice teachers learning to provide commentary on students’ papers are not helped much by vague prescriptions such as “Address content before form” and “Use questions rather than statements or imperatives.” Such advice does not address the issues of how to determine the most important issues or problems in a student’s paper, of what goals to set in responding, or of what forms of commentary are most comprehensible to L2 readers/writers. Tools such as the analytic model developed for this study can help writing teachers to examine written response (whether that of other teachers or their own) more critically and carefully, thus aiding them in building their own knowledge and skills. 2.

The substance and form of teacher commentary can vary significantly depending upon the genre of writing being considered, the point in the term at which the feedback is given, and the abilities and personalities of individual students. The results of this study showed, for instance, that the teacher asked more questions when the student was writing about personal experience-ostensibly because only the writer knew the answers to the questions. By the same token, the teacher tended to give more information when the writer was responding to or analyzing other authors’ texts, generally to address issues in the reading which had been omitted or misread. Suggestions given by Ll and L2 scholars as to how to respond to student writing rarely, if ever, address the issue of varying feedback according to the specifications of the task. As a result, teachers’ responses may be poorly focused and occasionally even counterproductive (e.g., asking for more descriptive detail in a persuasive essay).

The amount of teacher commentary (in total number of comments) varied at different points of the semester, and, as the term progressed, the frequency of all comment types decreased. Though it is possible and even tempting to blame teacher fatigue and burnout for this trend, it has been noted by Reid (1994) that as the student and teacher’s shared knowledge increases as the consequence of ongoing teacher feedback and in-class discussion and instruction, the need for extensive commentary lessens somewhat. Further, it is to be hoped, as the student

177

TEACHER COMMENTARY

shows progress

over the semester,

his/her

need for copious

response

should

decrease proportionately. It has been suggested that teachers should systematically decrease the amount of feedback given during a writing course to help students develop as independent self-editors (Bates, et al., 1993; Ferris, 1995a); along similar lines, Fathman & Whalley (1990) point out that the mere act of rewriting (i.e., without any feedback) can lead to student improvement. The findings in this study also indicate some variation in teacher commentary across student ability levels. This result is similar to those reported by Cohen and Cavalcanti (1987) and in Ll & L2 conferencing studies (Freedman & Sperling, 1985; Patthey-Chavez & Ferris, 1997; Walker & Elias, 1987). Though it is tempting to criticize the teacher in this study for “inequitable, disempowering” behavior-collegial and positive towards strong students, directive and mechanical with weaker students-the situation is likely far more complex than this. It is important to remember that response to student writing is a form of two-way communication, and that like any form of human interaction, it will vary according to the personalities and abilities of the participants. Presumably, the students’ personalities and in-person interaction with the teacher would also affect the nature and tone of the teacher’s written feedback, though this study, being discourse analytic rather than ethnographic, was not able to address this point specifically.

Implications

for L2 Writing Instruction

Though a description of one teacher’s feedback practices should of course be treated cautiously, several potential applications for L2 writing pedagogy arise from the development of the model and from the results of this study. First, teachers could use or adapt this analysis system to become more aware-or make their writing students more aware-of both the intent and the forms of their written comments. Similarly, teacher trainers and practicum supervisors could use marked student writing and the analysis categories induced from the data in this study to illustrate “real” teacher response and to build trainees’ and interns’ schemata about the substance and form of written commentary. The analysis model has already been used for these purposes in graduate teacher preparation courses at the authors’ institution and at a colleague’s school. In follow-up reflective journal entries, the graduate students commented that they had found the exercise helpful in a variety of ways (quotations used with permission): With an instrument like this one, I could spot check myself during each correction session to see if I am on target M’ithmaking the kinds of text-based comments I wish to make, especially since realizing that I begin to stray from specific to more generic comments.

178

FERRIS, PEZONE,

If I had it to do over remarks as I was with sheets was that more detail as opposed to 2

TADE & TINT1

again, I think I would try to be as explicit in my positive my negative ones. Thefirst thing I noticed about my analysis words were devoted to curmudgeonly comments in great to 5 words of “good job. ”

It was very helpful for me to see what I wasn’t doing in my responses to the papers. I see now that I have “statements” down pat, but that I need to use the Socratic approach more, i.e., questioning the Htriter. I noted...that I tend to save everything up for the end note and I don’t utilize the potential of the margins enough.

Second, in discussing and providing commentary to L2 writing students, we should avoid simplistic prescriptions such as “Give your feedback in the form of questions” or “Don’t appropriate the students’ texts by telling them what to do.” In this study, different assignments appeared to lead to different types of responses, and this seems appropriate. Third, teachers should also feel free to adjust the amount and types of feedback they give as a course goes on in order to build on feedback and instruction given already, respond to the students’ improvement, and build increasingly independent revising and editing skills. Finally, of course, teachers should be sensitive to the needs, abilities, and personalities of their students in providing feedback. There is no “one-size-fits-all” form of teacher commentary! Directions for Further Research Analysis of teachers’ written response to L2 student writing is a crucial, yet neglected, area of inquiry. Future studies should compare and contrast the results of this study with descriptions of commentary provided by other teachers in other contexts and teaching students at different stages of writing development and L2 acquisition. Such descriptions should also be linked systematically to student revision and overall achievement. Because it is likely that L2 students have differing needs for feedback (Leki, 1990) and different strategies for processing teacher response, researchers should also examine similarities and differences in the commentary given to Ll and L2 students and its effects on student writing development. Finally, since teacher response is only one aspect of the complex interaction among the student, teacher, and institutional factors which describe a writing class, ethnographic techniques such as observation and interviews of teachers and students should be employed to assess whether the conclusions of discourse analytic research are accurate reflections of classroom settings and teacher-student relationships.

TEACHER

COMMENTARY

179

Teacher response to student writing is important at all levels and in all contexts of instruction. Responding effectively to student writing is a skill which, according to previous research, can elude even experienced teachers. Precise descriptions of teachers’ responding behaviors and of their effects on students’ revision practices can inform practicing teachers as to what types and forms of comments may be most effective; it can also help pre-service and novice teachers to develop this skill. Studies which examine feedback in a systematic, contextualized, and longitudinal manner can help L2 researchers and teachers to build both the theoretical knowledge and practical skills needed for this critical endeavor.

Acknowledgments: A version of this paper was presented at the CATESOL State Conference, San Francisco, April 12, 1996. This research was partially funded by a California State University Summer Fellowship.

NOTES 1. The portion of the study which links teacher commentary to student revision is described in a separate manuscript (Ferris, in press). 2. A recent replication of Fathman & Whalley’s study (Russikoff & Kogan, 1996) allowed subjects one week to revise their papers at home; the findings and conclusions of this study as to the effects of content- or form-focused feedback differ from those of Fathman & Whalley. 3. Though there were other considerations which factored into the students’ final grades (attendance, punctuality, participation, whether assignments were submitted on time), the teacher’s categorization of the students’ writing abilities for research purposes was based only on her cumulative assessment of the students’ written products in the entire writing course. Although final essay drafts and examination essays were scored in accordance with the English department’s standard freshman composition rubric, the teacher’s judgments were ultimately relative and subjective, yet informed by a great deal of evidence (i.e., multiple and varied student texts). 4. The other papers were discarded from the sample because they either did not photocopy well enough to be analyzed or because there was no matching revised draft (due to student drops or late submissions). 5. It has been pointed out that the researchers’ subsequent judgments represent only their interpretations of the teacher’s aims in writing the commentary, which should not be equated with the goals themselves. While this is quite true, because we were able to consult with the teacher during the development of the model, we are confident that we have assessed her aims as accurately as possible. 6. Four different inter-rater reliability procedures were conducted to assess the agreement between the pairs of raters: A.

For Comment ‘Qpe and Comment Form, Friedman’s chi-squares (a reliability test for ranked data) were calculated. The agreement (correlation) between raters for

180

FERRIS, PEZONE, TADE & TINT1

comment type was .9233 (i.e., 92.33% agreement between raters); for comment form, the agreement was .9874 (98.74% agreement). B.

For identification of comments containing Hedges and Text-Based comments, Cochran’s Q tests (used for dichotomous variables) were performed. The agreement between raters for hedges was .9772 (97.72%): for text-based comments it was X635 (86.35%).

All reliability statistics were generated by the statistical package SPSS for Windows (Version 6.1). 7. An additional step in the analysis was to calculate the length of the comment (in number of words, later categorized into “Short,” “Average.” “Long,” or “Very Long” comments). This portion of the analysis is discussed in a separate paper (Ferris, in press). 8. The test used to compare means of different types of comments across different class/student variables was the Kruskal-Wallis One-Way Analysis of Variance, a chisquare test appropriate for categorical data such as the coded text variables in this study. Examination of the chr-square results allowed us to examine whether, for instance, observed differences in comment types across various assignments were statistically significant.

REFERENCES Bates, L., Lane, J.. & Lange, E. (1993). Writing clearly: Responding to ESL compositions. Boston: Heinle & Heinle. Beason, L. (1993). Feedback and revision in writing across the curriculum classes. Research in the Teaching of English, 27, 395421. Biber, D. (1988). Variation across speech and writing. Cambridge: Cambridge University Press. Brannon, L.. & Knoblauch, C.H. (1982). On students’ rights to their own texts: A model of teacher response. College Composition and Communication, 33, 157-166. Cardelle. M., & Como, L. (1981). Effects on second language learning of variations in written feedback on homework assignments. TESOL Quarterly. 15. 25 I-26 1. Carson. J.G., & Nelson, G.L. (1994). Writing groups: Cross-cultural issues. Journal of Second Language Writing, 3, 17-30. Carson, J.G., & Nelson. G.L. ( 1996). Chinese students’ perceptions of ESL peer response group interaction. Journal of Second Language Writing, 5. 1- 19. Cohen, A. (1987). Student processing of feedback on their compositions. In A.L. Wenden & J. Rubin (Eds.), Learner strategies in language learning (pp. 57-69). Englewood Cliffs, NJ: Prentice-Hall. Cohen, A. (1991). Feedback on writing: The use of verbal report. Studies in Second Language Acquisttion, 13, 133-159. Cohen, A.D.. & Cavalcanti, M.C. (1987). Giving and getting feedback on composition: A comparison of teacher and student verbal report. Evaluation in Research in Education, 1.63-73.

TEACHER COMMENTARY

181

Cohen, A.D., & Cavalcanti, MC. (1990). Feedback on compositions: Teacher and student verbal reports. In. B. Kroll (Ed.), Second language writing: Research insights for the clussroom (pp. 155177). New York: Cambridge University Press. Connor, U., & Asenavage, K. (1994). Peer response groups in ESL writing classes: How much impact on revision? Journal of Second Language Writing, 3,257-276. Connors, R., & Lunsford, A. (1993). Teachers’ rhetorical comments on student papers. College Composition and Communication. 44, 200-223. Ellis, R. (1994). Learning to communicate in the classroom: A study of two language learners’ requests. In H.D. Brown & S. Gonzo (Eds.). Readings on second language acquisition (pp. 334-360). Englewood Cliffs. NJ: Prentice Hall Regents. Fathman, A., & Whalley. E. (1990). Teacher response to student writing: Focus on form versus content. In B. Kroll (Ed.), Second language writing: Research insights for the classroom (pp. 178-190). New York: Cambridge University Press. Ferris, D.R. (1995a). Student reactions to teacher response in multiple-draft composition classrooms, TESOL Quarterly, 29, 33-53. Ferris, D.R. (1995b). Teaching ESL composition students to become independent self-editors. TESOL Journal, 4(4), 18-22. Ferris, D.R. (in press). The effects of teacher commentary on student revision. TESOL Quarterly. Fox, L. (1992). Focus on editing. London: Longman. Freedman, SW., & Sperling, M. (1985). Written language acquisition: The role of response and the writing conference. In S.W. Freedman (Ed.), The acquisition of written language. Norwood, NJ: Ablex. Glaser, B.G., & Strauss, A.L. (1967). The discovery of grounded theory: Strategies for quantitative research. Chicago: Aldine. Goldstein. L., & Conrad, S. (1990). Student input and the negotiation of meaning in ESL writing conferences. TESOL Quarterly. 24,443-460. Hedgcock, J., & Letkowitz, N. (1994). Feedback on feedback: Assessing learner receptivity to teacher response in L2 composing. Journal of Second Language Writing, 3. 141-163. Hedgcock, J., & Letkowitz, N. (1996). Some input on input: Two analyses of student response to expert feedback in L2 writing. Modem Language Journal, 80.287-308. Kepner, C. (1991). An experiment in the relationship of types of written feedback to the development of second-language wnting skills. Modern Language Journal, 75, 305-313. Knoblauch, C., & Brannon, L. (1981). Teacher commentary on student writing: The state of the art. Freshman English News. 10, l-4. Leki, I. (1990). Coachmg from the margins: Issues in written response. In B. Kroll (Ed.), Second language writing: Research insights for the classroom (pp. 57-68). New York: Cambridge University Press. Leki, I. ( 1991). The preferences of ESL students for error correction in college-level writing classes. Foreign Language Annals, 24,203-218. Lockhart, C., & Ng, P. (1995). Analyzing talk in ESL peer response groups: Stances, functions, and content. Language Learning, 45, 605-655.

FERRIS, PEZONE, TADE & TINT1

182

Mendoqa, C.O., & Johnson, K.E. (1994). Peer review negotiations: Revision activities in ESL writing instruction. TESOL Quarterly, 4,745-769. Nelson, G.L., & Murphy, J.M. (1992). An L2 writing group: Task and social dimensions. ~ournul of Second Language Writitzg, 1, 171-193. Patthey-Chavez, G.G., & Ferris, D.R. (1997). Writing conferences and the weaving of multi-voiced texts in college composition. Research in the Teachirzg ofEnglish, 31, 5 l-90. Radeckl. P., & Swales, J. (1988). ESL student reaction to written comments on their written work. System, 16.355-365. Reid. J. (1993). Teaching ESL uaririrag. Englewood Cliffs, NJ: Regcnts~rentice-Hall. Reid, J. (1994). Responding to ESL students’ texts: The myths of appropriation. TESOL Quarterly, 28. 273-292.

Robb, T., Ross, S., & Shortreed, I. (1986). Salience of feedback and its effect on EFL writing quality. TESOL Qzcarterly, 20, 83-93. Russikoff, K., & Kogan, S. (1996, March). Feedback on ESL writing. Paper presented at the 30th Annual TESOL Convention, Chicago. Searle, J. (1976). Indirect speech acts. In P Cole & J. Morgan (Eds.), Syntcu and semantics 3: Speech acts (pp. 59-82). New York: Academic. Silva, T. (1993). Toward an understanding of the distinct nature of L2 writing: The ESL research and its implications. TESOL Quarterly, 28,657-677. Silva, T. ( 1988). Comments on Vivian Zamei’s “recent research on writing pedagogy”: A reader reacts.. . TESOL Quarterly, 22,5 17-520. Sommers, N. (1982). Responding to student writing. College Composition and Conzmunication, 33, 148-I%. Spack, R. (1990). Guidelines. New York: St. Martin’s Press. Sperling, M. (1994). Constructing the perspective of teacher-as-reader: A framework for studying response to student writing. Research in the Teaching of English. 28, 17S207.

Tannacito. D. (1995). A guide to writing in English as a second or foreign language: An annotated bibliography of researclz and pedagogy. Alexandria, VA: TESOL. Villamil, O.S., & deGuerrero, M.C.M. (1996). Peer revision in the L2 classroom: Socialcognitive activities, mediating strategies, and aspects of social behavior. Journal of Second Larzguage Writing, 5.5 l-76.

Walker, C.P., & Elias, D. (1987). Writing conference talk: Factors associated with highand Low-rated writing conferences. Reseu~h irz the Teaching of English. 21, 266285.

Zamel, V. (1985). Responding to student writing. TESOL Qzzarterly, 19.79-102. Zhang. S. (1995). Reexamining the affective advantage of peer feedback in the ESL writmg class. Journal of Second Language Writing. 4.209-222.