System 29 (2001) 221±234
www.elsevier.com/locate/system
When an exit test fails G. Braine * Department of English, The Chinese University of Hong Kong, Shatin, Hong Kong Received 20 April 2000; received in revised form 10 July 2000; accepted 28 August 2000
Abstract Within the context of recent concerns that mainstream composition studies in the United States have largely neglected ESL writing, this report describes the performance of ESL students in the exit test of a ®rst year writing program at a US institution. The passing rate, which was high when the exit test was based solely on a prompt, declined sharply when the format was changed to a reading±writing test. The report is based on the analyses of exit test prompts, exit exam transcripts, scoring guides used in writing classes and during the calibration sessions to evaluate exit exams, reading passages, and interviews with students and teachers of ®rst year writing courses. Results of the analysis show that a lack of consistency in the scoring of the exit test, the use of inappropriate reading passages, and careless prompt design contributed to the decline in the passing rate. The report suggests that the employment of new PhDs, who have had little exposure to ESL theory and practice, as directors of Freshman Writing may be detrimental to programs which enroll large numbers of ESL students. The report concludes with a suggestion for the inclusion of some coursework in ESL writing in rhetoric and composition programs. # 2001 Elsevier Science Ltd. All rights reserved. Keywords: ESL students; Writing programs; First-year writing courses; Exit tests
In a recent article in Written Communication, three well known ESL writing specialists, Tony Silva, Ilona Leki, and Joan Carson (1997), provide convincing evidence that mainstream composition studies in the United States have largely neglected writing in ESL and in other languages. The neglect was seen in conference programs, bibliographies, monographs by the best-known scholars, scholarly journals, and composition textbooks. Silva, Leki, and Carson suggested that such an attitude resulted in a limited perspective of second and other language writing, leading to problems ``on both theoretical and practical levels'' (p. 402). * Tel.: +852-2609-7445; fax: +852-2603-5270. E-mail address:
[email protected] (G. Braine). 0346-251X/01/$ - see front matter # 2001 Elsevier Science Ltd. All rights reserved. PII: S0346-251X(01)00009-4
222
G. Braine / System 29 (2001) 221±234
Compounding this neglect is the absence of course work in second language writing in most graduate programs in rhetoric and composition. For students in such graduate programs, encounters with ESL writers may occur only when ESL students enroll in the lower-division writing courses these graduate students teach or when they tutor ESL students in writing labs. Thus, often only with ¯eeting encounters with ESL writers (and no exposure to theory and practice of ESL writing), new PhDs are sometimes put in charge of Freshman Writing programs which enroll hundreds of ESL students. Such arrangements could be detrimental to the welfare of ESL students; instead of accelerating their acquisition of academic literacy, the practices of these inexperienced administrators could become obstacles to the students' learning process. This report describes the performance of ESL students at the exit test of a ®rst year writing course at a US university, where the Freshman Writing program was administered by a new PhD with little experience with ESL students. The report traces the rapid decline in the passing rate of ESL students when the exit test changed from one based solely on a prompt (as in the TOEFL Test of Written English) to one based on a reading passage. It also analyzes the reasons for the decline, which appear to be due to a lack of consistency in the scoring of the exit test, the use of inappropriate reading passages, and careless prompt design. The report is based on the analyses of exit test prompts, exit exam transcripts, scoring guides used in writing classes and during the calibration sessions to evaluate exit exams, reading passages, and interviews with students and teachers of ®rst year writing courses. 1. Background Each year, about half a million international students, who are mainly ESL speakers, enroll in US colleges and universities (Desruisseaux, 1994). A signi®cant percentage of these students are undergraduates, who are required, like their American counterparts, to take a ®rst year writing course. In fact, the presence of at least one or two international students in every ®rst year writing class is more the norm than the exception. ESL students are usually placed in three types of ®rst year writing courses: with native speakers in regular sections (mainstreaming), with basic/developmental writers, and in sections designated for ESL students (see Braine, 1994; Silva, 1993, 1994, for more on the placement of ESL students in ®rst year writing courses.) This report focuses on the ®rst year writing courses at a medium-size urban university which was founded in 1964. By the mid-1990s, the period covered by this report, the enrollment had grown to more than 12,000 in the University's eight colleges. During the Fall quarters of 1991 through 1994, a total of 6379 ®rst year students were admitted to the University. The College of Arts & Sciences enrolled the highest percentage (33%) of these students. The University aggressively recruits international students. As a result, by the mid-1990s, more than 800 international students, de®ned as resident and nonresident aliens, were enrolled at the University annually. They were from 86 countries,
G. Braine / System 29 (2001) 221±234
223
68% from Asia and Oceania, 8% from the Middle East, 14% from Europe, and 10% from the Americas. The largest number of international students (120) came from Pakistan followed by Malaysia (110) and Vietnam (65). At this University, ®rst year writing courses are sequenced as Composition I and Composition II. During the period covered by this study, students were required to pass an exit test on completion of the Composition I course in order to enroll in Composition II. About 2000 students took the exit test annually, most during the fall quarter. 1.1. ESL students in ®rst year writing courses Up to the fall quarter of 1991, which precedes the period covered by this report, many international/ESL students repeatedly failed the exit test. Concerns about the high failure rate were voiced by the students, the Oce of International Students Services, and some academic departments. The main complaint by the students was that the exit test was designed and evaluated unfairly, with no consideration for second language writers who would have to reach the pro®ciency levels of nativespeaker students in order to pass the test. ESL students also complained of feeling isolated in mainstream sections, receiving little attention from the teachers and classmates. The Oce of International Students Services was concerned about possible drop in the enrollment of international students due to the ``bad reputation'' earned by the Composition I course. Academic departments that required their majors to pass Composition I and II courses as a pre-requisite to further study were concerned because their international students were unable to proceed with coursework in their majors. In response to these complaints, the English Department reserved three sections of Composition I for ESL students from the Winter quarter of 1992. Both immigrants and international students whose native language was not English had the choice of enrolling in ESL or mainstream sections. As a result, of the 258 ESL students who took Composition I during the 1992/1993 academic year, 168 enrolled in ESL sections and 90 chose to enroll in mainstream sections. The ESL sections were taught by teachers who had voluntarily participated in a 3-day workshop on teaching writing to ESL students. 2. The Composition I course During the 1991±1992 through 1993±1994 academic years, the period covered by this report, the Undergraduate Bulletin of the university described Composition I as taking ``the student through a series of sequenced assignments designed to move from expressive, personal writing . . . to informal, expository writing by the end of the quarter.'' The grading system was satisfactory/unsatisfactory. Both mainstream and ESL Composition I sections had similar course syllabi. During the 9-week academic quarter, students wrote four assignments, starting a new assignment every 2 weeks; in each cycle, the ®rst week was devoted to writing the draft and the second
224
G. Braine / System 29 (2001) 221±234
to revision. Students began an assignment by pre-writing in class and later turned-in the ®rst draft to the teachers, who selected a few drafts as model essays for class discussion. Following the discussion, the students holistically scored each others papers in small groups according to a common scoring guide designed for evaluating student papers (see Appendix A for a sample scoring guide). They later revised the assignment at home and handed the ®nal version to the teacher. In the ®rst writing assignment of the 1991±1992 and 1992±1993 academic years, students described an activity they disliked such as standing in line or doing laundry. For the second assignment, they wrote a ``meditation'' on returning to a place that once had a special signi®cance for them, describing the place as they remembered it and the changes they observed on their return, and discussing the signi®cance of the place then and now. For the third assignment, students summarized a magazine or newspaper article, and for the fourth assignment, they selected a library source on a phenomenon of modern life such as fast food, computers, or nursing homes and wrote a ``reaction'' paper. They were required to begin with a description of their phenomenon and conclude with a re¯ection, using the library source to support their ideas. The textbook used in mainstream Composition I sections was Write to Learn (Murray, 1993). In ESL sections, teachers used either In Our Own Words (Mlynarczyk and Haber, 1991) or weekly issues of Newsweek magazine. Both mainstream and ESL sections used The Prentice-Hall Reference Guide to Grammar and Usage (Harris, 1992) as a supplement to the main text. The four writing assignments were only loosely linked to the textbooks; instead, the texts served mainly to introduce students to the writing process and as ready sources of supplementary reading. Most teachers used handouts consisting of further readings or grammar exercises in addition to the textbooks or Newsweek magazine. For the 1993±1994 academic year, the Freshman English Committee, which was chaired by the director of Freshman Writing (a recent PhD from a rhetoric and composition program), decided to change the emphasis of Composition I to both reading and writing, although the course description in the Undergraduate Bulletin remained unchanged. In a handout distributed to students enrolled in the course, Composition I was now described as emphasizing invention through personal and autobiographical experiences in the ®rst two assignments and critically responding to a text in the later assignments. The handout listed separate reading and writing objectives for the course. As in the previous year, students continued to write four assignments during the course, spending the usual 2 weeks from draft to ®nal version of each assignment. In the ®rst assignment, titled ``Remembering,'' students described an unforgettable event from their childhood, recalling the event from a child's viewpoint. In the second assignment, ``Re¯ecting,'' students described a childhood game they played, re¯ecting on the purpose of such games. The task was ``to analyze from a mature point of view the meaning or social purpose of a childhood experience.'' In the third assignment, ``Analyzing through Observation,'' they analyzed an advertisement from a popular magazine to determine the advertisement's ``concealed message.'' In the ®nal assignment, ``Analyzing an Analysis,'' students analyzed a review of a movie, a TV program or a concert.
G. Braine / System 29 (2001) 221±234
225
The Critical Eye (Taylor, 1990) was the textbook used by all mainstream sections and some ESL sections. The other ESL sections continued to use weekly issues of Newsweek magazine. As during the previous years, both mainstream and ESL sections used The Prentice-Hall Reference Guide to Grammar and Usage (Harris, 1992) as a supplement to the main text. Again, the four writing assignments were only loosely linked to the textbooks. 3. The exit test Until Spring 1992, three prompts, designed by the director of Freshman Writing with the help of the Freshman English Committee, were given to the students a week before the exit test. In most sections, students prepared for the test by writing responses to all the prompts, which were discussed and commented upon by the teachers. Only one of the prompts would be given to them at the 2-h exit test. Typical prompts given during this period are shown in Fig. 1. From the beginning of the 1992±1993 academic year, although the Composition I syllabus remained the same, the format of the exit test was changed to emphasize both reading and writing. This was in anticipation of a change of emphasis in Composition I to both reading and writing from the following year, as proposed by the Freshman English Committee. The process that led to the new reading±writing test began with a screening of suitable reading passages by members of the Freshman English Committee. To be considered for the test, the readings had be about 1500 words long and judged accessible to all students irrespective of linguistic or cultural background. Three readings were selected, copied, and distributed to all students about a week before the test was administered. The committee then developed
Fig. 1. Sample prompts from Composition I exit test, 1991±1992.
226
G. Braine / System 29 (2001) 221±234
prompts for each reading, and selected one prompt based on one of the readings to be administered at the test. As in previous years, when the test prompts were discussed and analyzed in class, the readings too were discussed extensively and analyzed in Composition I sections. In addition, all the teachers created hypothetical prompts based on the readings and students wrote practice responses to these prompts, both in and out of class. One of the prompts given in Fall 1992, when the new reading±writing exit test was introduced, is given in Fig. 2. This prompt was based on a reading titled ``Comparing work ethics: Japan and America.'' 3.1. Calibration sessions Throughout the period covered by this report, teachers in the English Department participated in a calibration session about 2 weeks before the test. In preparation for a calibration session, the teachers read sample student papers from previous tests. During the calibration session, the same scoring guide used by students to holistically score their papers (in Appendix A) was used in rating the sample papers. These calibration sessions, led by the director of Freshman Writing, lasted about 2 h. They were attended by most of the teachers who also participated actively in the discussion that followed. 3.2. Test evaluation The exit test lasted 2 h. Students were identi®ed by a computer generated three digit number on the ``Blue Book'' in which they wrote. They were allowed to use a dictionary during the test, and most ESL students did so. Immediately after each test, a few teachers volunteered to read a sample of student papers, create a second scoring guide (such as in Appendix B) for evaluating the papers, and also select a set of range ®nders from the sample student papers which related to each level in the scoring guide, all under the guidance of the director of Freshman Writing. The new scoring guide and copies of the range ®nders were given to the teachers along with student papers for use during the evaluation. With the help of a computer program, student papers were randomly assigned to the teachers for evaluation. The papers of ESL students were not identi®ed as such and no special provision was made for their grading. All papers were read twice in the same day, in the morning by half the teachers and in the afternoon by the rest of the teachers. Again, the readers were matched randomly by the computer program. During the fall quarter, when the test was taken by the largest number of
Fig. 2. Prompt from Composition I exit test, Fall 1992.
G. Braine / System 29 (2001) 221±234
227
students, each teacher read about 40 papers. The number dropped to around 25 papers in the Spring quarter. As agreed by the Freshman English Committee, papers receiving a score of 4 or above in the scoring guide passed. For a paper to pass or fail, both readers had to agree; when a disagreement occurred, the paper was read by a third reader on the following day. The process of calibration, adherence to a scoring guide, and the availability of range ®nders ensured a high rate of correlation (averaging 0.80) among the ®rst and second readers. 3.3. Exit test results Data was collected from the winter quarter of the 1991±1992 year, when three sections of Composition I were ®rst reserved for ESL students, till the Winter quarter of 1993±1994, when the exit test was abolished (Fig. 3). In the Winter and Spring quarters of 1992, when the exit test was based solely on a prompt, students in ESL sections performed extremely well, achieving a mean passing rate of 72% (Fig. 3). In fact, their passing rate was better than that of mainly native-speaker students in mainstream sections, whose passing rate was only 58.5%1. According to some teachers, this high passing rate was partly attributable to a strategy used by some ESL students, who wrote responses to all three prompts (given to them a week before the test), memorized the responses, and copied the appropriate response at the test. When the ``reading±writing'' format was introduced from Fall 1992, ESL students no longer had the advantage of the ``memorize and copy'' strategy. Nevertheless, in Fall 1992, the passing rate for in students in ESL sections was only 2% less than for students in mainstream sections.
Fig. 3. Percentage of ESL students passing the Composition I exit test. Note: The prompt based test was changed to a reading±writing test in Fall, 1992. 1
test.
See Braine (1996) for a comparison of the performance of native-speaker and ESL students in the exit
228
G. Braine / System 29 (2001) 221±234
However, from the subsequent quarter (Winter 1993), the passing rate for students in ESL sections began to decline rapidly. As Fig. 3 indicates, during the 1992±1993 academic year, the overall passing rate fell to 45.5% from the previous year's 72%. (It must be noted that in the same period, the passing rate for mainly native-speaker students in mainstream sections also fell, from 58.5 to 46%.) By winter 1993±1994, the passing rate had fallen to a low 29% for ESL students. 4. Discussion What could the rapid decline in the passing rate be attributed to? The ®rst probable cause could be the dual scoring guides used in the calibration sessions and during the actual scoring of exit test papers. As noted earlier, in preparation for scoring exit test papers, English teachers participated in a calibration session about 2 weeks before the exit test, reading and scoring sample student papers selected from previous tests. The scoring guide seen in Appendix A was used in evaluating the sample papers. However, after the test, a hastily designed scoring guide (Appendix B) was used to score exit test papers. The confusion caused by the dual scoring guides may have been partly responsible for the low passing rate. However, the carelessly designed test prompts may also have led to the low passing rate. For instance, some prompts were as long as 80 words and required students to perform complex tasks within a 2-h time period, undoubtedly a dicult challenge for many novice second language writers. Fig. 4 shows a prompt from the Fall 1993 test that may be illustrative of a poorly designed prompt. An 80-word prompt may have become a challenging reading task, causing anxiety and stress to ESL students writing under time pressure. In addition, the prompt required multiple writing tasks, requiring the writers to explain the quotation, describe a special time for the student writer, discuss the special time, and relate the special time to the writer's review of White's essay. A comparison of this prompt with that of the prompt for Fall 1992 (in Fig. 2), when 56% of the ESL students passed the exit test, shows a stark contrast. The prompt for Fall 1992 consisted of a mere 38 words and did not demand the multiple writing tasks as the Fall 1993 prompt did. The passing rate for ESL students plummeted to 29% in Winter 1994 probably due to an inappropriate reading passage for the test and poor prompt design. One of
Fig. 4. Prompt from Composition I exit test, Fall 1993.
G. Braine / System 29 (2001) 221±234
229
the readings for the exit test was a seven-page, 47-paragraph essay titled ``Thinking as a Hobby'' by William Golding. In the essay, Golding classi®es thinking into three grades. Grade three thinking is described as feeling Ð thoughts full of prejudice, ignorance, and hypocrisy. Twenty-nine of the 47 paragraphs are given to describing grade three thinking with vivid anecdotes and examples. Grade two thinking is described as the detection of contradictions. Again, Golding illustrates grade two thinking with anecdotes and examples in six lengthy paragraphs. However, grade one thinking is described with one anecdote, Golding's meeting with Einstein, who is identi®ed as an ``undeniably grade one thinker.'' During the meeting, Golding and Einstein stand side by side on a small bridge at Oxford University in near silence for about 10 min, watching the stream below. Finally, Einstein points to a ®sh in the stream, utters the word ``Fish,'' and ambles o. No other anecdotes or examples are cited and the entire description takes only ®ve short paragraphs. Nevertheless, the test prompt based on this reading is shown in Fig. 5. To expect novice ESL writers barely out of their teens to have met a ``grade one'' thinker of Einstein's caliber would be unreasonable; asking them to describe and discuss ``grade one'' thinking ``vividly and thoroughly'' is, to say the least, unfair. Thus, it comes as no surprise that the passing rate at this test was only 29% for ESL students. The students' desperation when confronted with this prompt could be seen in the 20 exit test papers I examined for this report. A number of students referred to the diculty of the prompt in their responses. One student wrote: To tell something about my personal experience is really hard for me because I am not old enough to gather experience and it is almost out of my dreams to have a personal experience with a grade one thinker. Other ESL students resorted to creative writing, identifying the following individuals as grade one thinkers: a private tutor; a rich child named Peter who read a newspaper at 1-year-old and read History books at 3 years; a mathematics teacher; a classmate who was a genius; a middle-school friend named Jim; a classmate named Hiroshi who wore eyeglasses and looked intelligent; a physics teacher; an economics teacher; a chemistry teacher; a classmate who designed a model ship; and a truck driver named John. There is no doubt that many of these were ®ctitious characters hurriedly invented during the test. I interviewed two teachers who taught ESL Composition I sections in Winter 1994 and asked for their reactions to the prompt. They were both surprised that ``grade one thinking'' was the focus of the prompt, since the reading referred to such thinkers
Fig. 5. Prompt from Composition I exit test, Winter 1994.
230
G. Braine / System 29 (2001) 221±234
only brie¯y, and the only example given of a grade one thinker was Einstein. Therefore, they had paid little attention to grade one thinkers during class discussions of the Golding essay. They said that it was dicult to explain Einstein's genius without a sound knowledge of Physics. Both teachers felt that their students were unfairly penalized by the prompt. Later, I also interviewed a number of ESL students who had taken the exit test. They too said that they did not expect a prompt on grade one thinkers. Since grade two and grade three thinkers were discussed in greater detail in the Golding essay, they could relate such thinkers to people they personally knew and could describe these people with anecdotes. Although they knew Einstein to be a genius, they did not know enough about Einstein to relate his thinking habits to someone they knew personally. Many of these students stated that they were unlikely to meet anyone of Einstein's caliber in their lifetime and admitted that when they saw the test prompt, they created a character whom they could label as a grade one thinker. But, since the Golding essay did not clearly state what grade one thinking was, they could not proceed beyond naming or physically describing their ®ctitious character. When asked why they had not asked their teachers to describe grade one thinkers, one student said that he had. However, the teacher simply named Socrates without explaining why he was considered a grade one thinker. The TOEFL Test of Written English (TWE) provides an interesting contrast to the haphazard way in which these exit tests were conducted. In use since 1986, the TWE has been administered to more than 2.5 million test takers and has been scored more than 5 million times. It is a 30-min essay test administered along with the multiplechoice TOEFL, which is used by higher education institutions to evaluate the English pro®ciency of students whose native language is not English. Like the exit tests described in this study, the TWE is criterion referenced and scored holistically according to a 6-point scoring guide. Each paper is scored twice, each time by a different reader. When the scoring diers by more than one point, the papers are sent to a third reader to resolve the discrepancy (Test of Written English Guide, 1996). TWE prompts are developed by a team of ESL writing specialists who also review the prompts for ``accuracy, content appropriateness, suitability of language, and diculty'' (p. 6). Rigorous pre-testing ensures that the topics are ``fair, accessible, and appropriate'' to all (international) test takers. Further, the TWE Scoring Guide ensures that the scoring standards are consistent and have a high interrater reliability. Although the University where the exit tests were conducted does not have the ®nancial resources of the Educational Testing Service which conducts the TWE, careful prompt development and review, pre-testing on a small scale, and the consistent use of a single scoring guide would have ensured fairness, accessibility, and appropriacy of the exit tests. 5. Conclusion The plummeting passing rate evoked protests from students, parents, and teachers from subject departments which required ®rst year writing courses as prerequisites
G. Braine / System 29 (2001) 221±234
231
to their programs of study. As a result, the exit test for Composition I was abolished at the end of the Winter 1994 quarter. This report indicates that a lack of consistency in the scoring of the exit test, the use of inappropriate reading passages, and careless prompt design contributed to the rapid decline in the passing rate. The ®rst year writing course, a standard requirement in American tertiary education, is a challenge to ESL students. When successful completion of the course is a prerequisite for higher level courses, or when exit from the course is determined by a competitive test, it can be an obstacle to the students' academic success. Writing program administrators who are sensitive to these problems often take careful measures to ``level the playing ®eld,'' ensuring that ESL students have a fair chance to succeed in these courses. At most institutions, special sections of ®rst year writing courses for ESL students take years to establish. Skeptical administrators, members of curriculum committees, English department chairs, and writing course administrators have to be convinced that such courses would not segregate ESL students while also not favoring them (Braine, 1994). Even after they are established, these courses are often fragile Ð open to scrutiny and criticism. At best, when ESL students do better than native speakers at an exit test, for instance, there is bound to be envy, which can create problems. On the other hand, when ESL students appear to be fairing badly, as in the case described above, the very existence of ESL courses will be threatened. At the beginning of this report, I quoted Silva, Leki, and Carson (1997) providing evidence that mainstream composition studies have largely neglected writing in ESL. Nevertheless, Silva, Leki, and Carson also remind us that mainstream composition studies have embraced and thereby been enriched by ``feminist, critical, postcolonial, and postmodernist perspectives imported from literary and cultural studies'' (p. 399). In fact, mainstream composition studies have also promoted multiculturalism and diversity in recent years. Yet, as this report indicates, more remains to be done. The inclusion of some course work in ESL writing in rhetoric and composition graduate programs, so that new PhDs will avoid debacles such as the one reported here, would be another step in the right direction. Appendix A. Scoring guide used by students in Composition I classes and by teachers in calibration sessions A.1.6 pt. essay Ð establishes a context for the essay by providing background and purpose and a distinct subject. Ð is rich in detail. Ð is well organized, easy to follow, easy to read. Ð is virtually free of spelling, punctuation, sentence, and paragraph errors. Ð operates on a high level of signi®cance. Ð answers all the reader's basic questions. Ð has a ®rst sentence that makes you want to read the second. Ð presents a clear and signi®cant position.
232
G. Braine / System 29 (2001) 221±234
A.2.5 pt. essay Ð establishes a context for the essay by providing background and purpose and a distinct subject. Ð has many details. Ð is generally well organized and easy to read. Ð easy to follow. Ð is generally free of spelling and punctuation errors. Also free of sentence and paragraph errors. Ð answers most of the reader's basic questions. Ð has a subject that may be fuzzy. Ð has a good ®rst sentence, but it may lack originality. Ð has a clear but not necessarily signi®cant position. A.3.4 pt. essay Ð answers some but not all of the reader's basic questions. Ð establishes a context for the essay by providing background and purpose and a distinct subject. Ð has some details. Ð is organized, although not as easy to follow as a 5 pt. essay. Ð has occasional errors in punctuation, spelling, sentences, and paragraphs. Ð the author seems to be holding more than one position. Ð has an uninteresting ®rst sentence. A.4.3 pt. essay Ð Ð Ð Ð Ð Ð
doesn't answer nearly enough questions. fails to establish a context by providing background information. is not very detailed. has frequent errors in spelling, punctuation, sentences, and paragraphs. doesn't have a subject; the writer may be attempting the impossible. the ®rst sentence is boring.
A.5.2 pt. essay Ð will signi®cantly compound the problems of a 3 pt. essay. Ð doesn't have a subject. A.6.1 pt. essay Ð Ð Ð Ð Ð
lacks background. lacks purpose. lacks detail. may be o topic. composition is trivial.
G. Braine / System 29 (2001) 221±234
233
Ð is unorganized and hard to follow. Ð has serious surface errors in spelling and punctuation/and sentence and paragraph structure. Appendix B. A scoring guide for evaluating the exit test, Fall 1992 EH 101 Test Scoring Guide 6 Excellent discussion and analysis of Ouchi's view. Excellent details in the personal account. Few if any writing mistakes. Originality and insight. The discussion engages the text on an intellectual plane. It's not a report. Focus, order, and development are excellent. 5 Good discussion and analysis of Ouchi's view. Solid focus, order, and development. Not the same level of analysis and discussion as a 6. Personal account is detailed and relevant to the discussion. A few but not critical writing mistakes. 4 Fair discussion and analysis of Ouchi's view. Discussion of Ouchi's view must be present for passing; this is necessary to ful®ll the demands of the topic. Focus, order, and development are adequate and re¯ect competence. Writing mistakes are more numerous than a 5 paper. 3 Weak discussion of Ouchi's view. In fact, there may be no discussion at all and the personal account dominates the paper. Or there may be a discussion but no personal account. Few details. Not working well (or at all) with the text. Or a paper that may otherwise be a 4 except for numerous writing mistakes. 2 Seriously de®cient discussion and personal account. Few if any details. Writing needs remediation. 1 Titanic. Attached to scoring guide are six anchor papers, chosen as ``range-®nders'' to help you see distinctions between papers. A passing paper would earn a 4 or above; obviously, a failing paper would receive a 3 or below. References Braine, G., 1994. ESL students in ®rst year writing courses: an evaluation of the placement options. TESL Reporter 27, 41±49. Braine, G., 1996. ESL students in ®rst-year writing courses: ESL versus mainstream classes. Journal of Second Language Writing 5, 91±107. Desruisseaux, P., 1994. U.S. enrolls record number of foreign students. The Chronicle of Higher Education 23 (November), A38. Harris, M., 1992. Prentice-Hall Reference Guide to Grammar and Usage. Prentice Hall, Englewood Clis. Mlynarczyk, R., Haber, S., 1991. In Our Own Words. St. Martin's Press, New York. Murray, D., 1993. Write to Learn, 4th Edition. Harcourt Brace, Orlando, FL. Silva, T., 1993. Toward an understanding of the distinct nature of L2 writing: the ESL research and its implications. TESOL Quarterly 27, 627±656. Silva, T., 1994. An examination of options for the placement of ESL students in ®rst year writing classes. Writing Program Administration 18, 37±43.
234
G. Braine / System 29 (2001) 221±234
Silva, T., Leki, I., Carson, J., 1997. Broadening the perspective of mainstream composition studies. Written Communication 14, 398±428. Taylor, S., 1990. The Critical Eye. Harcourt Brace, Orlando, FL. Test of Written English Guide, 1996. Educational Testing Service, Princeton, NJ.