Computers & Education 58 (2012) 459–469
Contents lists available at SciVerse ScienceDirect
Computers & Education journal homepage: www.elsevier.com/locate/compedu
Reflective behaviors under a web-based portfolio assessment environment for high school students in a computer course Chi-Cheng Chang*, Cheng-Chuan Chen, Yi-Hui Chen Department of Technology Application and Human Resource Development, National Taiwan Normal University, No. 162, He-Ping E. Road, Section 1, Taipei 106, Taiwan, ROC
a r t i c l e i n f o
a b s t r a c t
Article history: Received 9 February 2011 Received in revised form 16 August 2011 Accepted 17 August 2011
This research attempted to categorize reflection in a Web-based portfolio assessment using the Chinese Word Segmenting System (CWSS). Another aim of this research was to explore reflective performance in which individual differences were further examined. Participants were 45 eight-grade students from a junior high school taking a computer course. The study results indicated that the words used most often in reflective journals fell into cognition and evaluation categories in comparison to emotion and memory. Based on lexical attributes, reflection was thus classified into cognition, evaluation and mix. Cognition was the most common type, and evaluation, the least; emotion and memory types failed to emerge. Although reflective journals tended to be short, the average scores on reflection were acceptably high, which implied it was the quality rather than the length of a reflective journal that students were primarily concerned about. In addition, significant group differences were detected in terms of word counts and reflection scores. The reviews of peer reflections were seldom fulfilled, and covered merely one-third of the peer work; there were significant group differences related to the number of reviews. The duration of peer reflection reviews was usually short, and again, significant differences were found across various duration groups. Ó 2011 Elsevier Ltd. All rights reserved.
Keywords: Portfolio Portfolio assessment Web-based portfolio Reflection Chinese word segmenting
1. Introduction The developmental process of learning portfolios entails projection of purpose, collection, selection, reflection, and presentation (Barrett, 2010; Barrett & Garrett, 2009; Falls, 2001). It is critical that portfolio development allows students to establish learning goals and to identify their strengths as well as weaknesses. Falls (2001) noted reflective practice plays an indispensable role in portfolio creation, for it is particularly instrumental for students. Tomkinson (2002) presented a four-stage scheme related to portfolio development–reflection, inspection, reaction, and documentation. It is commonly accepted that one of the advantages of portfolio assessment is the promotion of learner reflection (Coombe & Barlow, 2004; Lopez-Fernandez & Rodriguez-Illera, 2009; Tubaishat, Lansari, & Al-Rawi, 2009; Wang, 2009). Web portfolios, in turn, can be regarded as an authentic process that documents and fosters reflective thoughts (Avraamidou & Zembal-Saul, 2002; Carroll, Markauskaite, & Calvo, 2007; Milman, 1999; Morris & Buckland, 2000; Zembal-Saul, 2001). A study conducted by Hawkes and Romiszowski (2001) showed that computer-mediated reflections achieved a significantly higher level than did face-to-face reflections. Hence, it is concluded that learners’ involvement in reflective activities should be valuable and beneficial in the context of Web-based portfolio assessments. Given that learner reflections are associated with cognition and emotion elements, an issue has been raised regarding how to organize, classify and evaluate reflection contents. With respect to reflective behaviors, Morgan (1999) reported a hierarchy with four levels of reflection: Not qualified, Fair, Good, and Excellent. Teachers from South Brunswick schools adopted assessment criteria comprising purpose of reflection, supportability, systematicity, sentence structure and vocabulary and writing mechanics, each of which was accompanied with a four level scheme (King-Shaver, 1999). Cheng (2002), who assessed reflection in a Web-based portfolio setting, discovered that students generally performed medium level and failed to attain a higher level of reflection. This study result was supported by several researchers. Lin (2004) indicated although shallow and deep types were found existing, students tended to perform the former type more frequently.
* Corresponding author. Tel.: þ886 2 77343439; fax: þ886 2 23921015. E-mail addresses:
[email protected] (C.-C. Chang),
[email protected] (C.-C. Chen),
[email protected] (Y.-H. Chen). 0360-1315/$ – see front matter Ó 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2011.08.023
460
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
Likewise, Wood (2001) concluded students were more likely to operate pre-reflection compared to quasi-reflection or reflection. A study conducted by Li (2002) suggested lower levels of reflection appeared to be more accessible to high school students. In other words, previous investigations have revealed the tendency of reflectors toward lower levels of reflection. When it comes to the types of reflection, a number of researchers have made classifications from multiple perspectives. Santos (1997) categorized reflection into four processes. Reviewing process refers to the improvement in learning for which one looks at the past and learns from it in order to avoid the same mistakes; contemplative process enables one engaged in self-observation or introspection; comparing process talks about a learner’s self-examination of the extent to which anticipated goals are fulfilled; judging process involves self-assessment on learning performances and progress. On the other hand, Kember (1999) treated reflection as a double-faced notion containing reflective and non-reflective action; more specifically, reflective action is made up for several subdivisions, such as content, process, and premise reflection. According to Lin’s (2004) categorization, students’ journals were generally characterized by affective, selfaware, and integrative reflection. In the study of vocabulary analysis, Avraamidou and Zembal-Saul (2002) identified a four-stage trend that reflection authors shifted from being descriptive to being explanatory, reflective, and elaborative. The studies of Chen, Kinshuk, Wei, and Liu (2010) and Hsieh, Jang, Hwang, and Chen (2011) evaluated the quality of reflection based on the five levels which are reporting, responding, relating, reasoning, and reconstructing. According to the description above, many types of reflection have been proposed based on different contexts. Recognizing the importance of learning reflections, researchers classified reflections into various levels and types. Due to the diversity of reflective practice, learners are expected to perform differently, leading to a variety of evaluation methods and ways of classification. In the context of Web-based portfolio assessment, however, what levels and types of reflection take place, and in which way it can be appropriately categorized? Besides, another issue remained unanswered is concerned with the representations of reflective behaviors and its applicable methods of evaluation. Rama and Battistoni (2001) suggested that the number and duration of reflective practice can serve as useful indicators in measuring reflective performance. Thus, to evaluate reflections assessors should not only take into account the quality of reflection work, but also word count of reflective journals as well as the number and duration of peer reviews. On the other hand, a study conducted by Irby and Brown (1999) revealed that students had a similar length of reflective journals, drawing our attention to the question about whether or not there are no differences between individual learner in reflective performances. In response to the issues discussed above, the purpose of this study was to investigate reflective behaviors (types and performances), and to find the differences related to individual reflective performance. The research questions are as follows: 1. In reflective journals, how many types are reflections classified into? What is the frequency usage of each reflection type? 2. What are students’ reflective performance in terms of the number of words in journals, the number and duration of peer reflection reviews, and the overall quality of reflection? 3. Are there significant individual differences in reflective performance?
2. Literature reviews Firstly, the section presents the relationship between Web-based portfolio assessment and online reflection. Secondly, the analysis of reflection is discussed such as application of discourse analysis in reflection and researches on discourse analysis. Finally, types of reflection and contents of reflection are presented. 2.1. Web-based portfolio assessment and online reflection A portfolio assessment refers to the examination of a systematic collection of student work that documents evidence of artifacts, reflections, learning growth and achievements (Chang, 2008; Chang & Tseng, 2009a, 2009b; Eppink, 2002). It is advisable that a Web-based portfolio assessment should incorporate guidelines, learning goals, artifacts, work display, assessment rubrics and scoring. Moreover, to enhance critical reflection, learners have to get engaged in the following activities: journal writing, online discussion, peer-assessment and feedback and self-evaluation (Eppink, 2002). By doing so, the authenticity of Web-based portfolio assessment will be enhanced. According to Eppink (2002), a portfolio assessment plays a productive role, since it encourages reflective practice which in turn fosters learners’ metacognitive and introspective awareness. In discussing the advantages of portfolio assessment, McAlpine (2000) believed the implementation of this assessment tool urges students to reflect upon the process of learning and assessment; Chang and Tseng (2009a) bore out that a portfolio assessment is significantly instrumental in developing both reflective and writing skills. What makes portfolio assessment favorable is the process of portfolio creation in which learners are allowed not only to establish learning goals, but to ruminate on the feasibility and ways to achieve them. Meanwhile, it also sharpens reflective ability as learners make comments on their own artifacts, and give reasons why certain work were put in into their portfolios. Taken together, portfolio assessments offer learners a wide range of benefits, one of which is the facilitation of critical reflection (Coombe & Barlow, 2004). 2.2. Application of discourse analysis in reflection 2.2.1. Discourse feature in Chinese In computational linguistics, Chang, Chen, and Huang (2000a) have proposed a new representation of lexical knowledge, known as the Module-Attribute Theory of Verbal Semantics (MARVS), in which they noted syntactic expressions are restricted to the core verbs regardless of the content of an event. In the analysis of speech acts, Chen (2001) suggested researchers pay attention to the forms of verbs and events derived from core verbs. All verbs in Mandarin Chinese can be broken into stative and active (Institute of Information, 2007). A verb whose modifiers incorporate adjectives is grouped into stative; otherwise, it is counted as an active verb. More specifically, stative verbs tend to demonstrate several kinds of psychological states; hence they are further differentiated into cognition, memory, emotion, and evaluation (Chang, Chen, & Huang, 2000b).
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
461
In summary, core words shape the features of a discourse, and they contain associated sentence meanings and messages. Therefore, in order to pinpoint reflective types and idiosyncrasies, stative verbs play a decisive role in analyses of latent and patent forms associated with reflections. 2.2.2. Researches on discourse analysis Hawkes and Romiszowski (2001), in studies comparing face-to-face to computer-mediated teacher discourse, strived to determine the nature of the discourse produced in both settings and its reflective levels. The results showed the computer-mediated dialog was significantly more reflective than face-to-face discourse. Cheng (2002), who employed an analytical scheme derived from Sparks-Langer, Simmons, Pasch, Colton, and Starko (1990) and Kember (1999), reported that a high level of reflection did not emerge in the context of portfolio learning, whereas medium level was repeatedly undertook by learners. In the research project of Chung (2003), she adapted Henri’s (1991) model for discourse analysis regarding computer-mediated conferencing, focusing on the investigations into learners communicative behaviors. In addition, Chung (2003) proceeded to distinguish the content of discourse into three dimensions, namely taskoriented, social and procedural dialogs. To summarize online interactive behaviors, Huang (2002) explored relevant speech acts from three different perspectives. Procedure, allowing the purposes of a discussion achievable, is related to the manners and processes in which communication takes place. Task refers to the dialogs or conversations related to ongoing content of subject matter. Social messages were defined as a statement, having little to do with formal discussion, used for the expression of feelings. Chen (2001) investigated the correlation between communication patterns and leadership styles, and the Chinese Word Segmenting System (CWSS) was utilized to extract five types of stative verbs. Based on a speaker’s verb usage, patterns of communication were classified into emotion, cognition, memory, and evaluation types, in which leadership styles were affected by the degree to which cognition and evaluation words were used in a dialog. It has long been recognized that language and thought are interdependent and inseparable; meanwhile, reflective practice is considered as the documentation of both critical thinking and language use. This study, therefore, attempted to employ the CWSS in order to shed light on learners’ mental processes and representations as well as lexical characteristics in reflective journals. Moreover, previous studies on discourse analysis and their significant findings would serve as references for our categorization of reflection. 2.3. Types of reflection Kember (1999) divided reflection into reflective and non-reflective action. Three types of non-reflective action were then categorized, i.e. habitual action, thoughtful action and introspection. There are three subdivisions related to reflection–content, process, and premise reflection– among which premise reflection was considered as a higher level of reflection because it involves a learner’s awareness of why he/she perceives, thinks, feels, or acts upon. Content reflection deals with “what” one reflects on, while process reflection examines “how” reflection is performed. Langer (2002), however, regarded this framework flawed, considering the fact that reflective action cannot be assured once “perceived presence” fails to emerge in learners’ written journals. In addressing the Reflective Judgment model, Wood (2001) ranked three levels of individual reflections: pre-reflection, quasi-reflection, and reflection. Wood (2001) discovered that higher levels of reflection appeared to be more accessible as the educational level advances. For example, high school students showed preference to pre-reflection; reflectors at the college level lied in between the first and second level; graduate students fell into the area between the second and third level of reflection. Li (2002) noted that reflection can be seen as a metacognitive activity comprising three categories, i.e. awareness, comparison, and evaluation. Awareness is concerned with thinking about past events or experiences for which concrete examples are given. Comparison refers to the examination of cognitive development after a learning experience. With regard to evaluation, it involves assessing and accounting for one’s progress or changes in self-cognition. In Li’s (2002) study, the results suggested lower levels of reflection seemed more achievable than did higher levels. Also, it was discovered students making progress were characterized by “inferential reflection”, while those who performed otherwise were “intuitive reflection”. Lin (2004), on the other hand, outlined three kinds of reflection in terms of the features related to reflective journals. Reflectors belong to affective type are inclined to convey feelings or emotion, and positive mostly. As far as self-aware reflection is concerned, it talks about the ability of thinkers to articulate and reflect upon his or her own learning and thinking processes. Integrative reflection refers to the combination as well as application of acquired knowledge and real-life experiences, and learners in this type are likely to become actively involved in teacher–learner interaction. In her research study, Lin (2004) also separated learning reflection into shallow and deep level, and the former was more commonly operated by students. Reflections, as a multifaceted assessment tool, document learners’ real-world experiences covering both cognitive and affective domains (Tomkinson, 2002). In Lin’s (2004) view, vocabulary used by a learner in fact mirrors what he/she perceives and feels. In order to gain deeper and further insights, the contents of reflective journals should be examined aligning with characteristics of speech acts. In the categorization of reflection, five dimensions of speech acts were discussed in this study, namely cognition, emotion, evaluation, memory, and mix. 2.4. Contents of reflection According to Falls (2001), a rubric empowers reflection authors to realize what should be incorporated in journals, and to track their efforts or accomplishments in and outside the classroom. Rama and Battistoni (2001) suggested learners consider a wide range of aspects in their reflective practice, e.g. learning outcomes, reflection approaches, the number and duration of reflections, peer/teacher feedback, and assessment criteria. In order to enhance the authenticity of reflection, Reckase (2002) provided learners with three powerful reflection questions: How much effort did you put into the learning process? What knowledge or skills did you acquire from it? What are the things that you think you are able to do now? Accordingly, Whitfield (2000) indicated that reflectors have to become aware of the weaknesses and strengths in their own learning and thinking processes, and find the suitable methods for weak elements, at the same time strengthen the strong elements.
462
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
Wu (2008) concluded that assessors must scrutinize and evaluate contents of reflection from multiple dimensions, i.e. students’ achievements and self-expectation, learning problems and proposed solutions for them, comments on peer work, and responses to teacher/ peer feedback. Reflection is a form of mental processing that can be exhibited in a number of ways. Either way, however, requires instructors to provide learners with the purposes and guidelines for reflection, since it will be a basis for writing journals, and makes evidence of reflection solid and well-constructed. In addition to the aforementioned, this study further introduced several aspects associated with the evaluation of reflective behaviors by counting words in reflective journals and the number and duration of peer reflection reviews. 3. Method 3.1. Participants The subjects were 45 students in a computer course at a junior high school. The duration of the study was a 10-week period with 2 sessions for each week. Students were required to undertake portfolio creation, self- and peer-assessment on the Web-based portfolio assessment system. To develop a portfolio, the students had to be engaged in the following activities, i.e. goal setting, reflection, online artifact submission, etc. The 2-unit computer course addressed “Computer Animation” and “Timeline Control”; for each unit, an assignment had to be completed by using software such as Photoimpact, Dreamweaver MX. The course content covered not only technical skills but cognitive and affective domain, which in turn offered students an ideal scenario for reflective practice. 3.2. Framework In this study, the CWSS was employed in order to sort out different types of reflections after students’ use of the Web-based portfolio assessments. On the other hand, the researchers retrieved students’ records of reflective performances and scores from the Web-based portfolio assessment system in order to statistically compare the possible differences in reflective performance. Below are the research variables involved in this study: 1. Reflective types were derived from the CWSS, including cognition, emotion, evaluation, memory, and mix type. 2. Reflective performance were concerned the with the word counts in a reflective journal, the number and duration of peer reflection reviews, and the scores on reflection.
3.3. Procedure of experiment (see Table 1) 1. Stage 1 (preparation): The teacher began the course with a brief overview of learning portfolios, and provided tips for writing reflective journals. After course introduction and demonstration of the Web-based portfolio assessment system, the teacher would help students to become familiar with the system functions by hands-on experiences. 2. Stage 2 (Unit 1): The first unit talking about “Computer Animation” was taught aligning with the assessment system. Outside the classes, the students were responsible for a number of course activities, such as portfolio development, online discussion, peer reflection reviews and self- and peer-assessment. To create a portfolio, the ensuing work must have been completed by filling out the forms available, including goal setting, reflection writing, artifact submission, and so on. Afterward, the teacher and assistants would assess students’ performances on the basis of portfolio contents and learning behaviors. 3. Stage 3 (Unit 2): For better learning efficiency, the teacher gave instructions and assistance focusing on students’ problems or difficulties encountered in the previous course unit. Subsequently, this unit about “Timeline Control” was then started with the students repeating the coursework and activities as Unit 1. 4. Stage 4 (oral presentation): For this stage, students were required to deliver an oral presentation covering their portfolio contents, and to share experiences as well as advice on portfolio development.
3.4. Data Gathering and analysis 3.4.1. Data collection The recorded information available on the assessment system was collected, including a student’s written journals, scores and reflective performances centering on the number of words in journals as well as the number and duration of peer reflection reviews. 3.4.2. Data organization 1. Data organization The researchers started this part of work by carefully examining the reflective journal documents and having those incorrect words revised. Reflective contents were then compiled, according to its common attributes, and organized into textual data. 2. Unit of analysis In this study, a “word” or “vocabulary” was chosen as the minimal unit of analysis, based on the CWSS.
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
463
Table 1 Experimental procedures. Stage
Week No.
Subject
Procedure
1 (preparation)
1
Teacher
1. Prepared each material and activity on the Web-based assessment system. 2. Introduced learning portfolios, assessment methods and strategies for writing reflections. 3. Showed detailed course outlines. 4. Demonstrated how to use the system. 5. Assisted students in operating the system, and delineated scoring criteria. 1. Had basic understandings of learning portfolios, assessment methods and strategies for writing reflections. 2. Tried out the assessment system. 3. Familiarized oneself with the assessment rubrics. 1. Had Unit 1 started with the use of the assessment system. 2. Recapitulated scoring criteria, and offered assistance if needed. 1. Completed assigned course activities, such as developing a portfolio that contained learning goals, artifacts, reflection journals. 2. Uploaded preliminary, tentative and revised versions of artifacts. 1. Set a deadline for Unit 1 assignments, and sent reminders for late submission. 2. Determined grouping for peer-assessment. Turned in revised assignments, if applicable. 1. Reviewed learning portfolios. 2. Checked students’ participation and performances. 3. Gave scores. 1. Observed peer portfolios, and learned from it. 2. Undertook self-assessment. 3. Anonymously conducted group-to-group peer-assessment. 1. Had Unit 1 started with the use of the assessment system. 2. Recapitulated scoring criteria, and offered assistance if needed. 3. Provided additional instructions and remedial solutions aiming at students’ problems encountered at Unit 1.
Student
2 (Unit 1)
2-5
Teacher Student
Teacher
Student Teacher
Student
3 (Unit 2)
4 (presentation)
6-9
Teacher
See See See See See
10
Student Teacher Student Teacher Student Teacher
Student
Orally presented portfolio contents, and shared opinions or ideas about the process of creating it.
Reflective
Input
Stage Stage Stage Stage Stage
2 2 2 2 2
1. Offered timely assistance during the presentation session. 2. Evaluated and marked students’ portfolios and presentation performances.
CIPS system
Output
Journals
Part-of-speech Tagging
Extract
Vocabulary Type
Stative Verb
intransitive verb causative verb transitive verb etc.
Reflection Type
Cognition Classify
Memory
Cognition Classify
Emotion Evaluation Fig. 1. Flowchart of data processing.
Evaluation Mix
464
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
3.4.3. Data analysis (see Fig. 1) 1. Input The contents of reflection were uploaded to the CWSS. 2. Word Segmentation The CWSS dealt with this part of work by marking up the words in the reflective journals with corresponding part of speech tags. 3. Extraction The researchers specifically focused on various types of stative verbs in Chinese, e.g. intransitive verbs, causative verbs, transitive verbs, etc. 4. Classification and frequency count of vocabulary The researchers, with the help of Mandarin teachers and expert, worked together on grouping up vocabulary (i.e. stative verbs) into types, and counting the frequency usage of each type of words within a reflective journal. The following formulation is used to verify the consistency between the two experts. In terms of the total transfer from stative verbs to mental lexicons for all students, the consistency between the two experts is sufficient. In terms of the transfer for each student from situation verbs to mental lexicons, the consistencies between the two experts are all sufficient.
Percent of consistency ¼ 2 number of consistency=total number of transfer ¼ 2 2139=ð2516 þ 2516Þ ¼ 4278=5032 ¼ 0:85 5. Types of reflection For this part, we continued to have the categorization of vocabulary well-examined by repeated comparison, organization and crossvalidation. We removed repeatedly categorized words, and wiped out barely used vocabulary types. Three types of reflection were ultimately determined based on a self-developed guideline (see details in the Section of Results and Discussion). 3.5. Research instruments 3.5.1. Web-based portfolio assessment system The functions of the system are as follows: (1) ‘Portfolio Guidance’ assists users by illustrating the process of creating portfolio entries. (2) ‘Portfolio Creation’ comprises a student profile, learning goal setting, uploading of artifacts, reflective writing, and other content creation (e.g. anecdote, Website sharing, e-document sharing, achievement testing outcomes, or other entries, etc.). (3) ‘Portfolio Assessment’ encompasses student self-assessment, peer-assessment and teacher assessment. Students may be allowed to review portfolio contents such as digital artifacts, reflections, learning goals and other entries prior to assessment in order to know about the peers. (4) ‘Portfolio Score’ consists of scores obtained by student self-assessment, peer-assessment and teacher assessment. Students may access these scores. (5) ‘Course Descriptions’ comprises the names of course units, course syllabus, course content outlines, and teacher profiles, etc. (6) ‘Online discussion board’ including course discussion and portfolio discussion. Basically, the mechanism of the online reflection includes writing assistance, writing editor, review, feedback, and evaluation. Regarding the writing assistance, the system provides students an outline for writing their reflections. The outline of reflection includes reflection on learning goals, reflection on learning outcomes, reflection on learning attitude, reflection on peer performance, and reflection on feedbacks. The system also provides referred keywords and frequently used sentences for writing reflections to make students easily to write reflections. In regard to the review function, when students write reflection, the system allows them simultaneously enter the “Portfolio Assessment” area to review their-self and peers’ portfolio contents (i.e learning goals, artifacts, self-assessment, peer-assessment, teacher assessment online participation record etc.) created initially in the system. Based on the comparisons among self and peers’ portfolio contents, students can write reflection easily. While finishing writing reflection, students may click the “Save” or “Save and Send” button, and the system will automatically save the reflection in personal portfolios. After finishing writing reflections, students may enter the “Portfolio Assessment” area to see their written reflections and peers’ reflections. With regard to the feedback and evaluation functions, the system allows students grade and give comments on their-self and peers reflections. 3.5.2. Web-based portfolio assessment questionnaire 1. Assessment questionnaire The Web-based portfolio assessment questionnaire created by Wu (2008) was adopted in order to measure students’ learning performances based on the quality of their portfolios. The assessment questionnaire is six-dimensional and covers portfolio creation, learning goal, Table 2 Factor analysis of reflection questionnaire. Test
KMO
Explained Variance (%)
Unit 1 Unit 2
0.89 0.88
84.51 81.06
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
465
Table 3 Percentage of individual use of stative verbs (part of samples). Student ID
Frequency usage of stative verbs (%)
1104 1105 Others 1201 1202 1203 1204 1205 Others 1301 1302 1303 Others 1401 1402 1403 1404 1405 Others 1501 1502 1503 1504 1505 Others 1601 1602 1603 1604 1605 Others 1702 1703 1704 Others 1801 1803 Others
12(42.9) 31(53.4)
Cognition
Memory 0(0) 0(0)
Emotion
Evaluation
Total of Use
Assigned Reflective Type
2(7.0) 0(0)
14(49.1) 27(46.1)
28 58
Evaluation Cognition
1(3.8) 0(0) 1(1.3) 3(5.5) 0(0)
12(45.2) 16(35.1) 37(49.0) 27(49.5) 16(46.3)
26 45 75 54 34
Cognition Mix Evaluation Evaluation Cognition
0(0) 8(12.0) 2(3.4)
17(41.9) 21(31.5) 24(40.3)
40 66 59
Cognition Mix Cognition
37(51.8) 39(42.1) 46(44.9) 28(38.6) 6(48.0)
71 92 102 72 12
Evaluation Cognition Mix Cognition Mix
13(50.0) 19(42.2) 34(45.3) 24(44.4) 17(50.0)
0(0) 10(22.0) 3(4.0) 0(0) 1(2.9)
23(57.5) 33(50.0) 32(54.2)
0(0) 4(6.0) 1(1.7)
32(45.1) 44(47.8) 46(45.1) 40(55.6) 6(50.0)
0(0) 0(0) 0(0) 1(1.4) 0(0)
6(42.9) 7(50.0) 12(54.5) 19(57.6) 23(54.8)
0(0) 0(0) 0(0) 0(0) 0(0)
1(6.9) 0(0) 0(0) 0(0) 6(14.1)
7(48.3) 7(48.3) 10(44.4) 14(41.7) 13(30.5)
14 14 22 33 42
Evaluation Mix Cognition Cognition Mix
17(47.2) 20(50.0) 13(52.0) 31(50.0) 19(47.5)
1(2.7) 1(2.5) 0(0) 3(4.8) 1(2.5)
2(5.5) 1(2.5) 2(7.8) 3(4.8) 2(4.9)
16(43.8) 18(44.4) 10(39.1) 25(39.9) 18(44.4)
36 40 25 62 40
Cognition Cognition Cognition Cognition Cognition
19(50.0) 36(50.7) 25(47.2)
0(0) 3(4.2) 0(0)
0(0) 5(7.0) 5(9.4)
19(49.4) 27(37.7) 23(42.9)
38 71 53
Mix Cognition Cognition
28(47.5) 16(55.2)
2(3.4) 4(13.5)
0(0) 0(0)
29(48.7) 9(30.3)
59 29
Mix Mix
2(2.8) 9(9.7) 10(9.8) 3(4.1) 0(0)
artifact, reflection, attitude, and other, among which reflection performance was of primary interest in this research project. Furthermore, scores were calculated using a five-point Likert scale. In this assessment questionnaire, “reflection questionnaire” was specifically designed for the evaluation of a student’s reflection contents, looking at the following aspects: goal setting, artifact, learning achievement, attitude, peer review and feedback. A student’s reflection score was determined by the grade he/she acquired on this reflection questionnaire. 2. Questionnaire reliability With regard to the reflection questionnaire, the Cronbach’s Alpha coefficients for course Unit 1 and Unit 2 were 0.819 and 0.864, respectively. The results indicated both tests achieved high reliability, and were parallel to Wu’s (2008) findings in which the Cronbach’s Alpha coefficient was measured as 0.923. 3. Questionnaire validity In Wu’s (2008) factor analysis, the explained variance of the assessment questionnaire was 72.09%, and the reflection questionnaire, 89%. This implied that the assessment questionnaire and the reflection questionnaire had high validity. An approach of factor analysis–Principal Component Analysis (PCA)–and the varimax method were used conducting the orthogonal rotation approach in order to examine the appropriateness and accuracy of the reflection questionnaire. Shown as Table 2, the Kaiser-MeyerOlkin (KMO) values for each test were greater than 0.5, meaning that factor analysis could be applied. The explained variances for both unit courses exceeded 70%, which meant the reflection questionnaire had a high validity.
Table 4 Percentage distribution of reflective types.
Percent of Students
Cognition
Memory
Emotion
Evaluation
Mix
53.3
0
0
16.7
30
466
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
Table 5 Analysis of reflective behavior groups. Group
Mean
Word count in journals (words) Number of peer reflection reviews (times) Duration of peer reflection reviews (minutes) Reflection score (points)
High
Medium
Low
1165 16 72.2 81.3
552.1 7.4 67.7 73.9
279.6 5 7.1 59.6
Total
Individual Average
18945 272 1021.4 2184
631.5 9.1 34.1 72.8
3.5.3. Chinese word segmenting system The CWSS was employed to deal with word segmentation of reflective journal documents. The Chinese lexical database available in this system contains about 100,000 vocabulary words accompanied with pos tags, word frequencies, pos tag bigram information, etc. It is a state-of-the-art system, with unknown word identification and syntactic category prediction, which was ranked first at the International Chinese Word Segmenting Bakeoff. Due to its high accuracy and consistency (Chen & Bai, 1998; Tsai, 2004), we found it well-suited and efficient in data analysis: it is not only time- and labor-saving, but helps the improvement of reliability as well as validity. In addition to word segmentation, the CWSS is also equipped with word-tagging capability by dividing lexicon into active verbs, stative verbs as well as other parts of speech such as conjunctions, adverbs, nouns, and pronouns (Institute of Information, 2007). However, in psycholinguistics active verbs are not suited for the interpretation of psychological states. In this study, we then placed focus on stative verbs that served as references for our classification work of reflection. 4. Results and discussion 4.1. Types of reflection In each reflective journal, stative verbs were extracted using the CWSS (Appendix 1), and four kinds of psychological state words and its number of occurrence were summarized (Appendix 2). Considering that some stative verbs might be categorized into more than one vocabulary type, the researchers continued to scrutinize and filter overlapping classification. For example, preliminary estimates suggested that Student 1104 wrote a reflective journal with cognition type of words used 22 times; memory, 0; emotion, 2; and evaluation, 24. After taking out repeated classified words (Appendix 3), the result was further refined as follows: cognition, 12; memory, 0; emotion, 2; and evaluation, 14. Table 3 illustrates that, according to the frequency usage, cognition was the most commonly used vocabulary type following by evaluation; memory held the least users. This finding suggested reflectors had preference to cognition and evaluation type in vocabulary use. Initially, the categorization of reflection was determined by the degree to which each vocabulary type–cognition, emotion, memory, and evaluation–was used by individual student. It was discovered, however, emotion and memory words were nearly invisible in students’ journals. In light of this, we then proposed a 3-category scheme consisting of cognition, evaluation and mix in which emotion and memory were not put in. Below are detailed descriptions of eligible reflectors for each type: 1) Cognition type refers to those reflective authors who are inclined to cognition words comparing to evaluation, and the rest two vocabulary types are barely occurred in this case. 2) Evaluation type talks about a reflective author who dominantly selects evaluation words over cognition, whereas the other two vocabulary types are barely used. 3) Mix type includes two kinds of circumstances of vocabulary use. It covers those who use nearly equal amount of cognition and evaluation words, and the rest two vocabulary types are scarcely used. Secondly, reflective authors can be also labeled as mix type if three or more types of vocabulary are found, and each type should be responsible for over 10% of word use. As Table 4 outlines, the students fell into three categories, namely cognition, evaluation and mix, while memory and emotion failed to emerge. Additionally, it was also shown that cognition type held at least half of the students. Comparable study results were found by a number of researchers, such as Eppink (2002), Li (2002), Lin (2004), Kember (1999). In these previous investigations, it was concluded that meta-cognition or cognition type generally accounts for substantial portion of reflective practice. 4.2. Reflective performance 4.2.1. Word count in reflective journals Counting only relevant and meaningful text, Unit 1 had an average length of 345.6 words per person; Unit 2, 285.9 words. Adding it up, each person wrote on average 631.5 words. The average length of a reflective journal was 274.6 words per person (totally 69 reflective Table 6 ANOVA of reflective behavior groups. Reflective Behavior
F
Sig.
Estimate Effect Size
Word count of journals Number of peer reviews Duration of peer reviews Reflection score
80.563 33.864 42.203 90.663
0.000 0.000 0.000 0.000
0.396 0.215 0.237 0.428
***p<.001.
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
467
Table 7 Post-hoc comparison test of reflective behavior groups. Reflective Behavior
Between-Group Difference (Sig.)
Word count of journals Number of peer reflection reviews Duration of peer reflection reviews Reflection score
High>Medium>Low (0.000***) High>Medium, Low (0.000***); Medium>Low (0.007**) High>Medium, Low (0.000***) High>Medium>Low (0.000***)
**p<.01, ***p<.001.
journals). This statistical analysis showed the general journal lengths were still far from satisfaction. Here is an example demonstrating a student’s concerns about writing mechanics and the ideal length of reflective journals. “Writing reflections at first seemed to be a lot of work that I didn’t know where to start with. But I grasped a basic picture about what should be put in into a reflective journal after I checked the guidelines on assessment system. Though, it’s kind of frustrating that I sometimes feel I don’t have enough vocabulary for doing this. It would be great if the teacher will post some commonly used words or phrases, like a glossary, to help us to get things done faster. By the way, I have no idea what is the optimal length of a reflective journal, but I don’t want it to be unusually long or short too.” 4.2.2. Number of peer reflection reviews Reviews were ignored if they did not exceed 5-s minimum. For both Unit 1 and Unit 2, the average number of reviews was measured as 4.5 times per person. To sum up, individual average for the entire course was 9.1, indicating that students read only some of the peer journals, or one-third of it at best. The following is a piece of evidence in which a student articulated the values of reading peer journals as well as the concerns about the time commitment involved in it. “It was an amazing experience reading other people’s work, especially their reflective journals. It allows us to take a look at the inner world of people, which is pretty inspiring to me. But I haven’t got chance to read them all because of time issues, since it is such a time-consuming work doing all this.” 4.2.3. Duration of peer reflection reviews As the course unit went through, the mean duration per person reduced from 21.9 to 12.2 min. Taken together, the average duration per person for the entire course was 34 min; individual duration per review was 3.8. The finding suggested students seemed invest insufficient time reviewing peer journals. 4.2.4. Quality of reflection At each stage of our research project, there were always approximately 77% of the students being ranked into high-score or mediumscore groups, and only a small portion of them (23%) were not. This coincided with Cheng’s (2002) statement: the majority of students had the ability to show high-quality reflection outcomes. Below is an excerpt of a reflective journal in which the author described her needs to enhance reflection contents by viewing and observing peer journals. “I realized there’ll be a lot of work to do on improving writing skills after I got the reflection score for Unit 1. Even a general framework is available on the assessment system, it all depends on us to formulate more detailed and in-depth journal contents. The chances are I should spend more time reading peer work to become familiar with proper word usage and sentence structures, and to find out the focuses of reflections, or Unit 2 is gonna make me headache.” 4.3. Overall reflective performance In Table 5, reflective performances were classified into high, medium and low groups according to four kinds of reflective performance. Table 6 outlines the study result using the analysis of variance (ANOVA), in which the F-value for each kind of reflective behavior achieved significant level (Table 7). In addition, “reflection score” yielded the largest estimated effect size (0.42), and “number of peer reflection reviews” had the least (0.218). That is to say, the differences among “reflection score” groups were the largest, and “number of peer reflection reviews” groups had the smallest differences. 5. Conclusion This study was geared toward the reflective behaviors including reflection categories and performances during the process of the Webbased portfolio assessment. Reflection categories were induced using the CWSS and reflection performances were retrieved from the Webbased portfolio assessment. For reflection category, according to the frequency of occurrence, the ranking of four types of vocabulary (i.e stative verbs) is cognition, evaluation, emotion, and memory, from high to low. It was discovered the former two types were utilized a lot more than the latter two. In other words, students showed the ability to spontaneously become engaged in critical thinking and self-inspection given their preference for cognition and evaluation words. In order to have the categorization meaningful, we eliminated those barely used vocabulary types, i.e. emotion and memory, and eventually proposed three types of reflection–cognition, evaluation, and mix. Among them, cognition type was the most common one following by mix type, whereas evaluation type was rarely seen. For reflection performances, although reflective journals in overall appeared to be short, the average reflection score was moderately high. That is to say, in writing journals students placed focus on quality over quantity. According to the ANOVA test, significant group differences were detected not only across word count groups (the average words per person) but also reflection score groups (the average score per person).
468
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
Furthermore, students generally had an inadequate number of peer reflection reviews, and they only read about one-third of the peer work. The ANOVA test revealed there were significant group differences in terms of the number of reviews. The time students spent on reviewing was scanty as well, with the average duration of 4 min per person each time. In the ANOVA test, significant group differences were also found across “duration of peer reflection reviews” groups. In this paper, word counts in a reflective journal were treated as independent variables. However, a time count mechanism of writing a reflective journal in the Web-based portfolio assessment system was not ready to compute how much time a student had spent in writing a reflective journal. If not so, we would have been able to delve into reflective behaviors related to writing time. Besides, it seemed to be integral and full-fledged that students had access to guidelines and strategies for the composition of reflection, and teachers on the other hand were always ready to offer instructions. However, the truth was that a handful of students still encountered difficulties including lack of vocabulary and inability to select proper words. It is desirable instructors should provide them with appropriate and abundant vocabulary, so that the quality of reflection contents will be enhanced. This study aimed to gain further understandings of reflective practice such as categories by analyzing the word frequency and content in reflective journals. Even so, we yet found a great number of questions and topics researchable, such as the extent to which reflection outcomes mirror a learner’s mental growth as well as academic achievements. It is recommended that researchers or scholars who are interested in related issues should deepen the scope of investigations into those underlying implications and messages within it. No matter what the types of reflection were produce and how well the reflection performances were shown, students’ involvement in writing reflection should be valuable via a Web-based portfolio assessment. It is confirmed that by the study and other researches, portfolio assessment may facilitate the involvement of students in reflection (Coombe & Barlow, 2004; Lopez-Fernandez & Rodriguez-Illera, 2009; Tubaishat et al., 2009; Wang, 2009). Moreover, Web-based portfolio assessment may provide an authentic context that fosters reflection (Avraamidou & Zembal-Saul, 2002; Carroll et al., 2007; Milman, 1999; Morris & Buckland, 2000; Zembal-Saul, 2001). Appendix 1
Extraction of Stative Verbs An excerpt of a reflective journal from Student 1104. 1 Wo (Nh) XiWang (VK) Wo (Nh) Zhe (Nep) XueQi (Na) de (DE) ZuoYe (Na) NengGou (D) QuanBu (Neqa) JiaoQi (VC), (Comma category)‘I hope I can turn in all the assignments given during this semester.’ 2 BingQie (Cbb) ShangKe (VA) ZhuanXin (VH) de (DE) Ting (VE) LaoShi (Na) ZhiDao (VC), (COMMACATEGORY) ‘and (I will) listen attentively to the teacher’s instructions in class.’ 3 ShangKe (VA) de (DE) ShiHou (Na) Yao (D) AnJing (VH) BuYao (D) JiangHua (VA), (COMMACATEGORY) ‘I will stay quiet, and won’t chat in class.’
Appendix 2 List of stative verbs and assigned vocabulary type Student ID
Vocabulary Type
Cognition
1104
Stative Verbs
Feel (VK) Have something to do with (VJ) Outperform (VJ)x2 Work hard (VH)x2 Endeavor (VH) Slow down (VH)x3 Make progress (VH)x2 Do something quickly (VH)x2 Hurry up (VH) Quiet down (VH) Concentrate on (VH)x5 Make good use of (VJ) 22
Total
Memory
0
Emotion
Evaluation
Appreciate (VK)x2
Outperform (VJ)x2 Hope (VK)x3 Work hard (VH)x2 Anticipate (VK) Endeavor (VH) Slow down (VH)x3 Make progress (VH)x2 Do something quickly (VH)x2 Hurry up (VH) Quiet down (VH) Concentrate on (VH)x5 Make good use of (VJ) 24
2
Appendix 3
List of repeated cassified words. Student ID
Word
Cognition
Memory
Emotion
Evaluation
Total
1104
Outperform Work hard Endeavor Slow down Make progress Do something quickly Hurry up Quiet down Concentrate on Make good use of
1 1 0.5 1.5 1 1 0.5 0.5 2.5 0.5 10
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
1 1 0.5 1.5 1 1 0.5 0.5 2.5 0.5 10
2 2 1 3 2 2 1 1 5 1 20
Total
C.-C. Chang et al. / Computers & Education 58 (2012) 459–469
469
References Avraamidou, L., & Zembal-Saul, C. (2002). Exploring the influence of web-based portfolio development on learning to teach elementary science. Paper presented at the annual meeting of the national association for research in science teaching, New Orleans, Louisiana. Chang, C.-C. (2008). Enhancing self-perceived effects using web-based portfolio assessment. Computers in Human Behavior, 24(3), 1753–1771. Chang, C. C., & Tseng, K.-H. (2009a). Using a web-based portfolio assessment system to elevate project-based learning performances. Interactive Learning Environments, 16(2), 25–37. Chang, C. C., & Tseng, K.-H. (2009b). Use and performances of web-based portfolio assessment. British Journal of Educational Technology, 40(2), 358–370. Barrett, H., & Garrett, N. (2009). Online personal learning environments: structuring electronic portfolios for lifelong and life wide learning. On the Horizon, 17(2), 142–152. Barrett, H. (2010). Balancing the two faces of ePortfolios. Educação, Formação & Tecnologias, 3(1), 6–14. Carroll, N. L., Markauskaite, L., & Calvo, R. A. (2007). E-portfolios for developing transferable skills in a freshman engineering course. IEEE Transactions on Education, 50(4), 360–366. Chang, L. L., Chen, K. J., & Huang, C. J. (2000a). Alternation across semantic field: a study on mandarin verbs of emotion. Computational Linguistics and Chinese Language, 5(1), 61–80. Chang, L. L., Chen, K. C., & Huang, C. J. (2000b). A lexical-semantic analysis of mandarin chinese verbs: representation and methodology. International Journal of Computational Linguistics and Chinese Language Processing, 5(1), 1–18. Chen, C. W. (2001). The correlation between verbal communication patterns and leader behaviors of middle school principals. Unpublished master’s thesis, National Dong Hwa University, Hualien, Taiwan. Chen, N. S., Kinshuk, WC. W., & Liu, C. C. (2010). Effects of matching teaching strategy to thinking style on learner’s quality of reflection in an online learning environment. Computers & Education, 56(1), 53–64. Chen, K. J., & Bai, M. H. (1998). Unknown word detection for Chinese by a corpus-based learning method. Computational Linguistics and Chinese Language Processing, 3(1), 27– 44. Cheng, Y. W. (2002). The design and application of Web-based reflective portfolio. Unpublished master’s thesis, National Taiwan Normal University, Taipei, Taiwan. Chung, W. L. (2003). A study of group communication and interaction in Web-based project-based learning. Unpublished master’s thesis, National Taiwan University, Taipei, Taiwan. Coombe, C., & Barlow, L. (2004). The reflective portfolio: two case studies from the United Arab Emirates. Forum Online, 42(1), Retrieved June 29, 2011, from. http://exchanges. state.gov/forum/vols/vol42/no1/p18.htm. Eppink, J. A. (2002). The effect of Web-based portfolio assessment strategies on the attitudes and self-perceived growth in music learning of non-music elementary general classroom educators in a basics of music course. Unpublished doctoral dissertation. Muncie, IN: Ball State University. Falls, J. A. (2001). Using a reflective process to implement electronic portfolios. Unpublished doctoral dissertation. Blacksburg, VA: The Virginia Polytechnic Institute and State University. Hawkes, M., & Romiszowski, A. (2001). Examining the reflective outcomes of asynchronous computer-mediated communication on inservice teacher development. Journal of Technology and Teacher Education, 9(2), 283–306. Henri, F. (1991). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative learning through computer conferencing (pp. 117–136). New York: SpringerVerlag Press. Huang, H. H. (2002). Learners’ interactive process in Web-based learning: A case study on text-based communication. Unpublished master’s thesis, National Chung Cheng University, Chiayi, Taiwan. Hsieh, S. W., Jang, Y. R., Hwang, G. J., & Chen, N. S. (2011). Effects of teaching and learning styles on students’ reflection levels for ubiquitous learning. Computers & Education, 57(1), 1194–1201. Irby, B. J., & Brown, G. (1999). A gendered dichotomy in written reflections in professional development portfolios. Paper presented at the Annual Meeting of the American Educational Research Association, Montreal, Canada. Kember, D. (1999). Determining the level of reflective thinking from students’ written journals using a coding scheme based on the work of Mezirow. International Journal of Lifelong Education, 18(1), 18–30. King-Shaver, B. (1999). Reflection and portfolios across the curriculum. In L. Reid, & J. N. Golub (Eds.), Reflective activities: helping students connect with text. classroom practices in teachers of english (pp. 156–165). IL, Urbana: National Council of Teachers of English. Langer, A. M. (2002). Reflecting on practice: using learning journals in higher and continuing education. Teaching in Higher Education, 7(3), 337–351, Retrieved June 29, 2011, from. http://www.alanger.com/learningjournal.html. Li, Q. R. (2002). The exploration of the reflective thinking of the second-grader senior high students on gas property changes. Unpublished master’s thesis, National Kaohsiung Normal University, Kaohsiung, Taiwan. Lin, H. C. (2004). A study of elementary school third-graders’reflective features in learning diaries. Unpublished master’s thesis, National University of Tainan, Tainan, Taiwan. Lopez-Fernandez, O., & Rodriguez-Illera, J. (2009). Investigating university students’ adaptation to a digital learner course portfolio. Computers & Education, 52(3), 608–616. Institute of Information. (2007). The Chinese word segmenting system. Retrieved June 29, 2011, from. http://ckipsvr.iis.sinica.edu.tw. McAlpine, D. (2000). Assessment and the gifted. Tall Poppies, 25(1), Retrieved June 29, 2011, from. http://www.tki.org.nz/r/gifted/pedagogy/portfolio_e.php. Milman, N. B. (1999). Web-based electronic teaching portfolios for preservice teachers. In J. D. Price, J. Willis, D. A. Willis, M. Jost, & S. Boget-Mehall (Eds.), Proceedings of 10th international conference of the society for information technology and teacher education 1999 (pp. 1174–1179). Charlottesville, VA: Association for the Advancement of Computing in Education. Morris, J., & Buckland, H. (2000). Electronic portfolios for learning and assessment. Paper presented at the Annual Meeting of the Society for Information Technology and Teacher Education, San Diego, California. Morgan, B. M. (1999). Portfolios in a preservice teacher field-based program: Evaluation of a rubric for performance assessment. Education, 119(3), 416–426. Rama, D. V., & Battistoni, R. (2001). Structuring the reflection process. Service learning Website. Retrieved June 29, 2011, from. http://www.compact.org/disciplines/reflection/ structuring/decisions.html. Reckase, M. D. (2002). Portfolio research. Paper presented at the Workshop on Portfolio Assessment, December 16w18, Taipei, Taiwan. Santos, M. G. (1997). Portfolio assessment and the role of learner reflection. English Teaching Forum Online, 35(2), 10, Retrieved June 29, 2011, from. http://exchanges.state.gov/ forum/vols/vol35/no2/p10.htm. Sparks-Langer, G. M., Simmons, J. M., Pasch, M., Colton, A., & Starko, A. (1990). Reflective pedagogical thinking: How can we promote it and measure it? Journal of Teacher Education, 41(5), 23–32. Tsai, Y. F. (2004). Chinese word segmenting and POS tagging. Retrieved June 29, 2011, from. http://dlm.ntu.edu.tw/dlm/93NDAP/professional/pro-930906-7/8B.pdf. Tomkinson, B. (2002). A reflective approach to teaching and learning. Teaching and Learning Support Centre. Retrieved June 29, 2011, from. http://ctls.concordia.ca/pdf/ resources/reflective.approach.to.teaching.pdf. Tubaishat, A., Lansari, A., & Al-Rawi, A. (2009). E-portfolio assessment system for an outcome-based information technology curriculum. Journal of Information Technology Education: Innovations in Practice, 8, 1–15. Wang, S. (2009). E-Portfolios for integrated reflection. Issues in Informing Science and Information Technology, 6, 449–460. Whitfield, A. (2000). Student teacher self-assessment: A proposed method of professional development. Unpublished doctoral dissertation. New Orleans, LA: University of New Orleans. Wood, P. K. (2001). Summary of reflective judgment levels. Retrieved June 29, 2011, from. http://web.missouri.edu/wwood/html/refl.judg.html. Wu, M. F. (2008). Construction of the rubrics, reliability and validity of web-based portfolio assessment. Unpublished master’s thesis, National Taipei University of Technology, Taipei, Taiwan. Zembal-Saul, C. (2001). Web-based portfolios in a professional development school context. Paper presented at the Annual Meeting of the American Educational Research Association, Seattle, Washington.