Mini-clinical evaluation exercise as a student assessment tool in a surgery clerkship: Lessons learned from a 5-year experience Luise I. M. Pernar, MD,a Sarah E. Peyre, EdD,a,c Laura E. G. Warren, MEd,c Xiangmei Gu, MS,b Stuart Lipsitz, ScD,b,c Erik K. Alexander, MD,b,c Stanley W. Ashley, MD,a,c and Elizabeth M. Breen, MD,a,c Boston, MA
Background. The mini-clinical evaluation exercise (mini-CEX) used for clinical skill assessment in internal medicine provides in-depth assessment of single clinical encounters. The goals of this study were to determine the feasibility and value of implementation of the mini-CEX in a surgery clerkship. Methods. Retrospective review of mini-CEX evaluations collected for surgery clerkship students at our institution between 2005 and 2010. Returned assessment forms were tallied. Qualitative feedback comments were analyzed using grounded theory. Principal components analysis identified thematic clusters. Thematic comment counts were compared to those provided via global assessments. Results. For 124 of 137 (90.5%) students, mini-CEX score sheets were available. Thematic clusters identified comments on 8 distinct clinical skill domains. On the mini-CEX, each student received an average of 6.5 ± 2.2 qualitative feedback comments covering 4.5 ± 1.2 separate skills. Of these, 42.7% were critical. Comments provided in global evaluations were fewer (2.9 ± 0.6; P < .001), constrained in scope (0.8 ± 0.2 skills; P < .001), and rarely critical (9.1%). Conclusion. A mini-CEX can be incorporated into a surgery clerkship. The number and breadth of feedback comments make the mini-CEX a rich assessment tool. Critical and supportive feedback comments, both highly valuable, are provided nearly equally frequently when the mini-CEX is used as an assessment tool. (Surgery 2011;150:272-7.) From the Departments of Surgerya and Medicine,b Brigham and Women’s Hospital; and the Harvard Medical School,c Boston, MA
TRADITIONALLY, EVALUATION OF STUDENT PERFORMANCE in core surgery clerkships relies on written and oral examinations, performance on simulated clinical examinations such as the Objective Structured Clinical Examination, and, predominantly, on global ratings provided by faculty at the end of a rotation or clerkship.1 Despite the ubiquity and strengths of these methods of evaluation, it has been recognized that they may not be ideal for measuring student clinical performance and clinical competency. Specifically, although the National Board of Medical Examiners subject examination
Supported by Departmental Funds. Accepted for publication June 14, 2011. Reprint requests: Elizabeth M. Breen, MD, Brigham and Women’s Hospital, 75 Francis Street, Boston, MA 02115. E-mail:
[email protected]. 0039-6060/$ - see front matter Ó 2011 Mosby, Inc. All rights reserved. doi:10.1016/j.surg.2011.06.012
272 SURGERY
is an effective tool for assessing medical knowledge, performance on this examination has not shown good correlation with clinical skills.1 Oral examinations assess medical knowledge in addition to data gathering, but are plagued by the concern that they are inherently subjective and prone to bias.2,3 Objective Structured Clinical Examinations, based on scripted encounters with standardized patients, are time and resource intensive.4 Global assessments of trainees’ clinical skills provided by faculty have the advantages of requiring little additional time beyond typical interactions between students and faculty. Also, they derive from performance in real situations, typically span several weeks, and highlight not only snapshots of performance, but allow commentary on progress of the student. Despite these strengths, evaluation by global assessment is remote and reliant predominantly on the aggregation of different events,1 subjective, potentially does not capture deficiencies in performance,5 and in fact may overestimate clinical skills.6 Generally, none of the
Surgery Volume 150, Number 2
evaluations described provide timely feedback that might allow students to either build on good skills or to recognize and correct or eradicate deficiencies in performance.7 Acknowledgment of the limitations of the assessment tools used currently has led to the pursuit of direct observation and evaluation of clinical encounters as a means to improve clinical skills assessment.8,9 Ideally, such observations should be brief; assesses clinical, communication, and interpersonal skills; foster self-reflection; and allow for immediate feedback.7,9 A tool for such direct clinical observation is the mini-clinical evaluation exercise (mini-CEX). During a mini-CEX, a brief, actual trainee–patient encounter is observed and evaluated by a member of the faculty. The observer scores performance on the mini-CEX in 6 domains of clinical competence using a 9-point Likert scale. Feedback is provided at the end of the exercise.10,11 The available literature suggests that the mini-CEX is valid and reliable when used as an assessment tool of resident performance, particularly in internal medicine.12,13 The mini-CEX has also been used for assessment of medical students. Here published reports of its use appear restricted primarily to internal medicine core clerkships.14-16 With this report, we seek to answer the questions if administration of a mini-CEX in a core surgery clerkship is feasible and what it adds to the assessment of medical students’ clinical skills in the context of the currently used evaluation tools. METHODS Implementation. In 2005, the decision was made to incorporate direct clinical observation into the Harvard Medical School Core Surgery Clerkship held at Brigham and Women’s Hospital with the introduction of a mini-CEX. At Brigham and Women’s Hospital, the clerkship spans 12 weeks. Two 3-week blocks are spent on 2 different general or vascular surgery services and the remaining 6 weeks are 1-week blocks; 1 week is devoted to breast surgery, 1 week to anesthesia, and the remaining 4 blocks are electives. Additionally, students spend 6 separate 12-hour shifts in the emergency room, where they are paired with a surgical senior. The mini-CEX was administered once per rotation to each rotating medical student. The examinations were administered weekly throughout the clerkship; no more than 2 students participated on any given day of administration. Students were excused from their other rotation responsibilities to participate. The students were
Pernar et al 273
assigned a day for their mini-CEX by the clerkship coordinator within the first 2 weeks of the beginning of the rotation. The first mini-CEX was typically not scheduled before the fourth week of the rotation. The schedule and instructions for the exercise were e-mailed to the students and the same structure was always followed (Table I). The exercise was administered in 30 minutes. If students took too much time for any portion of the exercise, they were redirected by the observer. The mini-CEX was observed by either the clerkship director or by 1 of 2 surgery residents trained to conduct the examination. The participating residents were PGY-2 residents; each participated for approximately 1 nonoverlapping year. Training of the residents consisted of orientation by the clerkship director, multiple observations of mini-CEXs, followed by role-playing, and finally conducting a mini-CEX under observation of the clerkship director with feedback on the exercise. Student performance was scored by the observer and the mini-CEX score sheet was submitted to the surgery education department to become part of the student’s file. At this time, the mini-CEX is used for formative but not summative feedback and as such not used in student grading. Data collection. After the Harvard Medical School institutional review board reviewed and declared this study exempt, student demographic data were extracted from a student database kept by the student coordinator. Returned mini-CEX forms were collected for all students who were eligible to take the mini-CEX between 2005 and 2010. For comparison, global assessment forms, completed by residents and faculty at the end of students’ rotations on the surgical services, were collected. The global assessments are the assessment standard currently used for clerkship performance evaluation at all Harvard Medical School-affiliated hospitals. Evaluators use the assessment form to rate students’ clerkship performance on a 5-point descriptive scale; narrative feedback comments are also requested (Appendix 1; available online at). Qualitative analysis. We used a grounded theory approach17 to analyze the qualitative comments provided on the mini-CEX and the global assessments. Using the codes arrived at through this approach comments were classified by 3 analysts independently. The comments were further scored for attribution, namely, supportive (positive) or critical and containing suggestions for improvement (negative). A coding consensus conference was held to ensure analysts came to an agreement on all scores. Emergent themes were clustered using principal component analysis. The clusters
274 Pernar et al
Surgery August 2011
Table I. Structure of the mini-clinical examination exercise in the core surgery clerkship Scenario Proctor Information given Asked to perform Asked to present
Patient postoperative day 1 after abdominal surgery Clerkship director or trained surgery resident Brief history, operation performed Interview, focused physical examination History, physical examination, assessment and plan Duration (min)
Patient encounter Presentation Feedback/discussion
5–10 5 15
were named to reflect the language used in the medical student grading sheets. Statistical analysis. For each student, comment counts by theme and attribution were tabulated for the mini-CEX and the global assessments, respectively. Study results were calculated using proportions, means with standard deviations, and medians with ranges. Wilcoxon rank-sum tests were used for comparing ordinal or continuous variables between the 2 groups. For dichotomous and categorical variables, the Fisher exact test was used to determine differences in proportions between 2 groups. All tests were 2-tailed; P < .05 was considered significant. RESULTS Participation and completion. One hundred thirty-seven students were eligible to complete the mini-CEX. The age (mean ± standard error of the mean) of the students at the start of the academic year during which they took their core surgery clerkship was 25 ± 2 years. Approximately half of the students were men (52%). For 124 students a completed mini-CEX score form was on file for a completion rate of 90.5%. There were no significant differences in age or gender distribution between the groups of students for whom a mini-CEX score form was available and for whom it was not (data not shown). Thematic clusters. Nine thematic clusters emerged from the qualitative data analysis. Comments classified under history taking skills refer to the verbal data gathering process; focused physical examination skills to the performance of the relevant physical examination; fund of knowledge to the knowledge base with regard to the
physiologic basis of disease; clinical management skills to clinical reasoning and decisions about patient management; interpersonal skills to communication skills and ability to establish rapport with patients; presentation skills to communication of findings and management plans to other professionals; professionalism to demonstrated integrity, reliability, collegiality, and responsibility; initiative and desire to learn to any student-driven initiative to improve performance and knowledge. Summary comments were also made; these were nonspecific, do not comment on any skill or competency, and therefore are the least valuable. Examples of positive and negative comments extracted from the evaluations for each of these themes are shown in Table II. Distribution and attribution of qualitative comments provided on the mini-CEX. On average 6.5 ± 2.2 qualitative feedback comments were provided on each mini-CEX. These comments touched on 4.5 ± 1.2 thematic clusters. The 3 most frequently addressed themes were history taking skills, focused physical examination skills, and clinical management skills. Comparison of qualitative comments provided on the mini-CEX and the global assessments. Per evaluation, mini-CEX evaluations contained more than twice the number of comments (6.5 ± 2.2 vs 2.9 ± 0.6; P < .001) addressing nearly 4 more themes (4.5 ± 1.2 vs 0.8 ± 0.2; P < .001) than global assessments. Each evaluation tool highlighted different skills (Fig 1). Expressed as percentage of all comments made on the respective evaluation tools, the miniCEXs addressed history taking skills, focused physical examination skills, and clinical management skills much more frequently than the global assessments (16% vs 2.5%; 11.8% vs 1.5%; and 28% vs 5.4%; respectively). In contrast, comments (all P < .001) made on the global assessments were weighted toward fund of knowledge (11.4% vs 4.3%), presentation skills (8.2% vs 1%), professionalism (24.8% vs 2.2%), and initiative and desire to learn (13% vs 4.4%). Also, more summary comments were made in the global assessments than in the mini-CEX (17.3% vs 15.4%; P = .01). There was no significant difference in percentage of comments that addressed interpersonal skills between the mini-CEX (16.8%) and global assessments (15.9%). Whereas the mini-CEX provided nearly equal proportions of supportive comments and comments suggesting improvement (57.3% and 42.7%, respectively), the global assessments provided predominantly supportive comments (90.9%; P < .001; Fig 2).
Pernar et al 275
Surgery Volume 150, Number 2
Table II. Examples of comments for identified thematic clusters Positive
Negative
History taking
Asked appropriate questions about pain.
Focused physical examination
Abdominal examination was correctly sequenced and complete. High level of understanding of pathophysiology of postoperative state. Generated a well thought-out plan based on information gathered. Was comfortable and established nice rapport. Presentation was appropriately organized. Remained very professional. Received feedback well. Overall excellent.
Fund of knowledge Clinical management skills Interpersonal skills Presentation skills Professionalism Initiative and desire to learn Summary comments
Fig 1. Distribution of comments across thematic clusters, shown as percentage of all comments, provided on the mini-CEX and global assessments. *P < .001; yP = .01.
DISCUSSION In our study, we have shown that evaluations of direct observation of brief clinical encounters in the context of a mini-CEX can be successfully incorporated into a surgical clerkship. The implementation of the min-CEX in the clerkship was initially a pilot program that was expanded over 5 years to include all students. In the initial year 10 students were involved. By 2009, all students rotating at Brigham and Women’s Hospital were included; annually approximately 45–50 students now rotate through our surgical service over the course of a year. We are able to test all these students with the current format. As highlighted in Methods, approximately 30 minutes is spent with each student; perhaps 10 additional minutes are necessary to complete the assessment form. Thus, scaling up the administration of the exercise to a greater number of students was achievable using a single observer. Unlike an Objective Structured
Should have asked more follow-up questions. Physical examination was not thorough. No recognition of severity of illness. Did not provide assessment. Was very shy and apologetic. Presentation lacked key information. Ignored explicit instructions. Need for independent reading. Satisfactory job.
Fig 2. Percentage of positive and negative comments provided on the mini-CEX and the global assessments. *P < .001.
Clinical Examination, the exercise is designed to occur in the clinical setting, in real time, under observation by a cadre of clinician observers, and does not require students to leave their clinical duties. The infrastructure and resources necessary to run an OSCE are not associated with adding the mini-CEX. Hopefully, the labor of adding miniCEX evaluations would be dispersed with the addition of more observers. Close analysis of feedback comments provided on the mini-CEX score sheets revealed that the mini-CEX serves as a rich and robust assessment tool; multiple comments covering a wide range of clinical skills were provided routinely. Interestingly, these comments were not only supportive (positive), but almost equally frequently contained criticism leading to suggestions for improvement (negative). Because the mini-CEX is most similar to global assessments, we chose the latter as a point
276 Pernar et al
of comparison to understand what the mini-CEX may add to the evaluative process of our students. Although the feedback provided on both the mini-CEX and the global assessments spanned 8 domains of clinical performance, the feedback comments provided on the mini-CEX were richer in detail than those provided on the global assessments. On average, mini-CEX feedback comments provided approximately 6 items of feedback covering approximately 4 domains of clinical performance. The global assessments contained roughly half this detail. Considering that verbal feedback likely is more detailed than written comments, the benefit to the students may be even more pronounced than our findings reveal. The mini-CEX provided a forum preferentially for commentary on history taking, physical examination, and clinical management skills. These domains were generally not well covered by the global assessments. Also, feedback provided on global assessments was almost exclusively supportive, omitting constructive criticism as provided in the mini-CEX evaluations. Overall, addition of a mini-CEX to the evaluative process seems complimentary to the current evaluation methods. It strengthens the evaluative process by addressing gaps in the global assessments and by adding detail. Furthermore, the data suggest that the feedback provided on the mini-CEX satisfies the requirements that feedback should meet to contribute meaningfully to learning18,19; it is detailed, highlighting good behaviors while also addressing areas of improvement. It is also timely and based on direct observation. We believe that the difference between the mini-CEX and the global assessments may be due to several factors relating to the fact that the former is rooted in direct observation. First, the immediacy of feedback related to the encounter allows for more detail. Second, the mini-CEX represents a snapshot in time; therefore, observers potentially feel more license to point out opportunities and suggest improvements that will not label the learner as having these challenges for improvement in all of their clinical encounters, as might be seen in the global assessment form. The mini-CEX has served the purpose of facilitating direct observation of brief clinical encounter for evaluation of clinical skills with proven validity in the field of internal medicine. For residents, the instrument discriminates between trainees at different level of training or proficiency and scores correlate with performance on in-training examinations and in-training evaluations.12,13 For medical students in internal medicine, mini-CEX scores also correlate with scores
Surgery August 2011
on course examination and with final course grade.14-16 As few as 4 encounters are sufficient to rank candidates and between 8 and 10 encounters yield a reproducibility of 0.8.10,15 The reliability and validity of the tool make the mini-CEX an attractive tool for educators; trainees welcome the feedback provided in the context of a miniCEX.20,21 Similar to our study, analysis of transcripts of feedback following the mini-CEX has previously shown feedback to focus on history taking and physical examination skills.22 These data, along with our findings, suggest that the miniCEX is a potentially useful evaluation tool for surgical training. Although we were able to successfully implement the mini-CEX in our clerkship and found that it adds to the evaluative process, there are limitations to this study. The study is a retrospective, single-institution study, a specific clinical situation was used, and a few trained individuals proctored the exercises. These factors potentially limit the ability to generalize our findings. Furthermore, although the mini-CEX provides structured 1-on-1 sessions between students and surgery faculty and promotes robust, detailed studentspecific feedback through observation, our study does not answer the question whether participation in the mini-CEX has measurable effect on learning, a challenge for many educational tools. The American Board of Internal Medicine has mandated use of the mini-CEX in internal medicine residency training programs to document residents’ clinical skill.11 The tool has also been used in anesthesia21 and neurology23 training programs to assess clinical skill competency. To address the need for more objective evaluation of clinical skills in surgery residency training, a performance-based examination has been designed, the Patient Assessment and Management Examination.24,25 This examination evaluates residents based on their performance on clinical scenarios using standardized patients. Although reliability and validity for this examination have been established, it relies on simulated scenarios rather than real patients and takes several hours to complete.24,25 A drawback to using this method of evaluation is that, because of the necessary time commitment, it is not practical to administer repeatedly throughout the course of residency. Therefore, it potentially does not lend itself well to monitor resident progress or to allow for early intervention if deficiencies are detected. The mini-CEX is a time-efficient, low-infrastructure, and objective evaluation tool of clinical skills based in real clinical situations and may be more
Surgery Volume 150, Number 2
appropriate for routine use in ongoing evaluation of surgical residents. A future direction will be to determine whether the mini-CEX is a feasible and useful addition to the evaluation process in a surgical residency program. CONCLUSION A mini-CEX can be incorporated into a surgery clerkship. The depth and breadth of feedback comments make the mini-CEX a rich assessment tool. Critical and supportive feedback comments, both highly valuable to students, are provided equally frequently when the mini-CEX is used as an assessment tool. REFERENCES 1. Kassebaum DG, Eaglen RH. Shortcomings in the evaluation of students’ clinical skills and behavior in medical school. Acad Med 1999;74:841-9. 2. Haq I, Higham J, Morris R, Dacre J. Effect of ethnicity and gender on performance in undergraduate medical examinations. Med Educ 2005;39:1126-8. 3. Fernandez A, Wang F, Braveman M, Finkas LK, Hauer KE. Impact of student ethnicity and primary childhood language of communication skill assessment in a clinical performance examination. J Gen Intern Med 2007;22:1150-60. 4. Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med 2008;40:574-8. 5. Schwind CJ, Williams RG, Boehler ML, Dunnington GL. Do individual attendings’ post-rotation performance ratings detect resident’s clinical performance deficiencies? Acad Med 2004;79:453-7. 6. Silber CG, Nasca TJ, Paskin DL, Eiger G, Robeson M, Veloski JJ. Do global rating forms enable program directors to assess the ACGME competencies? Acad Med 2004;79:549-56. 7. Turnbull J, Gray J, MacFadyen J. Improving in-training evaluation programs. J Gen Intern Med 1998;13:317-23. 8. Holmboe ES. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med 2004;79: 16-22. 9. Williams RG, Dunnington GL, Klamen DL. Forecasting resident’s performance – partly cloudy. Acad Med 2005;80: 415-22.
Pernar et al 277
10. Norcini JJ, Blank LL, Arnold GK, Kimball HR. The miniCEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med 1995;123:795-9. 11. Norcini JJ, Blank LL, Duffy D, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med 2003; 138:476-81. 12. Durning SJ, Cation LJ, Markert RJ, Pangaro LN. Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training. Acad Med 2003;77:900-4. 13. Holmboe ES, Huot S, Chung J, Norcini J, Hawkins RE. construct validity of the mini clinical evaluation exercise (MiniCEX). Acad Med 2003;78:826-30. 14. Kogan JR, Bellini LM, Shea JA. Implementation of the miniCEX to evaluate medical students’ clinical skills. Acad Med 2002;77:1156-7. 15. Kogan JR, Bellini LM, Shea JA. Feasibility, reliability, and validity of the mini-clinical evaluation exercise (mCEX) in a medicine core clerkship. Acad Med 2003;78(Suppl 10): s33-5. 16. Kogan JR, Hauer KE. Brief report: use of the mini-clinical evaluation exercise in internal medicine core clerkships. J Gen Intern Med 2006;21:501-2. 17. Pope C, Ziebland S, Mays N. Qualitative research in health care: analysing qualitative data. BMJ 2000;320:114-6. 18. Sachdeva AK. Use of effective feedback to facilitate adult learning. J Cancer Educ 1996;11:106-18. 19. Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance. Med Educ 2007;41:1124-30. 20. Malhotra S, Hatala R, Courneya CA. Internal medicine residents’ perception of the mini-clinical evaluation exercise. Med Teach 2008;30:414-9. 21. Weller JM, Jolly B, Misur MP, Merry AF, Jones A, Crossley JGM, et al. Mini-clinical evaluation exercise in anaesthesia training. Br J Anaesth 2009;102:633-41. 22. Holmboe ES, Yepes M, Williams F, Huot SJ. Feedback and the mini clinical evaluation exercise. J Gen Intern Med 2004;19:558-61. 23. Wiles CM, Dawson K, Hughes TAT, Llewelyn JG, Pickersgill TP, Robertson NP, et al. Clinical skills evaluation of trainees in a neurology department. Clin Med 2007;7:365-9. 24. MacRae HM, Cohen R, Regehr G, Reznick R, Burnstein M. A new assessment tool: the patient assessment and management examination. Surgery 1997;122:335-43. 25. MacRae H, Regehr G, Leadbetter W, Reznick RK. A comprehensive examination for senior surgical residents. Am J Surg 2000;179:190-3.