Journal of Clinical Anesthesia (2009) 21, 38–43
Original contribution
Perceived predictive value of the Medical Student Performance Evaluation (MSPE) in anesthesiology resident selection Christopher Swide MD (Program Director and Associate Professor), Kathie Lasater EdD, RN, ANEF (Assistant Professor and Education Specialist)⁎, Dawn Dillman MD (Director of Medical Student Education, Assistant Program Director, and Associate Professor) Department of Anesthesiology and Peri-Operative Medicine, Oregon Health and Science University School of Medicine, Portland, OR 97239, USA Received 6 September 2007; revised 2 June 2008; accepted 19 June 2008
Keywords: Anesthesiologists: residents; Medical student performance evaluation (MSPE); Residency programs
Abstract Study Objective: To study the perceptions of anesthesiology resident program directors about the value of the Medical Student Performance Evaluation (MSPE) in predicting successful residents. Design: Survey instrument. Setting: Anesthesiology department of a university hospital. Measurements: An online survey was sent to 115 U.S. medical school-based anesthesiology residency program directors. Descriptive statistics were used to report which sections of the MSPE were predictive and which were not predictive. More than 30 qualitative comments were hand-coded for frequency and emerging themes. Main Results: Those sections predictive of success included the (a) academic history summary, (b) academic progress, (c) academic ranking, and (d) the candidate's comparative clinical performance. Non-predictive sections included (a) unique characteristics, (b) pre-clinical comparative performance, (c) professional behaviors versus those of classmates, (d) summary statement, and (e) Appendix E. The strongest theme emerging from the qualitative findings was a desire for the MSPE to indicate candidates' rank. Conclusions: Anesthesiology programs tend to rely on the most objective sections of the MSPE. While program directors valued comments from clinical faculty, they did not hold the preclinical performance relative to peers in similar esteem, and there is a lack of reliability in the MSPE's assessment of professional behaviors. © 2009 Elsevier Inc. All rights reserved.
⁎ Corresponding author. Department of Anesthesiology and Peri-Operative Medicine, Oregon Health and Science University School of Medicine, Mailcode: UHN 2, Portland, OR 97239, USA. Tel.: +1 503 494 8325; fax: +1 503 418 1389. E-mail address:
[email protected] (K. Lasater). 0952-8180/$ – see front matter © 2009 Elsevier Inc. All rights reserved. doi:10.1016/j.jclinane.2008.06.019
MSPE value in resident selection
1. Introduction The Medical Student Performance Evaluation (MSPE) has replaced the traditional Dean's Letter as a component of the medical student's application for residency. In 2002, the Association of American Medical Colleges (AAMC) published guidelines to standardize the content of the MSPE, which now includes these recommended sections: (a) Identifying Information, (b) Unique Characteristics, (c) Academic History, (d) Academic Progress, (e) Summary, and (f) Appendices [1]. The changes were intended to help medical schools present information more uniformly and to improve the likelihood of reporting important but potentially negative information about the applicant. An analysis of 532 Dean's Letters in 1999, prior to the release of the AAMC guidelines, showed that potentially negative information, such as marginal or failing grades and leaves of absence, often was not reported [2]. The AAMC evaluated MSPEs in 2005 and found that 70% of schools were using the recommended guidelines at an “adequate” level [3]. Program directors in family medicine [4], obstetrics/ gynecology [5], and physical medicine [6] report that the MSPE is one of the most useful tools in the application process. In contrast, orthopedic program directors rank the MSPE tenth in importance on a list of 26 criteria [7]. The value of the MSPE and, more specifically, the perceived importance of the individual sections to program directors in anesthesiology, have not been previously evaluated. The purpose of this study was to determine how anesthesiology residency programs currently utilize the standardized MSPE in screening resident applicants, and to gain insight into the perceived value of the MSPE in predicting resident success in their anesthesiology programs during the application process. This study was intended to determine the perceived predictive value of the MSPE during the selection process. We did not attempt to determine the predictive value of the MSPE in actual anesthesiology resident performance.
2. Materials and methods 2.1. Design Early in 2006, we reviewed MSPEs from medical schools all over the country and constructed an online survey to gather information from anesthesiology program directors about the value of each section in a typical MSPE and its perceived effectiveness in predicting success of their residents. The survey offered respondents a Likert scale of 4 choices and an opportunity for comments on each of 9 questions. The tenth question was open-ended, requesting suggestions to enhance the predictive value of the MSPE. The survey concluded with seven demographic questions about the respondent and his/her institution (Table 1).
39 Table 1
MSPE survey questions
1. How helpful/predictive of success in your program is the Unique Characteristics (candidate's pre-med history, family background, pre-med institution(s), research and publication record, global and local community service activities, and language skills) section? Comments? 2. How helpful/predictive of success in your program is the Academic History Summary (candidate's progress through the curriculum, need for remediation, matriculation and graduation dates) section? Comments? 3. How helpful/predictive of success in your program is the Academic Progress (candidate's preclinical course record, clinical clerkship evaluations, comments from clinical faculty and housestaff, categorizations of performance, such as Pass, High Pass, Honors) section? Comments? 4. How helpful/predictive of success in your program is the Overall Academic Ranking (candidate's achievement by comparison to classmates is usually visually depicted by quartile or quintile) section? Comments? 5. How helpful/predictive of success in your program is the Pre-Clinical Comparative Performance to Classmates (candidate's achievement by comparison to classmates in courses such as microbiology, pathology, biochemistry) section? Comments? 6. How helpful/predictive of success in your program is the Clinical Comparative Performance to Classmates (candidate's achievement by comparison to classmates in clinical clerkships, such as General Medicine, Pediatrics, Surgery) section? Comments? 7. How helpful/predictive of success in your program is the Professional Behaviors/Attitudes Comparative Performance to Classmates (by attribute, such as team and patient interactions, professional appearance and demeanor, accountability) section? Comments? 8. How helpful/predictive of success in your program is the Summary Statement (endorsement by the letter writer of the candidate's personal and academic qualities, use of categories defined by the institution, such as “good,” “very good,” or “excellent”) section? Comments? 9. How helpful/predictive of success in your program is the Appendix E (information specific to the medical school, such as programmatic emphases, average length of enrollment, use of the USMLE and OSCE criteria) section? Comments? 10. What changes would you suggest to enhance the predictive value of success in your program of the Dean's letter to your resident selection process? MSPE = Medical Student Performance Evaluation; USMLE = United States Medical Licensing Examination; OSCE = Objective Structured Clinical Examination.
Eleven members of the Resident Selection Committee in the Department of Anesthesiology and Peri-Operative Medicine at Oregon Health & Science University (OHSU) pilot tested the initial survey to make sure that the questions were clearly stated and measured what we intended to measure. We carefully reviewed their responses and comments and made wording changes to better capture the information needed so as to meet the purposes of this study.
40
C. Swide et al.
The basic wording was the same for each question; the only difference was that each focused on a different section of the MSPE (Table 1).
2.2. Participants After approval by the Oregon Health & Science University Institutional Review Board, we sent the survey electronically to 115 program directors of Accreditation Council of Graduate Medical Education (ACGME)-accredited anesthesiology residency programs in the United States. The AMA's Fellowship and Residency Electronic Interactive Database (FREIDA) was our resource to identify and contact potential respondents. We distributed the survey late in 2006 when anesthesiology programs were actively evaluating MSPEs. Anesthesiology programs use a variety of approaches to screen applicants such as selection committees, senior faculty members, or simply a program director. Since it would be unfeasible to determine the methods that all ACGMEapproved anesthesiology programs use for this process, we invited program directors as point persons to complete the survey. Since the ACGME requires program directors to have authority and accountability for the operation of the program [8], we believed that this group would accurately report their departments' utilization of the MSPE in applicant screening, regardless of their specific selection processes.
2.3. Analysis Quantitatively, the 4 response choices on the Likert scale were Highly Predictive, Consistently Predictive, Sometimes Predictive, and Not Predictive. The researchers used descriptive statistics to analyze the responses: Highly and Consistently Predictive formed the Predictive group while Sometimes and Not Predictive formed the Not Predictive group. Respondents provided 30 qualitative comments in response to the open-ended Question 10. In addition, each of the 9 previous questions elicited at least two comments per
Table 2
Respondents by geographic area
Geographic area
Number (%) survey respondents
Proportion of FREIDA
Northeast Central Atlantic Southeast Midwest Rocky Mountain Southwest Pacific Northwest No response to geographic location TOTAL
10 3 8 10 1 8 2 2
34% 10% 12.5% 23% 1.5% 17% 2% N/A
(23%) (7%) (18%) (23%) (2%) (18%) (4.5%) (4.5%)
44 (100%)
100%
FREIDA = Fellowship and Residency Electronic Interactive Database of the American Medical Association.
Table 3 Respondents by number of anesthesiology residents selected in 2006 [16] Number of residents
Number (%) survey respondents
Proportion of FREIDA
1-3 4-6 7-10 11-15 16-20 21 or more No response to number selected question TOTAL
1 4 15 11 5 6 2
8% 23% 26% 22% 14% 7% N/A
(2%) (9%) (34%) (25%) (11.5%) (13.5%) (5%)
44 (100%)
100%
FREIDA = Fellowship and Residency Electronic Interactive Database of the American Medical Association.
question. All comments were hand-coded for frequency and emerging themes.
3. Results 3.1. Respondents A total of 44 anesthesiology residency programs from all geographic areas responded to the survey, for a response rate of 38%. The gender distribution was 70% men and 30% women, with an age range of between 45 and 55 years for 70% of the respondents. The responding programs included all institutional types. Although the response rate was lower than anticipated, the geographic distribution of respondents was similar to that of the national FREIDA database (Table 2).
3.2. Quantitative findings Each section of the typical MSPE fell clearly into either perceived Predictive (rates of 60% or higher) or perceived Not Predictive (less than 60%). The sections from the MSPE that were perceived as Predictive were (a) Academic History Summary, (b) Academic Progress, (c) Academic Ranking, and (d) Comparative Clinical Performance. Conversely, those sections that were perceived as Not Predictive were: (a) Unique Characteristics; (b) Pre-clinical Comparative Performance; (c) Professional Behaviors/Attitudes Compared to Classmates; (d) Summary Statement; and (e) Appendix E (Table 3).
3.3. Qualitative findings The comments following each of the 9 questions about specific sections of the MSPE and the responses to the final open-ended question were aggregated and coded by theme. The following themes emerged in descending order of
MSPE value in resident selection
41
frequency: (a) the MSPE should provide the candidate's class rank; (b) comments about interpersonal skills and other professionalism issues are important for the selection of residents; (c) the MSPE should be standardized in order to enhance the value of the comments; (d) it should be shortened to minimize replication of other admission components; and, finally, (e) the MSPE should include accurate comments, whether positive or negative.
4. Discussion The Medical Student Performance Evaluation (MSPE), historically known as “the Dean's Letter,” is an integral component of the application to a residency program. Most of the anesthesiology program directors who responded to our survey perceived the information from the following sections of the standardized MSPE as useful in predicting successful anesthesiology residents: Academic Progress, Academic History Summary, Comparative Clinical Performance to Classmates, and Overall Academic Ranking. They perceived the other sections of the MSPE as less predictive of success: Unique Characteristics, Appendix E (school/class demographics), Pre-Clinical Comparative Performance to Classmates, Summary Statement, and Professional Behaviors/Attitudes (Table 4). Based on the open-ended survey comments and the quantitative responses, there appears to be a trend to rely on the most objective sections of the MSPE in selecting anesthesiology residents, such as class rank, academic performance relative to others, need for remediation, and overall performance relative to peers. This finding is of special note given that only 17% of MSPEs currently provide comparative data in the summary statement as recommended [3]. Although these program directors value comments from clinical faculty and residents in the clinical rotations, they do not equally value preclinical performance relative to peers despite the importance of the basic sciences to anesthesiology practice. This finding may reflect the fact that many programs use the United States Medical Licensing Examination (USMLE) 1 score as a screening
Table 4
tool for competency of medical students after completing their preclinical curricula, and USMLE 1 scores may be more predictive of performance on standardized examinations during and after residency than USMLE 2 results [9,10]. In addition, many applicants do not have a USMLE 2 score at the time of application; therefore, program directors may rely on other comparative measures of clinical skills. The lack of perceived predictive value in the Unique Characteristics section may be related to the fact that this information is available elsewhere, including the curriculum vita or personal statement, and thus may be considered redundant in the MSPE. The most surprising finding was the lack of confidence among anesthesiology program directors in the value of the MSPE assessment of professional behaviors despite the evidence that professionalism is a key indicator of success in residency programs and later in practice. The literature is definitive that professional behaviors exhibited in medical school and residency often predict future behaviors, particularly problematic behavior [11-13]. The MSPE is designed to describe the total performance of the student in medical school, including objective records of grades and descriptions of other achievements, as well as subjective information about the student's professional development and behavior. Rhoton [14] published a report of 71 anesthesiology residents and found that 15 (21%) had received comments about unprofessional behaviors. These 15 residents had significant problems in many areas of practice, including diminished eagerness to learn, multiple critical incidents, and poor knowledge base. In contrast, 21 residents had excellent performance overall and no comments describing unprofessional behavior. The study concluded “that clinical excellence and unprofessional behavior rarely coexist” [14]. In addition, Papadakis et al. found that physicians disciplined by the state medical board were significantly more likely to have had professionalism concerns raised in medical school [15]. It is expected, therefore, that such concerns reported on the MSPE would be of great interest to program directors as possible predictors of future problem behavior as a resident and practicing physician and should certainly be discussed and clarified during the applicant's interview.
Survey values of MSPE sections
Predictive/helpful
% Predictive
Not predictive/helpful
% Not predictive
Academic progress Academic history Overall academic ranking (Appendix D)
68 63 62
86 81 79
Comparative clinical performance to classmates (Appendix B)
61
Unique characteristics Appendix E (program attributes) Pre-clinical comparative performance to classmates (Appendix A) Summary statement Professional behaviors/attitudes; comparative performance to classmates
MSPE = Medical Student Performance Evaluation.
73 60
42
C. Swide et al.
However, our data suggest that the MSPE is still not viewed by program directors as a reliable source of information regarding a student's professional behavior. This may be secondary to experience with Dean's Letters prior to the change in the MSPE format, or it may be a continued perception that medical schools gloss over professional deficiencies in the MSPE. However, based on reports from previous studies and our qualitative findings, program directors often question the accuracy of the professional behaviors' report in most MSPEs. The comments associated with this survey question seem to support a perception that medical schools lack a reliable tool to measure professionalism and the continued belief that the MSPE, in general, avoids “negative” comments, rendering a section on professionalism inherently unreliable. The qualitative themes that emerged from our study strongly support the following changes to improve the value of the MSPE for anesthesiology program directors: • Include class rank; • Encourage reliable comments about interpersonal skills and professionalism; • Standardize to increase the value of the comments, but shorten the MSPE to eliminate replication of other components of the application, such as the curriculum vitae; • Include appropriate negative comments; • Release the MSPE earlier to allow program directors to use it during the applicant screening process. This study has several important limitations. Our response rate of 38% was lower than we expected. Before the survey was mailed, we reasoned that program directors would be more responsive during the resident selection season when they were actively working with both the strengths and the limitations of the current system. Perhaps the rate of return would have been higher if the survey were sent immediately at the close of the selection season. While a higher response rate would have been more desirable, the 44 programs who responded mirror the range of anesthesiology residency programs in the U.S. (Table 2). We assumed that program directors would accurately reflect how their programs utilize the MSPE in residency selection. We did not attempt to determine the specific methods that anesthesiology residency programs utilized for screening applicants. It might have been useful to ask about these methods so as to determine the range of processes represented by the respondents. We did not attempt to determine the optimal timing of release of the MSPE in resident selection in anesthesiology. Senior medical students applying to residency programs through the National Residency Match Program (NRMP) can begin the process in September, while MSPEs are not released until November [16,17].
This study represents the first attempt to determine the perceived value of the MSPE as a predictor of success among applicants by anesthesiology residency program directors. As a result of our survey, future research comparing anesthesiology program directors' perceptions with actual resident performance would be helpful in determining the role of the MSPE in resident selection.
5. Conclusion Anesthesiology residency program directors clearly value sections of the MSPE as a tool for resident selection. The MSPE is viewed by anesthesiology program directors as most worthwhile when it is concise, organized in a standardized format, and includes information that allows selection committees and their colleagues to evaluate a student's performance relative to their medical school peers. The paucity of negative comments in most MSPEs and the lack of reliable methods to measure and report professionalism weaken the MSPE as an effective tool for selecting those residents who have the best communication skills and professional behavior.
References [1] A guide to the preparation of the Medical Student Performance Evaluation. Association of American Medical Colleges (AAMC): Dean's Letter Advisory Committee. [http://www.aamc.org/members/ gsa/mspeguide.pdf edition, Association of American Medical Colleges; 2002:13]. [2] Edmond M, Roberson M, Hasan N. The dishonest dean's letter: an analysis of 532 deans' letters from 99 U.S. medical schools. Acad Med 1999;74:1033-5. [3] Shea JA, O'Grady E, Morrison G, Wagner BR, Morris JB. Medical Student Progress Evaluations in 2005: an improvement over the former dean's letter? Acad Med 2008;83:284-91. [4] Travis C, Taylor CA, Mayhew HE. Evaluating residency applicants: stable values in a changing market. Fam Med 1999;31:252-6. [5] Taylor CA, Weinstein L, Mayhew HE. The process of resident selection: a view from the residency director's desk. Obstet Gynecol 1995;85:299-303. [6] DeLisa JA, Jain SS, Campagnolo DI. Factors used by physical medicine and rehabilitation residency training directors to select their residents. Am J Phys Med Rehabil 1994;73:152-6. [7] Bernstein AD, Jazrawi LM, Elbeshbeshy B, Della Valle CJ, Zuckerman JD. An analysis of orthopaedic residency selection criteria. Bull Hosp Jt Dis 2002-2003;61:49-57. [8] ACGME. ACGME Program Requirements for Graduate Medical Education in Anesthesiology. 2007. [Retrieved 8/07, from www. acgme.org]. [9] McCaskill QE, Kirk JJ, Barata DM, Wludyka PS, Zenni EA, Chiu TT. USMLE step 1 scores as a significant predictor of future board passage in pediatrics. Ambul Pediatr 2007;7:192-5. [10] Armstrong A, Alvero R, Nielsen P, et al. Do US medical licensure examination step 1 scores correlate with council on resident education in obstetrics and gynecology in-training examination scores and American board of obstetrics and gynecology written examination performance? Mil Med 2007;172:640-3.
MSPE value in resident selection [11] Hunt DD, MacLaren C, Scott C, Marshall SG, Braddock CH, Sarfaty S. A follow-up study of the characteristics of dean's letters. Acad Med 2001;76:727-33. [12] Mallott D. Interview, Dean's letter, and affective domain issues. Clin Orthop Relat Res 2006;449:56-61. [13] Self DJ, Baldwin DC Jr. Should moral reasoning serve as a criterion for student and resident selection? Clin Orthop Relat Res 2000;(378): 115-23. [14] Rhoton MF. Professionalism and clinical excellence among anesthesiology residents. Acad Med 1994;69:313-5.
43 [15] Papadakis MA, Hodgson CS, Teherani A, Kohatsu ND. Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board. Acad Med 2004;79:244-9. [16] Association of American Medical Colleges (AAMC). ERAS 2009 Timeline for Residency (Allopathic) Programs. [Retrieved 6/08, from http://www.aamc.org/programs/eras/programs/timeline/timeline_res. htm]. [17] National Resident Matching Program (NRMP): Results and Data: 2003-2007 Main Residency Match. Washington, DC: National Resident Matching Program; 2007.