ORIGINAL REPORTS
Program Director Perceptions of the General Surgery Milestones Project Brian C. Drolet, MD,*,† Jayson S. Marwaha, BS,‡ Abdul Wasey, BS,‡ and Adam Pallant, MD, PhD§ Department of Plastic Surgery, Vanderbilt University Medical Center, Nashville, Tennessee; †Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, Tennessee; ‡The Warren Alpert Medical School of Brown University, Rhode Island Hospital, Providence, Rhode Island; and §Department of Pediatrics, Rhode Island Hospital, Providence, Rhode Island *
OBJECTIVE: As a result of the Milestones Project, all Accreditation Council for Graduate Medical Education accredited training programs now use an evaluation framework based on outcomes in 6 core competencies. Despite their widespread use, the Milestones have not been broadly evaluated. This study sought to examine program director (PD) perceptions of the Milestones Project.
KEY WORDS: milestones, assessment, evaluation, surgery,
graduate medical education COMPETENCIES: Practice-Based Learning and Improve-
ment, Patient Care and Procedural Skills, Systems-Based Practice, Medical Knowledge, Interpersonal and Communication Skills, Professionalism
DESIGN, SETTING, AND PARTICIPANTS: A national
survey of general surgery PDs distributed between January and March of 2016. RESULTS: A total of 132 surgical PDs responded to the
survey (60% response rate). Positive perceptions included value for education (55%) and evaluation of resident performance (58%), as well as ability of Milestones to provide unbiased feedback (55%) and to identify areas of resident deficiency (58%). Meanwhile, time input and the ability of Milestones to discriminate underperforming programs were less likely to be rated positively (25% and 21%, respectively). Half of PDs felt that the Milestones were an improvement over their previous evaluation system (55%). CONCLUSIONS: Using the Milestones as competency-
based, developmental outcomes measures, surgical PDs reported perceived benefits for education and objectivity in the evaluation of resident performance. The overall response to the Milestones was generally favorable, and most PDs would not return to their previous evaluation systems. To improve future iterations of the Milestones, many PDs expressed a desire for customization of the Milestones’ content and structure to allow for programmatic differences. ( J Surg Ed ]:]]]-]]]. Published by Elsevier Inc on behalf of the Association of Program Directors in Surgery)
Correspondence: Inquiries to Brian C. Drolet, MD, Department of Plastic Surgery, Vanderbilt University Medical Center, Medical Center N, D-4219, Nashville, TN 37232; fax: (615)936-0167; e-mail:
[email protected]
INTRODUCTION Since the implementation of the Accreditation Council for Graduate Medical Education (ACGME) Outcomes Project in 1999, outcomes-based education and assessment has become a cornerstone of graduate medical education.1-3 Nearly a decade later, the Next Accreditation System was implemented to further promote this outcomes focus.4 A central feature of Next Accreditation System is the specialty-specific Milestones, which are competency-based developmental outcomes that form the basis for evaluative metrics within the framework of the core competencies.5,6 Although the Milestones are now used for resident and fellow evaluations at all ACGME-accredited training programs, their use in practice has not been broadly studied and some concerns have been raised. An earlier study of the 1999 Outcome Project demonstrated significant barriers to successful utilization, specifically including lack of time, funding, and faculty support as well as resistance to the ACGME mandate.7 Authors of another study, which examined similar competency-based evaluations outside of medicine (K-12 education and the department of defense), found several concerning features of the Milestones that may lead to failure, including differences in learner styles as well as evaluators’ assessment constructs and the time needed for direct observation to perform these evaluations.8 In this study, we sought to evaluate program directors’ (PD) experience with and perceptions of the Milestones in general surgery.
Journal of Surgical Education Published by Elsevier Inc on behalf of the 1931-7204/$30.00 Association of Program Directors in Surgery http://dx.doi.org/10.1016/j.jsurg.2017.02.012
1
TABLE 1. Summary of Main Survey Results Survey Response Negative How would you rate Milestones in terms of: Value for education Frequency of evaluation Evaluation of resident performance Ability to provide unbiased evaluations Time input for completion Comparison to your previous evaluation system
9.8 7.6 12.1 12.9 36.4 15.9
Neutral Positive % Respondents (95% CI)
(6.3-13.4) (4.4-10.7) (8.2-16) (8.9-16.9) (30.6-42.1) (11.5-20.3)
Negatively How have Milestones changed your practice as PD with regard to: Teaching of residents 5.3 Identifying resident deficiencies 6.1 Decreasing bias in evaluations 3.1 Moving your evaluative process toward competency 6.1 Your overall feelings about being a PD 13.7
(2.7-8) (3.2-9) (1-5.1) (3.2-9) (9.6-17.9)
Disagree Indicate your level of agreement: Milestones enhances trainee competence I have adequate support (e.g., protected time and trained coordinator) for Milestones requirements I wish Milestones were customizable for aspects unique to my program Milestones enhances my effectiveness as a PD Milestones have positively influenced my career plans Milestones data would be able to discriminate between successful and underperforming training programs I was given enough instruction/training on how to complete the Milestones
34.8 35.6 29.5 31.8 38.6 28.8
(29.2-40.5) (29.9-41.3) (24.1-35) (26.3-37.4) (32.8-44.5) (23.4-34.2)
No change 74 35.9 48.9 37.4 61.8
(68.8-79.3) (30.1-41.6) (42.9-54.8) (31.6-43.2) (56-67.6) Neutral
55.3 56.8 58.3 55.3 25 55.3
(49.4-61.2) (50.9-62.7) (52.4-64.2) (49.4-61.2) (19.8-30.2) (49.4-61.2)
Positively 20.6 58 48.1 56.5 24.4
(15.8-25.4) (52.1-63.9) (42.1-54.1) (50.6-62.4) (19.3-29.6) Agree
34.1 (28.4-39.8) 34.1 (28.4-39.8) 28.2 (22.9-33.6) 26.7 (21.4-32)
31.8 (26.3-37.4) 45 (39.1-51)
26.5 (21.2-31.8) 34.1 (28.4-39.8)
39.4 (33.6-45.2)
31.8 (26.3-37.4) 32.6 (27-38.2) 35.6 (29.9-41.3) 40.2 (34.3-46) 54.5 (48.6-60.5) 5.3 (2.6-8) 50 (44-56) 28.8 (23.4-34.2) 21.2 (16.3-26.1) 23.5 (18.4-28.5) 37.1 (31.4-42.9)
39.4 (33.6-45.2)
Bold indicates statistically significant plurality or majority response (P o 0.05).
MATERIALS AND METHODS Data Collection
addresses for 219 PDs (88%) from multiple sources including the ACGME Accreditation Data System, American Medical Association Residency and Fellowship Database (FREIDA) and a broad internet search. The anonymous, electronic survey was then distributed in 3 rounds to all contacts by individualized email between January and March 2016.
We began by identifying the developmental themes and goals of outcomes-based evaluation in Graduate Medical Education through a literature review.1-6 From this article, we developed a survey instrument to evaluate PD experience with the Milestones. We used a 5-point Likert scale for all survey responses with the exception of one question (“Overall, how do you feel about the Milestones?”), which offered 5 possible responses (Table 1). We pretested the survey with faculty and then piloted the survey with PDs on the graduate medical education committee at one institution. Throughout this process, we iteratively revised the survey construct for content and clarity. Cronbach’s alpha (0.83) demonstrated excellent internal consistency of the questionnaire. The project was granted exempt status by the institutional review board. The reference set for this national survey included all general surgery PDs (N ¼ 249) listed by the ACGME in the Accreditation Data System. We obtained functional email
For our analysis, we studied 5-point Likert scale responses as 3 groups (negative/disagree, neutral/no change, and positive/ agree) by combining the proportions of similar responses. For example, we combined the proportions of “Strongly agree” and “Agree” responses into 1 group. Two-sided confidence intervals were calculated for the proportion of respondents in each group using the standard error of proportions with a fixed population correction and an α ¼ 0.05. We considered there to be a significant difference between groups when the confidence intervals did not overlap. All analysis was performed in SPSS, version 22.0 (IBM Corp).
2
Journal of Surgical Education Volume ]/Number ] ] 2017
Statistical Analysis
TABLE 2. Overall Perceptions of Milestones Overall, How Do You Feel About the Milestones? Return to My Modify the Current Old System Milestones, but They are a Good Start 9.8 (6.3-13.4)
46.2 (40.3-52.2)
Keep Milestones as They are, but I Need More Resources to Make Them Work 24.2 (19.1-29.4)
Fully Support Milestones in Current Form 14.4 (10.2-18.6)
In this national survey, we found that many general surgery PDs had positive perceptions of the Milestones. Few
respondents wished to return to their former evaluation systems or expressed negative feelings about the Milestones compared with their old evaluations. Negative perceptions of the Milestones were primarily related to the time and resource input necessary to complete the evaluations. Meanwhile, there were perceived benefits for competency-based education, evaluation of resident performance and ability to provide unbiased feedback. One important finding was a desire for customizability and changes to the Milestones. Almost half of PDs felt the Milestones were a good start, but expressed a desire to modify the content and structure. Although the Milestones are “specialty-specific,” the format remains a one-size-fits-all approach for programs of varying size and specialty. For example, even if there are more faculty and support staff in larger programs, the Milestones may be more cumbersome at larger scales. Therefore, customization of content could help mitigate some of the concerns that arise with scale. Although the Milestones have been widely discussed in the literature, few studies have examined outcomes or perceptions of the Milestones as they were first described for general surgery.9 Among one study of internal medicine residents, evaluations were reportedly better than previous systems when using the Milestones.10 However, no other studies have examined the response of the Graduate Medical Education community to the Milestones. The primary weakness of this study is the methodology, which only evaluates perceptions and indirectly assesses the Milestones on factors like education and evaluation. Importantly, psychometric research may be confounded by survey and respondent biases. Questions may have been interpreted differently by participants leading to inconsistent reporting. Furthermore, there is potential for nonresponse bias (40% of the surveys were not returned) as well as various cognitive biases (e.g., social desirability bias). Finally, not all aspects of the Milestones were evaluated by the limited number of questions used in the survey. Unfortunately, direct measurement of Milestones effect may be difficult (or impossible) to measure, as competency is reflected in patient outcomes, which are often missing or highly variable across specialties. Even with high-quality outcomes data from the National Surgical Quality Improvement Project (NSQIP), it would be a challenge for researchers to correlate patient outcomes with competency-based evaluation strategies because of a variety
Journal of Surgical Education Volume ]/Number ] ] 2017
3
RESULTS A total of 132 surgical PDs responded to the survey (60%). On average, respondents had 7.8 years (SD ¼ 6.0) of experience in their role as PD. Training programs ranged in size, with an averaged 31 residents per program (range: 5-80). Most PDs reported spending 1.5 hours to complete an individual residents Milestone evaluation (mean: 94 ⫾ 38 min). Milestones were most frequently completed on a twice annual basis (88%); whereas the second largest grouping used Milestones 12-times annually at the completion of 1-month rotations (1.5%). There no significant differences in time spent completing the milestones from PDs who had more experience or those in larger programs. A statistically significant majority of PDs reported favorable views of the Milestones for many of the dimensions evaluated including value for education (55%), assessment of resident performance (58%), as well as ability of Milestones to provide unbiased feedback (55%) and to identify areas of resident deficiency (58%). Although respondents’ opinions were split on whether they had enough resources to complete evaluations, most felt that the Milestones were an improvement over their previous evaluation system (55%) (Table 1). The survey results showed little perceived effect of the Milestones beyond resident evaluations; and no perceived benefit for teaching of residents (74%). The Milestones did not change most respondents’ feelings toward being a PD (62%), and had no effect on their career plans (55%). Most respondents disagreed that data from the current Milestones would effectively distinguish successful training programs from underperforming programs (54%). Although most PDs (57%) agreed that the Milestones have moved their evaluative process toward competency (the central goal of the Milestones Project), many felt that the Milestones should be more customizable for their programs (39%), and a plurality believed that the current Milestones are a good start but should be modified going forward (46%) (Table 2).
CONCLUSIONS
of confounding variables (e.g., contemporaneous changes in duty hours or healthcare legislation). Nevertheless, these survey results suggest that PDs feel the Milestones have had a positive effect on resident assessment. This may result from identifying specific areas of deficiency for trainees and helping to tailor educational objectives to mitigate these weaknesses.
REFERENCES 1. Swing SR. The ACGME outcome project: retrospec-
tive and prospective. Med Teach. 2007;29(7):648-654. 2. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the
general competencies of the accreditation council for graduate medical education: a systematic review. Acad Med. 2009;84(3):301-309. 3. Iobst WF, Sherbino J, Cate OT, et al. Competency-
based medical education in postgraduate medical education. Medl Teach. 2010;32(8):651-656. 4. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next
GME accreditation system—rationale and benefits. N Engl Jhga.5197p41s0tive.Nttive.