ORIGINAL REPORTS
The Program Evaluation Committee in Action: Lessons Learned From a General Surgery Residency’s Experience Shanley B. Deal, MD,* Heather Seabott, BA,* Lily Chang, MD,† and Adnan A. Alseidi, MD, EdM† *
Graduate Medical Education, Virginia Mason Medical Center, Mailstop H8-GME, Seattle, Washington; and †Department of General, Thoracic, and Vascular Surgery, Virginia Mason Medical Center, Mailstop C6GS, Seattle, Washington OBJECTIVE: To evaluate the success of the annual program
evaluation process and describe the experience of a Program Evaluation Committee for a General Surgery residency program. DESIGN: We conducted a retrospective review of the
Program Evaluation Committee’s meeting minutes, data inputs, and outcomes from 2014 to 2016. We identified top priorities by year, characterized supporting data, summarized the improvement plans and outcome measures, and evaluated whether the outcomes were achieved at 1 year. SETTING: Virginia Mason Medical Center General Surgery
Residency Program. PARTICIPANTS: Program Evaluation Committee members including the Program Director, 2 Associate Program Directors, 2 Senior Faculty Members, and 1 Resident. RESULTS: All outcome measures were achieved or still in progress at 1 year. This included purchasing a GI mentor to improve endoscopic simulation training, establishing an outpatient surgery rotation to increase the volume of cases, and implementing a didactic course on adult learning principles for faculty development to improve intraoperative teaching. Primary reasons for slow progress were lack of follow-through by delegates or communication breakdown. CONCLUSIONS: The annual program evaluation process
has been successful in identifying top priorities, developing action plans, and achieving outcome measures using our We would like to thank the Program Evaluation Committee members at Virginia Mason Medical Center for their work over the past 3 years. This research was made possible by support from the Patterson Surgery Research Endowment at Virginia Mason Medical Center, Seattle, WA. Correspondence: Inquiries to. Shanley B. Deal, MD, Virginia Mason Medical Center, Mailstop H8-GME, 1100 9th Ave, Seattle, WA 98101; e-mail: Shanley.
[email protected]
C 2017 systematic evaluation process. ( J Surg Ed ]:]]]-]]]. J Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.)
KEY WORDS: Program Evaluation in Medical Education,
surgical education, general surgery, professionalism COMPETENCIES: Practice-Based Learning and Improve-
ment, Professionalism, Systems-Based Practice
INTRODUCTION Over the past 2 decades, the methods by which we evaluate and improve Graduate Medical Education have dramatically changed. Residency programs have been mandated to conduct an annual program evaluation (APE) as directed by the Accreditation Council for Graduate Medical Education (ACGME) since July 2013.1,2 It is the responsibility of the residency programs’ Program Evaluation Committee (PEC) to conduct the APE and implement changes. Programs are encouraged to initiate this process and ensure ongoing improvement. Program directors, coordinators, faculty, residents, alumni, and staff all have a role in this annual review directly or indirectly. Typically, data are gathered from surveys and standardized evaluations. However, few resources are available to help guide this process. For this reason, each individual program tends to design its own annual program evaluation, create tools for documentation, and determine how best to conduct and respond to results of their findings with little guidance. For many General Surgery programs, this task is time-consuming and frustrating because of competing demands, delegation of tasks to busy faculty, and limited guiding resources to address the unique needs of technical training residency programs. The objective of this retrospective review is to analyze the process our PECs uses to set priorities, determine what action plans were developed, and assess whether improvement was achieved in measured outcomes
Journal of Surgical Education & 2017 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2017.06.026
1
on an annual basis. We hypothesized that the APE would be successful as measured by achievement of action plan goals.
MATERIALS AND METHODS The principal investigator with the assistance of a research librarian conducted a literature review of PubMed, Embase, and the Cochrane databases to identify publications that included any of the keywords: ACGME, annual program evaluation, residency, residency program, program evaluation committee, and general surgery. We chose to include articles published after the ACGME requirement for an annual program evaluation was implemented. These articles were reviewed and contributed to our program evaluation process. Additionally, the Joint Committee on Standards for Educational Evaluation support the framework for our process improvement structure.3 These proposed models for evaluation from the literature, materials available through the ACGME, and resources provided by the Association of Program Directors in Surgery (APDS), in combination with the Virginia Mason Production System lean methodology, were used to develop a robust, visual, stepwise approach to creating a program evaluation plan starting in 2014 at our institution. To explicate what was learned, we will review the methods and results from our APE experience between 2014 and 2016. In 2014, we established a PEC composed of the following members: Program Director, Associate Program Director, 2 Senior Faculty Members, Program Coordinator, and Research Resident. The committee meets biannually, in February and July, to review all pertinent source data that are summarized in Table 1. The source data include survey results of stakeholders, examination pass rates, and clinical competency committee findings. Data were summarized into a report by the PEC Committee Chair and provided to
PEC members for review before each meeting. Priorities were chosen by the PEC committee based on review of source data to determine areas for improvement, implement new program mandates, and develop plans to improve weakness or threats. The PEC members determined action plans, goals, and outcome measures. Delegates to monitor each priority were primarily PEC members, but faculty or residents were also assigned as delegates when their interest or expertise aligned with the priority. No process existed to track progress between biannual meetings or remind delegates to complete tasks. Meeting minutes of the PEC from 2014 to 2016 and results from all source data were compiled for review. A sample template of our PEC annual meeting report is provided to guide readers in the creation of their own tool at their institution (Appendix 1). Using thematic analysis of written feedback, quantitative data review, and focused interviews with PEC members, we reviewed the top priorities for improvement, the action plan and specific goals, and evaluated actual outcome measures and progress at 1 year. These data were available from the annual meeting minutes and PEC reports. In addition, we analyzed perceived causes from our thematic analysis and focused interviews for why progress was slow or stopped for outcome measures not achieved at 1 year.
RESULTS We identified 242 publications that met our literature review criteria. The principal investigator reviewed all 242 abstracts, and 11 publications were relevant to the ACGME annual program evaluation in residency.4–14 No General Surgery– specific articles were identified. Four of these articles are of particular importance to the design of the annual program evaluation process and are summarized in Table 2.9,11,13,15
TABLE 1. Source Data Gathered for Review by the Program Evaluation Committee Annually (1) (2) (3) (4) (5) (6)
Faculty survey—institution developed Resident survey—institution developed Alumni survey—self assessment Alumni survey—program assessment ACGME annual survey of faculty and residents Performance and pass rates on the following examinations: (A) American Board of Surgery (ABS) in-service training examination (ABSITE) (B) ABS qualifying examination (written) (C) ABS certifying examination (oral) (7) Clinical competency committee (CCC)—meeting minutes and review resident progression including these data inputs: (A) Resident 360 evaluations (B) Faculty evaluations of residents (C) Resident ABSITE scores (D) Resident performance on mock oral examination (E) Resident technical performance (I) Fundamentals of laparoscopic surgery (FLS) performance (II) Fundamentals of endoscopic surgery (FES) performance (III) Rotation evaluation technical assessment 2
Journal of Surgical Education Volume ]/Number ] ] 2017
TABLE 2. Literature Review Summary of 4 Key References on Annual Program Evaluation Reference
Highlights
Tools
Specialty
Rose 2010
(1) Annual program review process (2) In-depth review of specific metrics based on literature review
(1) Annual program review metrics (2) Color-coded program report card tool
Anesthesia
Nadeau 2012
(1) Systematic process improvement (2) Faculty and resident input (3) SWOT (strengths, weaknesses, opportunities, and threats) analysis
(1) Data sources (2) APE meeting agenda outline
Family medicine
(1) ACGME self-study steps (2) Elements of documentation
N/A
(1) Self-study components and action items (2) Special considerations (3) Process timeline
Internal medicine
Philibert 2014
ACGME self-study review including: (1) Committee (2) Data collection (3) Program aims (4) Stengths and areas for improvement (5) Opportunities and threats
Guralnick 2015 How to organize, plan, and conduct the annual program evaluation.
The results of our program evaluation review are outlined in Table 3. We will discuss the findings by year. In 2014, 14 priorities were identified. Of these, 64% met the stated goals and achieved the desired outcome measure at 1-year review, 14% had a 2-year goal and it was not appropriate to gage achievement at 1 year, and 21% of the priorities were not achieved. We investigated why 2 of the 3 improvement plans did not meet the stated goals. Reasons were due to lack of follow-through as well as change of leadership by the delegate responsible to oversee progress. The third improvement plan that did not meet stated goals was due to lack of communication to chief residents to attend the day-long course that was stated as an improvement opportunity. In 2015, we identified 18 top priorities. After review, we found that 72% met stated goals at 1 year, 11% met the stated 2-year goal from 2014, 6% had a 2-year goal to be reassessed in 2016, and 11% of the priorities were not achieved. Two of the goals that were unsuccessful did not have specific outcome measures that were tracked. One of these goals was to increase resident scholarly activity by monitoring completion of a research activity by the end of the PGY-2 and PGY-5 year. A curriculum and culture to promote research activity was successful, but no specific plan was outlined to monitor outcome goals. In addition, no delegate was assigned to track progress or report to the committee, as this was overlooked for this priority. In 2016, we set 15 top priorities. We are unable to report our findings, as we have not reached our 1-year review. We did incorporate a new strategy to identify 2 key focus areas during this academic year. The first focus area aimed to implement a structured laparoscopic and endoscopic
curriculum to help residents prepare for the new fundamentals of endoscopic surgery requirement. Specifically, this would include semiannual proctored simulation training sessions, tracking performance metrics, and measuring pass rates on the fundamentals of endoscopic surgery examination. We will review our outcome measures at our 2017 annual program evaluation. However, we can report that we have been successful in implementing a curriculum and tracking performance metrics thus far. We attribute this success to delegating oversight to 2 individuals: a resident and our simulation coordinator who had particular interest in this priority. Our second focus area for 2016 was to revitalize our resident recruitment process. Our outcome measure was defined as faculty and resident satisfaction as well as participation. We restructured our website, applicant materials, and screening process. We added a preinterview social event and restructured our interview day. Survey of resident and faculty members will reveal successes, failures, and areas for improvement in this new process. Priorities, strategies, and outcome measures were variable over the past 3 years. On average, 9 priorities remained the same from year to year. Lack of follow-through by delegates responsible for monitoring specific outcome measures was the most frequent reason for lack of improvement or slow progress, followed by the absence of timelines or reminders to guide delegates in tracking outcome measures during the year.
Journal of Surgical Education Volume ]/Number ] ] 2017
3
CONCLUSIONS The purpose of the Program Evaluation Committee is to identify top priorities, develop an improvement plan, and
TABLE 3. Summary of Results From Retrospective Review of the Annual Program Evaluation Committee From 2014 to 2016 Year
Top Priority Identified
2014 Need to improve endoscopy training Supporting Data Alumni Survey (1) 70% of graduates felt unprepared to perform EGD/ Colonoscopy (2) 45% of alumni reported endoscopy as an important part of their practice 2015 Low case numbers for outpatient General Surgery cases Supporting Data
2016 Faculty development to improve intraoperative teaching Supporting Data Faculty Survey 485% of faculty reported interest in improving operative teaching skills
Improvement Plan
Outcome Measure
Acquire GI mentor and implement an FES curriculum Rationale (1) Accelerate junior resident skills before endoscopy rotation (2) Provide observed feedback during simulation sessions
Identify a high-volume outpatient surgery rotation for residents Rationale (1) Increase case volume (2) Provide community based, outpatient surgery experience Provide a didactic course on psychomotor skills teaching Rationale Improve faculty understanding of adult learning theory and how to teach psychomotor skills in the intraoperative setting
Outcome Measure Achieved?
(1) Acquire GI Mentor by 2015 (2) Implement FES simulation program by 2016 (3) Monitor FES pass rates beginning in 2018 (4) Reassess graduate preparedness in 2020
(1) Yes (2) Yes (3) Pending 2018 review (4) Pending 2020 review
(1) Establish rotation by 2016 (2) Send 1 resident for 1 rotation to determine benefit and case volume achieved (3) Establish permanent rotation 2017
(1) Yes (2) PGY4 Resident rotated, reported benefit, logged 140 cases over 7 weeks. (3) Yes
(1) Postcourse faculty survey satisfaction (475% satisfied) with the course (2) Review resident evaluations of faculty operative teaching to assess for improvement
(1) Pending 2017 review (2) Pending 2017 review
reevaluate outcome measures annually to determine if the outcome has been achieved or requires further attention. This process has brought about meaningful change in our residency program. Anecdotally and via survey, faculty and residents have reported that they feel the program is responsive to identified opportunities for improvement. Rather than serving as a committee to confirm compliance and check boxes, this living, purposeful group of people can significantly affect the growth and improvement of a General Surgery residency program. We have summarized our program’s success with identifying opportunities for improvement through this process and developing meaningful action plans. Lessons have been learned through participation in this process, and by the act of reflecting and analyzing on our progress over the past 3 years. We have identified 3 main challenges: first, assigning a delegate to each priority and developing a timeline for accomplishing the outcome is critical; second, we must define more precisely success or achievement of our intended outcome measure—whether numerically or descriptively—so that it is measurable and clear to all
stakeholders; finally, meeting biannually is essential to tracking progress and reinvigorating delegates who are responsible for guiding the achievement of annual outcome measures. Identifying a PEC member or administrative assistant who could periodically send reminders or use an automatic reminder system to keep delegates on track would be beneficial. In addition, using an authoritative figure to help ensure tasks are completed by having delegates “report out” on progress may improve adherence to timelines and action plans. Our graduate medical education department contributes to our monitoring progress as well. The PEC reports annually to the Graduate Medical Education Committee (GMEC). Outcome measures that require administrative oversight are identified to gain support for more challenging outcome measures. This process has been helpful to our PEC, and our report now contains a responsible delegate and target date for reassessing outcome measures. In addition, we have a reporting tool for the GMEC that assigns a point value system to our annual priorities in order to highlight priorities deemed “critical.” These
4
Journal of Surgical Education Volume ]/Number ] ] 2017
FIGURE. Comprehensive timeline for the annual program evaluation review process. Leadership meetings are reviews conducted by the Graduate Medical Education staff.
critical priorities then become our focus areas for the year and are monitored by the GMEC leadership more regularly. A comprehensive diagram of our annual program review process is summarized in the Figure. It is critical for programs to recognize that priorities are developed from external and internal data sources and must be evaluated in the context of changing priorities unique to their specialties’ needs. Examining all data collected for review by the PEC with a broad lens will help identify top priorities. The annual program evaluation process can go beyond fulfilling requirements. We propose that our approach may be individualized to program needs. We encourage programs to adopt this holistic approach to residency program
improvement and to use the materials and references accompanying this article. Residency programs need to be both self-reflective and proactive to improve their educational curriculum. It is important for programs to recognize that their priorities will adapt to program changes annually. Thus, having a flexible program evaluation process and modifying action plans are critical to developing a more meaningful approach to program evaluation. Through an intentional evaluative process, residency programs can restructure their curriculum to be aligned with learner, instructor, regulation, and marketplace changes in an efficient, straightforward manner. The annual program evaluation provides a starting point for the initiation of this process.
APPENDIX 1. TEMPLATE OF AN ANNUAL PROGRAM EVALUATION REPORT WITH EXAMPLES Annual Program Evaluation Graduate Medical Education (Institution Name) Date Submitted to GMEC: _____________ Committee Chair: Participating Committee Members:
Program ( Academic Year (
) )
Date Submitted to Program Director: ______________
The committee makes the following findings and performance improvement recommendations: Resident Performance Outcome Metrics/Measures
Improvement Opportunity
Action Plan
Outcome Measure/Monitoring Plan
Example 42% Surgery graduates completed scholarly work
Increase scholarly work participation
Appoint research mentors for each resident. Clarify expectation that all residents will declare a Project by their R2 year.
Percentage of R2s who have declared their project focus. Percentage of R3s who have begun work.
Faculty Development
Journal of Surgical Education Volume ]/Number ] ] 2017
5
Outcome Metrics/Measures
Improvement Opportunity
Example 475% of faculty requested training Provide a faculty development opportunity for operative teaching to improve intraoperative teaching skills. skills
Action Plan
Outcome Measure/Monitoring Plan
Implement a didactic session for faculty on adult learning and teaching psychomotor teaching skills.
Percentage of faculty requesting training on operative teaching skills decreases.
Action Plan
Outcome Measure/Monitoring Plan
Graduate Performance Outcome Metrics/Measures
Improvement Opportunity
Example 87% of graduates pass the oral board Increase the percentage of graduates Identify at-risk senior residents based Percentage of graduates who pass the oral board examination the first on mock oral performance and examination the first time passing the oral board examination time. provide additional mock oral the first time by 5% over 3 years practice before graduation or develop an individualized study plan
Program Quality Outcome Metrics/Measures Example Case numbers for bread and butter General Surgery cases in the 10th percentile.
Improvement Opportunity
Action Plan
Outcome Measure/Monitoring Plan
Addition of rotation to increase case Increase opportunity for case volume Identify rotation which could be volume. in outpatient General Surgery cases. added to the curriculum to increase Improvement in case numbers to 30th case volume in these areas. percentile.
Signature of Program Director: Date of Review by Program Faculty: Date of Review by GMEC:
REFERENCES 1. Martinez S, Robertson WW Jr., Philibert I. Initial tests
of the ACGME self study. J Grad Med Educ. 2013;5 (3):535-537. 2. Nasca TJ, Philibert Ingrid, Brigham T, Flynn TC. The
next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051-1056. 3. Joint Committee on Standards for Educational Evaluation:
WordPress & Atahualpa. Available from: 〈http://www. jcsee.org/program-evaluation-standards-statements〉; 2017.
4. Amedee RG, Piazza JC. Institutional oversight of the
Graduate Medical Education Enterprise: development
6
of an annual institutional review. Ochsner J. 2016;16(1): 85-89. 5. Balmer DF. The initiative for innovation in pediatric
education: a snapshot of a program evaluation. Pediatrics. 2012;129(6):1017-1018. 6. Bellini L, Shea JA, Asch DA. A new instrument for
residency program evaluation. J Gen Intern Med. 1997;12(11):707-710. 7. Bierer SB, Fishleder AJ, Dannefer E, Farrow N, Hull
AL. Psychometric properties of an instrument designed to measure the educational quality of graduate training programs. Eval Health Prof. 2004;27(4):410-424.
Journal of Surgical Education Volume ]/Number ] ] 2017
8. Gardner AK, Scott DJ, Choti MA, Mansour JC.
12. Phitayakorn R, Levitan N, Shuck JM. Program report
Developing a comprehensive resident education evaluation system in the era of milestone assessment. J Surg Educ. 2015;72(4):618-624.
cards: evaluation across multiple residency programs at one institution. Acad Med. 2007;82(6):608-615.
9. Guralnick S, Hernandez T, Corapi M, et al. The
ACGME self-study-an opportunity, not a burden. J Grad Med Educ. 2015;7(3):502-505. 10. Murray PM, Valdivia JH, Berquist MR. A metric to
evaluate the comparative performance of an institution’s graduate medical education program. Acad Med. 2009;84(2):212-219. 11. Nadeau MT, Tysinger JW. The annual program
13. Rose SH, Long TR. Accreditation Council for Grad-
uate Medical Education (ACGME) annual anesthesiology residency and fellowship program review: a report card model for continuous improvement. BMC Med Educ. 2010;10(1):13. 14. Torbeck L, Canal DF, Choi J. Is our residency
program successful? Structuring an outcomes assessment system as a component of program evaluation J Surg Educ. 2014;71(1):73-78.
review of effectiveness: a process improvement approach. Fam Med. 2012;44(1):32-38.
15. Philibert I, Lieh-Lai M. A Practical Guide to the ACGME
Journal of Surgical Education Volume ]/Number ] ] 2017
7
Self-Study. J Grad Med Educ. 2014;6(3):612-614.