The Changing Landscape of Surgical Education: What are Residency Education Teams and do we Need Them?

The Changing Landscape of Surgical Education: What are Residency Education Teams and do we Need Them?

ORIGINAL REPORTS The Changing Landscape of Surgical Education: What are Residency Education Teams and do we Need Them? Nicole Woll, PhD, MEd, Marie H...

247KB Sizes 3 Downloads 43 Views

ORIGINAL REPORTS

The Changing Landscape of Surgical Education: What are Residency Education Teams and do we Need Them? Nicole Woll, PhD, MEd, Marie Hunsinger, RN, James Dove, BA, Linda Famiglio, MD, John Boker, PhD, and Mohsen Shabahang, MD, PhD Department of General Surgery, Academic Affairs, Geisinger Health System, Danville, Pennsylvania OBJECTIVES: This study aims to understand how general

surgery training programs constitute their residency education team (RET), how they define the roles of RET members, and how they measure success of the team. It fundamentally asks the question, “What is a RET and do we need one?” DESIGN AND PARTICIPANTS: Program directors, asso-

ciate program directors, educators, program coordinators, and chief residents from Accreditation Council for Graduate Medical Education (ACGME) general surgery training programs were asked to anonymously complete a survey categorized into 3 sections: (1) roles and responsibilities, (2) views of his/her RET and team members, and (3) general views about RETs. All respondents provided their opinions on the importance of a RET for administering and leading a surgical residency, whom the ideal members would be, and the main outcomes of a high-functioning RET. RESULTS: Respondents (n ¼ 167) included 59 (35.3%)

program directors, 16 (9.6%) associate program directors, 8 (4.8%) educators, 67 (40.1%) program coordinators, and 6 (3.6%) chief residents. Overall, 84.4% of respondents were a part of a RET, defined as 2 or more individuals who are responsible and accountable for oversight and conduct of the residency training program. RET respondents expressed statistically significantly and higher importance for a RET (p o 0.0001) than their non-RET counterparts. CONCLUSIONS: This study provides a snapshot of how

some associated with general surgery residencies view and value RETs. The results of this survey are preliminary and suggest a need for educators within surgery programs and ambiguity about the role of associate program director. It also suggests that a closer look at role responsibilities may be of value, especially in view of the changing landscape of Correspondence: Inquiries to Nicole Woll, PhD, MEd, Academic Affairs, Geisinger Health System, 100 North Academy Avenue, MC 13-34, Danville, PA 17822; e-mail: [email protected]

surgical education. Overall, most respondents felt that a RET was important to the main outcomes of a successful C 2015 Association of residency program. ( J Surg ]:]]]-]]]. J Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.) KEY WORDS: team, educator, associate program director,

surgical education COMPETENCY: Systems-Based Practice

INTRODUCTION The definition of “residency education team” (RET) differs among residency programs depending on their structure and needs. Although every program must have a program director and a program coordinator,1 individuals needed for program oversight, curriculum development, evaluation, and additional tasks remain unclear. Our program uses a team-based approach including the program director, 2 associate program directors, a nonphysician surgical educator, a program coordinator, and 3 chief residents. This group meets regularly and defines roles and responsibilities to accomplish daily tasks and long-term goals. This study aims to understand how other surgical training programs constitute their RET, how they define the roles of RET members, and how they measure success of the team. It fundamentally asks the question, “What is a RET and do we need one?” Surgical education has changed dramatically over the past 15 years, largely in response to several initiatives developed and implemented by the Accreditation Council for Graduate Medical Education (ACGME). These initiatives include the Outcomes Project (1999), Duty-Hour Restrictions (2003), the Next Accreditation System (NAS; 2013), and the current Milestones Project.2 The Outcomes Project had the goal of changing the focus of graduate medical education to improve training, to focus on outcomes, and to

Journal of Surgical Education  & 2015 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2015.02.005

1

allow public access to physician competencies. The framework of this focus was imbedded in the 6 core competencies designated to “competently and compassionately” treat patients in today’s changing health care system.3 The duty-hour restrictions limited the number and duration of residents’ work hours, and this dramatic change led many programs to rethink their curriculum and educational structure.2,4 Initial implementation of NAS aimed to increase peer review, to accelerate accreditation based on outcomes, and to reduce the burden with the current process.5 The educational milestones of NAS extend the core competencies and intend to provide annual educational outcomes data.6 In addition to the ACGME initiatives, dramatic development, growth, and changes occurred separately in surgical simulation, information technology, reimbursement for services, and research requirements.7 Particularly in the educational setting, such growth and evolution have a positive value in potentially enhancing outcomes, increasing learner and faculty satisfaction, and meeting standards set by accrediting bodies. In viewing these changes to medical education as improvements, it also creates a realization that change often requires additional resources. For example, the current requirements of program director and program coordinator may not be enough to achieve the desired outcomes of surgical education. A systematic review of program director surveys found, from 9 specialties represented, that time and financial constraints were the major barriers to accomplishing outcome-based education; furthermore, these administrative constraints may lead to dissatisfied program directors.8-10 One program developed a new surgical residency model consistent with duty-hour restrictions and, through their self-assessment, concluded that work limits do not deteriorate the educational experience but did produce new challenges and required additional resources.4 As the original aims of the Outcomes Project continue to be fulfilled and new initiatives developed, support for program directors and training programs will become more critical. We propose that a RET is an essential component of a training program to meet the demands of outcomes-based surgical education, but equally important is the definition of roles and responsibilities and measurement of team success.

about RETs. Participants provided comments and feedback to refine the survey items for use in the subsequent national study. National Study The national study was approved by the Institutional Review Board in September 2012 as exempt research. The first step identified program directors, associate program directors, and program coordinators from a comprehensive list of accredited general surgery programs maintained by the Association of Program Directors in Surgery. All identified individuals then received an e-mail containing the study description and the online survey link. The message specifically asked recipients to forward our e-mail to any other members of their RET, particularly educators and chief residents, who functioned in an administrative role in the residency. We defined “RET” for them as, “two or more individuals who are responsible and accountable for oversight and conduct of the residency training program.” Based on the Association of Program Directors in Surgery list, the response rate for program directors, associate program directors, and program coordinators was 23%, 18%, and 25%, respectively. The denominator for educators was not known, so a response rate for that role could not be calculated. Only 6 chief residents responded so their results were only used for aggregate comparisons and not intended to represent the opinions of this group. Survey respondents anonymously completed 25 structured-response items categorized into 3 sections: (1) roles and responsibilities, (2) views of his/her RET and team members, and (3) general views about RETs. We queried them about how long they held their current position, how they acquired it, and how they prepared themselves to assume the position. They also listed their 3 most important residency job functions from a list and indicated whether they now served on a RET. Finally, all respondents provided their opinions on the importance of a RET for administering and leading a surgical residency, whom the ideal members would be, and the main outcomes of a highfunctioning RET. Statistical Analyses

METHODS

A pilot study to validate the survey tool used in this research began in November 2011 by recruiting all program directors, associate program directors, educators, program coordinators, and chief residents at our institution into an Institutional Review Board–exempt survey.11 Specifically, participants were surveyed in 4 distinct areas: (1) demographics, (2) views of self as team member, (3) views of other team members and the team, and (4) general views

We performed all statistical analyses with SAS software, version 9.3 (SAS Institute Inc., Cary, NC). Categorical data were expressed as frequencies (percentages), and continuous data were expressed as median values (interquartile range [IQR]). Between-groups comparisons used chi-square or Fisher exact test for categorical data, Wilcoxon MannWhitney test for continuous data, and ordered logistic regression for ordinal data. Each comparison applied a nominal p o 0.05 for statistical significance. Subgroup analyses compared responses from RET vs non-RET members, large (430) vs small (r30) programs, and

2

Journal of Surgical Education  Volume ]/Number ]  ] 2015

Pilot Study

attending (program director/associate program director) vs nonattending physician (educator/program coordinator/ chief resident/other) members.

RESULTS Roles in the Residency Program Respondents (n ¼ 167) included 59 (35.3%) program directors, 16 (9.6%) associate program directors, 8 (4.8%) educators, 67 (40.1%) program coordinators, 6 (3.6%) chief residents, and 11 (6.6%) who categorized themselves as “other” (Table 1). Examination of narrative responses revealed that education managers, supervisors, and site directors mostly self-identified as “other.” Program directors and program coordinators comprised 75.4% of the survey sample. The majority (58.7%) of respondents reported holding their current role for less than 5 years. Respondents acquired their roles most often by formal application (37.7%), recruitment (24%), or assignment (22.2%). Almost one-half (47.9%) prepared themselves by “learning on the job.” Role Responsibilities in the Residency Program Respondents selected their 3 primary role functions from 10 provided options. A summed score computed from a weighted scale, where 3 ¼ “most important,” 2 ¼ “second most important,” and 1 ¼ “third most important,”

determined the top functions collectively and by role (Table 2). All respondents ranked recruitment and selection (score ¼ 243), program accreditation (233), and assessment and evaluation (199) as their 3 most important functions. When examined by separate roles, however, program directors and associate program directors both ranked assessment and evaluation (program director score ¼ 94; associate program director score ¼ 22), recruitment and selection (87 and 17), and curriculum development (70 and 16) as most important. Educators replicated the program director/associate program director functions, but ranked them differently, i.e., curriculum development (16), assessment and evaluation (12), and recruitment and selection (8). Program coordinators ranked their main functions as program accreditation (130), recruitment and selection (113), and assessment and evaluation (45). Selected subgroup analyses using ordered logistic regression further examined the aforementioned role responsibility data (Table 3). Comparing “large” (430 residents) vs “small” (r30) programs yielded a statistically significant difference, with the latter subgroup placing more importance on duty-hour enforcement (p ¼ 0.0097). Attending physician (program director/associate program director) vs nonattending physician (educator/program coordinator/ chief resident/other) yielded significant contrasts for 5 role responsibilities. That is, attending physicians ranked assessment and evaluation and curriculum development as more important (p ¼ 0.0005 and p o 0.0001, respectively). Conversely, nonattending physicians viewed rotation and

TABLE 1. Roles in the Residency Program No. (%) Demographic/Characteristic

PD

APD

E

Total no. of respondents 59 (35.3) 16 (9.6) 8 (4.8) Length of time in current role o1 y 12 (20.3) 2 (12.5) 1 (12.5) 1-5 y 32 (54.2) 7 (43.8) 6 (75) 6-10 y 8 (13.6) 6 (37.5) 0 11-15 y 4 (6.8) 0 0 415 y 3 (5.1) 1 (6.3) 1 (12.5) How did you acquire your position? Formally applied 29 (49.2) 3 (18.8) 2 (25) Assigned 17 (28.8) 8 (50) 1 (12.5) Part broader job description 0 1 (6.3) 2 (25) Recruited 3 (5.1) 3 (18.8) 1 (12.5) No one else to do it 6 (10.2) 0 0 Rotating responsibility 0 0 0 Other 4 (6.8) 1 (6.3) 2 (25) How did you prepare yourself for the position? (select all that apply)* Relevant education 11 (18.6) 0 3 (37.8) Relevant prior experience 23 (39) 3 (18.8) 4 (50) Same position elsewhere 11 (11.6) 2 (12.5) 3 (37.8) Mentored by team member 16 (27.1) 5 (31.2) 1 (12.5) Learned on job 29 (49.2) 10 (62.5) 1 (12.5) Other 2 (3.3) 1 (6.3) 0

PC

CR

Other

67 (40.1)

6 (3.6)

8 20 16 8 15

(11.9) (29.9) (23.9) (11.9) (22.4)

5 (83.3) 1 (16.7) 0 0 0

2 2 1 4 2

(18.2) (18.2) (9.1) (36.4) (18.2)

30 68 31 16 22

(18.0) (40.7) (18.6) (9.6) (13.2)

27 (40.3) 7 (10.4) 0 26 (38.8) 1 (1.5) 0 6 (9.0)

0 3 (50) 2 (33.3) 0 0 1 (16.7) 0

2 (18.2) 1 (9) 0 7 (63.6) 0 0 1 (9.0)

63 37 5 40 7 1 14

(37.7) (22.2) (3.0) (24.0) (4.2) (o1) (8.4)

18 16 17 18 33 4

1 1 3 1 3

5 5 2 4 4

38 52 38 45 80 7

(22.8) (31.1) (22.8) (27.0) (47.9) (4.2)

(26.9) (23.9) (25.4) (26.9) (49.3) (6.0)

(16.7) (16.7) (50) (16.7) (50) 0

11 (6.6)

Total

(45.5) (45.5) (18.2) (36.4) (36.4) 0

167

APD, associate program director; CR, chief resident; E, educator; PC, program coordinator; PD, program director. *Columns do not sum to 100% because respondents could select multiple responses to this item. Journal of Surgical Education  Volume ]/Number ]  ] 2015

3

TABLE 2. Role Responsibilities in the Residency Program Summed Scale Score Functions

PD

APD

E

PC

CR

What are the 3 most important functions you perform in this position? (using a weighted scale) 172 83 1132 3 Recruitment and selection 862 Promotion/retention/discipline 6 0 0 0 0 221 122 453 161 Assessment/evaluation 941 Curriculum development 703 163 161 27 1 Rotation/conference scheduling 4 8 0 29 92 Simulation training 1 4 5 1 0 Vacation requests 0 0 0 4 3 0 Program accreditation 62 9 6 1301 Duty-hour enforcement 5 1 0 23 43 Program evaluation 20 4 1 13 0 Other 4 9 0 12 0

Other

Total

162 4 103 1 0 0 0 261 2 6 1

2431 10 1993 131 50 11 7 2332 35 44 26

Note: Superscripts within a column denote most, second most, and third most important function, respectively. APD, associate program director; CR, chief resident; E, educator; PC, program coordinator; PD, program director.

conference scheduling, program accreditation, and dutyhour enforcement more importantly (p ¼ 0.0044, p ¼ 0.0001, and p ¼ 0.0046, respectively). Additional analysis yielded no significance for how attending physicians vs educator differentially ranked their 3 main role functions despite selecting identical responsibilities. Finally, program coordinators ranked program accreditation significantly higher in role importance than the rest of respondents (p o 0.0001). RET Status, Program Size, and Other Team Members Survey respondents answered whether they now served on a RET (as we defined it for them). Table 4 shows that

84.4% responded positively. Respondents also reported the total number of categorical and preliminary residents in their programs. The median number of residents for both RET and non-RET respondent’s was 29 (IQR: 1947). RET and non-RET respondents did not significantly differ in their respective number of residents (RET median ¼ 30 [IQR: 20-53]; non-RET median ¼ 24.5 [IQR: 17-34]). Only participants responding “yes” to being on a RET also answered the question, “What other staff are on your RET?” The responses showed that a large majority of respondents had a program director, associate program director, and program coordinator (97.2%, 78.7%, and 88.7%, respectively) on their RET with the roles of chief resident and educator less likely (62.4% and 37.6%, respectively).

TABLE 3. Subgroup Analysis From Table 2 (Ordered Logistic Regression) Comparison Group and p Value Functions

Large (430) vs Smallþ

PD/APD* vs E/PC/CR/Oþ

PD/APD vs E

What are the 3 most important functions you perform in this position? (Using a weighted scale) p Value Recruitment and selection 0.3634 0.4227 Promotion/retention/ 0.2542 0.4975 discipline Assessment/evaluation 0.4774 0.0005* Curriculum development 0.3824 o0.0001* 0.0692 Rotation/conference 0.2436 0.0044þ scheduling Simulation training 0.8614 Vacation requests 0.5004 0.8067 Program accreditation 0.8542 0.9562 Duty-hour enforcement 0.0097þ 0.0001þ Program evaluation 0.0930 0.0046þ Other 0.9278 0.3099

PC* vs PD/APD/E/CR/O

o0.0001*

Note: * or þ indicates which comparison group significantly ranked a function as more important. APD, associate program director; CR, chief resident; E, educator; PC, program coordinator; PD, Program director. 4

Journal of Surgical Education  Volume ]/Number ]  ] 2015

TABLE 4. Residency Education Team Status, Program Size, and Team Membership Responses

Are you a part of a residency education team (RET)? (no.(%)) Number of Residents in Program (Categorical and Preliminary) (Median (IQR)) What other staff are on your RET? Program Director Associate Program Director Educator Program Coordinator Chief Resident Other

YES

NO

Total

141 (84.4) 30 (20-53)

26 (15.6) 24.5 (17-34)

167 29 (19-47)

137 111 53 125 88 52

p Value 0.0656

(97.2) (78.7) (37.6) (88.7) (62.4) (36.9)

Note: RET definition—“two or more individuals who are responsible and accountable for oversight and conduct of the residency training program.”

General Views About RETs All respondents encountered a series of items measuring general opinions about RETs. Table 5 shows the results from analyses of subgroup responses to 1 question, “How important is a RET for the Administration and Leadership of a Residency Program?” RET respondents expressed statistically significantly and higher importance for a RET (p o 0.0001) than their non-RET counterparts. Likewise, the attending physician subgroup, compared

with nonattending physicians, also showed a statistically significant and higher degree of importance for RETs (p ¼ 0.0372). Finally, contrasting large (431) and small (r30) programs yielded no significant difference about RET importance. Next, all respondents—when presented with a list of roles including program director, associate program director, educator, program coordinator, and chief resident—answered the second question, “Who should comprise the ideal RET?” (Table 6). The RET subgroup rated associate program directors,

TABLE 5. General Views About Residency Education Teams How Important is a RET for the Administration and Leadership of a Residency Program? RET (no. (%)) n ¼ 141 Not at all To a little extent To some extent To a great extent To a very great extent

0 3 7 39 92

Non-RET (no. (%)) n ¼ 26

(0) (2.1) (5.0) (27.7) (65.3)

PD/APD (no. (%)) n ¼ 75

1 4 7 6 8

(3.9) (15.4) (26.9) (23.1) (30.8)

p Value o0.0001

E/PC/CR/O (no. (%)) n ¼ 92 0.0372

Not at all To a little extent To some extent To a great extent To a very great extent

1 6 9 20 39

(1.3) (8) (12) (26.7) (52)

Large (430) (no. (%)) n ¼ 89

1 5 25 61

0 (1.1) (5.4) (27.2) (66.3)

Small (r30) (no. (%)) n ¼ 78 0.0949

Not at all To a little extent To some extent To a great extent To a very great extent

1 6 8 28 46

(1.1) (6.7) (9) (31.5) (51.7)

0 1 (1.3) 6 (7.7) 17 (21.8) 54 (69.2)

APD, associate program director; CR, chief resident; E, educator; non-RET, not on a team; PC, program coordinator; PD, program director; RET, respondent now part of a residency education team. Journal of Surgical Education  Volume ]/Number ]  ] 2015

5

TABLE 6. General Views About Residency Education Teams Who Should Comprise the Ideal RET? RET (no. (%)) n ¼ 141 PD APD Educator Coordinator Chief resident

138 129 102 134 125

(97.9) (91.5) (72.3) (95.0) (88.7)

PD/APD (no. (%)) n ¼ 75 PD APD Educator Coordinator Chief resident Other

75 67 59 68 62 20

(100) (89.3) (78.7) (90.7) (82.7) (26.7)

Large (430) (no. (%)) n ¼ 89 PD APD Educator Coordinator Chief resident Other

86 72 63 77 72 20

(96.6) (80.9) (70.8) (86.5) (80.9) (22.5)

Non-RET (no. (%)) n ¼ 26 25 17 19 20 16

(96.2) (65.4) (73.1) (76.9) (61.5)

p Value 0.50 0.0012 0.99 0.0065 0.0016

E/PC/CR/O (no. (%)) n ¼ 92 88 79 62 86 79 24

(95.7) (85.9) (67.4) (93.5) (85.9) (26.1)

0.1283 0.5019 0.1047 0.5000 0.5701 0.9326

Small (r30) (no. (%)) n ¼ 78 77 74 58 77 69 24

(98.7) (94.9) (74.4) (98.7) (88.5) (30.8)

0.6238 0.0066 0.6061 0.0031 0.1787 0.2246

APD, associate program director; CR, chief resident; E, educator; non-RET, not on a team; PC, program coordinator; PD, program director; RET, respondent now part of a residency education team. p Values in bold represent the values that are o 0.05.

program coordinators, and chief residents as significantly more ideal (p ¼ 0.0012, p ¼ 0.0065, and p ¼ 0.0016, respectively) than non-RET respondents. No significant subgroup differences in ideal RET role composition resulted from comparing attending physicians vs nonattending physician responses. However, respondents from large (431) vs small (r30) residencies did produce significant differences, with smaller programs viewing the associate program director and program coordinator roles as more ideal for RET composition. The third general question about RETs asked all respondents, “What do you think are the main outcomes of a wellfunctioning RET?” Respondents selected as many outcomes that they felt applied from a list of 11 options (Table 7). Both RET and non-RET respondents selected satisfied residents (88.7% and 76.9%, respectively) and smoothly running program (81.6% and 88.5%, respectively) as the top 2 outcomes. Statistically significant RET vs non-RET differences emerged for the outcomes of few to no citations (p ¼ 0.0424) and greater recognition from peers and colleagues (p ¼ 0.0410); RET respondents more highly favored both outcomes. Similarly, attending physicians and nonattending physician subgroups selected satisfied residents (78.7% and 93.5%, respectively) and smoothly running program (77.3% and 87%, respectively) as the top 2 main team outcomes. The nonattending physician subgroup also viewed 8 outcomes as

being significantly more important to a well-functioning RET than their attending physician peers. Those outcomes included high in-training examination scores (p ¼ 0.0001), satisfied residents (p ¼ 0.0049), updated and current curriculum (p ¼ 0.0102), few to no accreditation citations (p ¼ 0.0100), low attrition rates (p ¼ 0.0198), greater recognition from institutional leadership (p ¼ 0.0001), greater recognition from peers and colleagues (p ¼ 0.0016), and enhanced program reputation and credibility (p ¼ 0.0003). The last subgroup analysis compared large (431) vs small (r30) programs, and again showed the same top 2 main outcomes of satisfied residents (84.3% and 89.7%, respectively) and smoothly running program (79.8% and 85.9%, respectively). No statistically significant difference between these 2 subgroups occurred for any of the 11 listed main outcomes.

6

Journal of Surgical Education  Volume ]/Number ]  ] 2015

DISCUSSION Roles and the RET One particular role in surgical education with an increased presence in recent years is that of the nonphysician surgical educator. This survey had a low educator representation and comprised only 4.8% of respondents (Table 1). However, when asked about ideal RET composition, 72.3% of RET

TABLE 7. General Views About Residency Education Teams What do you Think are the Main Outcomes of a Well-Functioning RET? (Select all That Apply)* RET (no. (%)) n ¼ 141 High board passage rate High in-training examination scores Satisfied residents Satisfied faculty Smoothly running program Updated/current curriculum Few/no accreditation citations Low attrition rates Greater recognition from institutional leadership Greater recognition from peers/colleagues Enhanced program reputation/credibility Other

83 63 125 94 115 99 100 69 55 53 95 8

(58.9) (44.7) (88.7) (66.7) (81.6) (70.2) (70.9) (48.9) (39.0) (37.6) (67.4) (5.7)

PD/APD (no. (%)) n ¼ 75 High board passage rate High in-training examination scores Satisfied residents Satisfied faculty Smoothly running program Updated/current curriculum Few/no accreditation citations Low attrition rates Greater recognition from institutional leadership Greater recognition from peers/colleagues Enhanced program reputation/credibility Other

36 20 59 46 58 44 43 28 15 16 38 4

(48) (26.7) (78.7) (61.3) (77.3) (58.7) (57.3) (37.3) (20) (21.3) (50.7) (5.3)

Large (430) (no. (%)) n ¼ 89 High board passage rate High in-training examination scores Satisfied residents Satisfied faculty Smoothly running program Updated/current curriculum Few/no accreditation citations Low attrition rates Greater recognition from institutional leadership Greater recognition from peers/colleagues Enhanced program reputation/credibility Other

49 42 75 55 71 59 57 45 29 30 55 4

(55.1) (47.2) (84.3) (61.8) (79.8) (66.3) (64) (50.6) (32.6) (33.7) (61.8) (4.5)

Non-RET (no. (%)) n ¼ 26 11 9 20 15 23 16 13 10 5 4 14 1

(42.3) (34.6) (76.9) (57.7) (88.5) (61.5) (50.0) (38.5) (19.2) (15.4) (53.9) (3.9)

p Value 0.14 0.39 0.12 0.38 0.57 0.49 0.0424 0.39 0.07 0.0410 0.19 0.99

E/PC/CR/O (no. (%)) n ¼ 92 58 52 86 63 80 71 70 51 45 41 71 5

(63) (56.5) (93.5) (68.5) (87) (77.2) (76.1) (55.4) (48.9) (44.6) (77.2) (5.4)

0.0512 0.0001 0.0049 0.3347 0.1025 0.0102 0.0100 0.0198 0.0001 0.0016 0.0003 0.9999

Small (r30) (no. (%)) n ¼ 78 45 30 70 54 67 56 56 34 31 27 54 5

(57.7) (38.5) (89.7) (69.2) (85.9) (71.8) (71.8) (43.6) (39.7) (34.6) (69.2) (6.4)

0.7319 0.2558 0.2967 0.3141 0.2974 0.4436 0.2854 0.3679 0.3360 0.9018 0.3141 0.7352

APD, associate program director; CR, chief resident; E, educator; non-RET, not on a team; PC, program coordinator; PD, program director; RET, respondent now part of a residency education team. *Columns do not sum to 100% because respondents could select multiple responses to this item.

and 73.1% of non-RET respondents nominated an educator (Table 6), and 37.6% of RET respondents reported having an educator on their team (Table 4). In addition, subgroup comparisons of opinions about educators yielded no statistically significant differences for RET vs non-RET, attending physician vs nonattending physician, and large vs small programs. That is, all subgroups agreed that an ideally comprised RET would include an educator (Table 6). This finding suggests a growing trend and need for educators

within surgery residency programs, and several recent publications support the role of the nonphysician surgical educator.12-14 Summarizing the literature’s conclusions, “professional educators provide support needed to meet the growing demands and requirements of surgical education,”12 “nonphysician educators serve as vital members to the team,”13 and “one strategy for assisting with the increase in program director workload that has accompanied the changes in surgical education is to hire nonphysician educators with relevant

Journal of Surgical Education  Volume ]/Number ]  ] 2015

7

education and experience in curriculum design, teaching techniques, adult learning theories, and research methods.”14 Interestingly, only 22.8% of respondents answered that they had “relevant education” in preparation for their residency positions. The latter finding clearly suggests a need for educational content expertise (Table 1). Compared with the nonphysician educator in surgery or any other specialty, little published literature addresses the associate program director role. The majority (50%) of surveyed program directors indicated that they acquired their position by being “assigned,” and 62.5% learned their role “on the job” (Table 1). Associate program directors and program directors listed and ranked the same 3 most important residency role functions, suggesting associate program directors perhaps tend to shadow the role of the program director and do not necessarily contribute to efficient workload distribution (Table 2). That said, 78.7% of RET respondents listed an associate program director on their team, so clearly associate program directors occupy a wellestablished team presence (Table 4). However, our survey findings showed some divergence in opinion about whether an associate program director is an ideal RET role. Subgroup analyses produced statistically significant differences in opinions about the associate program director as an ideal team member, with respondents from RETs and smaller programs both viewing this role as more ideal (Table 6). We attribute these differences in opinion more to the lack of clarity in the associate program director’s job description and less so to the necessity of the role. Thus, the need exists to better define and use the associate program director’s role in conjunction with the program director and educator to navigate successfully the current and evolving accreditation and other expectations of surgical residencies. Role Responsibilities and Main Outcomes To meet all demands of surgical training successfully, a wellfunctioning RET seems necessary. Most survey respondents agreed that a RET was important for residency programs. For example, 93% of RET and 53.9% of non-RET respondents answered either “to a great extent” or “to a very great extent” about RET importance. Other subgroups generally replicated (range: 78.7%-93.5%) the RET respondents’ high positive endorsement (Table 5). In 2008, Arora and Kaplan10 described the needs of surgery program directors and concluded, “most Program Directors do not have protected time, and some feel insufficient institutional resources are available for their responsibilities.” Results presented in Tables 2 and 3 suggest we perhaps not only must think about who is on our education team, but also what job functions should be assigned to each role and how to ensure all responsibilities are covered adequately. From the survey results, only 1 or none of the respondents ranked several job functions in their top 3. For example, none of the following job functions appeared among the top 3 of any role group: promotion, retention, and discipline; simulation training; vacation requests; and—rather interesting given the current medical education landscape— 8

program evaluation. Only 5 respondents ranked promotion, retention, and discipline among their 3 most important job functions (raw frequency data from Table 2, not shown). Surely the main functions of the program should be consistent among team members. However, if a majority of the team focuses on just 1 or 2 job functions, a reasonable question to ask is, “who is taking responsibility or devoting significant attention to other areas of the program?” In addition, “if only one team member’s role focuses on a job function, is that enough?” Attrition from surgical residency programs continues to be a major challenge.15,16 The 2012-2013 ACGME Data Resource Book lists an attrition rate of 4.8%.17 Considering that low attrition is an essential aspect of successful program growth and development, respondents’ failure to ascribe primary importance to the promotion, retention, and discipline job function was notable. Respondents consistently listed low attrition rates among the main outcomes of a well-functioning RET (Table 7). However, the nonattending physician group viewed low attrition rates significantly more important than their attending physician peers (p ¼ 0.0198). Perhaps we either have placed insufficient emphasis on attrition in surgical training or we have not given the roles of the nonattending physician enough responsibility in this area. Currently the American Board of Surgery mandates 2 graduation requirements for simulation: Fundamentals of Laparoscopic Surgery and Fundamentals of Endoscopic Surgery (starting with graduating residents in 2017-2018).18 Despite the known benefits of simulation training and increasing training requirements, no team role group ranked this function as a top 3 responsibility (Table 2). In fact, only 6 individual respondents ranked simulation training as a top function at all (raw frequency data from Table 2, not shown). Perhaps decentralized resources exist at institutions to address simulation training needs, e.g., a simulation center staffed by a dedicated simulation coordinator. The latter role falls outside of our definition of a RET in this survey. A similar discussion can be had for duty-hour enforcement and program evaluation as well. The fundamental question remains, whom do we need in our education teams to fulfill all job functions, and how do we best match individual role members with the separate RET job functions? When we asked respondents about their main functions in the residency, we provided the option of “other” with free-text space available. This option identified 19 other unique job functions including faculty development, call schedules, mentoring, counseling, budget, and payroll, among others. This suggests that other team responsibilities exist that this survey did not consider. Thus, although the investigators set and consistently applied a reasonable definition for RET, they also recognize that alternative definitions certainly are tenable.

LIMITATIONS The findings of this work must be interpreted in the context of the survey methodology. One issue is the definition of a RET Journal of Surgical Education  Volume ]/Number ]  ] 2015

and the specification of team role functions. This work should be taken as a stimulus for an ongoing discussion to clarify the roles and functions of RETs in general surgery residency training and perhaps other specialties as well. A second issue pertains to sampling adequacy and representativeness. We deliberately administered the online survey anonymously so we do not know how many different programs participated or the extent to which program responses overlapped, i.e., more than 1 individual from the same program responded. Surely, some team roles were underrepresented, but it was difficult to estimate response rates because of how we collected the data. There were only 8 educators that responded, but we do not know what the denominator for this role truly is for several reasons. One reason is that we have not, as a surgical community, defined this role completely to catalog the current membership. There have been some attempts in the literature suggesting there are approximately 40 nonphysician surgical educators, which would suggest about a 25% response rate for this survey.13 The role of the chief resident was inadequately represented, with only 6 respondents, and therefore no generalizable conclusions can be made as to the opinion of this role. However, the educator and chief resident data were included as part of the aggregate data for the “nonattending physician” vs “attending physician” comparisons.

CONCLUSIONS This study provides a snapshot of how some associated with surgery residencies view and value RETs. The results of the survey identified a need for educators within surgery programs and ambiguity about the role of associate program director. It also suggests that a closer look at role responsibilities may be of value. Overall, most respondents felt that a RET was important to the main outcomes of a successful residency program.

4. Schneider JR, Coyle JJ, Ryan ER, Bell RH, DaRosa DA.

Implementation and Evaluation of a New Surgical Residency Model. J Am Coll Surg. 2007;205(3):393-404. 5. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next

GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051-1056. 6. Cogbill TH, Malangoni MA, Potts JR, Valentine RJ.

The General Surgery Milestones Project. J Am Coll Surg. 2014;218(5):1056-1062. 7. Pellegrini CA, Warshaw AL, Debas HT. Residency

training in surgery in the 21st century: a new paradigm. Surgery. 2004;136(5):953-965. 8. Malik MU, Diaz Voss Varela DA, Stewart CM, et al.

Barriers to implementing the ACGME outcome project: a systematic review of program director surveys. J Grad Med Educ. 2012;4(4):425-433. 9. Beasley BW, Kern DE, Kolodner K. Job turnover and

its correlates among residency program directors in internal medicine: a three-year cohort study. Acad Med. 2001;76(11):1127-1135. 10. Arora TK, Kaplan BJ. Who are surgery program

directors and what do they need? J Surg Educ. 2008;65(6):504-511. 11. Woll N, Boker J, Famiglio L, Hunsinger M,

Shabahang M. Characteristics and perceptions of residency education teams. Poster Presentation. AAMC Annual Meeting. San Francisco, CA, November 2012. 12. Mendoza KA, Hauge LS, DaRosa D. The responsibilities

and contributions of professional educators in surgery departments. Am J Surg. 2004;188(2):126-130. 13. Tarpley MJ, Davidson MA, Tarpley JL. The role of

REFERENCES 1. ACGME Program Requirements for Graduate Medical

Education in General Surgery. Available at: 〈http:// www.acgme.org/acgmeweb/Portals/0/PFAssets/Program Requirements/440_general_surgery_01012008_07012 012.pdf〉.

2. Available at: 〈https://www.acgme.org/acgmeweb/〉.

the nonphysician educator in general surgery residency training: from outcome project and duty-hours restrictions to the next accreditation system and milestones. J Surg Educ. 2014;71(1):119-124. 14. Torbeck L, Sidhu R, Smink DS, Peyre SE. How to recruit,

retain, and reap the rewards of working with PhD/EdD educators in surgery. J Surg Educ. 2013;70(2):212-216. 15. Longo WE. Attrition: our biggest continuing chal-

lenge. Am J Surg. 2007;194(5):567-575.

3. Lee THL, Berger DH, Awad SS, Brandt ML, Bruni-

16. Bell RH Jr, Banker MB, Rhodes RS, Biester TW, Lewis

cardi FC. Accreditation council for graduate medical education core competencies. In: Brunicardi F Charles, Andersen Dana K, Billiar Timothy R, Dunn David L, Hunter John G, Matthews Jeffrey B, Pollock Raphael E, editors. Schwartz’s Principles of Surgery. 9th ed. New York: McGraw-Hill Companies, Inc, 2010 [chapter 1].

FR. Graduate medical education in surgery in the United States. Surg Clin North Am. 2007;87(4):811 v-vi.

Journal of Surgical Education  Volume ]/Number ]  ] 2015

17. ACGME. Graduate Medical Education Data Resource

Book Academic Year 2012-2013. Chicago: Accreditation Council for Graduate Medical Education; 2013. p. 65. 18. Available at: 〈http://www.abs.surgery.org〉.

9