Establishing Absolute Standards for Technical Performance

Establishing Absolute Standards for Technical Performance

S126 Scientific Forum Abstracts dependent on highly variable caseloads. This study aimed to determine the overall cost-effectiveness of a PROficienc...

83KB Sizes 3 Downloads 109 Views

S126

Scientific Forum Abstracts

dependent on highly variable caseloads. This study aimed to determine the overall cost-effectiveness of a PROficiency-based StePwise Endovascular Curricular Training (PROSPECT) program, including e-learning and hands-on virtual reality (VR) simulation. METHODS: A randomized controlled trial (RCT) was performed to assess endovascular performance after structured training (PROSPECT; n¼11) compared with solely e-learning (n¼10) or conventional training (n¼11). Costs for development of e-learning and VR simulation sessions were determined. Time spent studying and practicing within the curriculum was converted to indirect saving of operating time. Logistic costs, faculty time supervising simulation sessions, and 30-day complication rates were registered. RESULTS: Fifty-eight peripheral endovascular interventions, performed by 29 surgical trainees, were included in this RCT from October 2014 to February 2016. Yearly costs include 6,588.5V for curriculum design, 31,483.53V for implementation and 1,143.2V for operational costs. Per trainee at our university, simulation-based training until proficiency would require a total amount of 3,805.86V. In comparison, if endovascular proficiency levels would have been obtained during conventional training in the hybrid angiosuite this would cost 5,001.85V per trainee. CONCLUSIONS: Simulation-based training according to PROSPECT provides cost-effective endovascular training, mainly because training occurs outside the operating room. Structured proficiencybased, simulation-based training curricula should be included into surgical education. Establishing Absolute Standards for Technical Performance Mitchell G Goldenberg, MBBS, Alaina Garbens, MD, Peter Szasz, MD, Tyler M Hauer, Teodor P Grantcharov, MD, PhD, FACS University of Toronto, Toronto, ON INTRODUCTION: Standard setting methodologies have been used in medicine primarily for written examinations at the undergraduate and postgraduate level. The objective of this systematic review was to perform an in-depth review of the medical and surgical literature to identify studies that systematically establish cutoff values, focusing on procedural skill assessment. METHODS: A systematic review describing the use of standardsetting methodologies to assess performance, specifically focusing on procedural skills was conducted by searching MEDLINE, EMBASE, PsychINFO, and the Cochrane database of systematic reviews. Abstracts of retrieved studies were reviewed, and those meeting the inclusion criteria were selected for full-text review. Data were retrieved in a systematic manner, and validity and quality of evidence presented in the included studies was assessed using the Medical Education Research Study Quality Instrument (MERSQI). RESULTS: Of the 1,762 studies identified, 37 used standard-setting methodology for assessment of procedural skill (Table). Of these, 24 used participant-centered methods, and 13 used item-centered

J Am Coll Surg

methods. Twenty-eight studies took place in a simulated environment, and 9 studies were conducted in the clinical setting. The included studies assessed residents (26/38), fellows (6/38), and staff physicians (17/38). Seventeen articles were MERSQI graded as 14/ 18 or higher, while 20 did not meet this mark. Table.

Simulation, n

Clinical, n

Total studies, n

Mean MERSQI score out of 18 (range)

17 12 1

7 5 1

24 17 2

13.91 (12.5-15.5) 13.90 14.25

Generalized examinee-centered

0

1

1

14.5

Receiver operator characteristic curve Item-centered Angoff

4 11 11

0 2 2

4 13 13

13.63 13.17 (11-14.5) 13.17

(+ Hofstee) (+ Ebel) Total

5 1 28

1 0 9

6 1 37

13.41 14.00 13.70 (11-15.5)

Standard setting method Participant-centered Contrasting-groups Borderline-group

Assessment setting

CONCLUSIONS: The 37 studies included in this analysis demonstrate that absolute standard-setting methodologies can be used to establish cutoffs for procedural skill assessments, including those taking place in the clinical setting. Establishing benchmarks in technical skill is particularly important given the movement to introduce competency-based assessments into surgical training programs. Evaluation of a Statewide Surgical Coaching Program for Continuing Professional Development Lane L Frasier, MD, Hala N Ghousseini, PhD, Heather L Beasley, PhD, Sudha R Pavuluri Quamme, MD, Nicole A Brys, MPH, Douglas Wiegmann, PhD, Caprice C Greenberg, MD, MPH, FACS University of Wisconsin, Madison, WI INTRODUCTION: Multiple disciplines use coaching for continuing professional development, but studies identify barriers to implementation among practicing surgeons. We sought to develop and evaluate a video-based coaching program for boardeligible/certified surgeons. METHODS: Peer-nominated coaches from our state surgical society (n¼8) received training on core principles of coaching. Trained coaches were paired with participating surgeons (coachees, n¼11) using an a priori algorithm. After setting individual goals, surgeons recorded operations for video-based coaching sessions that were audio-recorded, transcribed, and analyzed. Program evaluation surveys were distributed to all participants. RESULTS: Coach-coachee pairs targeted technical, cognitive, and interpersonal aspects of performance. Specific topics included managing intraoperative stress and adopting new procedures. Participants rated the program highly. Coaches gave their training a 4.4