Acquiring basic surgical skills: Is a faculty mentor really needed?

Acquiring basic surgical skills: Is a faculty mentor really needed?

The American Journal of Surgery (2009) 197, 82– 88 Association for Surgical Education Acquiring basic surgical skills: Is a faculty mentor really ne...

857KB Sizes 6 Downloads 199 Views

The American Journal of Surgery (2009) 197, 82– 88

Association for Surgical Education

Acquiring basic surgical skills: Is a faculty mentor really needed? Aaron R. Jensen, M.D., M.Ed.a,b,*, Andrew S. Wright, M.D.a, Adam E. Levy, M.D.a, Lisa K. McIntyre, M.D.a, Hugh M. Foy, M.D.a, Carlos A. Pellegrini, M.D.a, Karen D. Horvath, M.D.a, Dimitri J. Anastakis, M.D., M.H.P.E., M.H.C.M.c a

Department of Surgery, University of Washington, School of Medicine, Room BB-487, 1959 NE Pacific St., Box 356410, Seattle, WA 98195, USA; bCollege of Education, University of Washington, Seattle, WA, USA; cDivision of Plastic Surgery, University of Toronto, Toronto, Ontario, Canada KEYWORDS: Surgical education; Resident training; Simulation; Basic skills; Knot-tying; Suturing; Skin closure; Bowel anastomosis; Transfer

Abstract BACKGROUND: We evaluated the impact of expert instruction during laboratory-based basic surgical skills training on subsequent performance of more complex surgical tasks. METHODS: Forty-five junior residents were randomized to learn basic surgical skills in either a self-directed or faculty-directed fashion. Residents returned to the laboratory 2 days later and were evaluated while performing 2 tasks: skin closure and bowel anastomosis. Outcome measures included Objective Structured Assessment of Technical Skill, time to completion, final product quality, and resident perceptions. RESULTS: Objective Structured Assessment of Technical Skill, time to completion, and skin esthetic ratings were not better in the faculty-directed group, although isolated improvement in anastomotic leak pressure was seen. Residents perceived faculty-directed training to be superior. CONCLUSIONS: Our data provided minimal objective evidence that faculty-directed training improved transfer of learned skills to more complex tasks. Residents perceived that there was a benefit of faculty mentoring. Curriculum factors related to training of basic skills and subsequent transfer to more complex tasks may explain these contrasting results. © 2009 Elsevier Inc. All rights reserved.

Historically, the training and development of technical skills has been largely performed in the operating room. A number of factors, including resident work hour restrictions, societal concerns, and financial pressures have transitioned None of the authors has received financial support or has a conflict of interest related to this study to disclose. University of Washington Human Subjects Approval # 06-1593-E/A 01. Presented at the 2008 Association for Surgical Education, April 17, 2008, Toronto, Ontario, Canada. * Corresponding author: Tel.: ⫹1-206-616-5687; fax: ⫹1-206-543-8136. E-mail address: [email protected] Manuscript received May 16, 2008; revised manuscript June 2, 2008

0002-9610/$ - see front matter © 2009 Elsevier Inc. All rights reserved. doi:10.1016/j.amjsurg.2008.06.039

a significant portion of this training to the simulation laboratory. Laboratory-based training is often unsupervised, in contrast to learning in the operating room where there is almost always a faculty member or more senior resident present. In a self-directed skills laboratory, trainees must rely on self-assessment to ensure that proper techniques are learned and practiced. This model creates the potential for trainees to develop bad habits that then need to be unlearned before proper technique can be learned. Real-time feedback during basic skills training has the potential to minimize the development of poor technique. Feedback can be either formative or summative, and either computer- or expert-

A.R. Jensen et al.

Acquiring basic skills

administered. One of the main limitations of expert-administered feedback is the cost involved in taking large numbers of faculty from clinical practice to the simulation laboratory. Many surgical residencies minimize this burden by convening knot-tying workshops where 1 or 2 faculty members proctor large-group training sessions. Although this is attractive, research suggests that the ideal student:instructor ratio for teaching suturing and knot-tying is much smaller (4:1).1 For the purposes of communication among surgical educators, a standardized taxonomy has been laid out defining various aspects of surgical psychomotor training including the following: (1) technical skills, (2) complex tasks, and (3) complete procedures.2 Technical skills consist of the fundamental building blocks of surgical technique; such as instrument handling, incision making, performing dissection, knot-tying, ligating structures, and suturing. Complex tasks require the integration of technical skills and include items such as excision, wound closure, and anastomosis. Complete procedures require the integration of multiple tasks, such as the combination of the following: (1) laparotomy, (2) bowel resection, (3) bowel anastomosis, and (4) closure of the abdomen. The ultimate goal of laboratory-based training is the transfer of a learned skill from the laboratory to the operating room. The concept of transfer refers to a learner’s ability to apply learned knowledge or skills in different contexts, and provides an important index for evaluating how well an individual has learned.3 Such transfer has been shown for laparoscopic procedures after simulated skill training,4 – 6 but studies of simulation-based open surgical skill training have been less successful in showing transfer. A study of a 2-year partial-task simulation-based curriculum failed to show any significant transfer of learned skills to the operating room.7 The authors hypothesized that rather than focusing on tasks, laboratorybased training would be more productive if it focused on basic technical skills and afforded ample time for the supervised practice of psychomotor skills. Working on the assumption that superior training in basic technical skills would lead to improved performance of more complex tasks, we aimed to study the effect of expert-directed basic technical skills training and its impact on differential transfer of those learned skills to more complex tasks. We hypothesized that the inclusion of a faculty expert in a laboratory-based basic technical skills training session, as compared with self-directed training of basic skills, would lead to improved performance on 2 complex tasks.

83 have time protected for laboratory-based technical skills training. Study procedures were performed over the course of 12 months at the University of Washington Institute for Surgical and Interventional Simulation. Participation in basic skills training and subsequent task performance were required by the residency program, but participation in the study (ie, outcome measure assessment) was voluntary. Study participation was kept confidential from the residents’ files, as were outcome measures and survey responses obtained. Institutional review board approval was obtained for this study and all subjects provided written informed consent before participation. Before training, all subjects were surveyed for level of postgraduate training, prior surgical experience, and prior postgraduate residency training. All subjects participated in basic skills training and task performance in small groups (n ⫽ 14 groups, 2– 4 residents per group). Training groups were assigned in accordance with resident rotation schedules, with all R1 and R2 residents on a particular technical skills rotation participating in training together. These training groups were stratified based on R-level composition and were assigned randomly to receive basic skills training in a self-directed or expertdirected fashion (treatment groups, Fig. 1). In accordance with the institutional review board protocol, all residents crossed-over into the opposite treatment group and were given an additional basic skills training session after task performance outcome measures were obtained (ie, selfdirected groups all received an equivalent expert-directed training session after task outcome measures were obtained, and expert-directed groups received additional time for selfdirected practice).

Basic skills training All subjects, regardless of treatment group, received 8 hours of protected time in the laboratory for basic skills training and had access to equivalent materials and simulation platforms. Educational materials available to both groups in the laboratory included textbooks, knot-tying manuals, on-line text, and on-line video. The expert-directed group had a surgical faculty member in the laboratory

Methods Study population and group assignment First- and second-year (R1 and R2) surgical residents, both preliminary and categorical, were recruited to participate in the study during a dedicated rotation in which residents

Figure 1 Experimental design. Residents were assigned to groups based on rotation schedules. Groups were stratified by R-level and assigned to self-directed or expert-directed basic skills training. Outcome measures were obtained during task performance 2 days later.

84

The American Journal of Surgery, Vol 197, No 1, January 2009

Figure 2 (A) Videorecordings of the surgical field were made for subsequent blinded OSATS assessment. (B) Digital photographs were used for blinded esthetic quality review. (C) Completed anastomoses were fixed to a pressure source and the amount of pressure required to make the suture line leak was measured.

for the duration of training to give expert demonstration as needed, as well as both positive and negative feedback. The faculty instructor was chosen because of his particular interest in laboratory-based training of open technical skills and because of frequent resident feedback praising his ability to teach technical skills. This faculty member instructed all sessions. Both treatment groups were assigned the same 14 basic technical skills to learn. These skills included instrument handling, suture selection, and incisions; as well as various forms of knot-tying, suturing, and ligature techniques. Simulation platforms included low-fidelity materials such as nitrile gloves, simulated skin pads, and foam suturing models. Performance of surgical tasks (ie, skin closure or anastomosis technique) was specifically not addressed during basic skills training sessions. At the end of basic skills training, subjects from both treatment groups were given a CD containing cognitive training materials (text and figures similar to a surgical atlas as well as narrated expert demonstration videos) related to 2 surgical tasks. Residents were asked to review the materials at home and to return to the laboratory 2 days later to perform the 2 tasks. Materials to practice the 2 tasks were not distributed to trainees, and they were not asked to practice the 2 tasks before performance and evaluation. Aside from reviewing the cognitive materials, residents received no formal training in task performance before evaluation. Residents were not protected from clinical duties between basic skills training and task performance.

Outcome measures The goal of this study was to measure outcomes of basic surgical skill training as a function of transfer of learned

skills to more complex surgical tasks. As such, after a 24-hour period in which residents from both treatment groups were to review cognitive materials at home, we asked subjects to perform 2 surgical tasks: (1) excision of a simulated skin lesion and interrupted vertical mattress closure of the wound, and (2) hand-sewn bowel anastomosis. Partial-task simulation was performed using previously frozen porcine tissues. Subjects were given up to 65 minutes to perform each task. Residents did not practice the 2 tasks before evaluation. Outcome measures were 3-fold for each task: (1) assessment of technical skill, (2) time to completion, and (3) assessment of final product quality. Task performance was videorecorded for subsequent blinded review (Fig. 2A). Only the surgical field was recorded, including gowned and gloved hands, and all audio tracks were deleted to ensure a truly blinded review. The Objective Structured Assessment of Technical Skill (OSATS) global rating scale was used to assess technical skill based on blinded review of the videorecordings.8 –10 The same 2 evaluators performed OSATS assessment for all subjects for skin closure and bowel anastomosis. Time to completion was measured in real-time in the laboratory. Final product quality was measured by esthetic rating for skin closure and by anastomotic leak pressure for bowel anastomosis. Digital photographs were taken of finished skin closure specimens and photographs for each specimen were evaluated by 3 independent blinded reviewers using a rating scale (Fig. 2B). Rating scale items included suture spacing, mattress limb symmetry, gaps in closure, dog ears, scar length, suture alignment, and wound eversion. Bowel anastomotic leak pressure was obtained by fixing the completed anastomosis to a pressure source, clamping off the opposite end of the lumen, and gradually increasing the water pressure until a leak was visible at the suture line (Fig. 2C).

A.R. Jensen et al.

Acquiring basic skills

85

At the completion of training (after subjects had crossedover and completed basic skills training using the opposite modality), residents were surveyed for perceptions related to training. Survey items included stress level of training, appropriateness of time spent, likelihood of transfer of skills learned to more complex tasks and to the operating room, subjective comparison of self-directed and expert-directed training, and value of laboratory-based basic skills training as compared with learning basic skills in the operating room. All perception measures used a 5-point scale.

Statistical methods A priori power analysis showed 80% power to detect an effect size of .85 for objective outcome measures. OSATS Global Rating Scale and final product quality scores were treated as continuous ratio data and were analyzed for differences between treatment groups using 1-way analysis of covariance with both months of training and self-reported prior postgraduate experience as covariates. Months of training (range, 0 –24 mo) was calculated by taking the difference of the date of basic skills training and July 1st of the subject’s R1 year. For skin esthetic ratings, inter-rater reliability was assessed with the Cronbach alpha. The mean score of all 3 reviewers was used for statistical comparison between treatment groups. Time to completion data for each task were significantly nonnormal and were analyzed using the Mann–Whitney U test. To control for experiment-wide type I error (␣ ⫽ .05), a conservative Bonferroni correction for 6 comparisons was used, with a P value of .008 or less per comparison considered significant. Potential differences in confounding variables were assessed by using the independent t test for continuous data and the Fisher exact test for frequency data. Survey data were homogeneous between groups and statistical comparisons were not made. Statistical analysis was performed using SPSS for Windows version 15 (SPSS, Inc., Chicago, IL) and G*Power (Bonn University, Bonn, Germany).

Results Forty-five surgical residents participated in the study. Confounding factors were not significantly different between groups (Table 1). Prior surgical experience was ho-

Table 1

mogenous among residents and was not suitable for inclusion as a covariate in the statistical model. Most residents had performed a significant number of skin closures and few had performed a bowel anastomosis. Inter-rater reliability of skin closure esthetic quality composite scores was acceptable (Cronbach ␣ ⫽ .8). Significant differences between treatment groups were not seen in technical skill or time to completion for either task or for skin closure esthetic quality. An isolated difference in anastomotic leak pressure was seen, with the expert-directed basic skills group showing superior quality (Fig. 3). Subjects perceived the basic skills training to be stress-free, to likely transfer to more complex tasks and transfer to the operating room, to be more productive when faculty-directed, to be appropriate in the amount of time spent, and to be slightly more valuable than learning basic skills in the operating room (Table 2).

Comments This study was designed to help elucidate the effect of 8 hours of expert-directed basic technical skills instruction as a function of differential transfer of learned skills to more complex tasks. Other studies of transfer of training after isolated partial-task simulation have shown the following: (1) correlation between performance on a laboratory model and operating room performance for saphenofemoral dissection; (2) equivalence between cadaver-based training and low-fidelity bench models when performance is measured on cadaver-based simulations; and (3) transfer of training from a bench model to a live animal model.11–13 We assumed some degree of transfer of learned skills to more complex tasks and aimed to show improved performance on complex tasks by presumably improving skills training. Despite subjective evidence that there is a benefit of the presence of an expert, we have shown only minimal objective evidence of differential performance, with only 1 of 6 outcome measures showing a difference. The results of our study, taken in the context of existing literature, can be interpreted in 2 ways: a true lack of difference or a true difference that was not seen. There may truly be no difference in outcomes between faculty-directed and self-directed learning for these skills. Modern multimedia-based learning may be sufficient to provide adequate demonstration of these skills, with resi-

Confounding variables, expert-directed versus self-directed groups Expert-directed (n ⫽ 23)

Self-directed (n ⫽ 22)

Subject characteristics

Frequency

%

Frequency

%

P value*

R1/R2 Prior postgraduate training Training, mo

17/6 5 Mean, 9.79

74/26 22 SD, 7.19

13/9 2 Mean, 10.71

59/41 9 SD, 6.48

.35 .41 .66

*The Fisher exact test was used for frequency data and the independent t test was used for continuous data.

86

The American Journal of Surgery, Vol 197, No 1, January 2009

Figure 3 Comparison of surgical task outcome measures by basic skill training modality (self- vs expert-directed). †One-way analysis of covariance with months of training and prior postgraduate training as covariates. ‡Mann–Whitney U test.

dents able to pick up on correct technique, create a cognitive representation of the performance of the skill, and practice until competent. This was shown in a recently published study by Xeroulis et al,14 which showed equivalent longterm retention with computer-based video instruction as compared with experts providing summative feedback. Furthermore, medical students using self-directed video-based instruction for knot-tying have been shown to be able to recognize when they have reached the plateau portion of the learning curve, suggesting that equipped with a well-designed multimedia curriculum, these basic technical skills may be mastered without additional expert direction.15 The cognitive aspect of training, however, likely does benefit from expert mentoring, as was shown by the study of Rogers et al16 of knot-tying training in 2000, which compared

computer-assisted learning with computer-assisted learning plus expert feedback. Students receiving expert feedback performed better on immediate posttesting. In addition, others have shown that the cognitive component of self-directed video-based learning of knot-tying can be improved by adding a demonstration of commonly performed errors (in addition to correct technique).17 Finally, one could hypothesize that the interactions between peers in the selfdirected group may have improved outcomes, but prior research has suggested that this is not the case.18 Our survey results, however, suggest that there is indeed a benefit of the expert-directed training that we have not measured. There are many plausible explanations for this, the first of which relates to the number of tasks covered in a single block of time. In our study, residents were trained

A.R. Jensen et al. Table 2

Acquiring basic skills

87

Resident posttraining perceptions

Survey item*

Mean

SD

1. The laboratory sessions were stress-free 2. Basic skill training made (would have made) subsequent task performance easier 3. Skills acquired in basic skills training will transfer to the operating room 4. Faculty-supervised training was more productive than self-directed training 5. The amount of time spent on basic skills training 6. The time spent in the laboratory for this session as compared with the equivalent amount of time spent in the operating room for a resident at my level of training

4.4

.7

4.3

.9

4.6

.5

4.6

.7

3.3

.7

3.8

.8

*Response scales (all 5-point): items 1– 4: 1, disagree strongly; 3, neutral; 5, agree strongly. Item 5: 1, too little; 3, perfect; 5, too much. Item 6: 1, less valuable than training in the operating room; 3, equal to training in the operating room; 5, more valuable than training in the operating room.

in 14 skills in an 8-hour period (about 35 minutes per skill). Although this may have resulted in cognitive understanding of the skills to be performed, there likely was not adequate practice time to develop psychomotor skills to a level of competence, let alone to automaticity, in either group.19 We did not routinely measure basic surgical skill level before or after training, and therefore cannot comment on the amount of actual immediate learning that may have occurred in the laboratory. Recent work studying microsurgical skill acquisition has shown that shorter training sessions distributed over 4 weeks yields superior skill retention compared with 4 training sessions in 1 day.20 This model is proposed to allow cognitive rehearsal between sessions leading to improved memory encoding and therefore to long-term retention. Although our measurement of transfer was delayed by only 2 days, it is possible that throughout the basic skills training the rapid progression through skills likely prevented trainees from dwelling on prior skills, leading to minimal retention even at the 2-day point. It also is possible that a number of the basic surgical skills taught in the faculty-directed session did not directly impact the time and quality of the more complex tasks used in this study. Many of the skills, although consuming a significant portion of the skills training laboratory session, were not used in the performance of the 2 tasks (specifically, ligation techniques). Furthermore, attention to squareness of knots may not impact the time to completion or the esthetic results of a skin closure. On the other hand, such technical aspects may be more important in performing a secure bowel anastomosis, thus explaining the one positive difference seen in favor of faculty-directed training. We assumed—perhaps falsely—some degree of transfer of learned skills to task performance. Learning theory dictates that learners must achieve a sufficient threshold of

learning to support transfer.3 Because we did not obtain measures of technical skill performance at the end of training, we do not know if the lack of differential task performance was caused by insufficient mastery of basic skills or equivalent mastery of technical skills. Direct measurements of basic skill performance may have shown more of a difference between groups and may have elucidated whether the lack of difference we have seen is owing to a lack of skill acquisition or owing to a false assumption of transfer to the 2 selected tasks. Currently, there are few validated measures of basic surgical skills. Consequently, this will be an area of future research for our group. Another explanation of the lack of difference may be the fact that both groups received only cognitive training for the tasks to be performed. We assumed that basic skills would transfer to the complex tasks, we do not have control-group data. The outcome measures for the tasks performed were obtained in the absence of any specific hands-on training on those tasks and, thus, our results may reflect the fact that neither group had achieved its optimal potential performance on the task at hand. This issue may have been compounded by the short period of time between skills training and task performance. An additional measurement at a delayed time point may have shown differential retention. In the previously mentioned study by Xeroulis et al,14 immediate differences were not seen, but at 1 month the follow-up retention was significantly lower in the formative feedback group as compared with the expert-based summative feedback or self-assessment with video instruction. Although the training in our study was not standardized, a significant portion of the mentoring was of a formative nature, potentially aligning our training with the formative feedback group in the Xeroulis et al14 study. Finally, there was a substantial amount of variability in the sample, which may have lead to a type II error. This variability may be owing to resident differences and/or to study procedures being performed over a 1-year period. Over this period of time, residents presumably were exposed to technical skills training in other environments including in the operating room. The inclusion of months of training and prior postgraduate experience as covariates significantly reduced this variability, but other methods such as pretesting basic surgical skill level before training likely would have been more effective. Alternatively, performing this—and similar—studies with true novices such as medical students may achieve this same goal. Although attractive, the real question is whether or not this training is effective for residents, and therefore, we performed the study within the context of the residency training—where it is likely to be implemented in other programs. This study had a number of limitations. The first limitation was the lack of standardization of basic surgical skills training. Although the expert was not given a script to work from, we did attempt to standardize the training by having the same expert instructor at every session with the same number of skills to be taught. The second limitation was the use of

88 outcome metrics that have not necessarily been shown to be valid. Although the OSATS assessment generally is accepted to be a valid measure in the live setting, it has not been studied formally for use with a videorecorded performance of open tasks or procedures. In addition, the skin esthetic rating scale used in our study, despite being of acceptable reliability, may not be able to discriminate at a fine enough level to show a difference. Furthermore, the use of anastomotic leak pressure recently has been questioned for assessment across the range of resident training levels, but has not been studied adequately among novices.21 The third limitation of this study was the small sample size. Given the amount of variability in the sample, our power to detect a difference was lower than expected (4 –5 Global Rating Scale points, 6 –7 minutes for time to completion, and 4 esthetic rating scale points). Regardless of statistical power, from a practical standpoint, if a difference is not shown across 45 residents, the likelihood of an economically feasible difference is low. Finally, because this study only examined residents in a single program with a single instructor, further research is needed to generalize these findings. The most effective method of basic technical skill instruction is likely a combination of self-directed training with the assistance of multimedia aides and intermittent expert feedback distributed over a period of weeks, not hours. Efforts to use motion-tracking devices to provide feedback have not been successful in producing retention of skill when compared with expert-based instruction, and further development of these automated systems is needed.22 Future research is needed to further elucidate the optimal use of curriculum factors such as session timing, length, and modality of training to optimize the use of valuable faculty resources. In addition, the use of less costly alternative expert instructors, including senior residents, laboratory technicians, scrub technicians, nurses, or physicians’ assistants trained to expertise in these basic technical skills may allow faculty time to be better used in the teaching of more complex tasks and ease the significant financial burden of using faculty for laboratory-based training.

Conclusions The addition of an expert mentor to a day-long basic technical skills training laboratory did not result in objective improvements in the performance of 2 more complex tasks. Indeed, outcomes for skin closure and bowel anastomosis were similar among residents who had acquired 14 basic skills by self-directed practice when compared with those who acquired those skills under faculty-directed supervision. In this context, we must question the utility of using valuable faculty time. Other contexts such as distributed mentoring and training under different curricula may produce different results. Further research is needed to define the role of expert mentors in laboratory-based training.

The American Journal of Surgery, Vol 197, No 1, January 2009

References 1. Dubrowski A, MacRae H. Randomised, controlled study investigating the optimal instructor: student ratios for teaching suturing skills. Med Educ 2006;40:59 – 63. 2. Satava RM, Cuschieri A, Hamdorf J. Metrics for objective assessment. Surg Endosc 2003;17:220 – 6. 3. Bransford J, Brown A, Cocking R. How People Learn: Brain, Mind, Experience, and School. Washington, DC: National Academy; 2000. 4. Grantcharov TP, Kristiansen VB, Bendix J, et al. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg 2004;91:146 –50. 5. McCluney AL, Vassiliou MC, Kaneva PA, et al. FLS simulator performance predicts intraoperative laparoscopic skill. Surg Endosc 2007; 21:1991–5. 6. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002;236:458 – 63. 7. Anastakis DJ, Wanzel KR, Brown MH, et al. Evaluating the effectiveness of a 2-year curriculum in a surgical skills center. Am J Surg 2003;185:378 – 85. 8. Regehr G, MacRae H, Reznick RK, et al. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med 1998;73:993–7. 9. Reznick R, Regehr G, MacRae H, et al. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997;173:226 –30. 10. Martin JA, Regehr G, Reznick R, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997;84:273– 8. 11. Anastakis DJ, Regehr G, Reznick RK, et al. Assessment of technical skills transfer from the bench training model to the human model. Am J Surg 1999;177:167–70. 12. Datta V, Bann S, Beard J, et al. Comparison of bench test evaluations of surgical skill with live operating performance assessments. J Am Coll Surg 2004;199:603– 6. 13. Grober ED, Hamstra SJ, Wanzel KR, et al. The educational impact of bench model fidelity on the acquisition of technical skill: the use of clinically relevant outcome measures. Ann Surg 2004;240:374 – 81. 14. Xeroulis GJ, Park J, Moulton CA, et al. Teaching suturing and knottying skills to medical students: a randomized controlled study comparing computer-based video instruction and (concurrent and summary) expert feedback. Surgery 2007;141:442–9. 15. Jowett N, LeBlanc V, Xeroulis G, et al. Surgical skill acquisition with self-directed practice using computer-based video training. Am J Surg 2007;193:237– 42. 16. Rogers DA, Regehr G, Howdieshell TR, et al. The impact of external feedback on computer-assisted learning for surgical technical skill training. Am J Surg 2000;179:341–3. 17. Rogers DA, Regehr G, MacDonald J. A role for error training in surgical technical skill instruction and evaluation. Am J Surg 2002; 183:242–5. 18. Rogers DA, Regehr G, Gelula M, et al. Peer teaching and computerassisted learning: an effective combination for surgical skill training? J Surg Res 2000;92:53–5. 19. Reznick RK, MacRae H. Teaching surgical skills— changes in the wind. N Engl J Med 2006;355:2664 –9. 20. Moulton CA, Dubrowski A, Macrae H, et al. Teaching surgical skills: what kind of practice makes perfect? A randomized, controlled trial. Ann Surg 2006;244:400 –9. 21. Vick LR, Vick KD, Borman KR, et al. Face, content, and construct validities of inanimate intestinal anastomoses simulation. J Surg Educ 2007;64:365– 8. 22. Porte MC, Xeroulis G, Reznick RK, et al. Verbal feedback from an expert is more effective than self-accessed feedback about motion efficiency in learning new surgical skills. Am J Surg 2007;193: 105–10.