Skill retention following proficiency-based laparoscopic simulator training

Skill retention following proficiency-based laparoscopic simulator training

Skill retention following proficiency-based laparoscopic simulator training Dimitrios Stefanidis, MD, PhD, James R. Korndorffer Jr, MD, Rafael Sierra,...

135KB Sizes 2 Downloads 67 Views

Skill retention following proficiency-based laparoscopic simulator training Dimitrios Stefanidis, MD, PhD, James R. Korndorffer Jr, MD, Rafael Sierra, MD, Cheri Touchard, BS, J. Bruce Dunne, PhD, and Daniel J. Scott, MD, New Orleans, La

Background. Proficiency-based curricula using both virtual reality (VR) and videotrainer (VT) simulators have proven to be efficient and maximally effective, but little is known about the retention of acquired skills. The purpose of this study was to assess skill retention after completion of a validated laparoscopic skills curriculum. Methods. Surgery residents (n = 14) with no previous VR or VT experience were enrolled in an Institutional Review Board–approved protocol and sequentially practiced 12 Minimally Invasive Surgical Trainer-VR and 5 VT tasks until proficiency levels were achieved. One VR (manipulate diathermy) and 1 VT (bean drop) tasks were selected for assessment at baseline, after training completion (posttest), and at retention. Results. All residents completed the curriculum. Posttest assessment occurred at 13.2 ± 11.8 days and retention assessment at 7.0 ± 4.0 months. After an early performance decrement at posttest (17%-45%), the acquired skill was maintained up to the end of the follow-up period. For VR, scores were 81.5 ± 23.5 at baseline, 33.3 ± 1.8 at proficiency, 48.4 ± 9.2 at posttest, and 48.4 ± 11.8 at retention. For VT, scores were 49.4 ± 12.5 at baseline, 22.0 ± 1.4 at proficiency, 25.6 ± 3.6 at posttest, and 26.4 ± 4.2 at retention. Skill retention was better for VT, compared with VR (P < .02). The extent of skill deterioration did not correlate with training duration or resident level. Conclusions. Although residents do not retain all acquired skills (more so for VR than for VT) according to simulator assessment, proficiency-based training on simulators results in durable skills. Additional studies are warranted to further optimize curriculum design, investigate simulator differences, and establish training methods that improve skill retention. (Surgery 2005;138:165-70.) From the Tulane Center for Minimally Invasive Surgery, Tulane University School of Medicine

THE TEACHING of operative skills in the clinical setting is constrained by the complexity of procedures, medicolegal and ethical concerns, fiscal and time limitations (especially in the 80-hour workweek), and has created the need for formal training outside the operating room.1-3 Moreover, the increased incidence of complications that was observed as a result of the undisciplined introduction of laparoscopic techniques in the early 1990s4 has raised public awareness and resulted in an outcry for safety.5 Loss of depth perception6 and Presented at the 66th Annual Meeting of the Society of University Surgeons, Nashville, Tennessee, February 9-12, 2005. Reprint requests: Daniel J Scott, MD, Associate Professor of Surgery, Director, Tulane Center for Minimally Invasive Surgery, Department of Surgery, SL-22, 1430 Tulane Ave, New Orleans, LA 70112-2699. E-mail: [email protected]. 0039-6060/$ - see front matter Ó 2005 Mosby, Inc. All rights reserved. doi:10.1016/j.surg.2005.06.002

haptic feedback, the fulcrum effect, and the use of instruments with limited range of motion7 make laparoscopic tasks difficult and introduce new skill sets that must be mastered. Acquisition of laparoscopic skills using both virtual reality (VR) and videotrainer (VT) simulators can help overcome the learning curve of new, complex, and difficult tasks, and lead to improved operative performance.8-12 Moreover, proficiency-based curricula have proven to be maximally effective and efficient.12-14 Besides skill acquisition, skill retention is vitally important15,16 but has not been well studied for surgical motor tasks. Contrary to the nonsurgical literature, in which many publications have investigated skill retention,15,17-24 only a handful of surgical papers have addressed this issue.25-27 In addition, there are no studies that compare the durability of skill between VR and VT simulators, which have previously demonstrated distinct learning characteristics.28 The purpose of this study was to investigate the durability of skill that surgery SURGERY 165

166 Stefanidis et al

Surgery August 2005

Table I. Likert scales and distribution of responses in baseline questionnaire Laparoscopy self-rating 1. Very poor 2. Poor 3. Moderate 4. Good 5. Excellent Past video game exposure 1. None 2. Very little 3. Moderate 4. Extensive Prior VR experience 1. None 2. <30 minutes 3. <1 hour 4. <2 hours 5. <3 hours

7.7% 46.1% 38.5% 7.7% 0% 7.7% 30.7% 46.2% 15.4% 84.6% 15.4% 0% 0% 0%

residents acquired during a proficiency-based basic laparoscopic skills curriculum using both VR and VT simulators. METHODS Surgery residents (n = 14) of varying levels (R14) with no or minimal prior VR or VT experience were enrolled in an Institutional Review Boardapproved training curriculum on 12 MIST-VR and five VT tasks. The Minimally Invasive Surgical Trainer (MIST)-VR (Mentice, Go¨teborg, Sweden) tasks consisted of 6 core skills 1 (CS1) and 6 core skills 2 (CS2) tasks on the easy default setting and the 5 Southwestern VT tasks of the bean drop, running string, checkerboard, block move, and suture foam drills using a 6-station VT (Karl-Storz Endoscopy, Culver City, Calif), as described in detail elsewhere.29 Scores for the MIST-VR were generated automatically by its software on the basis of completion time, errors, economy of movement, and economy of diathermy (for tasks using diathermy). VT scores were based on completion time and were recorded with a stopwatch (Fischer Scientific International Inc, Hampton, NH). All residents completed a questionnaire regarding demographics, handedness, curriculum expectations, and prior experience with laparoscopic surgery, VR or VT simulators, and video games; responses were recorded on Likert scales (Table I). After a standardized demonstration of each task and administration of a baseline test at the beginning of the curriculum (3 repetitions of each task), all residents sequentially practiced the 17 tasks until previously established proficiency levels13,14

Course expectations 1. Useful 2. Somewhat useful 3. Not useful

Current video game exposure 1. None 2. Very little 3. Moderate 4. Extensive Prior VT experience 1. None 2. <30 minutes 3. <1 hour 4. <2 hours 5. <3 hours

69.2% 30.8% 0%

69.3% 23.0% 7.7% 0% 84.6% 15.4% 0% 0% 0%

were achieved on 2 consecutive repetitions. The proficiency scores were posted at each station so that trainees could follow easily their own progress. Practice was scheduled during 1-hour weekly sessions, and a research assistant was available for assistance although no active instruction was given. For the evaluation of skill retention, we elected to test trainees on 2 tasks to simplify the testing sessions. We chose the manipulate diathermy (MD) VR task (proficiency score 36.6) and the bean drop (BD) VT task (proficiency score 24) as these represent the most valid tasks of the 2 simulators.12,30 All residents performed 3 repetitions of both tasks after achieving proficiency (Posttest) and at the end of the academic year (Retention test). During the follow-up period, no resident had additional simulator exposure, but all had routine on-the-job training. Composite scores consisting of the mean of the 3 repetitions were calculated and used to compare performance differences. To compare relative performance on each simulator, we normalized composite scores according to the respective proficiency levels. To examine expert performance over time, we tested 3 experts with extensive laparoscopic and simulator experience on both tasks at a time remote to any practice on the simulators and without allowing any warm up period on the tasks, thus mimicking the testing conditions for the subjects of this study. One of the experts also had been involved in the creation of the proficiency levels for both tasks. Statistical analysis was performed with the use of a t test, paired t test, and Pearson’s correlation;

Surgery Volume 138, Number 2

Fig 1. Comparison of trainees’ performance on the bean drop and manipulate diathermy tasks. Scores have been normalized to proficiency levels (set at 100%), and high scores represent inferior performance. P values reflect differences between the 2 tasks. For both tasks, performance scores are different for all testing session comparisons (P < .01) except between Posttest and Retention. Error bars represent SEM.

P less than .05 was considered significant (SPSS Sigma Stat, Chicago, Ill). Values are expressed as mean ± SD, unless noted otherwise. RESULTS All 14 residents successfully completed the curriculum; six R1, four R2, two R3, and two R4 residents participated. Mean age was 29.9 ± 3.1 years; 12 residents were male and 2, female; 12 residents were right handed, one, left handed, and one used both hands equally. Questionnaire data are shown in Table I. Training on the MD and the BD tasks required 22.3 ± 15.3 and 19.9 ± 9.1 (P = NS) repetitions, respectively. The interval between achieving proficiency and posttest was 13.2 ± 11.8 days, and between proficiency and retention, 7.0 ± 4.0 months. As seen in Figure 1, after an early performance decrement at posttest, there was no skill loss for both tasks at retention. In particular, for MD, scores were 81.5 ± 23.5 at baseline, 33.3 ± 1.8 at proficiency (59% improvement, compared with baseline, P < .001), 48.4 ± 9.2 at posttest (55% skill retention, compared with proficiency, P < .001), and 48.4 ± 11.8 at retention (0% deterioration, compared with posttest, P = NS). For BD, scores were 49.4 ± 12.5 at baseline, 22.0 ± 1.4 at proficiency (56% improvement, compared with baseline, P < .001), 25.6 ± 3.6 at posttest (83% skill retention, compared with proficiency, P < .001) and 26.4 ± 4.2 at retention (3% deteriora-

Stefanidis et al 167

Fig 2. Skill loss over the follow-up period (7 ± 4 months), that has been broken down into 4 intervals. After an initial loss within the first 2 weeks post training, trainee performance remained stable for the duration of the follow-up. P values refer to differences between the 2 tasks at each time frame. Both Posttest and Retention performances of each resident are included in this graph. Error bars represent SEM.

tion, compared with posttest, P = NS). The early performance decay was greater for MD, compared with BD (45% vs 17%; P < .001). As seen in Figure 2, most of the skill loss occurred within the first 2 weeks after training and stabilized thereafter for the duration of follow-up. In contrast, as expected, the mean performance scores of the 3 experts on both tasks at a time remote to any simulator practice (37.0 for MD and 21 for BD) was very close to our proficiency levels (36.6 for MD and 24 for BD). Since only 1 of the 3 experts had been involved originally in the creation of our proficiency levels, we specifically examined his performance and found no changes over time (35.9 ± 2.1 at retest vs 36.6 ± 5.2 at level development for MD and 24.6 ± 3.2 vs 22.6 ± 5.4 for BD, respectively; P = NS for both). There was no correlation of skill loss with resident level, duration of training (number of repetitions), or any of the other demographic and questionnaire parameters. Resident level and past laparoscopic experience correlated with baseline performance (indicating construct validity) but only for the BD (r = 0.68-0.71, P < .01). As seen in Figure 3, there was no difference in the performance between junior and senior residents for the MD task at any time interval: The BD baseline performance of seniors was superior to that of juniors (indicating further construct validity).

168 Stefanidis et al

Fig 3. Resident performance is shown according to junior (R1-2) and senior (R3-4) levels. There were no statistically significant differences between levels except for baseline performance on the bean drop (P < .05). Error bars represent SEM.

DISCUSSION As seen in previous studies, our trainees achieved significant improvement in performance after completing a carefully structured proficiencybased curriculum on previously validated VR and VT simulators; skill acquisition was similar for both systems (59% and 56% improvement, compared with baseline, respectively). Despite an early performance decrement (45% for VR and 17% for VT at 13.2 ± 11.8 days post training) the acquired skill persisted over a mean 7-month follow-up period. While the early skill loss may seem substantial, little is known regarding skill retention after simulatorbased surgical training. Grober et al25 showed that retention of skills was better at 4 months after hands-on bench model training for rodent vas deferens anastamoses, compared with didactic training alone. Torkington et al26 showed deterioration in basic laparoscopic skill according to MIST-VR testing using the manipulate diathermy

Surgery August 2005

task (25% over 3 months); however, training was conducted on a box trainer, not on the VR system. Anastakis et al27 found that a single session of a surgical skills curriculum did not confer long-term benefit to participants. Nonetheless, many more studies from nonsurgical fields have been published on durability of skill.15,17,18,21-23 Factors considered important for skill retention include the duration of retention interval, the quality and quantity of the original training, certain task characteristics, and individual differences.15 While a meta-analysis of skill retention found that performance decay increased with longer retention intervals and documented a 92% skill loss at 1 year,15 this finding appears to be more applicable to cognitive tasks because several authors have shown long-term persistence of psychomotor skills.22,23 Likewise, in our study the retention interval did not appear to be related to skill deterioration because we found that skill acquired during a proficiency-based training persisted for many months despite an early decrement in the first few weeks post training. The quality and quantity of training are also important factors for skill retention.15 Even though the quantity of training (number of repetitions) did not appear to be related to skill retention, the quality of the training in this study (proficiency based) may be accountable for the longevity of the acquired skill; having to achieve expert-derived performance may confer skills that are resistant to decay. On the other hand, it is unrealistic to believe that the trainee’s ability to achieve the expert-derived performance goals in 2 consecutive attempts (as was the endpoint of training for our subjects) provides them with enough skill to perform consistently at an expert level. Indeed, we showed in this study that while expert performance, as measured by simulators, remains stable and consistent over time, trainees lost 20% to 45% of the skill they achieved during proficiency-based training. Along those lines, the performance achieved during the posttest actually may reflect true trainee ability more accurately, compared with the performance achieved at the end of intensive training. Thus, even though we used a proficiency-based curriculum, it may not have been rigorous enough or ideal for maximal skill retention; alternative protocols for initial training or ongoing practice may be needed to maintain expert level performance. Individual differences like baseline ability can also affect learning, as higher skilled individuals acquire and retain more skill over time.15,17,19 However, in our study, skill retention was not

Stefanidis et al 169

Surgery Volume 138, Number 2

influenced by resident level even though senior residents outperformed junior residents at baseline on both tasks (indicating construct validity; statistically significant only for VT). This lack of correlation could be a consequence of our proficiency-based curriculum: All residents trained to the same expert-derived level, and their performance at training completion was by far superior to their baseline. In this context, acquired skill (quality of training) appears to be the primary determinant of retention, and baseline skill (resident level) may be of less importance, especially for basic laparoscopic skills. Another factor, motivation, also plays an important role in skill acquisition31 and may influence long-term retention.15 Our baseline questionnaire data revealed that all participants were highly motivated because 100% expected to benefit from training. Motivation during training, however, may be affected by the type of task. In psychologic terms, ‘‘natural tasks are generally retained better than artificial tasks’’ because trainees exhibit a higher level of interest in learning tasks that seem ‘‘natural’’.15 A study by Hamilton et al28 showed a clear preference of trainees for VT over VR, citing better visualization and tactile feedback that made VT more realistic. In regards to the significantly higher early skill decay for the VR system, the superior face validity (realism) of the VT task may have motivated the trainees to maximize their learning and accounted for the lower skill deterioration on that task. In addition, interface differences between the 2 systems may have contributed to the skill retention differences at posttest. Although the VR training has proven effective in enhancing operative performance and incorporates sophisticated metrics,12 the environment is artificial and relatively gamelike, compared with VT. While some of the ‘‘tricks’’ for obtaining a good score easily may be forgotten for both VR and VT (possibly explaining the early performance decay), the VR system seemed to have many more such tricks and was much less forgiving as judged by the higher skill loss at posttest (45% vs 17%, respectively). Nevertheless, performance, as measured by both simulators, remained stable over a long period of time, substantiating their value as training tools. This knowledge should be taken into consideration when simulators are used as training or testing tools. Undoubtedly, as technology continues to improve, so too will the fidelity of the simulators, especially for VR. While the findings of this study may not be generalized easily to the clinical setting, the results

have important implications for training curricula design for residents and continuing education for practicing surgeons. The knowledge of how much skill is retained and when deterioration occurs (if it does) allows educators to plan retraining sessions at appropriate time frames to maintain optimal performance. Moreover, applying automaticity theory in training (how experts become automated at performing a task) and defining proficiency in new multidimensional fashions may prove more effective. These issues may be of lesser importance when, after initial training, a surgeon has ongoing clinical practice on a task that closely resembles the simulator task (eg, laparoscopic suturing); in that instance the surgeon’s skill will be unlikely to deteriorate because of ongoing practice. However, skill decay occurs potentially when a surgeon neither performs a procedure nor uses a specific skill for a long time; refresher training on a simulator, such as just before the procedure, might be of great value. In addition, although simulator training has proven to improve operative performance,8-12 the longevity of that benefit is unknown. Such studies are currently underway in our laboratory. CONCLUSION This study clearly shows long-term benefit for trainees after proficiency-based training on simulators. Additional studies are warranted to further optimize curriculum design and enhance acquisition of durable skills.

REFERENCES 1. Hamdorf JM, Hall JC. Acquiring surgical skills. Br J Surg 2000;87:28-37. 2. Bridges M, Diamond DL. The financial impact of teaching surgical residents in the operating room. Am J Surg 1999; 177:28-32. 3. Scott DJ, Valentine RJ, Bergen PC, et al. Evaluating surgical competency with the American Board of Surgery InTraining Examination, skill testing, and intraoperative assessment. Surgery 2000;128:613-22. 4. Moore MJ, Bennett CL. The learning curve for laparoscopic cholecystectomy. The Southern Surgeons Club. Am J Surg 1995;170:55-9. 5. Kohn LT, Corrigan JM, Donaldson MS. To err is human: building a safer health system. Washington (DC): National Academy Press; 2000. 6. Jones DB, Brewer JD, Soper NJ. The influence of threedimensional video systems on laparoscopic task performance. Surg Laparosc Endosc 1996;6:191-7. 7. Gallagher AG, McClure N, McGuigan J, Ritchie K, Sheehy NP. An ergonomic analysis of the fulcrum effect in the acquisition of endoscopic skills. Endoscopy 1998;30:617-20. 8. Peters JH, Fried GM, Swanstrom LL, et al. Development and validation of a comprehensive program of education

170 Stefanidis et al

9.

10.

11.

12.

13.

14.

15.

16.

17.

18.

19.

and assessment of the basic fundamentals of laparoscopic surgery. Surgery 2004;135:21-7. Fried GM, Feldman LS, Vassiliou MC, et al. Proving the value of simulation in laparoscopic surgery. Ann Surg 2004; 240:518-25. Scott DJ, Bergen PC, Rege RV, et al. Laparoscopic training on bench models: better and more cost effective than operating room experience? J Am Coll Surg 2000;191: 272-83. Hamilton EC, Scott DJ, Kapoor A, et al. Improving operative performance using a laparoscopic hernia simulator. Am J Surg 2001;182:725-8. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 2002; 236:458-63. Brunner WC, Korndorffer JR, Sierra R, et al. Determining standards for laparoscopic competency using virtual reality. Am Surg 2005;71:29-35. Korndorffer JR Jr, Scott DJ, Sierra R, et al. Developing and testing competency levels for basic laparoscopic skills training. Arch Surg 2005;140:80-4. Arthur W Jr, Bennet WJ, Stanush PL. Factors that influence skill decay and retention: a quantitative review and analysis. Human Performance 1998;11:57-101. Schmidt RA, Bjo¨rk RA. New conceptualizations of practice: common principles in three paradigms suggest new concepts in training. Psychol Sci 1992;3:207-17. Farr MJ. The long term retention of knowledge and skills: a cognitive and instructional perspective. New York: Springer Verlag; 1987. Hurlock RE, Montague WE. Skill retention and its implications for navy tasks: an analytical review. NPRDC Special Rep. No. 82-21. San Diego (CA): Navy Personnel Reasearch and Development Center; 1982. Schendel JD, Shields JL, Katz MS. Retention of motor skills: review (technical paper 313). Alexandria (VA): US Army Research Institute for the Behavioral and Social Sciences; 1978.

Surgery August 2005

20. Schendel JD, Hagman JD. On sustaining procedural skills over prolonged retention interval. J Appl Psychol 1982;67: 605-10. 21. Annett J. Memory for skill. In: Gruneberg MM, Morris PE, editors. Applied problems in memory. London: Academic; 1979. 22. Hikosaka O, Rand MK, Nakamura K, et al. Long-term retention of motor skill in macaque monkeys and humans. Exp Brain Res 2002;147:494-504. 23. Shadmehr R, Brashers-Krug T. Functional stages in the formation of human long-term motor memory. J Neurosci 1997;17:409-19. 24. Bodilly S, Fernandez J, Kimbrough J, Purnell S. Individual ready reserve skill retention and refresher training options. AD-A183416. Santa Monica (CA): Rand Corporation; 1986. 25. Grober ED, Hamstra SJ, Wanzel KR, et al. Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004;172:378-81. 26. Torkington J, Smith SG, Rees B, Darzi A. The role of the basic surgical skills course in the acquisition and retention of laparoscopic skill. Surg Endosc 2001;15:1071-5. 27. Anastakis DJ, Wanzel KR, Brown MH, et al. Evaluating the effectiveness of a 2-year curriculum in a surgical skills center. Am J Surg 2003;185(4):378-85. 28. Hamilton EC, Scott DJ, Fleming JB, et al. Comparison of video trainer and virtual reality training systems on acquisition of laparoscopic skills. Surg Endosc 2002;16:406-11. 29. Scott DJ, Jones DB. Virtual reality training and teaching tools. In: Soper NJ, Swanstro¨m LL, Eubanks WS, editors. Mastery of endoscopic and laparoscopic surgery. Philadelphia: Lippincott Williams & Wilkins; 2005. p. 146-60. 30. Korndorffer JR Jr, Clayton JL, Tesfay ST, et al. Multicenter construct validity for Southwestern laparoscopic videotrainer stations. J Surg Res. In press. 31. Kanfer R. Motivation theory and industrial and organizational psychology. In: Dunnette MD, Hough LM, editors. Handbook of industrial and organizational psychology. Palo Alto (CA): Consulting Psychologists Press; 1992. p. 75-155.