Self-assessment in simulation-based surgical skills training

Self-assessment in simulation-based surgical skills training

The American Journal of Surgery 185 (2003) 319 –322 Association for surgical education Self-assessment in simulation-based surgical skills training ...

57KB Sizes 0 Downloads 69 Views

The American Journal of Surgery 185 (2003) 319 –322

Association for surgical education

Self-assessment in simulation-based surgical skills training Jeannie MacDonald, M.D.*, Reed G. Williams, Ph.D., David A. Rogers, M.D. Department of Surgery, Southern Illinois University School of Medicine, PO Box 19638, Springfield, IL 62794-9638, USA Manuscript received September 3, 2002; revised manuscript November 16, 2002 Presented at the 22nd Annual Meeting of the Association of Surgical Education, Baltimore, Maryland, April 4 – 6, 2002.

Abstract Background: Simulation-based training provides minimal feedback and relies heavily on self-assessment. Research has shown medical trainees are poor self-assessors. The purpose of this study was to examine trainees’ ability to self-assess technical skills using a simulation-trainer. Methods: Twenty-one medical students performed 10 repetitions of a simulated task. After each repetition they estimated their time and errors made. These were compared with the simulator data. Results: Task time (P ⬍ 0.0001) and errors made (P ⬍ 0.0001) improved with repetition. Both self-assessment curves reflected their actual performance curves (P ⬍ 0.0001). Self-assessment of time did not improve in accuracy (P ⫽ 0.26) but error estimation did (P ⫽ 0.01) when compared with actual performance. Conclusions: Novices demonstrated improved skill acquisition using simulation. Their estimates of performance and accuracy of error estimation improved with repetition. Clearly, practice enhances technical skill self-assessment. These results support the notion of self-directed skills training and could have significant implications for residency training programs. © 2003 Excerpta Medica, Inc. All rights reserved. Keywords: Self-assessment; Technical skills; Simulation training

There have been a number of educational studies done in the area of self-assessment. In looking at the field of medicine, most self-assessment studies examine cognitive knowledge rather than surgical or technical skills [1– 4]. This statement is supported by the meta-analysis review done by Gordon [2] who reviewed work on self-assessment in health care and found studies that focus on written examination and clinical skills but none around technical or surgical skills. Most of these studies have shown medical professionals and trainees to be inaccurate in self-assessment [1,2,3,5]. In a study of medical students, Anthoney [6] found that those who are the least proficient tend to be very inaccurate in self-assessment and overestimate their ability. This calls into question the greater issue of patient safety when medical personnel are required to self-assess in practice. As surgical educators, we need to establish the value of

* Corresponding author. Tel.: ⫹1-217-782-8880; fax: ⫹1-217-5241793. E-mail address: [email protected]

self-assessment in training medical students and residents in skill acquisition [2,4]. Because simulation-based training is emerging as a principal method in acquiring surgical skills, trainees exposed to this new technology would benefit greatly if they were able to enter a simulation-based training center and learn certain surgical skills independently. This would offer advantages to both the learner and also to the residency program in terms of demands on faculty teaching time and program costs. Before we can implement these self-directed simulationbased training programs, it is important to examine self-assessment skills in a practical setting [1,7]. The purpose of this research was to study the ability of students to self-assess while learning a new surgical skill using a simulation-based laparoscopic trainer. If it is shown that a learner is able to accurately self-assess skill acquisition progress over time then self-assessment may be used in simulation training as a useful learning and self-evaluation tool. The hypothesis of this study was that the accuracy of self-assessment of technical skills by novices would improve with repetition.

0002-9610/03/$ – see front matter © 2003 Excerpta Medica, Inc. All rights reserved. doi:10.1016/S0002-9610(02)01420-4

320

J. MacDonald et al. / The American Journal of Surgery 185 (2003) 319 –322

Table 1 Self-assessment form Please answer the following questions after performing the task. 1 ⫽ poor 2 ⫽ fair 3 ⫽ average 4 ⫽ good 5 ⫽ excellent 1. 2. 3. 4. 5.

Performance of left hand Performance of right hand Overall speed Ability to avoid errors Overall performance

1 1 1 1 1

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------Fill in what you EXPECT you will do for the following items for the task 1. Time to complete one repetition of task 2. Number of times the object was entered with grasper tips closed 3. Number of times the object was hit incorrectly with shaft of grasper 4. Number of times the object was entered correctly but grasper removed without closing 5. Number of times the target hit the side of the box while inside the box 6. Number of times the tools hit each other

Methods The participants for this study included 21 second and third year medical students with no previous exposure to laparoscopic training. Human subjects approval was received from the internal review board and participants were made aware that involvement was voluntary when informed consent was established. A small stipend of a $5.00 lunch certificate was offered for participating. Each student was assigned an individual code number to ensure anonymity. The Minimally Invasive Surgical Trainer, or MIST, was used as the study instrument. The MIST is a computer program with an interface that allows the learner to practice six different skills related to laparoscopic procedures using actual laparoscopic equipment. For simplicity purposes only one of the six skills was selected for this study. This selected task “withdraw insert” required the operator to pick up the target with one grasper and place it in the target box without releasing the target. The tip of the free grasper was withdrawn out of the viewing screen and returned to touch the target, being held by the other grasper, inside the target box. This task simulates the eye-hand coordination skills required when performing laparoscopic surgery. The MIST allows for three levels of difficulty: easy, medium, and hard. These settings vary in the size of the target, target box and graspers. Easy and hard were chosen for this study to ensure greater performance variation by minimizing the possibility of ceiling or floor effects. The MIST also measures and records specific errors related to each task. The students were not able to see the error scores the computer generated but did see color changes on the screen whenever an error was made or when the task was being performed correctly. The students were instructed only to the meaning of the target’s color change when the grasper was withdrawn the appropriate distance, a condition required for task completion. They were not ed-

sec

ucated on the meaning of the other color changes that occurred during the procedure. The principle investigator demonstrated the task on the MIST three times at easy setting. The student performed the task three times, alternating hands to complete the set, then filled out the posttask self-assessment form and repeated this process for a total of 10 sets. The errors specific to “withdraw insert” constituted the error questions asked in the post self-assessment form. The students filled out this form after each trial and were asked to estimate globally their overall speed and accuracy and to indicate the number of errors they made for each of the five forms of errors (see Table 1). After completing 10 sets at easy the MIST automatically changed the task setting to hard and the students were told this at the beginning of the study. The student then repeated the same process as explained above for easy. Again 10 sets were done with posttask self-assessment forms completed. The students were told that they had 1 hour to complete all 20 sets at easy and hard. The data were categorized into easy and hard levels of difficulty and examined using the two outcome measures: task completion time and number of errors. The five task errors were combined to give an overall number of errors and the time value was in seconds. Repeated measures analysis of variance was used to establish changes in time and errors over repetitions. Two factor repeated measures analysis of variance was used to compare estimated and actual results over the 10 trials.

Results In Fig. 1 the slopes of both lines reflected statistically significant performance improvement through the 10 repetitions. The analysis of time estimates and actual performance times indicated significant improvement in both time

J. MacDonald et al. / The American Journal of Surgery 185 (2003) 319 –322

321

Fig. 1. Actual and estimated time to completion over repetitions at easy level.

estimates and actual time over repetitions (P ⬍ 0.0001). Participant performance times improved and their estimates reflected improvement. However, participant estimates of time did not become more accurate with task repetitions (P ⫽ 0.31). Similar results were found in the hard level data set, not presented in this paper. In Fig. 2 the slopes of both lines represented statistically significant performance improvement through the 10 repetitions. The analysis of error estimates and actual performance errors indicated a significant decrease in error estimates and actual errors over task repetition (P ⬍ 0.0001). In this case, the participants did increase their estimate accuracy with experience (P ⫽ 0.01). Similar results were found in the hard data set except that the students consistently underestimated their number of errors for each task. These data are not presented in this paper.

Comments It has been shown that learning does occur in simulationbased training with repetition [8 –11]. This study also demonstrated that, with repetition, trainees improved their skill level on the simulated skills trainer. This was true for both the time taken to perform the skill and the number of errors made during each skill repetition. The level of difficulty, easy or hard, did not appear to impact on the student’s ability to learn and improve their skill acquisition. Learners’ estimates of performance mirrored their actual performance learning curves (time and error). This indicated that the learners had a sense of improving, a finding that has motivational benefits during learning. The accuracy of error estimates also improved with repetition as reflected in the convergence of the estimate and actual error lines over task

Fig. 2. Actual and estimated error over repetitions at easy level.

322

J. MacDonald et al. / The American Journal of Surgery 185 (2003) 319 –322

repetition. On the other hand, the accuracy of time estimates did not improve with repetition. Previous research has suggested that trainees are poor self-assessors of knowledge and clinical skills [1– 4,7]. Most of these studies reflected single estimates of performance with comparisons to poorly controlled measures of actual performance. The current study uses a more objective measure of actual performance and investigates improvement over multiple trials. Work by Rogers et al [12,13] has shown that feedback plays an important role in computer-assisted learning. What needs to be determined is how much computer-generated feedback is required for optimal error recognition. The end goal is to show increased correction in skills training through error recognition. Proving that the students did realize self-improvement through self-assessment was the real value of this study. These results go on to support the goal of implementing a more self-directed skills training center for surgical skills acquisition. The importance of these more self-directed training centers is twofold. Increased student independence in surgical skill acquisition may result in decreased demands for faculty teaching time and program costs. But more significant is that the time spent in the skills laboratory in self-directed skills acquisition may translate into an overall improvement in operating room error.

Acknowledgments Partial funding for this study was provided by the Memorial Medical Center.

References [1] Ginsburg S, Regehr G, Hatala R, et al. Context, conflict, and resolution: a conceptual framework for evaluating professionalism. Acad Med 2000;75:S6 –11. [2] Gordon MJ. A review of the validity and accuracy of self-assessments in health professions training. Acad Med 1991;12:762–9. [3] Jankowski J, Crombie I, Block R, et al. Self-assessment of medical knowledge: do physicians overestimate or underestimate? J R Coll Phys London 1991;25:306 – 8. [4] Regehr G, Hoges B, Tiberius R, Lofchy J. Measuring self-assessment skills: an innovative relative ranking model. Acad Med 1996;71: S52– 4. [5] Risucci DA, Tortolani AJ, Ward RJ. Ratings of surgical residents by self, supervisors and peers. Surg Gynecol Obstet 1989;169:519 –26. [6] Anthoney TR. A discrepancy in objective and subjective measures of knowledge: do some medical students with learning problems delude themselves? Med Educ 1986;20:17–22. [7] Henbest RJ, Fehrsen GS. Preliminary study at the Medical University of Southern Africa on student self-assessments as a means of evaluation. J Med Educ 1985;60:66 – 8. [8] Gallagher AG, McClure N, McGuigan J, et al. Virtual reality training in laparoscopic surgery: a preliminary assessment of minimally invasive surgical trainer virtual reality (MIST VR). Endoscopy 1999; 31:310 –13. [9] PohlD, EubanksTR, Kao CC, et al. Synthetic material simulation improves performance laparoscopic cholecystectomy in an animate model. Seattle, Washington: University of Washington. [10] TaffinderN, SuttonC, Fishwick RJ, et al. Validation of virtual reality to teach and assess psychomotor skills in laparoscopic surgery: results from randomized controlled studies using the MIST VR laparoscopic simulator. MMVR 1998;124 –30. [11] Smith CD, Farrell TM, McNatt SS, Metreveli RE. Assessing laparoscopic manipulative skills. Am J Surg 2001;181:547–50. [12] Rogers DA, Regehr G, Howdieshell TR, et al. The impact of external feedback on computer-assisted learning for surgical technical skills training. Am J Surg 2000;179:341–3. [13] Rogers DA, Regehr G, Yeh KA, Howdieshell TR. Computer-assisted learning versus a lecture and feedback seminar for teaching a basic surgical technical skill. Am J Surg 1998;175:508 –10.