Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review

Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review

ORIGINAL REPORTS Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review Michael Morgan, BSc,* Abdullatif Aydin,...

540KB Sizes 58 Downloads 151 Views

ORIGINAL REPORTS

Current Status of Simulation-based Training Tools in Orthopedic Surgery: A Systematic Review Michael Morgan, BSc,* Abdullatif Aydin, BSc, MBBS,† Alan Salih, BSc, MBBS,‡ Shibby Robati, MRCS, MSc,‡ and Kamran Ahmed, PhD, FRCS† School of Medicine, King’s College London, London, United Kingdom; †MRC Centre for Transplantation, Guy’s Hospital, King’s College London, London, United Kingdom; and ‡Department of Orthopedic Surgery, East Sussex Healthcare NHS Trust, Eastbourne, United Kingdom *

OBJECTIVE: To conduct a systematic review of orthopedic training and assessment simulators with reference to their level of evidence (LoE) and level of recommendation. DESIGN: Medline and EMBASE library databases were searched for English language articles published between 1980 and 2016, describing orthopedic simulators or validation studies of these models. All studies were assessed for LoE, and each model was subsequently awarded a level of recommendation using a modified Oxford Centre for Evidence-Based Medicine classification, adapted for education. RESULTS: A total of 76 articles describing orthopedic

simulators met the inclusion criteria, 47 of which described at least 1 validation study. The most commonly identified models (n ¼ 34) and validation studies (n ¼ 26) were for knee arthroscopy. Construct validation was the most frequent validation study attempted by authors. In all, 62% (47 of 76) of the simulator studies described arthroscopy simulators, which also contained validation studies with the highest LoE. CONCLUSIONS: Orthopedic simulators are increasingly

being subjected to validation studies, although the LoE of such studies generally remain low. There remains a lack of focus on nontechnical skills and on cost analyses of C 2017 Associorthopedic simulators. ( J Surg Ed ]:]]]-]]]. J ation of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.)

INTRODUCTION Halstead’s method of “see one, do one, teach one” has traditionally been the preferred method of surgical training.1 Learning as an “apprentice” in the operating room (OR) was the principal method of gaining skills at any level of a surgical trainee’s learning curve, until relatively recently.1 With increased focus on patient safety, heightened patient expectations, and working time restrictions on weekly working hours, the Halsteadian method of training is now less applicable.2,3 The successful implementation of simulation within the military and the aviation industries has paved the way for simulation-enhanced training in surgery.3,4 The benefits of simulation training in the current climate are recognized by most surgical specialties, and increasing numbers of simulators have been developed as a result.5 Orthopedic simulation has generally lagged behind other specialties, with fewer validated simulators available, though this trend is now changing.5 Surgical simulators may be divided into several categories, including synthetic bench, animal and human cadaver models, and computer-assisted “virtual reality” (VR) simulators. Before these can be used for training and assessment, they must initially undergo a multiparametric assessment of validity.6,7 The aim of this study is to identify all of the orthopedic simulators described in the literature and review their validity.

KEY WORDS: orthopedic surgery, simulation, training,

systematic review COMPETENCIES: Patient Care, Practice-Based Learning

and Improvement, Interpersonal and Communication Skills Correspondence: Inquiries to Abdullatif Aydin, BSc (Hons), MBBS, MRC Centre for Transplantation, 5th Floor Southwark Wing, Guy’s Hospital, King’s College London, London, London SE1 9RT, United Kingdom; e-mail: [email protected]

METHODS Search Methods The EMBASE and MEDLINE databases were searched for articles that described orthopedic training models or simulators between 1980 and March 2016. The search strategy

Journal of Surgical Education  & 2017 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.jsurg.2017.01.005

1

Idenficaon

Titles and Abstracts from Medline and Embase electronic databases (n = 4430)

Addional records idenfied through other sources (n = 8)

Included

Eligibility

Screening

Records aer duplicates removed (n = 3050)

Titles and abstracts screened (n = 3050)

Records excluded (n = 2748)

Full-text arcles assessed for eligibility (n = 302)

Full-text arcles excluded, not about training simulators (n = 226)

Studies included in qualitave synthesis (n = 76)

FIGURE 1. Systematic review algorithm, employing the PRISMA guidelines in the EMBASE and MEDLINE databases.

employed the following terms: “orthopaedic” or “orthopedic” or “arthros*” and “simulat*.” Duplicates were removed and titles and abstracts were screened for relevance, using the PRISMA guidelines8 (Fig. 1).

Selection Criteria Articles describing an orthopedic training simulator or validating an existing training model/simulator were included. Articles were excluded if they were not in the English language or if they were not complete by their author’s description. Models and simulators were classified into the following categories: bench, VR, cadaver, animal model, and augmented reality. These categories have, in places, been expanded to include details about the type of bench model, such as a Sawbones product, or the use of an additional system such as motion analysis.

Data Extraction After the initial articles were screened using their titles and abstracts, the remaining articles were examined in their entirety. Articles were included if they described an orthopedic simulator used for training. If the reference list of an article contained a study that was not found in the search result but appeared relevant to this article, the said study was subjected to the same selection criteria.

Data Analysis The outcomes for the validation studies were selected and reported. Definitions of validity were based on the definitions of van Nortwick et al.9 (Fig. 2). Some studies did not explicitly state the type of validation study undertaken; in these cases, they were classified according to the definitions below. Face and content validity inquire the realism and

Face Validity – Degree to which the simulator resembles clinical scenarios, i.e. realism Content Validity – Whether the domain or criteria aempng to be measured is actually being measured by the assessment tool or simulator Construct Validity – Capability of the simulator to disnguish between different levels of experse Transfer Validity – A gauge of whether the simulator has the effect if proposes to have, ie will the simulator improve performance whilst operang through a consequence of learning FIGURE 2. Validation definitions.9. 2

Journal of Surgical Education  Volume ]/Number ]  ] 2017

A

B

LoE Criteria 1a Systemac reviews (meta-analysis) containing at least some trials of level 1b evidence, in which results of separate, independently conducted trials are consistent 1b Randomized controlled trial of good quality and of adequate sample size (power calculaon) 2a Randomized trials of reasonable quality and/or of inadequate sample size 2b Nonrandomized trials, comparave research (parallel cohort) 2c Nonrandomized trial, comparave research (historical cohort, literature controls) 3 Nonrandomized, noncomparave trials, descripve research 4 Expert opinions, including the opinion of Work Group members

LoR

Criteria 1 Based on one systemac review (1a) or at least two independently conducted research projects classified as 1b 2 Based on at least independently conducted research projects classified as level 2a or 2b, within concordance 3 Based on one independently conducted research project level 2b, or at least two trials of level 3, within concordance 4 Based on one trial at level 3 or mulple expert opinions, including the opinion of Work Group members (e.g., level 4)

FIGURE 3. (A) Levels of evidence (LoE) and (B) Levels of recommendation (LoR).10

content-suitability, respectively, of the simulators. They are obtained through subjective questionnaires and tend to confer the lowest level of evidence (LoE). Construct and transfer validity are more objective measures and confer higher LoE. A LoE (Fig. 3A)10 and a level of recommendation (LoR) (Fig. 3B)10 were awarded to each study and model, respectively, using a modified educational Oxford Centre for Evidence-Based Medicine (OCEBM) classification system, as adapted by the European Association of Endoscopic Surgery (Figures 3A and B)10 where a recommendation of 1 is the highest and 4 the lowest.

RESULTS Description of Studies From the original 4430 articles retrieved from the databases, 76 studies11-86 described orthopedic simulators and met the inclusion criteria (Fig. 1). A large number of promising articles were excluded based on them being early designs of simulators, generally in engineering or computer science journals. Of the 76 articles selected, 47 (62%) described at least 1 validation study. Table 1 shows an overview of the individual simulators and their manufacturers. Description of Orthopedic Models

ArthoMentor, is referred to occasionally as the Insight Arthro VR in the literature. In this article, they are tabulated together. Knee Arthroscopy The most popular knee arthroscopy model used was the Sawbones knee bench model (Pacific Research Laboratories, Washington, USA) described in 4 separate studies. Variations on the Sawbones model knee accounted for 8 studies in total. Furthermore, 23 studies described VR simulators for knee arthroscopy, 14 of which were developed or partly developed by academic institutions. Knee Replacement Two studies were retrieved: the first study described a VR simulator46 and the second study described a knee replacement simulator, where a cadaver model was used with computerized analysis for assessment.45 The system is able to map the position of the femur and tibia of the cadaver as well as the participants’ surgical instruments.45 Hip Arthroscopy One study described a hip arthroscopy simulator, where the authors used a Sawbones hip simulator in conjunction with a PATRIOT motion tracking system. The combination of the 2 allowed for the “total path length of the subject’s hands,” “the total number of hand movements,” and the “time taken to complete the task” to be measured.47

Of 76 articles, 47 (62%) described arthroscopy simulators. These were also articles with the greatest number of validation studies. Knee arthroscopy simulation studies were the most abundant (n ¼ 34) followed by shoulder arthroscopy (n ¼ 15). Spine simulator studies were the third most frequent (n ¼ 12). One particular simulator, the

Shoulder Arthroscopy Of the 15 shoulder arthroscopy simulator studies, 13 used VR simulators. There were only 5 unique simulators between them, noticeably fewer than the knee arthroscopy simulator studies. The 5 unique simulators include version

Journal of Surgical Education  Volume ]/Number ]  ] 2017

3

TABLE 1. Overview of Simulators Name of Model (Institution/Manufacturer) Knee arthroscopy ArthroMentor/Insight Arthro (Simbionix, Airport City, Israel) (University of Rochester Medical Center, Rochester, NY) Sawbones Model Knee (Pacific Research Laboratories, Washington, USA) Cadaveric Knee (Dept. of Orthopedic Surgery, Mayo Clinic, Rochester, USA) ArthroSim (Touch of Life Technologies, Colorado, USA) Model Knee 1517 (Pacific Research Laboratories, Washington, USA) Model Knee 1414-1 /1413-1 (Pacific Research Laboratories, Washington, USA) Sawbones Knee “unspecified” (Pacific Research Laboratories, Washington, USA) modified by (Lawson Health Research Institute, Ontario, Canada) Dry Arthroscopy Knee 1401 (Sawbones, Malmo, Sweden) ArthroS (VirtaMed, Zurich, Switzerland) (Chinese University of Hong Kong, Hong Kong, China) (Department of Computer Science and Engineering, Chinese University of Hong Kong, Hong Kong, China) VR-AKS (American Board of Orthopedic Surgery, North Carolina, USA) Porcine Knee (University of Manitoba, Winnipeg, Canada) SKATS (University of Sheffield, Sheffield, UK) (CRIM, Scuola Superiore Sant'Anna, Pisa, Italy) with modified Sawbones (Sawbones Europe, Malmo, Sweden) and unspecified soft tissue from Limbs and Things (Limbs and Things Ltd, Bristol, UK) Kneetrainer 1 (SEIDI, Sau Paulo, Brazil) ArthroS V1.2 (VirtaMed, Zurich, Switzerland) The TKA Serious Game (University of Toronto, Toronto, and University of Ontario Institute of Technology, Ontario, Canada) (Computer Vision Laboratory, ETH Zurich, Switzerland) Procedicus VA Knee (Mentice Corp, Gothenburg, Sweden) (Chung Yuan Christian University, Taoyuan City, Taiwan) Knee phantom (Orthopedic Research Center, Amsterdam, and Deft University of Technology, Mekelweg, Netherlands) Bovine Knee Arthroscopy (Marmara University School of Medicine, Istanbul, Turkey) (Fraunhofer Institute for Computer Graphics, Darmstadt, Germany) Knee replacement (Institute of Orthopedic Research and Education, Texas, USA) (Sejong University, Seoul, South Korea) Hip arthroscopy Sawbones Hip “unspecified” (Sawbones Europe, Malmo, Sweden)

4

Fidelity

Type of Model

Describing Study Akhtar et al.,11 Chang et al.,12 Jacobsen et al.,13 and Rebolledo et al.14 Butler et al.15

H

VR

H

Cadaver

L

Bench

H

Cadaver

H

VR

H

Bench

Camp et al.,19 Cowan et al.,20 and Cannon et al.21,22 Chang et al.12

H

Bench

Dwyer et al.23

H

Bench

Escoto et al.24

H

Bench

Ferguson et al.25

H

VR

H H

VR VR

Fucentese et al.,26 Rahm et al.,27 and Stunt et al.28 Heng et al.29 Heng et al.30

H

VR

Mabrey et al.31 and Poss et al.32

H

Animal

Martin et al.33

H H

VR VR

McCarthy et al.34 Megali et al.35

L H H

Bench VR VR

Peres et al.36 Roberts et al.37 Sabri et al.38

H H

VR VR

Spillmann et al.39 Strom et al.40

H

VR

Tsai et al.41

H

Bench

Tuijthof et al.42

L

Animal

Unalan et al.43

H

VR

Ziegler et al.44

H

Cadaver

Conditt et al.45

L

VR

Jun et al.46

L

Bench

Pollard et al.47

Butler et al.,15 Howells et al.,16 Jackson et al.,17 and Tashiro et al.18 Camp et al.19

Journal of Surgical Education  Volume ]/Number ]  ] 2017

TABLE 1 (continued) Name of Model (Institution/Manufacturer) Shoulder arthroscopy ArthroMentor/Insight Arthro (Simbionix, Airport City, Israel) Alex Shoulder Professor (Sawbones Europe, Malmo, Sweden) Procedicus arthroscopy (Mentice Corp, Gothenburg, Sweden) ArthroS (VirtaMed, Zurich, Switzerland) ArthroS V1.2 (VirtaMed, Zurich, Switzerland) Basic skills Casting Simulator—Sawbones model forearm “unspecified” (Pacific Research Laboratories Washington, USA) with thermocouples (Omege, Stamford, CT) Turkey Wing Microvasculature for microsurgery (Stony Brook University Medical Center, New York, USA) Burring Simulator (Chung Yuan Christian University, Chung Li, Taiwan) with haptics (SensAble Technologies, Massachusetts, USA) Drilling Simulator (Human Machine Symbioses Lab, Arizona State University and Banner Good Samaritan Medical Centre, Arizona, USA) Bone sawing simulator (Institute of Biomedical Manufacturing and Life Quality Engineering, Shanghai, China) Amputation Amputation Simulator (Chung Yuan Christian University, Chung Li, Taiwan) with haptics from SensAble Technologies, Massachusetts, USA Fractures Hip fracture fixation (University of Auckland, Auckland, New Zealand) Distal radial fracture simulator (Cork University, Cork, Ireland) TraumaVision (Swemac, Linkoping, Sweden) Sawbones Ulna “1017” (Pacific Research Laboratories, Washington, USA) Ulna Simulator (University of Calgary, Calgary, Canada) with haptics (SensAble Technologies, Massachusetts, USA) Sawbones model forearm with fracture modification (Pacific Research Laboratories, Washington, USA) Sawbones Forearm “unspecified” (Pacific Research Laboratories, Washington, USA) Sawbones Proximal Femur (Pacific Research Laboratories, Washington, USA) Hip Fixation CAOSS (University of Hull, Yorkshire, UK) Touch Surgery IM femoral nail (TouchSurgery, London, UK) Sawbones Ankle “1518” (Pacific Research Laboratories, Washington, USA) with modification (University of Iowa, Iowa, USA) Spine Lumber discectomy simulator (Leipzig University of Applied Sciences, Leipzig, Germany) Minimally invasive spinal surgery (Department of Medical Device Engineering, Upper Austria University of Applied Sciences, Wells, Austria)

Fidelity

Type of Model

Describing Study Andersen et al.,48 Dunn et al.,49 Martin et al.,50 Martin et al.,51 Rebolledo et al.,14 and Waterman et al.52 Ferguson et al.25 and Howells et al.53

H

VR

L

Bench

H

VR

H H

VR VR

Gomoll et al.,54 Gomoll et al.,55 Henn et al.,56 Pedowitz et al.,57 and Srivastava et al.58 Rahm et al.59 Roberts et al.37

H

Bench

Brubacher et al.60

L

Animal

Grossman et al.61

L

VR

Tsai and Hsieh62

H

VR

Vankipuram et al.63

H

VR

Yanping et al.64

H

VR

Hsieh et al.65

L

VR

Blyth et al.66

H

Bench

Egan et al.67

H H

VR Bench

Floelich et al.68 and Pedersen et al.69 LeBlanc et al.70

H

VR

LeBlanc et al.70

H

Bench

Mayne et al.71

H

Bench

Moktar et al.72

L

Bench

Nousiainen et al.73

H L

VR VR

Rambani et al.74 Sugand et al.75 and Sugand et al.76

H

Bench

Yehyawi et al.77

H

Bench

Adermann et al.78

H

Bench (Hybrid with tracking)

Fuerst et al.79

Journal of Surgical Education  Volume ]/Number ]  ] 2017

5

TABLE 1 (continued) Name of Model (Institution/Manufacturer) Cervical lateral mass screw simulator (Emory University, Georgia, USA) Sawbones Cervical Vertebrae “unspecified” (Pacific Research Laboratories, Washington, USA) Pedicle screw insertion (University of Toronto, Ontario, Canada) (Department of Orthopedic Surgery, Orlando Regional Medical Centre, Florida, USA) Sawbones pelvis (Pacific Research Laboratories, Washington, USA) (Orthopedic Biomechanics Laboratory, University of Toronto, Ontario, Canada) (Orthopedic and Traumatology Department, Michallon Hospital, Grenoble, France) (MedStar Union Memorial Hospital, St Joseph Medical Centre and The Johns Hopkins University School of Medicine, Maryland, USA) Perk Station (Queen's University, Ontario, Canada and Johns Hopkins University, Maryland, USA)

Fidelity

Type of Model

Describing Study

H

Cadaver

Gottshalk et al.80

L

Bench

Gottshalk et al.80

H

Podolsky et al.81

L

Cadaver þ VR Bench

Riehl and Widmaier82

L

Bench

Riehl and Widmaier82

L

VR

Rush et al.83

L

VR

Tonetti et al.84

H

Cadaver

Tortalani et al.85

L

VR

Yeo et al.86

H, high; L, low; VR, virtual reality.

1 and version 1.2 of the ArthroS Shoulder and Knee arthroscopy simulator.

Basic Skills Five articles described basic skills simulators for casting, microsurgery, burring, drilling, and bone sawing. All 5 simulators were developed in higher education institutions.

Amputation Only 1 study described an amputation simulator which used computed tomography images as a base for the VR environment, simulating multiple amputation sites.65

Fracture Fixation Eleven unique fracture fixation simulators were described in the literature, 5 of which used Sawbones bench models as their basis. Nousiainen et al.,73 in particular, created a heavily modified version of the standard model 1518 ankle. In contrast, the literature also describes 5 examples of VR simulators with Froelich et al.68 and Pedersen et al.69 both using the Swemac TraumaVision VR simulator.

Validation Studies Of the 76 articles, 47 described at least 1 validation study. Of these, 22 studies attempted to prove face validity, 28 attempted to show construct validity, 16 attempted to demonstrate some degree of transfer validity, and only 2 attempted to show content validity.

Knee Arthroscopy Table 2 shows the 26 knee arthroscopy validation studies the search provided. Both of the transfer validation studies for the Sawbones knee simulator were positive. The transfer validation study in the Procedicus knee simulator was not able to demonstrate a positive outcome. The author attributed this to the lack of time given to the participants in the training environment.40

Spinal Surgery All but 2 of the studies looked at simulators for screw insertions at various levels in the spine. Adermann et al.,78 in contrast, developed a simulator for lumbar discectomies, whereas Fuerst et al.79 developed a minimally invasive spine surgery simulator.

Shoulder Arthroscopy There were 16 validation studies in shoulder arthroscopy simulators (Table 2). The Procedicus VR simulator was validated in 6 studies through 5 separate articles. A more recent article by Gomoll et al.55 reaffirmed construct validity in the Procedicus through a comparison study of a new cohort of participants against the cohort from their earlier article. Both Martin et al.51 and Gomoll et al.55 repeated validation studies on their respective simulators in subsequent articles. There were 3 shoulder arthroscopy validation studies that were randomized and received a greater LoE than 2b with the study by Waterman et al.52 receiving the highest LoE at 1b.

6

Journal of Surgical Education  Volume ]/Number ]  ] 2017

Journal of Surgical Education  Volume ]/Number ]  ] 2017

TABLE 2. Validation Studies for Knee, Shoulder, and Hip Arthroscopy Simulators Name of Model (Institution/ Manufacturer) Knee arthroscopy Sawbones Model Knee (Pacific Research Laboratories, Washington, USA)

Type of Model Bench

Participants Study (Year)

Type of N Demographics Validity

Howells et al.16 Jackson et al.17

20 Junior trainees

Tashiro et al.18

30 12 Novices, 12 intermediates, and 6 experts 40 29 Residents, 5 fellows and 6 staff surgeons 5 Experts

19 Residents

Sawbones Model Knee 1414-1 /1413-1 (Pacific Research Laboratories, Washington, USA

Bench

Dwyer et al.23

Sawbones Knee “unspecified” (Pacific Research Laboratories, Washington, USA) modified by (Lawson Health Research Institute, Ontario, Canada)

Bench

Escoto et al.24

Knee phantom (Orthopedic Research Center, Amsterdam and Deft University of Technology, Mekelweg, Netherlands)

Bench

Tuijthof et al.42

28

ArthroSim (Touch of Life Technologies, Colorado, USA)

VR

Cannon et al.22

18

Cannon et al.21

48

McCarthy et al.34

33

SKATS (University of Sheffield, Sheffield, UK)

13

VR

23

3

Qualitative

Yes, OCAP score (p ¼ 0.0007) and global rating scale (p ¼ 0.0011) showed improvement Transfer Yes, there was a learning curve for trainees at every level. The experienced resident group exhibited a plateau in its learning curve by the 21st trial. Construct Yes, the more experienced groups performed better than the less-experienced groups.(p ¼0.024 and p ¼ 0.003, respectively) Construct Yes, post hoc analysis showed significant difference in global ratings and checklist scores between the 3 groups of participants (p o 0.5) Transfer

Face

LoE LoR 2a

2

2a 2b 3

7

Average Likert score of 4.16/5 in 5 different 2c measures, indicating a high level of realism by expert surgeons Novices Construct Reduction in time to completion of task as well as tool path length and hand path length. However, statistical significance unclear 20 Intermediates Face Yes, majority agreed that PASSPORT can be used to 2a (surgeons) and train knee joint inspection and navigation (93%) 8 experts Construct Yes, median task time for all trails was faster in the (residents) surgeons than residents. All task differences in time were significant (p r 0.01) 6 PGY 1 Construct Yes, time to completion is less in experts (p ¼ 0.006) 2b residents, 6 PGY 5 residents, and 6 attendees 48 Year-3 Transfer Yes, experienced group had a better procedural 2a residents checklist score (p ¼ 0.031) but not significant in global impression criteria (p ¼ 0.061). Visualization skills were not performed better by experimental group Surgeons Face Yes, “strong agreement with all the statements 2a regarding realism except the realism of the physical limb model” on a 4-point Likert scale 5 Junior (5-50 Construct Yes, surgeons with the most experience completed the KAs), task faster than the other 2 groups (p ¼ 0.004 and 7 intermediate p ¼ 0.01). The path length of the arthroscope was (51-100 KAs), also shorter in the experienced group (p ¼ 0.015) 11 fellows (1000 KAs) Novices Transfer Yes, nonsurgeon novices demonstrated significant improvements in task completion time (χ2, p ¼

4

4

3

2

3

8

TABLE 2 (continued) Name of Model (Institution/ Manufacturer)

ArthroS (VirtaMed, Zurich, Switzerland)

Type of Model

VR

ArthroS V1.2 (VirtaMed, Zurich, Switzerland) VR

Participants Study (Year)

Type of N Demographics Validity

Stunt et al.28

27 9 Novices (0 Face KAs), 9 intermediates (1-59 KAs), and 9 experts Construct (Z60 KAs)

Fucentese et al.26

68 33 Novices Face (o20 KAs), 19 intermediates (21-99 KAs), and 16 experts Construct (4100 KAs) 60 30 Novices, 20 Face intermediates, and 10 experts

Roberts et al.37

Construct Journal of Surgical Education  Volume ]/Number ]  ] 2017

Procedicus VA Knee (Mentice Corp, Gothenburg, Sweden)

VR

Strom et al.40

28 Novices (medical Transfer students

ArthroMentor/Insight Arthro (Simbionix, Airport City, Israel)

VR

Akhtar et a.11

37 Not stated

Face Construct

Jacobsen et al.13

Rebolledo et al.14 Porcine Knee (University of Manitoba, Winnipeg, Canada)

Animal Martin et al.33

Qualitative 0.001) and arthroscopic path length (p ¼ 0.05), over 5 wk Partly, a questionnaire was given after 3 tasks. 4/9 questions for face validity scored under 7/10, with instruments probing and instruments cutting scoring 1, by expert and intermediate groups Yes, median task time to completion was significantly different between experts and beginners but not between other pairings Yes, face validity questionnaires to participants after 2 exercises scored 5.9/7 for overall realism, and 3.9/7 for tactile sensation scored, 5 (71%) being seen as acceptable Yes, experts were significantly faster and had shorter camera lengths Yes, the simulator demonstrated face validity about realism of external appearance (93.5%) and the instrumentation (93.6%); however, the realism of the tissues was not supported (51.6%) Yes, construct validity was demonstrated on both the ASSET global rating scale (p o 0.00003) and time to complete task (p o 0.001) between the 3 groups No, “one hour of training in different visual-spatial contexts was not enough to improve performance in virtual arthroscopy tasks” Yes, participants agreed that the simulator accurately reflected knee arthroscopy Yes, path length covered by arthroscope (p ¼ 0.02) and path length travelled by the probe (p ¼ 0.028) The pass-or-fail standard for the simulator was set at a z-score of 15.5 points. 66% of novices passed with only one experienced surgeon failing, a demonstration of construct validity

LoE LoR

2b

2

2a

2a

3

2a

3

2a

2

26 13 Novices Construct 2b (interns and residents) þ 13 experts (attendees) 14 Junior residents Transfer Yes, randomized group that trained with the 2a simulator showed increased cartilage grading index scores and decreased time to completion when performing arthroscopy on a cadaver 15 Face Yes, there was a high level of concordance between 4 human and porcine knees among the participants

4

Journal of Surgical Education  Volume ]/Number ]  ] 2017

11 Residents, 1 fellow, and 3 attendees 17 Experts

Face

Transfer

Knee Trainer 1 (SEIDI, Sau Paulo, Brazil)

Bench

Peres et al.36

Shoulder arthroscopy ArthroMentor/Insight Arthro (Simbionix, Airport City, Israel)

VR

Rebolledo et al.14

14 Junior residents

Martin et al.50

19

Martin et al.51

27

Waterman 22 et al.52

Procedicus arthroscopy (Mentice Corp, Gothenburg, Sweden)

VR

Gomoll et al.54

43

Gomoll et al.55

10

Henn et al.56

17

Pedowitz et al.57

78

Srivastava et al.58

35

Yes, face validity was demonstrated for both 4 meniscectomy and ACL reconstruction with expert ratings of 64.7% and 82.4%, respectively

Yes, randomized group that trained with the simulator showed increased cartilage grading index scores and decreased time to completion when performing arthroscopy on a cadaver 15 Novices Construct Yes, experts completed the task faster than the (residents) and less-experienced group (p ¼ 0.016) 4 experts Transfer Yes, “The task performance time on the simulator (attendees) correlated strongly with the performance on the cadaveric model (r ¼ 0.736, p o 0.001)” Mixed resident Transfer Yes, “Every additional postgraduate year” resulted group (years in a 16-second decrease in task completion time 1-5) (p o 0.005) Trainees Transfer Yes, the authors' simulator cohort improved their live diagnostic shoulder arthroscopy times by 80 s and improved probe distance by 41 mm compared to the control cohort. Their ASSET safety scores were also significantly better than the cohort. 8 Novices, 11 Construct Yes experts complete the task was 62% less in the PGY 2-3 expert group, “path length and hook collisions residents, 14 were more than halved” in the expert group, and PGY 4-5 the “average probe length more than doubled” in residents, and the expert group 10 experts Residents Construct Group compared with group of moderate experience from previous study—no statistically significant differences Transfer Yes, subjects significantly improved their performance on the simulator retesting 3 y after initial evaluation Novices (year Transfer Yes, experimental group with VR training showed 1 medical improved scores from baseline in cadaver. students) Completion time was significantly improved in final test (p o 0.05) 35 Novices, 22 Construct Yes, more experienced surgeons had a “shorter and intermediates, more consistent” time distribution than the other and 21 experts groups 21 Novices, Construct Yes, the more experienced surgeons were quicker 5 intermediand more accurate in the hook manipulation and ates, and scope navigation exercises, respectively 9 experts

2a

4

2

2b

2b 1b

2b

2b

2a

2b 2b

2

9

10

TABLE 2 (continued) Name of Model (Institution/ Manufacturer)

Type of Model

ArthroS (VirtaMed, Zurich, Switzerland)

VR

ArthroS V1.2 (VirtaMed, Zurich, Switzerland) VR

Alex Shoulder Professor (Sawbones Europe, Malmo, Sweden) Journal of Surgical Education  Volume ]/Number ]  ] 2017

Hip arthroscopy Sawbones Hip “unspecified” (Sawbones Europe, Malmo, Sweden)

Participants Study (Year)

Type of N Demographics Validity

Qualitative

LoE LoR

Rahm 51 25 Novices and Face et al.659 26 experts

2b

3

Roberts et al.37

4

3

Bench

Howells et al.53

Bench

Pollard et al.47

Yes, overall impression of the realism was rated “Good” (6/7). The overall training potential was also rated “Good.” Construct Yes, experts were significantly faster than novices in completing both diagnostic and therapeutic exercises (p o 0.0001 for both). Experts had a significantly shorter camera path length in the diagnostic task (p o 0.0001) but not so in the therapeutic test. 60 30 Novices, 20 Face Yes, the simulator demonstrated realism of the intermediates, external appearance (93.5%) and the and 10 experts instrumentation (93.6%); however, the realism of the tissues was not supported (51.6%). Construct Yes, construct validity was demonstrated on both the ASSET rating scale (p o 0.00003) and task completion time (p o 0.001) between the 3 groups. 6 Consultant Transfer Yes, every parameter showed a learning curve. orthopedic There was skill loss after 6 mo. surgeons (naïve to Bankart repair and without a shoulder fellowship) 20 Orthopedic trainees without fellowships

KA, knee arthroscopy; LoE, level of evidence; LoR, level of recommendation; VR, virtual reality.

Transfer

2a

2b

3

Yes, baseline performances were improved in each 1b group significantly. The less-experienced participants had significantly worse initial diagnostic sessions (p ¼ 0.05) but improved on their second round.

3

Journal of Surgical Education  Volume ]/Number ]  ] 2017

TABLE 3. Validation studies of basic skills and fracture fixation simulators. Abbreviations: LoE- Level of Evidence, LoR- Level of Recommendation, VR- virtual reality. Participants Name of Model (Institution/ Manufacturer)

Type of Model

Basic Skills Drilling Simulator (Human Machine VR Symbioses Lab, Arizona State University and Banner Good Samirtan Medical Centre, Arizona, USA)

Study (Year)

N

Demographics

Vankipuram 23 6 Novices, 11 intermediates 201063 (Residents) and 6 experts

10 Novices Sawbones model forearm 'unspecified' (Pacific Research Laboratories Washington, USA) with thermocouples (Omege, Stamford, CT)

Bench

Bone sawing simulator (Institute of VR Biomedical Manufacturing and Life Quality Engineering, Shanghai, China)

Brubacher 201560

24 12 interns, 9 residents, 3 attendees

Yanping 201464

25 16 Novices and 9 Experienced surgeons

10 Novices

Fracture Fixation Sawbones Ulna '1017' (Pacific Research Laboratories, Washington, USA)

Bench

VR

LeBlanc 201370

22 12 Junior and 10 Senior Residents

22

Type of Validity

Qualitative

LoE LoR

Construct Yes, task completion time was greater in the expert 2b group; this was due to the experts taking their time to familiarize themselves with a new environment. However, the number of errors made by the expert group by the 4th trial was significantly lower than both the novice and resident groups Transfer Yes, testing of a bone model after use of the simulator suggests that skills learned on the simulator can be transferred Face Yes, face validity was established by 2b discriminating between "good" and "bad" techniques and measuring the mean maximum temperatures between them with the "good" technique producing significantly lower temperatures. Construct Yes, the difference in temperature between groups provided good evidence of construct validity with novices producing the highest maximum temperatures and differences in temperatures Face Yes, 94% of all participants scored the simulator 2b 47/10 on three metrics - 'Safe force learning', 'Stable hand controlling' and 'Overall Performance'. Construct Yes, significant differences between the surgeons and novices maximal acceleration and haptic force use Transfer Yes, the experimental group showed a significant difference to the control group in maximal acceleration when performing a Lefort I osteotomy suggesting a positive training effect. Face

Yes, the orthopedic surgeons agreed that the simulator would help with the introduction of surgical skills Construct The senior surgeons performed better on all metrics, including, a checklist, Global Rating Score and time. Face

3

3

3

2a

3

2a

3

11

12

TABLE 3 (continued) Participants Name of Model (Institution/ Manufacturer)

Type of Model

Unla Simulator (University of Calgary, Calgary, Canada) with haptics (SensAble Technologies Technologies, Massachusettes, USA)

Study (Year)

N

LeBlanc 201370

12 Junior and 10 Senior Residents

Journal of Surgical Education  Volume ]/Number ]  ] 2017

Sawbones Ankle '1518' (Pacific Research Bench Laboratories, Washington, USA) with modifcation (University of Iowa, Iowa, USA) Distal radial fracture simulator (Cork Bench University, Cork, Ireland)

Yehyawi 201377

12

Egan 201367

55

Sawbones Forearm 'unspecified' (Pacific Research Laboratories, Washington, USA)

Bench

Moktar 201472

9

TraumaVision (Swemac, Linkoping, Sweden)

VR

Pedersen 201469

20

Touch Surgery IM Femoral Nail (TouchSurgery, London, UK)

VR

Sugand 201575

49

Sawbones model forearm with fracture modification (Pacific Research Laboratories, Washington, USA)

Bench

Mayne 201671

20

Hip fracture fixation (University of Auckland, Auckland, New Zealand)

VR

Blyth 200766

LoE, level of evidence; LoR, level of recommendation; VR, virtual reality.

Demographics

Type of Validity

Qualitative

LoE LoR

Yes, however “participants would use the Sawbones simulator preferentially”

Construct Yes, the senior surgeons performed better than the juniors on all metrics except for Global Rating Score 7 Junior and 5 Senior Construct Yes, senior residents outperformed junior residents 2b Residents by 1 minute 32 seconds and by 311m in cumulative hand motion during the fracture reduction 19 Registrars, 27 Face Yes, 78% of those questioned agreed that the 4 Specialist model was a good approximation to a real Registrars, reduction 9 Consultants 3 Medical Students, Content The casting simulation model and evaluation 2c 3 Residents, instrument is a reliable assessment of casting 3 Fellows, skill' 1 Orthopedic Technologist 10 Novices, 10 Construct Yes, the score for the novice group was 30% and 2b Experts the score for the senior group was 76% after three attempts 39 Novices, 10 Face Yes, face and content validity was demonstrated 2b Experts through the use of questionnaires Content Construct Yes, significant difference between the median expert score (72-77.55%) and the novice score (41-60%) 10 junior and 10 Face Yes, questionnaire demonstrated face validity 2b senior residents

10 3 medical students, 4 junior trainees, 3 fellows

Construct Yes, senior residents displayed significantly higher OSATS and GRS scores (Po0.001) Face The participants judged that the simulator gave a 4 realistic view of the operation with a median score of 8.2/10

3

4

4

3 3

3

4

4

Basic Skills There were a total of 7 validation studies for basic skills simulators (Table 3). Each of the 3 specific simulators validated were given a LoR of 3. Fracture Simulators Of the 14 validation studies attempted in fracture simulators found by the search (Table 3), 4 were performed by LeBlanc et al.70 in 2 simulators. In both of the face validation studies, surgeons with mixed levels of experience agreed that the simulators would be useful in training.70 The construct studies measured “Checklists,” “Global Rating Score,” and “time to completion.” Experienced surgeons generally performed better, except the Global Rating Score in the VR simulator.70 The construct validation study for the Sawbones ankle model 1518 demonstrated a significant difference in one of the parameters, “cumulative hand difference,” between senior and junior surgeons.77 “Cumulative hand difference” refers to the total distance travelled by the surgeon’s hands during the procedure with a lower value conferring an increased proficiency in the procedure. The highest LoR for a fracture simulator did not surpass a 3.

LoE, level of evidence; LoR, level of recommendation; VR, virtual reality.

(University of Toronto, Ontaria, Cadaver Podolsky Canada) þ VR et al.81

3 2b Face

Construct Yes, 2 of the 4 measurements showed differences between novice and experienced groups. The first was a time difference of 5 min 40 s between groups, and the second was a 50% rate of cortical breaches in the novice group, whereas none occurred in the experienced group. 28 Mixed resident Face Yes, there was strong agreement between senior and junior 4 group groups that the simulator was “beneficial as an (orthopedics and educational tool” neurosurgery)

4 4

Yes, the experts agreed that the simulator was realistic visually and haptically Yes, questionnaire revealed that the simulator provided a realistic approximation of the operation (9.2/10) Face

Adermann 12 Expert surgeons et al.78 16 6 Novices, Riehl and 6 intermediates Widmaier82 (residents) , and 4 experts (Leipzig University of Applied Bench Sciences, Leiozig, Germany) (Department of Orthopedic Bench Surgery, Orlando Regional Medical Centre, Florida, USA)

LoE LoR Qualitative Type of Validity N Demographics

Participants

Study (Year) Type of Model Name of Model (Institution/ Manufacturer)

TABLE 4. Validation Studies for Spine Simulators

Hip Arthroscopy The search returned a single validity study in a hip arthroscopy simulator (Table 2). Pollard et al.47 demonstrated transfer validity, through a high-quality randomized trial, in 20 participants using the Sawbones hip model. This validation study received a 1b LoE, the highest of all studies in this review shared only with the study by Waterman et al.52 on the ArthoMentor for shoulder arthroscopy.

Journal of Surgical Education  Volume ]/Number ]  ] 2017

Spine Simulators Four validation studies of spine procedure simulators were found through the search (Table 4), and 3 were face validation studies, which received the lowest LoE as they are measured by expert opinion. Riehl et al.82 attempted a construct validation study with their low-fidelity bench simulator using 4 parameters: wire positioning in the bone, time to complete the task, presence of a cortical breach, and number of times the wire required removal.82 Their simulator received the highest LoR of a spine simulator at 3.

DISCUSSION There is no official list of validation definitions for surgical simulators, although the consensus guidelines by Carter et al.10 provide a robust framework. These guidelines are often not implemented, and different terms are used to describe similar studies between articles. An interspecialty guideline for definitions of validity would prove useful, along with authors explicitly stating their validation studies (which has become more common in recent articles). 13

Study Design Of 76 studies, 29 (38%) were exclusively descriptive and did not engage with any type of validation study. This is significantly lower than in commercially available simulators across all specialties (94%).87 This is potentially because of the large portion of the simulators in this article being developed in academic institutions rather than commercially, where priorities may differ. Further efforts should be made to ensure that validation studies become the norm in simulator design. Twenty-five unique knee arthroscopy procedural simulators were described in the literature. In contrast, only 5 unique shoulder arthroscopy simulators were described. Moreover, all of the shoulder arthroscopy simulators were designed in “industry,” whereas 56% (14 of 25) of the knee arthroscopy simulators were developed or partly developed in academic institutions. Academic institutions’ disproportionate tendency to design knee arthroscopy simulators is an interesting trend that has not yet been broached in the literature. This may be, in part, due to the greater number of knee arthroscopy procedures performed worldwide compared to shoulder arthroscopy procedures, at 4 million to 1.4 million, respectively.64,65 The reason behind the substantially greater number of arthroscopic simulators compared to nonarthroscopic orthopedic simulators may be an attempt to address trainees feeling relatively unprepared for arthroscopic procedures.5 Interestingly, 58% (18 of 31) of all the arthroscopy simulators were VR simulators, a higher proportion than the fracture or spine simulators. VR simulators have the advantage of providing instantaneous measures of performances, though non-VR simulators can still use the Objective Structured Assessment of Technical Skills (OSATS) to inform metrics for assessment, as evidenced by several studies in this article.88

most validation studies (6) and with the most participants (183 in total). The small sample sizes may reflect the difficulty of coordinating surgeons in a hospital to attend specific time slots for simulation. A multicenter approach has been suggested as a preferred future model for validation and would likely improve the power of future studies, while increasing the flexibility at individual institutions.89 Arthroscopy simulators received notably higher LoR than nonarthroscopy simulators. The Sawbones hip model and the ArthroMentor would each need another study with a 1b LoE to become the first orthopedic simulator with a LoR of 1. Simulators for Orthopedics may be lagging behind other specialties such as Urology regarding their LoR. A systematic review conducted by Aydin et al.90 revealed that 6 urology simulators had a LoE of 1b with the URO Mentor receiving a LoR of 1. The ultimate aim of a surgical simulator is to provide an environment for learning skills that will be relevant in the OR. There were a total of 14 transfer validation studies in this systematic review. Although promising, the vast majority of the studies used learning curves or cadavers as proxies for the OR, likely because the use of proxies is easier than true transfer studies in the operating theater.91 Howells et al.16 were able to demonstrate transfer of skill to the OR from a Sawbones bench simulator. Validation studies, especially construct and transfer, require a staggering amount of coordination in departments that is both financially costly and time consuming. These requirements are a significant factor as to why so few validation studies exist. Nonetheless, to ensure the widespread adoption of simulators, there needs to be a greater effort to produce high-quality validation studies with an emphasis on the establishment of transfer validity.5 Future Considerations

Knee arthroscopy simulators also had the most validation studies at 26. In comparison, there were 16 for shoulder arthroscopy and 14 for fractures. The only validation studies that were of a high enough standard to be rated a level 1b by the guidelines used were the transfer studies conducted on the Sawbones hip used by Pollard et al. and the ArthroMentor used by Waterman et al.52 These studies were the only ones to have an explicit and positive power analysis included in the article.47 The lower level validation studies were randomized trials that neglected power analyses or had small sample sizes. Construct validation studies by Pedowitz et al.57 performed on the Procedicus Shoulder Arthroscopy simulator was the largest, with 78 participants.57 The Procedicus simulator, designed by the Mentice Corporation, Gothenburg, Sweden, was also the simulator with the

Nontechnical surgical skills can broadly be classified as situational awareness, communication and teamwork, decision-making, and leadership.92 These skills are crucial to preoperative, intraoperative, and postoperative care, though simulation of such techniques was notably lacking in the literature. Poor nontechnical skill has been linked to a number of adverse events in operating theaters. Gawande et al.93 reported that 43% of surgical adverse events could be attributed to communication error. Nontechnical skills are poorly self-assessed by surgeons when compared to technical skills and have also been shown to have a corollary effect on surgeons’ technical skills,94 indicating a need for objective assessment.95 Nontechnical skills simulations tend to use fully simulated ORs as opposed to VR consoles and have been successfully adopted by Urology and General Surgery, suggesting that Orthopedics may also merit from such techniques.94-96 These simulators come with considerable

14

Journal of Surgical Education  Volume ]/Number ]  ] 2017

Validation Studies

financial burden, highlighting the need for newer designs that are lightweight and portable, such as the “distributed simulator” used in endourology, which offers a fully immersive experience to test both technical and nontechnical skills.97 Balancing the financial cost with the perceived benefit is a universal concern with all simulators98; therefore, it is prudent to include the financial cost of simulators in their descriptions. The articles included in this systematic review seldom mentioned the cost of their simulators or considered the economic justification for the benefits of training on their particular simulator. The distal radial fracture simulator by Egan et al.67 was one of the few simulators that priced their model, at $455, and also provided a projection of cost with large-scale production.67 Training a single surgeon in theater has been estimated at $48,000 in the United States98 partly attributed to increased operating time when a trainee is present.5 During 4 years of training, the time lost because of training a single trainee was 11,184 minutes.5 In future, a cost-benefit analysis should be included in a systematic review of this kind. Another problem facing widespread adoption of surgical simulators is the limited range of tasks a simulator can provide. It would be easier to justify the cost of some high-fidelity VR simulators if they could provide simulation for several different tasks, preferably in different anatomical regions.5 The ArthroS simulator by VirtaMed, Zurich, Switzerland, is an example of a simulator with both shoulder and knee physical models for their VR arthroscopy system.26,28 Both the simulator’s knee and shoulder arthroscopy simulation capabilities performed well, demonstrating its versatility and suggesting a valuable quality that may be replicated in future simulators to increase their appeal.59

the second-highest LoR. Future work in streamlining validation terms and enhancing the quality of validation study designs would strengthen the evidence for the translation of skills. Work should also be done in justifying the financial cost of simulators as well as in developing simulators that enable the assessment of nontechnical skills.

REFERENCES 1. McDougall EM. Validation of surgical simulators.

J Endourol. 2007;21(3):244-247. 2. Aydin A, Raison N, Khan MS, Dasgupta P, Ahmed K.

Simulation-based training and assessment in urological surgery. Nat Rev Urol. 2016;13(9):503-519. 3. Kneebone R, Aggarwal R. Surgical training using

simulation. Br Med J. 2009;338:b1001. 4. Tay C, Khajuria A, Gupte C. Simulation training: a

systematic review of simulation in arthroscopy and proposal of a new competency-based training framework. Int J Surg. 2014;12(6):626-633. 5. Thomas GW, Johns BD, Marsh JL, Anderson DD. A

Review of the role of simulation in developing and assessing orthopaedic surgical skills. Iowa Orthop J. 2014;34:181-189. 6. Frank R, Erickson B, Frank J, et al. Utility of modern

arthroscopic simulator training models. Arthroscopy. 2014;30(1):121-133. 7. Samia H, Khan S, Lawrence J, Delaney CP. Simu-

lation and its role in training. Clin Colon Rectal Surg. 2013;26(1):47-55. 8. Panic N, Leoncini E, de Belvis G, Ricciardi W,

Limitations Heterogeneity between studies prohibited the use of a pooled meta-analysis. Publication bias may have skewed the validation studies of the simulators. The exclusion of non-English language articles may also have contributed to bias.

CONCLUSION Orthopedic simulators predominantly consist of a range of arthroscopy simulators. Although nonarthroscopy orthopedic simulators exist, their numbers are few in comparison to arthroscopy simulators and their validation studies even fewer. This systematic review supports the notion that orthopedic simulators have the potential to translate useful skills into the operating theater. In particular, several arthroscopy simulators are awarded Journal of Surgical Education  Volume ]/Number ]  ] 2017

Boccia S. Evaluation of the endorsement of the preferred reporting items for systematic reviews and meta-analysis (PRISMA) statement on the quality of published systematic review and meta-analyses. PLoS ONE. 2013;8(12):e83138. 9. Van Nortwick SS, Lendvay TS, Jensen AR, Wright AS,

Horvath KD, Kim S. Methodologies for establishing validity in surgical simulation studies. Surgery. 2010;147 (5):622-630. 10. Carter FJ, Schijven MP, Aggarwal R, et al. Consensus

guidelines for validation of virtual reality surgical simulators. Surg Endosc. 2005;19(12):1523-1532. 11. Akhtar K, Wijendra A, Bayona S, et al. Assessing skills

decay on a knee arthroscopy simulator. Arthroscopy. 2013;29(Suppl 10):e139. 12. Chang J, Banaszek DC, Gambrel J, Bardana D. Global

Rating Scales and motion analysis are valid proficiency 15

metrics in virtual and benchtop knee arthroscopy simulators. Clin Orthop Relat Res. 2016;474(4): 956-964.

23. Dwyer T, Slade Shantz J, Chahal J, et al. Simulation of

13. Jacobsen ME, Andersen MJ, Hansen CO, Konge L.

24. Escoto A, Le Ber F, Trejos AL, Naish MD, Patel RV,

Testing basic competency in knee arthroscopy using a virtual reality simulator: exploring validity and reliability. J Bone Joint Surg Am. 2015;97(9):775-781.

Lebel ME. A knee arthroscopy simulator: design and validation. Conf Proc IEEE Eng Med Biol Soc. 2013;2013:5715-5718.

14. Rebolledo B, Leali A, Hammann J, Ranawat A.

25. Ferguson J, Middleton R, Alvand A, Rees J. Newly

Arthroscopy skills development with a surgical simulator: a comparative study in orthopaedic surgery residents. Arthroscopy. 2014;30(6):e36.

acquired arthroscopic skills: are they transferable during simulator training of other joints? Knee Surg Sports Traumatol Arthrosc. 2015:1-8, [ahead of print].

15. Butler A, Olson T, Koehler R, Nicandri G. Do the

26. Fucentese SF, Rahm S, Wieser K, Spillmann J,

anterior cruciate ligament reconstruction in a dry model. Am J Sports Med. 2015;43(12):2997-3004.

skills acquired by novice surgeons using anatomic dry models transfer effectively to the task of diagnostic knee arthroscopy performed on cadaveric specimens? J Bone Joint Surg. 2013;95(3):e15.

Harders M, Koch PP. Evaluation of a virtual-realitybased simulator using passive haptic feedback for knee arthroscopy. Knee Surg Sports Traumatol Arthrosc. 2015;23(4):1077-1085.

16. Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL.

27. Rahm S, Wieser K, Wicki I, Holenstein L, Fucentese

Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg. 2008;90(4):494-499. 17. Jackson WF, Khan T, Alvand A, et al. Learning and

retaining simulated arthroscopic meniscal repair skills. J Bone Joint Surg. 2012;94(17):e132.131-138. 18. Tashiro Y, Miura H, Nakanishi Y, Okazaki K,

Iwamoto Y. Evaluation of skills in arthroscopic training based on trajectory and force data. Clin Orthop Relat Res. 2009;467(2):546-552. 19. Camp CL, Krych AJ, Stuart MJ, Regnier TD, Mills

KM, Turner NS. Improving resident performance in knee arthroscopy: a prospective value assessment of simulators and cadaveric skills laboratories. J Bone Joint Surg Am. 2016;98(3):220-225. 20. Cowan JB, Seeley MA, Irwin TA, Caird MS.

SF, Gerber C. Performance of medical students on a virtual reality simulator for knee arthroscopy: an analysis of learning curves and predictors of performance. BMC Surg. 2016;16(1):1-8. 28. Stunt JJ, Kerkhoffs GM, van Dijk CN, Tuijthof GJ.

Validation of the ArthroS virtual reality simulator for arthroscopic skills. Knee Surg Sports Traumatol Arthrosc. 2015;23(11):3436-3442. 29. Heng PA, Cheng CY, Wong TT, et al. A virtual-

reality training system for knee arthroscopic surgery. IEEE Trans Inf Technol Biomed. 2004;8(2):217-227. 30. Heng PA, Cheng CY, Wong TT, et al. Virtual reality

techniques. Application to anatomic visualization and orthopaedics training. Clin Orthop Relat Res. 2006;442:5-12. 31. Mabrey JD, Gillogly SD, Kasser JR, et al. Virtual

reality simulation of arthroscopy of the knee. Arthroscopy. 2002;18(6):E28.

Computer-simulated arthroscopic knee surgery: effects of distraction on resident performance. Orthopedics. 2016;39(2):e240-e245.

32. Poss R, Mabrey JD, Gillogly SD, et al. Development

21. Cannon WD, Garrett WE Jr, Hunter RE, et al.

of a virtual reality arthroscopic knee simulator. J Bone Joint Surg. 2000;82-A(10):1495-1499.

Improving residency training in arthroscopic knee surgery with use of a virtual-reality simulator. A randomized blinded study. J Bone Joint Surg. 2014;96(21): 1798-1806. 22. Cannon WD, Nicandri GT, Reinig K, Mevis H,

Wittstein J. Evaluation of skill level between trainees and community orthopaedic surgeons using a virtual reality arthroscopic knee simulator. J Bone Joint Surg. 2014;96(7):e57.51-57. 16

33. Martin RK, Gillis D, Leiter J, Shantz JS, MacDonald P. A

Porcine Knee Model is valid for use in the evaluation of arthroscopic skills: a pilot study. Clin Orthop Relat Res. 2016;474(4):965-970. 34. McCarthy

AD, Moody L, Waterworth AR, Bickerstaff DR. Passive haptics in a knee arthroscopy simulator: is it valid for core skills training? Clin Orthop Relat Res. 2006;442:13-20.

Journal of Surgical Education  Volume ]/Number ]  ] 2017

35. Megali G, Tonet O, Dario P, Vascellari A, Marcacci M.

Computer-assisted training system for knee arthroscopy. Int J Med Robot. 2005;1(3):57-66. 36. Peres LR, Junior WM, Coelho G, Lyra M. A new

simulator model for knee arthroscopy procedures. Knee Surg Sports Traumatol Arthrosc. 2016, [ahead of print]. 37. Garfjeld Roberts P, Guyver P, Baldwin M, et al.

Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics. Knee Surg Sports Traumatol Arthrosc. 2016:1-10. 38. Sabri H, Cowan B, Kapralos B, Porte M, Backstein D,

Dubrowskie A. Serious games for knee replacement surgery procedure education and training. Procedia. 2010;2(2):3483-3488. 39. Spillmann J, Tuchschmid S, Harders M. Adaptive

space warping to enhance passive haptics in an arthroscopy surgical simulator. IEEE Trans Vis Comput Graph. 2013;19(4):626-633. 40. Strom P, Kjellin A, Hedman L, Wredmark T,

Fellander-Tsai L. Training in tasks with different visual-spatial components does not improve virtual arthroscopy performance. Surg Endosc. 2004;18(1): 115-120. 41. Tsai MD, Hsieh MS, Jou SB. Virtual reality orthope-

dic surgery simulator. Comput Biol Med. 2001;31(5): 333-351. 42. Tuijthof GJ, van Sterkenburg MN, Sierevelt IN, van

Oldenrijk J, van Dijk CN, Kerkhoffs GM. First validation of the PASSPORT training environment for arthroscopic skills. Knee Surg Sports Traumatol Arthrosc. 2010;18(2):218-224. 43. Unalan PC, Akan K, Orhun H, et al. A basic

arthroscopy course based on motor skill training. Knee Surg Sports Traumatol Arthrosc. 2010;18(10): 1395-1399. 44. Ziegler R, Fischer G, Muller W, Gobel M. Virtual

reality arthroscopy training simulator. Comput Biol Med. 1995;25(2):193-203. 45. Conditt MA, Noble PC, Thompson MT, Ismaily SK,

Moy GJ, Mathis KB. A computerized bioskills system for surgical skills training in total knee replacement. Proc Inst Mech Eng H. 2007;221(1):61-69. 46. Jun Y, Lee K-Y, Gwak K-W, Lim D. Anatomic basis

3-D surgical simulation system for custom fit knee replacement. Int J Precision Eng Manufacturing. 2012;13(5):709-715.

curves with the lateral and supine patient positions: a randomized trial. J Bone Joint Surg. 2012;94(10):e68. 48. Andersen C, Winding TN, Vesterby MS. Develop-

ment of simulated arthroscopic skills. Acta Orthop. 2011;82(1):90-95. 49. Dunn JC, Belmont PJ, Lanzi J, et al. Arthroscopic

shoulder surgical simulation training curriculum: transfer reliability and maintenance of skill over time. J Surg Educ. 2015;72(6):1118-1123. 50. Martin KD, Belmont PJ Jr., Schoenfeld AJ, et al.

Arthroscopic basic task performance in shoulder simulator model correlates with similar task performance in cadavers. J Bone Joint Surg. 2011;93(21): e1271-e1275. 51. Martin KD, Cameron K, Belmont PJ, Schoenfeld A,

Owens BD. Shoulder arthroscopy simulator performance correlates with resident and shoulder arthroscopy experience. J Bone Joint Surg. 2012;94(21):e160. 52. Waterman BR, Martin KD, Cameron KL, Owens BD,

Belmont PJ Jr. Simulation training improves surgical proficiency and safety during diagnostic shoulder arthroscopy performed by residents. Orthopedics. 2016;39(3):e479-e485. 53. Howells NR, Auplish S, Hand GC, Gill HS, Carr AJ,

Rees JL. Retention of arthroscopic shoulder skills learned with use of a simulator: demonstration of a learning curve and loss of performance level after a time delay. J Bone Joint Surg. 2009;91(5): 1207-1213. 54. Gomoll AH, O’Toole RV, Czarnecki J, Warner JJ.

Surgical experience correlates with performance on a virtual reality simulator for shoulder arthroscopy. Am J Sports Med. 2007;35(6):883-888. 55. Gomoll AH, Pappas G, Forsythe B, Warner JJ.

Individual skill progression on a virtual reality simulator for shoulder arthroscopy: a 3-year follow-up study. Am J Sports Med. 2008;36(6):1139-1142. 56. Henn Iii RF, Shah N, Warner JJP, Gomoll AH.

Shoulder arthroscopy simulator training improves shoulder arthroscopy performance in a cadaveric model. Arthroscopy. 2013;29(6):982-985. 57. Pedowitz RA, Esch J, Snyder S. Evaluation of a virtual

reality simulator for arthroscopy skills development. Arthroscopy. 2002;18(6):E29. 58. Srivastava S, Youngblood PL, Rawn C, Hariri S,

Rees JL. Simulated hip arthroscopy skills: learning

Heinrichs WL, Ladd AL. Initial evaluation of a shoulder arthroscopy simulator: establishing construct validity. J Shoulder Elbow Surg. 2004;13(2): 196-205.

Journal of Surgical Education  Volume ]/Number ]  ] 2017

17

47. Pollard TC, Khan T, Price AJ, Gill HS, Glyn-Jones S,

59. Rahm S, Germann M, Hingsammer A, Wieser K,

72. Moktar J, Popkin CA, Howard A, Murnaghan ML.

Gerber C. Validation of a virtual reality-based simulator for shoulder arthroscopy. Knee Surg Sports Traumatol Arthrosc. 2016;24(5):1730-1737.

Development of a cast application simulator and evaluation of objective measures of performance. J Bone Joint Surg. 2014;96(9):e76.

60. Brubacher JW, Karg J, Weinstock P, Bae DS. A novel

73. Nousiainen MT, Omoto DM, Zingg PO, Weil YA,

cast removal training simulation to improve patient safety. J Surg Educ. 2016;73(1):7-11.

Mardam-Bey SW, Eward WC. Training femoral neck screw insertion skills to surgical trainees: computerassisted surgery versus conventional fluoroscopic technique. J Orthop Trauma. 2013;27(2):87-92.

61. Beth Grossman L, Komatsu DE, Badalamente MA,

Braunstein AM, Hurst LC. Microsurgical simulation exercise for surgical training. J Surg Educ. 2016;73 (1):116-120. 62. Tsai MD, Hsieh MS. Accurate visual and haptic

burring surgery simulation based on a volumetric model. J Xray Sci Technol. 2010;18(1):69-85. 63. Vankipuram M, Kahol K, McLaren A, Panchanathan S. A

virtual reality simulator for orthopedic basic skills: a design and validation study. J Biomed Inform. 2010;43(5):661-668. 64. Lin Y, Wang X, Wu F, Chen X, Wang C, Shen G.

Development and validation of a surgical training simulator with haptic feedback for learning bonesawing skill. J Biomed Inform. 2014;48:122-129. 65. Hsieh MS, Tsai MD, Yeh YD. An amputation

simulator with bone sawing haptic interaction. Biomed Eng. 2006;18(5):229-236. 66. Blyth P, Stott NS, Anderson IA. A simulation-based

training system for hip fracture fixation for use within the hospital environment. Injury. 2007;38 (10):1197-1203.

67. Egan C, Egan R, Curran P, Bryan K, Fleming P.

Development of a model for teaching manipulation of a distal radial fracture. J Bone Joint Surg. 2013;95 (5):433-438. 68. Froelich JM, Milbrandt JC, Novicoff WM, Saleh KJ,

Allan DG. Surgical simulators and hip fractures: a role in residency training? J Surg Educ. 2011;68(4): 298-302. 69. Pedersen P, Palm H, Ringsted C, Konge L. Virtual-

reality simulation to assess performance in hip fracture surgery. Acta Orthop. 2014;85(4):403-407. 70. LeBlanc J, Hutchison C, Hu Y, Donnon

T. A comparison of orthopaedic resident performance on surgical fixation of an ulnar fracture using virtual reality and synthetic models. J Bone Joint Surg. 2013;95(9):e60 S1-S5.

71. Mayne IP, Brydges R, Moktar J, Murnaghan ML.

Development and assessment of a distal radial fracture model as a clinical teaching tool. J Bone Joint Surg Am. 2016;98(5):410-416. 18

74. Rambani R, Viant W, Ward J, Mohsen A. Computer-

assisted orthopedic training system for fracture fixation. J Surg Educ. 2013;70(3):304-308.

75. Sugand K, Mawkin M, Gupte C. Validating Touch

Surgery™: a cognitive task simulation and rehearsal app for intramedullary femoral nailing. Injury. 2015;46(11):2212-2216. 76. Sugand K, Mawkin M, Gupte C. Training effect of

using Touch Surgery™ for intramedullary femoral nailing. Injury. 2016;47(2):448-452. 77. Yehyawi TM, Thomas TP, Ohrt GT, et al. A

simulation trainer for complex articular fracture surgery. J Bone Joint Surg. 2013;95(13):e921-e928. 78. Adermann J, Geissler N, Bernal LE, Kotzsch S, Korb W.

Development and validation of an artificial wetlab training system for the lumbar discectomy. Eur Spine J. 2014;23 (9):1978-1983. 79. Fuerst D, Hollensteiner M, Schrempf A. Assessment

parameters for a novel simulator in minimally invasive spine surgery. Conf Proc IEEE Eng Med Biol Soc. 2015;2015:5110-5113. 80. Gottschalk MB, Yoon ST, Park DK, Rhee JM,

Mitchell PM. Surgical training using threedimensional simulation in placement of cervical lateral mass screws: a blinded randomized control trial. Spine J. 2015;15(1):168-175. 81. Podolsky DJ, Martin AR, Whyne CM, Massicotte

EM, Hardisty MR, Ginsberg HJ. Exploring the role of 3-dimensional simulation in surgical training: feedback from a pilot study. J Spinal Disord Tech. 2010;23(8): e70-e74. 82. Riehl J, Widmaier J. A simulator model for sacroi-

liac screw placement. J Surg Educ. 2012;69(3): 282-285. 83. Rush R, Ginsberg HJ, Jenkinson R, Whyne CM.

Beyond the operating room: a simulator for sacroiliac screw insertion. Surg Innov. 2008;15(4):321-323. 84. Tonetti J, Vadcard L, Girard P, Dubois M, Merloz P,

Troccaz J. Assessment of a percutaneous iliosacral Journal of Surgical Education  Volume ]/Number ]  ] 2017

screw insertion simulator. Orthop Traumatol Surg Res. 2009;95(7):471-477.

91. Madan SS, Pai DR. Role of simulation in arthroscopy

85. Tortolani PJ, Moatz BW, Parks BG, Cunningham BW,

92. Yule S, Flin R, Maran N, Rowley D, Youngson G,

Sefter J, Kretzer RM. Cadaver training module for teaching thoracic pedicle screw placement to residents. Orthopedics. 2013;36(9):e1128-e1133.

Paterson-Brown S. Surgeons’ non-technical skills in the operating room: reliability testing of the NOTSS behavior rating system. World J Surg. 2008;32(4):548-556.

86. Yeo CT, Ungi T, U-Thainual P, Lasso A, McGraw RC,

93. Gawande AA, Zinner MJ, Studdert DM, Brennan TA.

Fichtinger G. The effect of augmented reality training on percutaneous needle placement in spinal facet joint injections. IEEE Trans Biomed Eng. 2011;58(7): 2031-2037.

Analysis of errors reported by surgeons at three teaching hospitals. Surgery. 2003;133(6):614-621.

87. Stunt J, Wulms P, Kerkhoffs G, Dankelman J, van

Dijk C, Tuijthof G. How valid are commercially available medical simulators? Adv Med Educ Pract. 2014;5:385-395. 88. Martin JA, Regehr G, Reznick R, et al. Objective

structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84(2):273-278. 89. Schout BM, Hendrikx AJ, Scherpbier AJ, Bemelmans

BL. Update on training models in endourology: a qualitative systematic review of the literature between January 1980 and April 2008. Eur Urol. 2008;54(6): 1247-1261. 90. Aydin A, Shafi AM, Khan MS, Dasgupta P, Ahmed K.

training. Simul Healthc. 2014;9(2):127-135.

94. Hull L, Arora S, Aggarwal R, Darzi A, Vincent C,

Sevdalis N. The impact of nontechnical skills on technical performance in surgery: a systematic review. J Am Coll Surg. 2012;214(2):214-230. 95. Arora S, Miskovic D, Hull L, et al. Self vs expert

assessment of technical and non-technical skills in high fidelity simulation. Am J Surg. 2011;202(4):500-506. 96. Brunckhorst O, Shahid S, Aydin A, et al. Simulation-

based ureteroscopy skills training curriculum with integration of technical and non-technical skills: a randomised controlled trial. Surg Endosc. 2015;29(9):2728-2735. 97. Kneebone R, Arora S, King D, et al. Distributed

simulation—accessible immersive training. Med Teach. 2010;32(1):65-70.

Current status of simulation and training models in urological surgery: a systematic review. J Urol. 2016;196(92):312-320.

98. Bridges M, Diamond DL. The financial impact of

Journal of Surgical Education  Volume ]/Number ]  ] 2017

19

teaching surgical residents in the operating room. Am J Surg. 1999;177(1):28-32.