Ear Disease Knowledge and Otoscopy Skills Transfer to Real Patients: A Randomized Controlled Trial

Ear Disease Knowledge and Otoscopy Skills Transfer to Real Patients: A Randomized Controlled Trial

ORIGINAL REPORTS Ear Disease Knowledge and Otoscopy Skills Transfer to Real Patients: A Randomized Controlled Trial Vincent Wu, BHSc, Joobin Sattar, ...

883KB Sizes 1 Downloads 19 Views

ORIGINAL REPORTS

Ear Disease Knowledge and Otoscopy Skills Transfer to Real Patients: A Randomized Controlled Trial Vincent Wu, BHSc, Joobin Sattar, MSc, Stephanie Cheon, BHSc and Jason A. Beyea, MD, PhD, FRCSC Department of Otolaryngology, Hotel Dieu Hospital, Queen’s University School of Medicine, Kingston, Ontario, Canada OBJECTIVE: To determine which teaching method—otoscopy simulation (OS), web-based module (WM), or standard classroom instruction (SI)—produced greater translation of knowledge and otoscopy examination skills to real patients.

p ¼ 0.0011), with no significant improvement from WM (13.46%, 0.78 ± 1.92, p ¼ 0.1050). Students across all groups reported significantly improved confidence in diagnostic accuracy (p o 0.0001) and otoscopy skill (p o 0.0001) after the intervention.

DESIGN: In a prospective randomized controlled nonclinical

CONCLUSION: All 3 teaching modalities showed an

trial, medical students were randomized to 1 of 3 interventional arms: (1) OS, (2) WM, or (3) SI. Students were assessed at baseline for diagnostic accuracy and otoscopy skills on 5 volunteer patients (total of 10 ears), followed by the intervention. Testing was repeated immediately after intervention on the same patients. Student reported confidence in diagnostic accuracy and otoscopy examination were also captured. Assessors were blinded to the intervention group, and whether students were pre- or post-intervention.

improvement in diagnostic accuracy immediately postintervention. Otoscopy clinical skills were found to have increased only in OS and SI, with the OS group demonstrating the largest improvement. Simulation-based medical education in Otolaryngology may provide the greatest transfer of medical knowledge and technical skills when C 2018 evaluated with real patients. ( J Surg Ed ]:]]]-]]]. J Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.)

SETTING: Clinical Teaching Centre, Queen’s University.

KEY WORDS: medical education, otoscopy, simulation,

PARTICIPANTS: Twenty-nine participants were initially

randomized. Two students were unable to attend their specific intervention sessions and withdrew. Final group sizes were: OS—10, WM—9, SI—8. Five patients with external/middle ear pathologies were voluntarily recruited to participate as testing subjects. RESULTS: Baseline diagnostic accuracy and otoscopy clin-

ical skills did not differ across the groups. Post-intervention, there were improvements in diagnostic accuracy from all groups: OS (127.78%, 2.30 ± 1.42, p ¼ 0.0006), WM (76.40%, 1.44 ± 1.88, p ¼ 0.0499), and SI (100.00%, 1.50 ± 1.20, p ¼ 0.0093). For otoscopy skills, postintervention improvements were noted from OS (77.00%, 3.85 ± 2.55, p o 0.0001) and SI (22.20%, 1.25 ± 1.20,

Correspondence: Inquiries to Jason A. Beyea, MD, PhD, FRCSC, Otology/ Neurotology, Department of Otolaryngology, Queen’s University, 144 Brock Street, Kingston, Ontario, Canada K7L 5G2; e-mail: [email protected]

web-based learning module COMPETENCY: Medical knowledge

BACKGROUND The development of simulation-based training models has drastically changed the educational environment of modern medical schools, allowing for more hands-on and active learning.1–3 Specifically within Otolaryngology, simulation has filled important learning gaps in both the undergraduate and postgraduate medical curricula.4,5 For ear disease, there currently exists a number of commercially available simulators, including the web-based OtoTrain, the Life/form Diagnostic and Procedural Ear Trainer, the Earsi Otoscope, and the OtoSim Ear Training and Simulation System.6–12 Various validation studies have been performed on these simulators, with some having shown significant improvements in both the diagnosing capabilities as well as technical skills of

Journal of Surgical Education  & 2018 Association of Program Directors in Surgery. Published by 1931-7204/$30.00 Elsevier Inc. All rights reserved. https://doi.org/10.1016/j.jsurg.2017.12.011

1

medical trainees.6,12,13 Additionally, simulators have also been shown to be effective in increasing the diagnostic confidence of students and their interest in Otolaryngology.14 Our research group previously established that the acquisition and long-term retention of knowledge and clinical skills were highest from students who participated in simulationbased teaching, compared to web-modules and classroom lectures.12 Unlike other modalities of learning that are commonly employed by medical schools today, simulationbased learning can provide students the opportunity to mimic what they will encounter in a clinical setting. At the same time, simulators reduce the need for multiple volunteer patients, which is often required for the repetitive exposure that is needed for clinical skills to develop.5 The ultimate goal of simulation-based medical education is to translate the knowledge and skills learned in a simulated environment to that of a clinical setting, to improve patient care and outcomes.15,16 Such successful translations have been documented in the past. However, it is important to note that research in simulation-based education showing transfer of learning to clinical practice is difficult to design and conduct.6,17 Many of the studies currently published on simulation-based medical education in the field of Otology have measured student performance on the simulator that was used as part of the training.12,13 Clearly, the limitation of this approach is that it may bias the results, favoring those trained on the simulator. These participants may have increased familiarity from additional exposure. Testing on a surrogate measure has also been explored previously.6,18 However, without direct comparison to students’ performance in an actual clinical context, it is still difficult to postulate whether surrogate measures are true and accurate reflections of students’ learning. To date, simulation-based education within Otolaryngology has not identified knowledge and skill translation to real patients, which is the gold standard in assessing the utility and effectiveness of an educational intervention.15,16 Herein, we aimed to evaluate this by directly comparing 3 different teaching modalities: otoscopy simulation (OS), web-based module (WM), and standard classroom instruction (SI).

on the studied otoscopy simulator or if they had participated in previous research studies using the studied otoscopy simulator. Written consent was obtained from each student prior to beginning the study. The sample size was determined based on the previous educational study conducted by our research group, evaluating diagnostic accuracy and otoscopy skill across 3 interventional arms.12 Based on the smallest significant effect size of 2.54 observed between groups and a standard deviation (SD) of 1.29, our sample size was calculated to be 5 participants per intervention arm based on α ¼ 0.05 and β ¼ 0.20. To ensure the study was adequately powered, we aimed to recruit at least 6 participants per intervention arm. Patients from an Otology/Neurotology Outpatient Clinic with current otologic pathologies were invited to participate in the study as volunteers to be examined. Written consent was obtained from each patient prior to being enrolled in the study. Of the 5 patients who volunteered to participate, 2 had bilateral pathologies and 3 had unilateral pathologies (total of 7 pathological and 3 normal ears). Design

All first and second year medical students from Queen’s University were recruited to voluntarily participate in the study. Students were excluded if they had previous training

Following recruitment, students were randomized to 1 of 3 parallel educational intervention arms (OS, WM, or SI) following simple randomization using a random numbers generator (http://www.random.org). The study flow diagram is shown in Figure 1. Students underwent baseline testing prior to receiving their intervention. The testing session was designed as an objective structured clinical examination (OSCE), whereby students were given 2.5 minutes to examine both ears of a volunteer patient, rotating through all 5 patients. Students were instructed to focus only on the otoscopy examination. Responses were written on provided answer sheets and submitted into opaque slotted envelopes inside each patient room following the examination. Answers from other students were not accessible and no discussion was permitted between students during testing. Students were videorecorded during one of the patient encounters, in order to capture their otoscopy clinical skills. Following baseline testing, students underwent their assigned educational intervention. All interventions were 30 minutes in duration and included the same teaching material: otoscopy examination techniques and 25 images of middle and external ear pathologies including images of normal ears.12 Examples of pathologies utilized included acute otitis media, serous otitis media, tympanic membrane perforation, exostosis, and cholesteatoma. Students were asked not to take notes during the intervention sessions. Immediately following the intervention, testing was repeated, which was again video-captured. The baseline testing, intervention, and post-intervention testing all took place over the course of the same day, whereby all students were assessed on the same 5 volunteer patients.

2

Journal of Surgical Education  Volume ]/Number ]  ] 2018

METHODS This prospective randomized controlled nonclinical trial (RCT) was approved by the Queen’s University Health Sciences and Affiliated Teaching Hospitals Research Ethics Board (#6019936) and the Queen’s University School of Medicine Undergraduate Medical Education Curriculum Committee. Participants

FIGURE 1. Study flow diagram.

Otoscopy Simulation The OtoSim Ear Training and Simulation System (v.1, OtoSim Inc, Toronto, ON, Canada) was used for the OS intervention arm, which was led by a sessional instructor in a small-group format. Correct performance of the otoscopy examination was demonstrated on the simulator, and the sessional instructor led students through a series of ear pathologies based on established methodology for simulation teaching.12,14 With remaining time, students had the opportunity to practice the otoscopy exam on the simulator and review ear pathologies at their own pace. Web-Based Module The WM was designed using Google Slides (v.1.2017, Google Inc, Mountain View, CA, USA) and accessed from students’ personal computers. The WM was intuitive to use and required no additional instructions. The WM outlined correct otoscopy examination techniques and provided a Journal of Surgical Education  Volume ]/Number ]  ] 2018

comprehensive review of the same ear pathologies. The website was time-restricted to 30 minutes, and all students within the WM group were sequestered for the full duration of this time. A study investigator was immediately available to answer any questions that arose. Standard Classroom Instruction The SI involved a standard classroom lecture delivered by a staff Otolaryngologist (J.A.B.). Proper techniques for performing the otoscopy examination were discussed, along with a review of ear pathologies. The lecturer answered any questions that arose during the lecture. Remaining time after the lecture was used for further questions from students. Outcome Measures The primary outcome measure was diagnostic accuracy, based on the correct identification of otological pathologies 3

Table 1. Modified Minnesota Department of Health Otoscopy Clinical Skills Scoring Checklist Task 1 2 3 4 5 6 7 8 9 10

Score

Otoscope assembled properly, speculum applied properly Correctly selects the largest speculum size Turns on the otoscope correctly and verifies illumination Holds otoscope with power base up Cushions patient head with hand to prevent trauma Grasps pinna with the other hand Correctly pulls the pinna back to straighten the ear canal Gently inserts the speculum Correctly looks through the magnifying lens Identifies a visual image of the eardrum Total Score:

and normal ears of volunteer patients. The secondary outcome measure was otoscopy examination skills. This was assessed by 2 independent, intervention-blinded reviewers based on recorded videos using the 10-item Modified Minnesota Department of Health Otoscopy Checklist, which was previously developed and reported by our group (Table 1).12 Currently there are no validated checklists, which assess otoscopy examination skills on adult patients. Both primary and secondary outcome measures were assessed at baseline and immediately postintervention. Additionally, a survey questionnaire was administered after post-intervention testing to capture students’ selfperceived confidence in both diagnostic accuracy and otoscopy skills before and after receiving their intervention. The questions also pertained to the organization and quality of the event itself, and included both qualitative and quantitative components. The survey questions are found in Table 2, and were answered either using a 5-point Likert scale or with open-free text. Table 2. Survey Questions Please answer the following questions using the 5-point Likert scale (1—very poor, 2—poor, 3—neutral, 4—good, 5—excellent) 1. Overall quality of the event? 2. Organization of the event? 3. Confidence in using the otoscope before the event? 4. Confidence in using the otoscope after the event? 5. Confidence in diagnosing middle/external ear disease before the event? 6. Confidence in diagnosing middle/external ear disease after the event? Please answer the following questions using the free-text box 7. What did you find to be the most useful/beneficial aspect of the event? 8. What did you find to be the least useful aspect of the event? 9. Do you have any other recommendations or comments? 4

Statistical Analysis Prism (v7.0, GraphPad, La Jolla, CA, USA) was used for all statistical analyses, with statistical significant set to α ¼ 0.05. Results are reported as mean ± SD. Repeatedmeasures two-way analysis of variance (ANOVA) was used to analyze between group differences for diagnostic accuracy, otoscopy clinical skill scores, and students’ selfperceived confidence levels at baseline and post-intervention time points. Intra-subject differences from pre- and postintervention scores were calculated, with one-way ANOVA performed to analyze differences in diagnostic accuracy and otoscopy skills between groups. Post hoc analysis was performed with the Holm-Bonferroni method. Intra-class correlation coefficient (ICC) was calculated as a measure of inter-rater variability.

RESULTS Twenty-nine participants were initially randomized, with 10 in the OS and WM groups, and 9 in the SI group. Two students were unable to attend their specific intervention sessions and withdrew. A total of 27 undergraduate medical students participated in the study (first year—23, second year—4). Final group sizes were: OS—10, WM—9, and SI —8. All students completed their assigned intervention and both testing sessions. Diagnostic Accuracy Diagnostic accuracy results (total score of 10) are shown in Figure 2. Improved diagnostic accuracy scores were noted post-intervention (F(1,24) ¼ 34.87, p o 0.0001). Significant increases within all groups were seen: OS (127.78%, 2.30 ± 1.42, p ¼ 0.0006), WM (76.40%, 1.44 ± 1.88, p ¼ 0.0499), and SI (100.00%, 1.50 ± 1.20, p ¼ 0.0093). The type of intervention did not have affect diagnostic accuracy (F(2,24) ¼ 0.6611, p ¼ 0.5254), with no differences noted between groups at baseline (p ¼ 0.8150) or Journal of Surgical Education  Volume ]/Number ]  ] 2018

Table 3. Qualitative Survey Results Most useful aspect of the event “practicing on a real patient” “observing anatomic variations between different ears” “re-examining the same patients after the intervention” “learning about different disease presentations of the ear” Least useful aspect of the event “unable to take notes” “uncomfortable with otoscopy usage during the first patient encounter prior to intervention” Comments/recommendations

FIGURE 2. Diagnostic accuracy scores at baseline (OS—1.80 ± 1.32, WM—1.89 ± 1.27, SI—1.50 ± 1.31) and post-intervention (OS—4.10 ± 0.88, WM—3.33 ± 2.45, SI—3.00 ± 1.31).

post-intervention (p ¼ 0.4100). No interaction was noted between the variables (F(2,24) ¼ 0.9256, p ¼ 0.4100). Otoscopy Examination Skill Otoscopy examination skill results (total score of 10) are illustrated in Figure 3. The reliability between raters was excellent (ICC ¼ 0.93, 95% confidence interval ¼ 0.88 –0.96, p o 0.0001). Post-intervention otoscopy scores were significantly improved (F(1,24) ¼ 25.15, p o 0.0001). Within group improvements were noted from OS (77.00%, 3.85 ± 2.55, p o 0.0001) and SI (22.20%, 1.25 ± 1.20, p ¼ 0.0011), with no significant improvement from WM (13.46%, 0.78 ± 1.92, p ¼ 0.1050). Intervention type did not significantly affect otoscopy skills (F(2,24) ¼ 0.89, p ¼ 0.4254); no differences were present at baseline between

“to have the ability to re-examine patients after knowing the diagnosis” “well done, great overall learning experience”

groups (p ¼ 0.4692). However, an interaction between the variables was noted (F(2,24) ¼ 6.37, p ¼ 0.0060). This was reflected with significantly higher increases in otoscopy skill scores post-intervention in the OS group as compared to WM (p ¼ 0.0092) and SI (p ¼ 0.0175). No difference was noted post-intervention between WM and SI (p ¼ 0.5585). Survey Questionnaire

FIGURE 3. Otoscopy examination scores at baseline (OS—5.00 ± 2.20, WM—5.78 ± 2.05, SI—5.63 ± 1.86) and post-intervention (OS—8.85 ± 0.88, WM—6.56 ± 1.42, SI—6.88 ± 1.41).

Completion rate of the survey was 100%. On a 5-point Likert-type scale (1 ¼ very poor, 3 ¼ neutral, 5¼ excellent), students found the event to be of great quality (4.44 ± 0.51) and well-organized (4.63 ± 0.49). Qualitative results from student feedback surrounding the event are listed in Table 3. Self-perceived confidence in diagnostic accuracy is illustrated in Figure 4. The type of intervention did not affect confidence (F(2,24) ¼ 0.89, p ¼ 0.4150). No differences between groups were noted at baseline (p ¼ 0.4068) or post-intervention (p ¼ 0.7715). Diagnostic accuracy confidence improved post-intervention (F(1,24) ¼ 51.35, p o 0.0001). All groups reported significant improvements: OS (66.67%, 1.50 ± 0.85, p ¼ 0.0010), WM (92.57%, 0.89 ± 0.92, p ¼ 0.0005), and SI (84.36%, 1.25 ± 0.71, p ¼ 0.0001). Interaction between the variables (F(2,24) ¼ 0.08, p ¼ 0.9250) was not significant. Student reported confidence with otoscopy examination is shown in Figure 5. Similarly, the type of intervention did not affect otoscopy confidence (F(2,24) ¼ 0.93, p ¼ 0.4005), with no between group differences noted at baseline (p ¼ 0.5509) or post-intervention (p ¼ 0.1315). Otoscopy confidence increased post-intervention (F(1,24) ¼ 28.35, p o 0.0001), with improvements within all groups: OS (55.56%, 1.20 ± 0.79, p ¼ 0.0003), WM (33.29%, 1.33 ± 0.71, p ¼ 0.0207), and SI (50.00%, 1.38 ± 0.52, p ¼ 0.0016). Interaction of the variables (F(2,24) ¼ 0.60, p ¼ 0.5509) was not significant.

Journal of Surgical Education  Volume ]/Number ]  ] 2018

5

FIGURE 5. Otoscopy confidence scores at baseline (OS—2.70 ± 0.82, WM—2.67 ± 1.12, SI—2.50 ± 0.93) and post-intervention (OS—4.20 ± 0.42, WM—3.56 ± 1.01, SI—3.75 ± 0.46).

performed, as is reflected in the similarity between baseline diagnostic accuracies in all the intervention groups. All groups scored poorly at baseline, but statistically significant improvements were achieved by all groups after the intervention, emulating findings from our previous study.12 The overall improvement in diagnostic accuracy across all groups was a mean of 1.78 ± 1.53 points out of 10. For evaluating the clinical significance of these diagnostic accuracy improvements for real-world patient benefits, it will be important to relate the changes seen to an external measure.19 Establishing an external anchor of students’ performance outside of the study context, such as assessment of global performance within the Otolaryngology rotation, may help to further explain the significance of these improvements.19 We also found no significant post-intervention difference in the diagnostic accuracy score between groups. All study interventions were standardized for time and utilized the same 25 images of middle/external ear pathologies and variants of normal. The only factor that changed between intervention arms was the method in the delivery of the information. Previously, we reported that OS and WM groups outperformed SI in diagnostic accuracy when reexamined on the otoscopy simulator.12 As a limitation for any study, which tests on a measure which is included as part of the intervention, the results may be biased toward that particular group due to increased exposure.12,13 Because we had met the numbers of participants required based on our sample size calculation, it appears that the type of teaching modality does not affect the transfer of knowledge when tested with real patients. The assessment of otoscopy skill was performed in an intervention-blinded fashion, by 2 independent reviewers, based on the video-recording of one patient assessment station. The same volunteer patient was video-recorded for all baseline and post-interventional testing sessions as to maintain consistency across all students. The inter-rater reliability was highly agreeable among the reviewers. The baseline scores had an average of 5.44 ± 2.04 among all students, which was similar to our previously reported baseline of 6.62 ± 2.18.12 The finding of OS scoring significantly higher than WM and SI was also a repeat of the results from our previous study.12 Statistically significant post-intervention improvements were noted within OS and SI groups. Similar to diagnostic accuracy, an external anchor may be required to further validate the clinical significance of these improvements in the context of students’ global clinical performance.19 Factors unique to the OS group may have contributed to its improvement over WM and SI in otoscopy skill. This included actual usage of the otoscope during the intervention and the availability for immediate feedback from the instructor surrounding proper otoscope usage, which are both known to contribute to improved accuracy for performance of a skilled task.20,21 The finding of superiority

6

Journal of Surgical Education  Volume ]/Number ]  ] 2018

FIGURE 4. Diagnostic confidence at baseline (OS—1.80 ± 0.63, WM—.44 ± 0.53, SI—1.63 ± 0.52) and post-intervention (OS—3.00 ± 0.67, WM—2.78 ± 1.09, SI—3.00 ± 0.00).

DISCUSSION This prospective RCT was the first of its kind to assess Otolaryngology knowledge and skill transfer from 3 different ear disease teaching modalities with real patients. The transferability of knowledge and skills into real-world scenarios is the ultimate test for assessing the utility and effectiveness of any educational modality.15,16 Especially for simulation-based training, the fundamental assumption is that the skills and knowledge acquired are directly transferable to a clinical setting. This study reaffirmed several facts previously known to simulation-based education, with results that highlight the superiority of simulation-based teaching in the domain of skills transfer. A primary strength of this study was that the diagnostic accuracy of all students was based on their assessment of the same 5 volunteer patients. True randomization was

in OS group’s otoscopy skill is important for 2 reasons: (1) OS may offer the greatest transfer of skills from simulation to real patients, and (2) performance on the OS may be correlated with performance on real patients. Combined, the findings suggest that the otoscopy simulator has the most optimal transfer of otoscopy examination skills, and may be the best substitute for real patients during the clinical assessment and testing of medical students. The format of the session itself, with baseline OSCE-style testing, intervention, and post-intervention OSCE-style testing in that order, also made for an experiential learning experience that was very well received by the students, as indicated by the survey responses. The event was equally well received by volunteer patients, many of whom stated to study investigators that it was a great opportunity to contribute to the learning and education of medical students. Many of the students indicated that being able to assess real patients before and after the intervention allowed them to not only practice their otoscopy skills, but also become exposed to otological diseases and normal anatomy. Due to the great student and patient feedback, the authors plan similar future events as part of the undergraduate Otolaryngology curriculum at Queen’s University. This study has potential limitations. Students were only assessed immediately following the intervention, and not on long-term knowledge and skills retention. This decision was made due to 2 main reasons: (1) the authors did not find a significant difference between immediate post-intervention and 3-month follow-up from our previous assessment of diagnostic accuracy and otoscopy skill,12 and (2) more practically, coordination among the same volunteer patients and students would have been too onerous for the participants. Additionally, all of the patients had chronic otological conditions, limiting the scope of the pathologies encountered. However, recruiting volunteer patients with acute untreated middle and/or external ear diseases was not considered to be ethical nor practical. Moreover, the lack of practical skills training within the WM and SI teaching arms may have affected the development of otoscopy skills within these groups. The authors decided to exclude hands-on otoscopy training within the WM and SI arms based on current trends in undergraduate Otolaryngology teaching across Canadian medical schools, many of which do not include practical hands-on training.22 Interventions were also kept similar to our previous study, allowing for direct comparisons to be made.12 Future studies can aim to directly compare simulation with practical skills training within small groups, allowing for a direct evaluation of the simulator’s benefits.

confirms the clinical relevance of otoscopy simulation. The authors believe OS is a vital part of Otolaryngology undergraduate medical education.

CONCLUSION

10. Campisi P, Tirado Y, Chadha NK, Forte V. Otoscopy

The results of this study demonstrated an improvement in diagnostic accuracy of ear pathologies across all groups, when tested with real patients. The simulation group demonstrated the most improved otoscopy skills. This study Journal of Surgical Education  Volume ]/Number ]  ] 2018

ACKNOWLEDGMENTS The authors would like to thank the Queen’s University Clinical Simulation Centre for their support through the Medical Student Simulation Research Grants to V.W. and J.S. This study received material support (use of OtoSim) from the Department of Otolaryngology, Queen’s University.

REFERENCES 1. Johnson E. Surgical simulators and simulated sur-

geons: reconstituting medical practice and practitioners in simulations. Soc Stud Sci. 2007;37(4):585-608. 2. Dent JA. Current trends and future implications in the

developing role of clinical skills centres. Med Teach. 2001;23:483-489. 3. Okuda Y, Bryson EO, DeMaria S, et al. The utility of

simulation in medical education: what is the evidence? Mt Sinai J Med. 2009;76(4):330-343. 4. Thone N, Winter M, Garcia-Matte RJ, González C.

Simulation in otolaryngology: a teaching and training tool. Acta Otorrinolaringol Esp. 2017;68(2):115-120. 5. Wiet GJ, Stredney D, Wan D. Training and simu-

lation in otolaryngology. Otolaryngol Clin North Am. 2011;44(6):1333-1350. 6. Stepniak C, Wickens B, Husein M, et al. Blinded

randomized controlled study of a web‐based otoscopy simulator in undergraduate medical education. Laryngoscope. 2017;127(6):1306-1311. 7. V.R. Magic. Earsi Otoscope 2017. Available at:

https://www.vrmagic.com/simulators/simulators/ear si-otoscope/. Accessed 12.05.17. 8. Morris E, Kesser BW, Peirce-Cottler S, Keeley M. Develop-

ment and validation of a novel ear simulator to teach pneumatic otoscopy. Simul Healthc. 2012;7(1):22-26. 9. Wickens B, Lewis J, Morris DP, Husein M, Ladak

HM, Agrawal SK. Face and content validity of a novel, web-based otoscopy simulator for medical education. J Otolaryngol Head Neck Surg. 2015;44:7. Simulation: A New Paradigm in Undergraduate Medical Education. Laryngoscope. 2011;121(S5):S246. 11. Davies J, Djelic L, Campisi P, Forte V, Chiodo A.

Otoscopy simulation training in a classroom setting: a 7

novel approach to teaching otoscopy to medical students. Laryngoscope. 2014;124(11):2594-2597. 12. Wu V, Beyea JA. Evaluation of a web-based module

and an otoscopy simulator in teaching ear disease. Otolaryngol Head Neck Surg. 2017;156(2):272-277. 13. Oyewumi M, Brandt MG, Carrillo B, et al. Objective

evaluation of otoscopy skills among family and community medicine, pediatric, and otolaryngology residents. J Surg Educ. 2015;7204:198-205. 14. Lee DJ, Fu TS, Carrillo B, Campisi P, Forte V, Chiodo A.

17. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ.

A critical review of simulation‐based medical education research: 2003–2009. Med Educ. 2010;44(1):50-63. 18. Beyea JA, Wong E, Bromwich M, Weston WW, Fung K.

Evaluation of a particle repositioning maneuver web‐based teaching module. Laryngoscope. 2008;118(1):175-180. 19. Dwyer T, Wadey V, Archibald D, et al. Cognitive and

psychomotor entrustable professional activities: can simulators help assess competency in trainees? Clin Orthop Relat Res. 2016;474(4):926-934.

Evaluation of an otoscopy simulator to teach otoscopy and normative anatomy to first year medical students. Laryngoscope. 2015;125(9):2159-2162.

20. Ericsson KA. Deliberate practice and acquisition of

15. Beard JD. Assessment of surgical skills of trainees

21. Holland JP, Waugh L, Horgan A, Paleri V, Deehan

in the UK. Ann R Coll Surg Engl. 2008;90(4): 282-285.

DJ. Cadaveric hands-on training for surgical specialties: is this back to the future for surgical skills development? J Surg Educ. 2011;68(2):110-116.

16. Tay C, Khajuria A, Gupte C. Simulation training: a

systematic review of simulation in arthroscopy and proposal of a new competency-based training framework. Int J Surg. 2014;12(6):626-633.

8

expert performance: a general overview. Acad Emerg Med. 2008;15(11):988-994.

22. Fung K. Otolaryngology–head and neck surgery in

undergraduate medical education: advances and innovations. Laryngoscope. 2015;125(S2):S1-S14.

Journal of Surgical Education  Volume ]/Number ]  ] 2018