Developing and Evaluating a Simulator for Complex IVC Filter Retrieval

Developing and Evaluating a Simulator for Complex IVC Filter Retrieval

ARTICLE IN PRESS Innovations in Radiology Education Developing and Evaluating a Simulator for Complex IVC Filter Retrieval Nam S. Hoang, BA, Benjami...

390KB Sizes 0 Downloads 59 Views

ARTICLE IN PRESS

Innovations in Radiology Education

Developing and Evaluating a Simulator for Complex IVC Filter Retrieval Nam S. Hoang, BA, Benjamin H. Ge, MD, William T. Kuo, MD

Rationale and Objectives: Simulation models allow trainees to acquire and develop procedural skills without compromising patient safety. Complex inferior vena cava (IVC) filter retrieval requires the operator to be proficient at using devices, such as endobronchial forceps, and advanced techniques to carefully dissect free embedded filter tips encased in fibrous tissue adherent to the IVC. Therefore, it is important to develop an effective, inexpensive model to simulate tip-embedded IVC filter retrieval. €nther Tulip Vena Cava Filter, Cook Materials and Methods: Silicone tubes (Flexi-Seal SIGNAL, ConvaTec Inc., Skilman, NJ), IVC filters (Cook Gu Medical, Bloomington, IN), and endobronchial forceps (Lymol Medical, Woburn, MA) were obtained to assemble the model. A total of 12 combinations of adhesive binding methods were used to adhere IVC filter fragments to the silicone tubes, and these were blind tested. A single operator with over 10 years of experience using forceps scored the adhesives subjectively on a three-point scale for adherence, elasticity, and tactile feel. The adhesive most similar to IVC fibrous tissue was selected to assemble the final tip-embedded IVC filter model. 20 trainees were then assigned to practice on the model. A 3-point scale scoring metric objectively measured confidence before and after training on the model. Results: Sil-poxy Silicone Adhesive (Smooth-On, Macungie, PA) was found to be the most similar to human IVC fibrous tissue with an average score of 3 of 3 on all metrics. Comparing scores from before and after use of the model, trainee confidence improved significantly (p < 0.1) in all three categories from 1.20 to 2.10 (handling forceps), 1.05 to 2.15 (understanding tactile feel of fibrous tissue), and 1.05 to 1.70 (overall confidence). Conclusion: The development of a low-cost simulator for embedded IVC filters is feasible and can be used to improve trainee confidence and skill for complex IVC filter retrieval. Key Words: Low-Fidelity; Simulation; Training Model; IVC Filter; Retrievals. © 2019 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

Acad Radiol 2019; &:1–4

fidelity being used for skills-based procedural training, and highfidelity capturing a broader range of skills-, knowledge-, and rule-based cognitive training [10 15]. Low-fidelity simulators have been criticized for lacking the ability to transfer haptic cues and for not replicating the fine motor actions of real procedures [14,16,17], such that their usefulness is confined to the cognitive aspect of procedural skills development. However, some lowfidelity simulators have been demonstrated to be just as effective in many contexts as high-fidelity [11,18 23]. Some authors have distinguished between structural fidelity (how does it look) and functional fidelity (how does it work), and have proposed the possibility of a simulation with low structural fidelity and high functional fidelity [24,25]. High-fidelity simulations such as immersive virtual reality models are growing popular but their cost is often upwards of six figures [26] and is not feasible across a variety of procedures, departments, and academic centers. Therefore, the aim of our study was to develop and evaluate a low-cost training model that could act as a simulator with low structural and high functional fidelity.

From the Division of Vascular and Interventional Radiology, Stanford University School of Medicine, 300 Pasteur Drive, Room H3651, Stanford, CA 94305. Received April 15, 2019; revised August 22, 2019; accepted August 25, 2019. Address correspondence to: W.T.K. e-mail: [email protected]

METHODS

© 2019 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved. https://doi.org/10.1016/j.acra.2019.08.008

This study was granted an institutional review board exemption and a waiver of consent.

INTRODUCTION

C

linical training has seen a paradigm shift in the past few decades as the Halstedian master-apprentice model has made way for competency-based medical education and simulation-based mastery learning [1 4]. With both the ACGME (Accreditation Council for Graduate Medical Education)-mandated restriction in resident work hours and the promotion of brief hospital stays, trainees are receiving a clinical education in the shortest amount of time with only the sickest of patients [5 7]. Simulations have been posed as a solution to improve training and patient safety, but as evaluation is shifting from length of observership to quantitative measures of performance on simulators, training is becoming fragmented [8,9]. As simulation changes the landscape of education, fidelity has been debated as a factor in the transfer of learning. Fidelity refers to the quality of a simulator’s visual representation, with low

1

ARTICLE IN PRESS HOANG ET AL

Academic Radiology, Vol &, No &&, && 2019

and tactile feel relative to IVC fibrous tissue (Appendix 1). The binding method was not revealed to the expert until the scoring was complete. The adhesive most similar to IVC fibrous tissue was used to secure the IVC filter (Cook G€ unther Tulip Vena Cava Filter, Cook Medical, Bloomington, IN) within the silicone tube for the tip embedded IVC Filter training model. Seven medical students, three nurse practitioners, four radiology residents, and six interventional radiology fellows, all affiliated with or on a rotation through the IR section, were assigned to practice on the model. The trainees engaged with the practice model under the supervision and guidance of the expert (W.K.). Each trainee was given two simultaneous opportunities to retrieve the filter. They were given as much time as needed, which was often about 5 minutes (not measured). A three-point scoring metric was used to subjectively measure trainee confidence pre- and postintervention (Appendix 2). Trainees rated their confidence in handling endobronchial forceps, understanding the tactile feel/force required to remove tip embedded IVC filters, and their overall confidence with the retrieval technique, both before and after engaging with the practice model. A paired Wilcoxon signed rank test was used to compare the pre- and postintervention surveys. Figure 1. A silicone tube is strung within the training model such that the tube is open to either end of the box and an IVC Filter can be adhered inside one end. An LED light slides into a slot at the lower end of the box, illuminating the inside of the tubing. Abbreviations: IVC, inferior vena cava.

To construct the model, silicone tubes (Flexi-Seal SIGNAL, ConvaTec Inc., Skilman, NJ) were strung within a wooden box (2 ft x 1 ft x 0.75 ft) through circular cutouts at each end (Fig 1). Each end of the silicone tube was fastened to either end of the wooden box such that the tube ends were exposed. An LED light (A4 Pad, Amazon, Seattle, WA) was placed in the slot at the lower end of the box to illuminate the filter within the silicone tube. The total cost of the materials used to construct the model was approximately $80, which covered the cost for all trainees to use the model. The cost did not include the expired inferior vena cava (IVC) filter and silicone tubes. Several adhesives were tested in different combinations to adhere the IVC Filter tip to the silicone tube. Adhesives tested included Liquid Rubber (Smooth-On, Macungie, PA), Sil-poxy Silicone Adhesive (Smooth-On, Macungie, PA), Gorilla Glue (Gorilla Glue Inc., Cincinnati, Ohio), Sugru (FORMFORMFORM, Hackney, East London), Silicone Thixotropic Agent (Thi-Vex, Smooth-On, Macungie, PA), and Barge Cement (Barge, North Brookfield, MA). Twelve combinations of adhesive binding methods were blind tested by an expert with over 10 years of experience with complex IVC filter retrieval (W.K.). Endobronchial forceps were used to remove the embedded filter tip (Lymol Medical, Woburn, MA). The expert evaluated each adhesive method on a three-point scale for adherence, elasticity, 2

RESULTS The Sil-poxy silicone adhesive received the highest rating of the 12 adhesive methods tested with a score of three in each category: adhesion, elasticity, and tactile feel. Each of the 20 trainees successfully removed the IVC filter from the training model. Comparing scores from before and after use of the model, trainee confidence improved from 1.20 to 2.10 (handling forceps), 1.05 to 2.15 (understanding tactile feel of fibrous tissue), and 1.05 to 1.70 (overall confidence). There was a statistically significant increase in each of the three metrics for confidence (p  0.01) (Table 1). The largest increase in mean score was for understanding the tactile feel/force required to remove tip embedded IVC filters. There was no significant difference between the increases in scores between any two level of trainees (p > 0.05) as determined by pairwise comparisons. Trainees provided comments in the free text portion at the end of the survey. Comments recommended the use of simulation training models earlier in medical education and for a variety of different procedures, particularly other endovascular interventions. One comment asked for a high-fidelity version of the model for upper-level trainees. TABLE 1. Ratings of Confidence Pre- and Postintervention Confidence Metric

Pre

Post

p Value

Handling endobronchial forceps Tactile feel of fibrous tissue Retrieving tip Embedded IVC filters

1.2 1.05 1.05

2.1 2.15 1.7

 0.01  0.01  0.01

ARTICLE IN PRESS Academic Radiology, Vol &, No &&, && 2019

DISCUSSION Medical procedural training has traditionally been taught from master to apprentice, such that the transmission of information occurs through direct observation and feedback [27]. This transmission of information can also occur through a physical model in a controlled setting without risk to a patient. Specific features of a procedure can be targeted and replicated within low fidelity simulators so that trainees can gain haptic experiences typically associated with high fidelity simulators. The distinction between procedural versus cognitive, skills-based versus knowledge-based, and low- versus high-fidelity may be too divisive, as many procedures require both cognition and expert judgment that simulations seek to transfer from expert to learner [28,29]. This is especially pertinent in interventional radiology training where proper procedure execution relies on learning diagnostic imaging interpretation along with the development of procedural skills [9,30]. Although expensive simulators exist [24,31], there is a lack of low-cost endovascular training models associated with high functional fidelity. Complex IVC filter retrievals are potentially high-risk procedures during which endobronchial forceps are used to attempt filter tip detachment without injury to the underlying vessel. A simulator for this procedure allows trainees to practice this advanced procedure in a safe environment without risk of patient injury. The training model used in this study maintained the benefits of a low-fidelity model: low cost (»$80), easily constructed, and easily transportable. In terms of preparation, it took about 10 minutes for the sil-poxy to set completely. However, multiple tubes with attached filters can be prepared in advance. It took less than a minute to attach a new tube. Therefore, given several tubes and filters are available, a dozen students can train within a 1-hour span. Although low-fidelity models have traditionally been criticized for lacking haptic feedback, essential for endovascular skills education [14,17], our low-fidelity model was able to transfer haptic knowledge to trainees, allowing them to familiarize themselves with endobronchial forceps as well as understand the procedural feel and tension during attempted embedded IVC Filter removals. Furthermore, trainees could use the model for practice in a low stress environment without relying on a supervising physician, which would be required in a live case. The popularity of simulation-based mastery learning has prompted a quantitative approach to evaluation, often measured as the time required to complete certain tasks or components of a procedure. This training model creates avenues for quantitative evaluation, such as time required to detach the embedded tip, time to remove the filter from the model, and extent of damage to the vessel wall. However, the extent to which these quantitative measures relate to clinical competence is limited [5,32]. Low-fidelity simulators are undeniably useful both in cost efficiency and clinical skills development [33 35]. The free text responses in this study suggest that lower-fidelity models are useful for trainees earlier in their education, while

DEVELOPING AND EVALUATING A SIMULATOR

higher-fidelity models may be more suited to the more advanced trainees that will soon be required to participate actively in procedures. Additionally, advanced trainees will have had more experience in the operating room and with patients, and may likely experience a greater degree of discordance between the simulation and their procedural experiences. Simulations necessarily lack an immeasurable quality of the real experience. The increase in confidence scores in all trainees, however, demonstrates that regardless of structural fidelity, models with high functional fidelity are useful across the entire spectrum of medical trainees. Simulations have been praised for contributing to standardization in medical education but they have also been criticized for destroying it [7,8,16,36]. However, if used appropriately, simulators have the potential to provide both standardization and customization to medical education. Since low-fidelity instruments require fewer resources to produce, they represent an opportunity for increasing accessibility to medical trainees and for teaching a wide range of procedures. For example, our training model has the potential not only for continued development in teaching IVC filter removal, but also for simulating other vascular interventions such as balloon angioplasty and stent deployment. This study has important limitations. Our training model was visualized directly, without the use of live fluoroscopy or video, and this lack of realism was a limitation. However, the addition of fluoro or video would have increased costs, and it is unclear if this would have significantly improved the user experience. Also, the experience level of our trainees may have affected the change in confidence metric. For a trainee without any prior experience using forceps, any introduction could potentially improve confidence. Although the silicone tubing was never damaged throughout the training, it is unknown whether the tube shares similar durability to the human IVC. Finally, the study did not investigate the effect of simulator training on real-world patient care. Although we assumed that increased trainee confidence would translate into better clinical outcomes, this warrants further study. CONCLUSION The development of a low-cost simulator for embedded IVC filters is feasible and it can be used to confer haptic knowledge, offer repetitive skills training, and improve trainee confidence in attempting complex IVC filter retrieval. AUTHOR CONTRIBUTIONS Nam S. Hoang built the simulation, administered the questionnaires, and wrote the initial draft of the manuscript. Benjamin H. Ge. wrote the questionnaires, and revised the manuscript. William T. Kuo designed the simulation, scored the adhesives, revised the manuscript and approved the final draft. 3

ARTICLE IN PRESS HOANG ET AL

REFERENCES 1. Willis MH, Frigini LA, Lin J, et al. Clinical decision support at the point-oforder entry: An education simulation pilot with medical students. Acad Radiol 2016; 23:1309–1318. 2. Picard M, Nelson R, Roebel J, et al. Use of low-fidelity simulation laboratory training for teaching radiology residents CT-guided procedures. J Am Coll Radiol 2016; 13:1363–1368. 3. McGaghie WC, Issenberg SB, Cohen ER, et al. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med 2011; 86:706–711. 4. Denedai R, Saad-Hossne R, Todelo AP, et al. Low-fidelity bench models for basic surgical skills training during undergraduate medical education. Rev Col Brais Cir 2014; 41:137–145. 5. Gould DA, Reekers JA, Kessel DO, et al. Simulation devices in interventional radiology: validation pending. J Vasc Interv Radiol 2006; 17:215–216. 6. Dawson S, Gould DA. Procedural simulation’s developing role in medicine. Lancet 2007; 369:1671–1673. 7. Becker GJ. Simulation and the coming transformation of medical education and training. Radiology 2007; 245:7–9. 8. Gunderman RB. Competency-based training: conformity and the pursuit of educational excellence. Radiology 2009; 252:324–326. 9. Patel AA, Gould DA. Simulators in interventional radiology training and evaluation: a paradigm shift is on the horizon. J Vasc Interv Radiol 2006; 17:S163–S173. 10. Patel AA, Glaiberman C, Gould DA. Procedural simulation. Anesthesiol Clin 2007; 25:349–359. 11. Klein KA, Neal CH. Simulation in radiology education: thinking outside the phantom. Acad Radiol 2016; 23:908–910. 12. Dankelman J, Wentink M, Grimbergen CA, et al. Does virtual reality training make sense in interventional radiology? Training skill-, rule- and knowledge-based behavior. Cardiovasc Intervent Radiol 2004; 27:417–421. 13. Gaca AM, Frush DP, Hohenhaus SM, et al. Enhancing pediatric safety: using simulation to assess radiology resident preparedness for anaphylaxis from intravenous contrast media. Radiology 2007; 245:236–244. 14. Gould D. Using simulation for interventional radiology training. Br J Radiol 2010; 83:546–553. 15. Mills BW, Carter OB-J, Rudd CJ, et al. Effects of low- versus high-fidelity simulations on the cognitive burden and performance of entry-level paramedicine students: a mixed-methods comparison trial using eye-tracking, continuous heart rate, difficulty rating scales, video observation and interviews. Simul Healthc 2016; 11:10–18. 16. Gould DA, Kessel DO, Healey AE, et al. Simulators in catheter-based interventional radiology: training or computer games? Clin Radiol 2006; 61:556–561. 17. Chetlen DO AL, Mendiratta-Lala M, Frcpc LPMD, et al. Conventional medical education and the history of simulation in radiology. Acad Radiol n.d. doi:10.1016/j.acra.2015.07.003.

4

Academic Radiology, Vol &, No &&, && 2019

18. de Giovanni D, Roberts T, Norman G. Relative effectiveness of high- versus low-fidelity simulation in learning heart sounds. Med Educ 2009; 43:661–668. 19. Lee JY, McFadden KL, Gowen 3rd CR. An exploratory analysis for Lean and Six Sigma implementation in hospitals: together is better? Health Care Manage Rev 2016. doi:10.1097/HMR.0000000000000140. 20. Basak T, Unver V, Moss J, et al. Beginning and advanced students’ perceptions of the use of low- and high-fidelity mannequins in nursing simulation. Nurse Educ Today 2016; 36:37–43. 21. Munshi F, Lababidi H, Alyousef S. Low- versus high-fidelity simulations in teaching and assessing clinical skills. J Taibah Univ Med Sci 2015; 10:12–15. 22. Chen R, Grierson LE, Norman GR. Evaluating the impact of high- and low-fidelity instruction in the development of auscultation skills. Med Educ 2015; 49:276–285. 23. Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ 2012; 46:636–647. 24. Hamstra SJ, Brydges R, Hatala R, et al. Reconsidering fidelity in simulation-based training. Acad Med 2014; 89:387–392. 25. Maran NJ, Glavin RJ. Low- to high-fidelity simulation - a continuum of medical education? Med Educ 2003; 37(Suppl 1)):22–28. 26. Al-Elq AH. Simulation-based medical teaching and learning. J Fam Community Med 2010; 17:35–40. 27. Collins J, de Christenson MR, Gray L, et al. General competencies in radiology residency training: definitions, skills, education and assessment. Acad Radiol 2002; 9:721–726. 28. Kneebone R. Simulation and transformational change: the paradox of expertise. Acad Med 2009; 84:954–957. 29. Andersen DK. How can educators use simulation applications to teach and assess surgical judgment? Acad Med 2012; 87:934–941. 30. Shanks D, Wong RY, Roberts JM, et al. Use of simulator-based medical procedural curriculum: the learner’s perspectives. BMC Med Educ 2010; 10:77. 31. Beydoun T, Beydoun A, Towbin R. Abstract No. 576 - Training beyond the interventional suite: application of Google Glass in the education of future interventional radiologists. J Vasc Interv Radiol 2016; 27:S254. 32. Dawson S. Procedural simulation: a primer. Radiology 2006; 241:17–25. 33. Evans LV, Dodge KL, Shah TD, et al. Simulation training in central venous catheter insertion: improved performance in clinical practice. Acad Med 2010; 85:1462–1469. 34. Manley KM, Park CH, Medland VL, et al. The training value of a low-fidelity cervical biopsy workshop. Simul Healthc 2015; 10:116–121. 35. McCaslin J, Trimmer CK, Reddick M, et al. Abstract No. 286 - Comparing capture efficiency of retrievable IVC filters in an in-vitro model of venous thromboembolism. J Vasc Interv Radiol 2014; 25:S133. 36. Ahmad R, Alhashmi G, Ajlan A, et al. Impact of high-fidelity transvaginal ultrasound simulation for radiology on residents’ performance and satisfaction. Acad Radiol 2015; 22:234–239.