Interventional Radiology Simulation: Prepare for a Virtual Revolution in Training

Interventional Radiology Simulation: Prepare for a Virtual Revolution in Training

Interventional Radiology Simulation: Prepare for a Virtual Revolution in Training Derek Alan Gould, FRCP, FRCR It is becoming increasingly difficult ...

283KB Sizes 0 Downloads 48 Views

Interventional Radiology Simulation: Prepare for a Virtual Revolution in Training Derek Alan Gould, FRCP, FRCR

It is becoming increasingly difficult to learn interventional radiology (IR) skills because there are fewer “straightforward” invasive diagnostic imaging studies, a reduction in the time available for training, concerns about patient safety, and changing patient perceptions. Computer-based simulation has the potential to allow an operator to realistically perform a virtual procedure with feedback about performance and could remove at least some of the patient’s role during the learning curve. To do this effectively requires a strategy for integrating simulator models into curricula and the development of standards for their validation. J Vasc Interv Radiol 2007; 18:483– 490 Abbreviations:

IR ⫽ interventional radiology, VE ⫽ virtual environment

EVEN today, medical skills are learned by practicing on patients in an apprenticeship, and this is no less the case for interventional radiology (IR) (1). To avoid isolating the acquisition of these skills from essential knowledge and rules, training is carried out in carefully structured curricula that also help develop the behavior and attitudes that are essential to professionalism (2). When we first learned to ride a bike it was in a quiet playground or side street. Safety dictated that the essential skills to remain upright while pedaling and steering must become automatic before cycling in a busy main road. Once these basic skills are mastered, automation allows attention to be given to the complexities of traffic. If we have not cycled for a few years, we might well wish to check out and refresh our core skills in a quiet place

From the Department of Interventional Radiology, Royal Liverpool University Hospital, Prescot St, Liverpool L7 8XP, England. Received November 7, 2006; accepted November 8, 2006. Address correspondence to the author; E-mail: [email protected] Supported by the Royal College of Radiologists Xappeal Fund (2003). The author has identified no conflicts of interest. © SIR, 2007 DOI: 10.1016/j.jvir.2006.11.005

before venturing into a busy road. In IR, the core skills of needle guidance using touch, imaging, and wire and/or catheter manipulation should first be practiced until automated to avoid exceeding the learner’s attention capacity when moving on to more complex skills in patients (3,4). Once acquired, core and complex technical skills for IR are maintained by regular practice or may need to be refreshed by means of repeat training. The training and maintenance of core skills underpins IR practice, yet a dearth of invasive diagnostic work in the wake of the imaging revolution has reduced the opportunities for basic training. European and other working time regulations (5,6), together with a schedule for modernizing medical careers, have further condensed the time available to train. In addition, it is a paradox that while we should “first do no harm,” an essential component of learning on patients is the experience of feedback on errors made. At the same time, apprenticeship training prolongs procedures and occupies expert mentors, resulting in more expensive patient treatment (7,8). On top of all this, the assessment of proficiency is an essential part of a curriculum, providing evidence for certification and the feedback to motivate the trainee to learn. However,

there are still no objective assessment methods in routine use for IR skills. There is therefore a pressing need for the imaging revolution to be closely followed by a revolution in training and assessing IR skills, with precisely and accurately defined minimum standards for success and an alternative to patient-centered learning. Possible alternatives include various simulations such as models, animals, and computer-based methods. Simple deformable models of anatomy have been used to train and assess in surgery (9,10). They can also be punctured by needles under ultrasonographic (US) guidance (11) or act as a conduit to train catheter and guide wire skills. Rapid prototyping models can faithfully reproduce the anatomy at computed tomography (CT) by using pumps to circulate fluid (mimicking blood flow) and permitting contrast media injections, with realistic guidance using through-transmission of light or even real fluoroscopy (12,13). Such models, however, are expensive, are destroyed by multiple needle punctures, and lack a facility to easily alter anatomy and pathology. Training can also use animals, which provide realistic physiology and “feel,” although it is difficult to reproduce human pathologic states in animals, their anatomy is somewhat dis-

483

484



IR Simulation: Prepare for a Virtual Revolution in Training

similar to that of humans, and they are expensive to maintain (14,15). The use of animals also raises political issues, particularly in the United Kingdom and the United States. Technology-based simulation, conversely, “constructs a mathematical model to reproduce the characteristics of a phenomenon, system, or process, often using a computer, in order to infer information or solve problems” (16). A systematic review of 109 published studies has looked at whether medical simulation facilitates learning (17). Although the overall quality of the research was considered weak, the best available evidence did show a benefit for simulation when the following conditions are met: (a) educational feedback is provided, (b) learners are given the opportunity for repetitive practice, (c) tasks range in difficulty, and (d) the exercises based on the simulation are integrated into the curriculum. When criteria of functionality such as these are met, and in particular if the simulation and its content are appropriate, simulators could seemingly provide training and assessment of IR skills. The level of fidelity (accuracy) could then be chosen to suit the training objectives of the curriculum. A low level of fidelity, such as using simple geometric environments (task primitives) (18,19), may suffice in the training of core skills (4), with high levels of fidelity used to train and maintain more complex skills. Training would become learner-centered, performed at the learner’s pace and remotely from patients, with a new opportunity to learn safely from mistakes. Herein, I examine the role of computer-based simulations in IR to train, maintain, and assess technical skills.

TECHNOLOGY A computer-based simulator model provides an operator with a facility to interact with a range of data sets derived from medical imaging. These virtual worlds or virtual environments (VEs) display variable anatomy and pathology, which are supplemented where necessary by medical illustration (19). VEs have been developed by industry (20 –30) and by a number of academic groups (31–38), with some providing a facility to upload patient specific imaging data.

The operator views the VE by using a two-dimensional screen or a passive three-dimensional screen. Headmounted systems provide mono- or stereoscopic effect but are uncomfortable to wear for prolonged periods and are not widely used in practice. Active stereo glasses, which use synchronized polarizing screens to view a two-dimensional screen image, are easier to wear and are almost as effective. Auto-stereoscopic flat screen systems are now also showing promise for stereo viewing without the need for wearing special glasses (39). These technology-based simulations have often been referred to as “virtual reality,” which attempts to replicate the real-life procedure in a VE or “augmented reality” which allows computergenerated imagery to be overlaid on the real-world scene. The latter, while usually applied to procedural guidance aids, can also be used with mannequins for training. The sense of touch (haptics, feel, or force feedback) has been suggested as playing a more limited role in some aspects of laparoscopic procedures with forceps (40). Conversely, touch may be a key sensory element during IR procedures. In simulators, feel is communicated by an interface device between the operator and computer, providing both input (tracking) and output (haptics). The latter uses various devices to provide real-time, continuous sensory feedback in the form of force, with up to 6 degrees of freedom of movement. The interface to the operator’s hand is provided directly by using tactile gloves, mechanical linkages, or gimbal mountings (24–27). Rollers can be used to contact catheters and wires, simulating resistance during catheterization (21–23,28,30). The haptic device can be co-located within the VE and, by using a semitransparent mirror, the hand or instrument can appear to touch objects where they are seen. The haptic device delivers the feel that is important for the correct performance of IR procedures. To avoid the isolation of the acquisition of these technical skills from knowledge and clinical skills, the simulation must be integrated into an appropriate curriculum. Indeed, to provide a more robust training package, relevant cognitive and clinical teaching material is often provided by additional windows in the simulation. Approximately half of all IR work involves guided needle puncture with

April 2007

JVIR

Figure 1. Simbionix ANGIO Mentor cardiovascular computer based simulator model. (Image courtesy of Simbionix, Lod, Israel.)

use of various imaging modalities to access visceral organ systems. In guided needle puncture, a rigid instrument (a needle) is guided to a target, typically with use of US guidance. Although the skills necessary to perform this procedure are currently largely learned in patients, they can also be learned by training with fixed models (eg, rapid prototyping [11]) or in VEs and hybrid (augmented reality) computer simulations (41). Developments are needed in visceral IR computerbased simulation to achieve more realistic US “images” from CT, physiologic motion (respiration, pulsation), and tissue deformation. Simulation of catheterization brings further challenges. Visual-spatial and tactile skills are used to manipulate long and flexible instruments, with their intrinsic responses predicated on the physics of their structure and of their immediate anatomic and pathologic environment. Various simulations of IR catheterization have now been developed by academic groups (38,42) and a number of commercial manufacturers (21–23,28 –30) (Fig 1). In flight simulators, all parameters of each specific aircraft model are known and modelled. Conversely, in medical simulation there is a great range of variability between case scenarios, with no standard human model with reproducible responses to the same interaction. While performing a real-world procedure, an operator will appreciate a range of tactile sensory inputs (eg, pressure, vibration, temperature) arising from the interactions between the operator’s hands and instruments and the living tissues. Simulation of these interactions requires the anatomic structures

Volume 18

Number 4

in a VE to be mathematically rendered. Modelling of the nonlinear, elastic, and viscous (viscoelastic) properties of tissues is, however, challenging. A further complicating factor is the current incomplete characterization of the physical properties of IR instruments. In an effort to improve the feel of a simulated procedure, workers have directly measured tissue properties during deformation, penetration, and cutting in vitro (38,43– 47) and in vivo in animals (48,49). Compared with the more limited variability of industrial and aeronautic systems, however, there are differences in the physical properties of tissues from different species, between living and cadaveric tissue, and between old and young patients (46). Studies have therefore also been performed to evaluate forces generated by instruments in living human tissues (50,51). Unobtrusive calibrated sensors, worn under surgical gloves, have been used to measure summated forces generated during vascular and visceral IR needle puncture procedures in patients (50). Data obtained in this way can be used to revise underlying mathematic algorithms of simulator models, with the aim of improving the accuracy of force feedback in a simulated procedure. Although it is not known for certain that highly accurate tactile representation of the real-world procedural task is required for learning, it is likely that some understanding of these “haptic cues” is necessary for competence. Studies of learning with and without haptics should provide evidence to more clearly define the role for more accurate feel. The availability of high levels of visual and tactile fidelity might allow trained individuals to maintain their skills in challenging procedures (eg, transjugular intrahepatic portosystemic shunts, embolization for bleeding) despite working in centers with low throughput. Experts could master new and complex skills and plan and rehearse difficult cases in a VE created from their patient’s own imaging data (mission rehearsal). High fidelity and complexity, however, carry a trade-off of financial and computational expense, and the latter can introduce difficulty when performing in real time. Lower fidelity should therefore be considered first when developing sim-

Gould

ulations to meet basic or core skills training objectives (4). A major advantage of computerbased simulation is that it provides a facility to automatically evaluate and record an operator’s performance. The reliability and reproducibility of this assessment methodology might be improved by using higher fidelity to present a more natural and realistic task to the learner, with fewer test artifacts. It is, however, arguable that no tools for assessing technical proficiency—whether based on simulators or not— have yet been validated for use in IR curricula.

PERFORMANCE ASSESSMENT To provide legitimate evidence for award of a certification by a statutory body, an assessment process should follow content that is in keeping with the discipline’s curriculum (52). The methodology used should apply to a range of test scenarios and be unbiased, reproducible, cost effective, feasible, and objective (53). Objective assessment of skills is of increasing importance for certification in surgery. For example, observer-based checklists and global scoring systems have been evaluated in the assessment of real-world surgical tasks (10,54 –59). The use of actors to play the role of standardized patients (60) can help evaluate clinical and communication skills and has also been used to assess invasive procedural skills in surgery by using attached models (61) to avoid “operating” on the actor posing as the patient. The in-training assessment of IR skills, however, remains, to a great extent, a subjective process. Although the widely used logbook method shows procedure experience, it lacks objectivity, it fails to account for varying rates of learning, and its value is susceptible to fluctuating case mix and trainee experience from center to center. This method therefore falls short of reliably indicating proficiency. Among the small amount of work performed in objective assessment of proficiency for IR has been the use of time-action analysis (62), which objectively measures the duration and frequency of the different actions performed during a procedure. Economy of hand motion has been evaluated in surgery (63) and may also be applicable to IR. Com-



485

puter-based simulation has now been used successfully not only for surgical training but also assessment, with objective and automated evaluation of specific measures (metrics) of performance (18,64 – 66). The computer games industry uses metrics to test performance. If we triumph in a simulated fight against a silver knight astride his horse, are we confident to enter the battlefield for real? Unless the game’s (or simulation’s) design is germane and the metrics are correctly replicated and relevant to the real world, our sword may prove heavier than we had thought and our opponent stronger and ourselves weaker. We would have been misled by design compromises and imprecise assessments of our performance, perhaps with fatal results. Given appropriate levels of fidelity, however, the use of relevant metrics should permit valid modelling and assessment of tasks while reducing the risk of training inappropriate or incorrect skills (negative training). In determining what a medical simulator should allow us to do and what it should measure, we therefore need detailed information about what it is that the clinician is actually doing during a real procedure. Although metrics might be distilled by individual speculation, committees, or consensus, for relevance to a particular curriculum they should be derived from observations of realworld tasks in that curriculum. One approach is for psychologists or human factors (the interface between psychology and engineering) experts to perform this work together with medical experts in the task or procedure in question (subject matter experts) (67–70). Usually, these subject experts will have been transparently selected by the certifying organization itself (52). A literature search is first compiled to determine best practice and any known guidelines. Then, in an interview, psychologists and subject experts use video-recorded procedures as a prompt to develop a complete procedure description that defines the nature of the required skills and their relationship to each other. Then, the procedure description is decomposed in further interviews to show, in hierarchical format, the decision-making process, including cues, the ensuing

486



IR Simulation: Prepare for a Virtual Revolution in Training

psychomotor actions, and automated steps. This is the cognitive task analysis that documents how the procedure is performed within the curriculum. This can then be further evaluated by subject experts to identify which steps are most crucial and most prone to error. Collectively, these are the metrics that can be used to evaluate the learner’s proficiency. One way to do this is to use metrics in task checklists or global scoring systems, which are then used by trained observers to assess performance in patients. Metrics can also be incorporated into simulators to automatically assess proficiency, although for this to occur reliably and reproducibly the simulator must actually be capable of using these metrics. Given appropriate functionality and relevant metrics, computer-based simulators will provide automatic, objective assessment of IR skills proficiency in a range of case scenarios, including critical events to determine how the trainee performs in adversity. Feedback will provide the trainee and/or operator with information about their development, progress, and learning needs (71). Requiring the trainee to obtain satisfactory assessment in computer-based simulators before moving to procedures in patients would provide strong motivation for learning. The logbook will have had its day as an assessment tool. Instead, standardized test procedures will show stepwise, and perhaps more rapid, attainment of required competencies, providing fair, accountable, and blinded evidence for certification and revalidation provided, that is, that suitable evidence can be found to support the use of simulation for assessing, as well as training, IR proficiency.

VALIDATION There is a need for some proof of the effectiveness, or validity, of simulated training tasks or test items and of any complete procedures and their assessment methodologies. For training purposes alone, however, it is necessary to validate only that part of the simulation being used for training. A commonly performed and important study of validity is face validity, in which the test should appear to test takers to resemble the real-world task (how much does it look like the

real thing?). Content validity is determined by subject experts who verify that, for assessment, the test measures what it is supposed to measure, and, for training, that it accurately replicates the procedure or process it claims to model. Construct validity evaluates whether the simulator assesses factors that are important to the acquisition of the required skills. Concurrent validity for training correlates the new method with a standard of reference. To show concurrent validity for assessment, it must be possible for the simulator to differentiate the performance of experts from that of novices. Predictive validity is proved when performance in the simulation is shown, by subsequent clinical studies, to correlate with future competence in patients. Ultimately, there should be proof that the skills acquired by means of simulator training transfer to procedures performed in patients (transfer of training) and are then maintained over time. Validation of the laparoscopic trainer MIST VR (18,64) (Mentice, Goteborg, Sweden) has been successful in showing that skills transfer to patient procedures (65). With use of task primitives to train basic laparoscopic skills, a predictive validation study showed a 29% reduction in the time to perform a real laparoscopic procedure, with improved proficiency confirmed by a sixfold reduction of errors in patients (65). There has been similar success in validation of simulators in improving operator performance in patients in the fields of colonoscopy (72) and anaesthetics (73). These successes in validation have, however, yet to be reproduced for IR endovascular simulations (74), of which there are now a small number of commercial producers (21–23,28,30). In general, the assessments provided by current endovascular simulators are of overall performance, such as the time taken to successfully perform a procedure, fluoroscopy time, ‘C’ arm handling, or volume of contrast media used. Although these metrics provide some indication of proficiency, they give little information about the detailed skills or errors relevant to IR curricula. This might reflect their selection by those outside the speciality of IR and/or it simply may mean that these are the only metrics that can currently be used by a

April 2007

JVIR

particular simulation. Metrics that reflect the performance objectives of an IR curriculum would represent more specific measures of proficiency and would be more likely to differentiate between experts and untrained novices. Although this discrepancy may, in part, explain the current lack of successful validation of endovascular simulators, some of these simulators are being evaluated in a number of ongoing predictive validation studies, the outcomes of which are awaited with interest. In 2002, the area of imaging-guided needle puncture was identified as a focus for simulator model development and validation by academic collaboration in the United Kingdom (32,37) (Fig 2). A bottom-up approach entailed that interventional radiologists, computer scientists, physicists, clinical engineers, and psychologists worked together closely. The academic developers of the simulator used the findings of a hierarchical cognitive task analysis of real procedures (69) to inform simulator design. Subject experts identified metrics from this task analysis with a statistically significant correlation between raters (P ⬍ .001), a finding that indicates expert agreement on key points of the procedures (75). The development phase of the simulator has been supported by multiple, continuing feedback sessions with clinicians within the academic environment, and also at large conferences where questionnaires were used to explore face and content validity. Consultation with such a wide range of end-users has driven revisions to improve functionality and is regarded as essential to the development of successful simulators. Repeated content validation will continue until the simulator is regarded as ready for construct validation to objectively determine whether the correct procedural elements are indeed being trained and assessed. This will then be followed by transfer of training studies. Much of the impetus to develop medical simulation has been through concerns with regard to patient safety. At the same time, elements of the medical device industry are working with a range of specialties by using simulation to train (76,77), often with an emphasis on specific manufacturers’ medical devices. Meanwhile, some IR

Volume 18

Number 4

Gould



487

Figure 2. (a) Demonstration of simulated visceral needle puncture using a computer based simulation. (Image courtesy of Glyn Davies, Bangor.) (continues)

competitor specialties are advocating the use of simulation in the training of procedures such as carotid stent placement (78), whereas the simulation industry itself clearly has a need to sell simulators. In this climate, training organizations should take a lead in setting standards, including validation standards, for medical simulation that is to be used within their curricula. These curricula, in turn, must become structured to fully utilize the foreseen advantages of procedural simulation.

HOW TO ADOPT THE TECHNOLOGY? Currently, simulators are expensive. In due course, however, as with all technology, the costs for software and hardware should decrease. In addition, there is a trade-off of fidelity against cost that is likely to remain, at least for some time. The realism of a simulation should therefore be tailored to its purpose: If a low-fidelity simulation is sufficient to meet a particular training objective, it can, and indeed should, be used. Costs could also be contained by means of lease hire arrangements that provide an opportunity to be freed from obsolescence and at the same time to perform validation studies and other assessments. Academic collaborations, supported by grant and industrial funding, can provide focused simulator development for training organizations, perhaps with some cost advantages. Web-based solutions have previously been demonstrated for lumbar puncture (33) and ventricular catheterization simulations (79) without the use of haptics. The use of haptics over the Web has now been demonstrated for a

Figure 2. (continued) (b) Concept/operator’s viewpoint of simulated visceral needle puncture. A simulated US scan appears in a pop-up window. (Image courtesy of Professor Nigel John, PhD and Franck Vidal, Department of Informatics, University of Wales, Bangor.)

force feedback mouse (22,80), and it is conceivable that higher-fidelity simulations might become achievable over the internet. If borne out in practice, this might help realize the dual benefits of lower cost and accessibility to a wide range of archived case scenarios. Aside from cost, when considering whether a simulator is suitable for training and assessment in a curriculum it is important to first understand how, and by whom, the test items and metrics were developed (75,81). Was there adequate input from widely acknowledged medical and human factors experts? How were these experts selected and by whom? What metrics are actually included and are they relevant to IR practice? Are these metrics capable of being tested by the simulator? Can the simulator record and measure errors? Does the simulator’s assessment rely on surrogate endpoints such as the time to perform a procedure rather than the ability to actually measure key performance objectives? This information should have been fully documented by the manufacturer in the development process and should be available to prospective end-users. Flight simulation in aviation training is conducted with a focus on safety and lifelong learning. The safety record attests to the value of standards developed and applied over many

years; as an example, the Federal Aviation Administration will not certify a simulator model for use outside the Federal Aviation Administration curriculum (82). Brunner et al (83) evaluated learning curves in the MIST-VR laparoscopic simulator and concluded that standards that define performance-based endpoints should be established. Such standards, however, exist within the remit of the certifying authorities that develop their curricula, indicating the level of proficiency required and the test items used for assessment (52). It follows that the assessment processes in medical simulators should also fall within the remit of statutory organizations, although standards are only just being proposed (74,75,81). The Cardiovascular and Interventional Radiological Society of Europe, the Society of Interventional Radiology, and the Radiological Society of North America have recently established individual medical simulation task forces and a joint task force. They have set out recommendations, supported by the British Society of Interventional Radiologists, for the development and use of medical simulation in the training and assessment of IR skills (81). Contemporary simulators are considered suitable for gaining certain aspects of procedural experience, such as learning the correct se-

488



IR Simulation: Prepare for a Virtual Revolution in Training

quence of procedural steps and selecting appropriate tools; many medical errors result from incorrect procedural sequencing. Although this may be beneficial before procedures are performed in patients, the utility of simulators to acquire catheter manipulation skills remains unproved and, as yet, experience on a simulator cannot be regarded as equivalent to training in patients. The ideal objective of training using simulators is to demonstrate the transfer of trained skills to procedures in patients. Face and content validity are, however, of particular importance for intergration into the curricula, and the use of simulation for assessment requires, at a minimum, face, content, and construct validation. At which point in simulator development should these validation studies be performed? Validation too soon risks being overtaken by rapid technology development. Evaluation after unchecked adoption might, however, prove to be validation too late. It would be an auspicious strategy indeed that proposed standards for the development and validation of simulation as a part of an international curriculum, ideally available over the Web as an open source. With such an approach, simulation for IR training and assessment might well attain an unassailable role within medical curricula in a short space of time. Improved patient safety, quality of care and efficiency (which would strike a chord with the accountants) should ensue.

SUMMARY For IR, the apprenticeship training method has become a defective and costly anachronism. Simulation, however, offers an alternative for at least some of the skills currently learned in patients. Although existing catheterbased simulations are suitable for understanding the procedure steps and the use of instruments, disparity often exists between their content and the training goals of specific curricula. Close collaboration between human factors experts, clinicians, educators, certifying organizations, and computer scientists would help maintain clinical relevance and should facilitate successful validation. Research in medical simulation is therefore a bourgeoning area. Successful validation of simulators and their adoption as a routine part of

systematically designed curricula will bring a step change in the mentored training of IR core skills. After demonstrating satisfactory progression, with perhaps a reduced time to proficiency, the trainee will move on to learn more complex skills by using a range of case scenarios, difficulty, background clinical data, equipment, and team interactions. Continuous and objective assessment will provide feedback and contribute to certification. Because simulation does not require the use of patients, who feel pain and experience complications, the operator will be able to deliberately make errors in safety to understand their avoidance, their consequences, and the bail-out maneuvers required. Experts will learn new skills, maintain and revalidate old ones, and rehearse their more difficult cases in advance. New medical devices may be simulated and undergo trial in a VE before ever being used in an animal or patient. At some point, trainees will be assessed on the simulator as having attained a standard whereby it is deemed safe to commence mentored training in patients. This, however, raises the question, “How might this be received by patients themselves?” A study of patient preferences has shown that patients would be more likely to allow a medical student to perform a procedure after simulator training than without simulator training, although 21%–55% of patients did not wish a medical student to perform a procedure on them regardless of the student’s level of training (84). There seems to be a tendency to overestimate the short-term achievements of technology; however, what is accomplished in the longer term can greatly exceed expectations. As more interventions come to use visualization with monitors and computers, VE-based training will increasingly approximate real practice. In concert with this, the leisure pursuits of today’s youth are notable for a preoccupation with computer graphics from the games industry. VE training for them will be a natural step. Under the aegis of IR certification and training bodies, a strategy for adoption and validation of medical simulation is set to facilitate this step forward (85): The virtual revolution will shortly be upon us.

April 2007

JVIR

Acknowledgments: I am most grateful to Bill Lewandowski, MS (William E. Lewandowski Consulting, WV), Professor Nigel John, PhD, and Franck Vidal (Department of Informatics, University of Wales, Bangor) for their review, expertise, and insightful and invaluable comments about this manuscript. References 1. Rosendahl K, Aasland OG, Aslaksen A, Nordby A, Brabrand K, Akre V. How is the specialist training in radiology? Tidsskr Nor Laegeforen 2000; 120: 2289 –2292. 2. The Royal College of Radiologists (2004). Structured training in clinical radiology. 4th ed. Education Board of the Faculty of Clinical Radiology. London, England: The Royal College of Radiologists. 3. Gallagher AG, Ritter M, Champion H, et al. Virtual reality simulation for the operating room: proficiency based training as a paradigm shift in surgical skills training. Ann Surg 2005; 241:364 – 372. 4. Dankeman J, Chmarra MK, Verdaasdonk EGG, Stassen LPS, Grimbergen CA. Fundamental aspects of learning minimally invasive surgical skills: review. Minim Invasive Ther Allied Technol 2005; 14:247–256. 5. IDS working time: European Working Time Directive. Incomes Data Services Limited: http://www.incomesdata.co. uk/information/worktimedirective.htm. Accessed January 17, 2007. 6. Working Time Regulations. Department of Trade and Industry Web site. http://www.dti.gov.uk/employment/ employment-legislation/employmentguidance/page14232.html. Accessed May 4, 2006. 7. Bridges M, Diamond DL. The financial impact of training surgical residents in the operating room. Am J Surg 1999; 177:28 –32. 8. Crofts TJ, Griffiths JM, Sharma S, Wygrala J, Aitken RJ. Surgical training: an objective assessment of recent changes for a single health board. BMJ 1997; 314:891. 9. Brehmer M, Tolley DA. Validation of a bench model for endoscopic surgery in the upper urinary tract. Eur Urol 2002; 42:175–180. 10. Datta V, Bann S, Beard J, Mandalia M, Darzi A. Comparison of bench test evaluations of surgical skill with live operating performance assessments. J Am Coll Surg 2004; 199:603– 606. 11. Limbs and Things medical simulation models. Limbs and Things, Inc: http:// www.golimbs.com/products/products. php?sectid⫽Ultrasound. Accessed January 17, 2007.

Volume 18

Number 4

12. High quality reproduction of anatomical human vascular systems. Elastrat Web site. http://www.elastrat.com/ index.php?option⫽com_content&task⫽ view&id⫽56&Itemid⫽38. Accessed May 4, 2006. 13. Chong CK, How TV, Black RA, Shortland AP, Harris PL. Development of a simulator for endovascular repair of abdominal aortic aneurysms. Ann Biomed Eng 1998; 26:798 – 802. 14. Lunderquist A, Ivancev K, Wallace S, Enge I, Laerum F, Kolbenstvedt AN. The acquisition of skills in interventional radiology by supervised training on animal models: a three year multicenter experience. Cardiovasc Intervent Radiol 1995; 18:209 –211. 15. Dondelinger RF, Ghysels MP, Brisbois D, et al. Relevant radiological anatomy of the pig as a training model in interventional radiology. Eur Radiol 1998; 8:1254 –1273. 16. Encarta world English dictionary. Rooney K, ed. Bloomsbury 1999; 1749. 17. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of highfidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005; 27:10 –28. 18. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, doubleblinded study. Ann Surg 2002; 236: 458 – 463; discussion 463– 464. 19. Vidal FP, Bello F, Brodlie KW, et al. Principles and applications of computer graphics in medicine. Computer Graphics Forum 2006; 25:113–137. 20. VEST SYSTEM: the interactive virtual reality simulator for surgeons. Select-it VEST Systems AG: http://www.select-it.de. Accessed January 17, 2007. 21. ANGIOMentor. Simbionix Web site. http://www.simbionix.com/ANGIO_ Mentor.html. Accessed May 4, 2006. 22. Endovascular AccuTouch Simulator. Immersion Medical Web site. http://www.immersion.com/medical/ products/endovascular/index.php. Accessed May 4, 2006. 23. Mentice Procedicus VIST-Radiology. Carotid and renal stenting. Mentice Web site. http://www.mentice.com/. Accessed May 4, 2006. 24. PHANTOM devices. Sensable Web site. http://www.sensable.com/products/ phantom_ghost/phantom.asp. Accessed May 4, 2006. 25. FakeSpace Advanced Visualisation Solutions Web site. http://www.fake space.com/index.htm. Accessed May 4, 2006. 26. ReachIn touch enabled solutions. ReachIn Web site. http://www.reachin.se/products/. Accessed May 4, 2006.

Gould

27. Multisensory virtual reality solutions. SenseGraphics Web site. http://www. sensegraphics.se/products.html. Accessed May 4, 2006. 28. SimSuite technology. Medical Simulation Corporation Web site. http://www. medsimulation.com/education_partners/ industry.asp. Accessed May 4, 2006. 29. Hofer U, Langen T, Nziki J, et al. CathI— catheter instruction system. In: Lemke HU, Vannier MW, Inamura K, Farman AG, Doi K, Reiber JHC, eds. Proceedings of CARS 2002, Computer Assisted Radiology and Surgery, Paris, France. Amsterdam: Elsevier 2002; 101–106. 30. Xitact Medical Simulation Xitact Web site. http://www.xitact.com/. Accessed May 4, 2006. 31. John NW. Basis and principles of virtual reality in medical imaging. In: Caramella D, Bartolozzi C, eds. Medical radiology: diagnostic imaging, 3D image processing. technique and clinical applications. Springer-Verlag 2002; 35– 41. 32. Vidal FP, Chalmers N, Gould DA, Healey AE, John NW. Developing a needle guidance virtual environment with patient specific data and force feedback. In: Proceeding of the 19th International Congress of CARS - Computer Assisted Radiology and Surgery, Berlin, Germany, 22–25 June 2005. International Congress Series, Elsevier 2005; 1281:418 – 423. 33. Moorthy K, Jiwanji M, Shah J, Bello F, Munz Y, Darzi A. Validation of a Web-based training tool for lumbar puncture. Stud Health Technol Inform 2003; 94: 219 –225. 34. Pulmonary artery catheterization simulation. Manbit PL: http://www.manbit. com/PAC.htm. Accessed January 17, 2007. 35. Web3Dconsortium, open standards for real time 3D communication, Medical realtime visualisation and communication using X3D: Web 3D Consortium: http:// www.web3d.org/applications/medical/. Accessed January 17, 2007. 36. HORUS: haptic operative realistic ultrasonography simulator. Institut de Recherche contre les Cancers de l’Appareil Digestif (IRCAD): http://www.ircad.fr/ virtual_reality/horus.php?lng⫽en. Accessed January 17, 2007. 37. CRaIVE, collaboration in radiological interventional virtual environments. Web site. www.craive.org.uk. Accessed January 17, 2007. 38. CIMIT endovascular simulator. CIMIT simulation program: http://www. medicalsim.org/eve.htm. Accessed January 17, 2007. 39. Real D Scientific Web site. http:// www.reald.com/scientific/. Accessed May 4, 2006.



489

40. Heijnsdijk EA, Pasdeloup A, van der Pijl AJ, Dankelman J, Gouma DJ. The influence of force feedback and visual feedback in grasping tissue laparoscopically. Surg Endosc 2004; 18:980 – 985. 41. Zhu Y, Magee D, Ratnalingam R, Kessel D. A virtual ultrasound imaging system for the simulation of ultrasoundguided needle insertion procedures. In: Proceedings of Medical Image Understanding and Analysis (MIUA). Surrey: BMVA Press: 2006; 1:61–65. 42. Dawson SL, Cotin S, Meglan D, Shaffer DW, Ferrell MA. Designing a computer-based simulator for interventional cardiology training. Catheter Cardiovasc Intervent 2000; 51:522–527. 43. DiMaio SP, Salcudean SE. Interactive simulation of needle insertion models. IEEE Trans Biomed Eng 2005; 52:1167– 1179. 44. Alterovitz R, Pouliot J, Taschereau R, et al. Simulating needle insertion and radioactive seed implantation for prostate brachytherapy. Stud Health Tech Inform 2003; 94:19 –25. 45. Kataoka H, Toshikatsu W, Kiyoyuki C, et al. Measurement of the tip and friction force acting on a needle during penetration. T. Dohi and R. Kikinis, eds. In: Proceedings of Medical Image Computing and Computer-Assisted Intervention–MICCAI 2002, Lecture Notes in Computer Science (vol 2488); New York; Springer-Verlag, 2002; 216 –223. 46. O’leary MD, Simone C, Washio T, et al. Robotic needle insertion: effects of friction and needle geometry. In: Proceedings of the 2003 IEEE International Conference on Robotics and Automation. Los Alamitos: The IEEE Robotic and Automation Society, 2003; 1774 – 1780. 47. Chanthasopeephan T, Desai J, Lau ACW, et al. Study of soft tissue cutting forces and cutting speeds. Stud Health Technol Inform 2004; 98:56 – 62. 48. Ottensmeyer MP. TeMPeST 1-D: an instrument for measuring solid organ soft tissue properties— experimental techniques. Experimental Techniques 2002; 26:48 –50. 49. Brouwer I, Ustin J, Bentley L, et al. Measuring in vivo animal soft tissue properties for haptic modeling in surgical simulation. Stud Health Technol Inform 2001; 81:69 –74. 50. Healey AE, Evans JC, Murphy MG, et al. In vivo force during arterial interventional radiology needle puncture procedures. Stud Health Technol Inform 2005; 111:178 –184. 51. Chami G, Ward JW, Wills DPM, Phillips R, Sherman KP. Smart tool for force measurements during knee arthroscopy: in vivo human study. Stud

490

52.

53.

54.

55.

56.

57.

58.

59.

60.

61.

62.

63.



April 2007

IR Simulation: Prepare for a Virtual Revolution in Training

Health Technol Inform 2006; 119:85– 89. Dauphinee WD. Licensure and certification. In: Norman GR, Van der Vleuten CPM, Newble DI, eds. International handbook of research in medical education, vol 2. Dordrecht, the Netherlands: Kluwer Academic Publishers, 2002; 836. Shumway JM, Harden RM. AMEE Guide No. 25: The Assessment of learning outcomes for the competent and reflective physician. Medical Teacher 2003; 25:569 –584. Faulkner H, Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med 1996; 71:1363–1365. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997; 173:226 – 230. Cushieri A, Francis N, Crosby J, Hanna GB. What do master surgeons think of surgical competence and revalidation? Am J Surg 2001; 182:110 –116. European Association of Endoscopic Surgeons. Training and assessment of competence. Surgical Endosc 1994; 8:721–722. Faulkner H, Regehr G, Martin J. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med 1966; 71:1363– 1365. Moorthy K, Munz Y, Sarker SK, Darzi A. Objective assessment of technical skills in surgery. BMJ 2003; 327:1032– 1037. Battles JB, Wilkinson SL, Lee SJ. Using standardized patients in an objective structured clinical examination as a patient safety tool. Quality Safety Health Care 2004; 13(suppl 1):i46 –i50. Kneebone R, Kidd J, Nestel D, et al. Blurring the boundaries: scenariobased simulation in a clinical setting. Med Educ 2005; 39:580 –587. Bakker NH, Tanase D, Reekers JA, et al. Evaluation of vascular and interventional procedures with time-action analysis: a pilot study. J Vasc Interv Radiol 2002; 13:483– 488. Taffinder N, Smith S, Jansen J, Ardehali B, Russell R, Darzi A. Objective measurement of surgical dexterity: validation of the Imperial College Surgical Assessment Device (ICSAD). Minim Invasive Ther Allied Technol 1998; 7 (suppl 1):11.

64. Taffinder N, McManus I, Jansen J, et al. An objective assessment of surgeons’ psychomotor skills: validation of the MIST-VR laparoscopic simulator. Br J Surg 1998; 85(suppl 1):75. 65. Gallagher AG, Cates CU. Virtual reality training for the operating room and cardiac catheterisation laboratory. Lancet 2004; 364:1538 –1540. 66. Sherman KP, Ward JW, Wills DPM, Mohsen AMMA. Surgical trainee assessment using a VE knee arthroscopy training system (VE-KATS): experimental results. Stud Health Technol Inform 2001; 81:465– 470. 67. Clark RE, Estes F. Cognitive task analysis for training. Int J Educ Res 1996: 25:403– 417. 68. Grunwald T, Clark D, Fisher SS, et al. Using cognitive task analysis to facilitate collaboration in development of simulators to accelerate surgical training. Stud Health Technol Inform 2004: 98:114 –120. 69. Johnson SJ, Healey AE, Evans JC, et al. Physical and cognitive task analysis in interventional radiology. J Clin Radiol 2006; 61:97–103. 70. Lewandowski W. Performing a task analysis: the critical step in creating a simulation that improves human performance. Syllabus: medicine meets virtual reality 12: building a better you: the next tools for medical education, diagnosis, and care. San Luis Obispo: Aligned Management Associates, 2004; 90 –91. 71. Southgate L, Grant J. Principles for an assessment system for postgraduate medical training: a working paper for the Postgraduate Medical Education Training Board. PMETB: http://www.pmetb.org. uk/media/pdf/4/9/PMETB_principles_ for_an_assessment_system_for_post graduate_medical_training_(September_ 2004).pdf. Accessed January 18, 2007. 72. Sedlack R, Kolars J. Computer simulator training enhances the competency of gastroenterology fellows at colonoscopy: results of a pilot study. Am J Gastroenterol 2004; 99:33–37. 73. Rowe R, Cohen R. An evaluation of a virtual reality airway simulator. Anesthesia Analgesia 2002; 95:62– 66. 74. Gould DA, Kessel DO, Healey AE, Johnson JJ, Lewandowski WE. Simulators in catheter based interventional radiology: training or computer games? J Clinical Radiol 2006; 61:556 – 561. 75. Gould DA, Healey AE, Johnson SJ, Lewandowski WE, Kessel DO. Metrics for an interventional radiology curric-

76.

77.

78.

79.

80.

81.

82.

83.

84.

85.

JVIR

ulum: a case for standardization? Stud Health Technol Inform 2006; 119:159 – 164. SimSuite centers. Medical Simulation Corporation Web site, http://www. medsimulation.com/education_system/ centers.asp. Accessed May 4, 2006. Interventional Cardiology Practice Enhancement Programs. Boston Scientific: http://www.bostonscientific.com/ common_templates/listPages.jsp?task⫽ tskListPages.jsp§ionId⫽4&relId⫽2, 107,108. Accessed January 17, 2007. Rosenfield K, Cowley MJ, Jaff MR, et al. SCAI/SVMB/SVS clinical competence statement on carotid stenting: training and credentialing for carotid stenting—multispeciality consensus recommendations, a report of the SCAI/ SVMB/SVS writing committee to develop a clinical competence statement on carotid interventions. Cathet Cardiovasc Interv 2005; 64:1–11. Phillips N, John NW. Web-based surgical simulation for ventricular catheterization. Neurosurgery 2000; 46:933– 937. Yu W, Reid D, Brewster S. Webbased multimodality graphs for visually impaired people. University of Glasgow, Department of Computer Science Web site. www.dcs.gla.ac.uk/ ⬃rayu/Publications/Yu_CWUAAT.pdf. Accessed April 24, 2006. Gould DA, Reekers JA, Kessel DO, et al. Simulation devices in interventional radiology: validation pending. J Vasc Interv Radiol 2006; 17:215–216. Aviation Safety Research. Federal Aviation Administration Web site. http://www.faa.gov/safety/programs_ initiatives/aircraft_aviation/nsp/ flight_training/qualification_process/ media/GAO_PCATD.txt. Accessed May 2006. Brunner WC, Korndorffer JR Jr, Sierra R, et al. Laparoscopic virtual reality training: are 30 repetitions enough? J Surg Res 2004; 122:150 –156. Graber MA, Wyatt C, Kasparek L, Xu Y. Does simulator training for medical students change patient opinions and attitudes toward medical student procedures in the emergency department? Acad Emerg Med 2005; 12:635– 639. Becker GJ, Connors B, Cardella J, et al. Joint International Task Force Simulation Strategy. CIRSE Web site. http:// www.cirse.org/_files/contentmanage ment/CIRSE_SIR_Joint_Strategy.pdf. Accessed July 2006.