Simulation for quality assurance in training, credentialing and maintenance of certification

Simulation for quality assurance in training, credentialing and maintenance of certification

Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15 Contents lists available at SciVerse ScienceDirect Best Practice & Research Clinica...

260KB Sizes 0 Downloads 37 Views

Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

Contents lists available at SciVerse ScienceDirect

Best Practice & Research Clinical Anaesthesiology journal homepage: www.elsevier.com/locate/bean

1

Simulation for quality assurance in training, credentialing and maintenance of certification Randolph Herbert Steadman, MD, Professor and Vice Chair a, *, Yue Ming Huang, EdD, MHS, Assistant Adjunct Professor b, c a

Department of Anesthesiology, David Geffen School of Medicine at UCLA, 757 Westwood Plaza, Suite 2331E, Los Angeles, CA 90095-7403, USA Department of Anesthesiology, David Geffen School of Medicine at UCLA, UCLA Simulation Center, 700 Westwood Plaza, Suite A222 LRC, Los Angeles, CA 90095-7381, USA b

Keywords: anaesthesiology/education anaesthesia/methods anaesthesia/standards certification clinical competence competency-based education computer-assisted instruction education medical graduate/methods feedback humans mannequins outcome assessment (health-care) patient simulation quality assurance speciality boards/standards teaching/methods

Simulation has become ubiquitous in medical education over the last decade. However, while many health-care professions and disciplines have embraced the use of simulation for training, its use for high-stakes testing and credentialing is less well established. This chapter explores the incorporation of simulation into training requirements and board certification, and its role for quality assurance of educational programmes and professional competence. Educational theories that underlie the use of simulation are described. The driving forces that support the simulation movement are outlined. Accreditation bodies have mandated simulation in training and maintenance of certification. It may be only a matter of time before simulation becomes one of the standards for performance assessment. Ó 2012 Elsevier Ltd. All rights reserved.

* Corresponding author. Tel.: þ1 310 267 8693; Fax: þ1 310 825 0037. E-mail addresses: [email protected] (R.H. Steadman), [email protected] (Y.M. Huang). c Tel.: þ1 310 267 2114; Fax: þ1 310 825 0037. 1521-6896/$ – see front matter Ó 2012 Elsevier Ltd. All rights reserved. doi:10.1016/j.bpa.2012.01.002

4

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

Historical and theoretical perspectives Simulation’s long history in anaesthesiology training Simulation-based training is not new. For many years, anaesthesiologists have developed and used models for practice and training. From the inanimate models of the 1950s to the computerised mannequins of the 1980s, anaesthesiologists have been pioneers in the advent of simulation. Table 1 highlights anaesthesiologists’ creative contributions to simulation. Comprehensive reviews on the history of simulation development and usage are available.1–3 Simulation exists in many formats and devices. Broadly, simulation has been defined as a “technique, not a technology, that replaces or amplifies real experiences with guided experiences that . replicate substantial aspects of the real world in a fully interactive manner.”4 In the last few years, multiple external forces combined with an exponential interest in simulation have caused a substantial shift in the educational paradigm for health-care training and assessment. Adult learning principles applied to simulation-based education Understanding how people learn is critical in designing, implementing and evaluating educational programmes. Adult learning principles based on psychology and neuroscience research inform us that learning is largely shaped by experience. Adult learners are self-directed, goal-oriented individuals who need to know that what they are learning is timely, relevant and practical. Adults learn through a cycle of direct experiences, reflection, conceptualisation and experimentation (feeling, reflecting, thinking and doing).5 Expertise is developed through deep understanding and deliberate practice with immediate and constructive feedback.6 Learning is optimised when the right conditions provide a suitable milieu for experimentation, practice and mastery of skills. Ideally, the ultimate goal of health-care education is not skill acquisition and professional competency for the sake of certification and licensure, but attainment of a higher level of proficiency to predict outcomes, explain ideas, learn from mistakes, build on prior knowledge/skills and reflect to promote safe effective patient care. This advanced thinking is referred to as metacognition (awareness or analysis of one’s learning or thinking processes) and is the basis for developing life-long skills as a mindful and reflective practitioner. Environments that foster skills mastery and metacognition are learner-centred (build on strengths, interests and needs of learners), knowledge centred (build on metacognition and monitoring what you do not know) and assessment centred (build on revisions through adequate feedback, with assessments directly related to learning goals).7 As a training technique, simulation fits well within the concepts of pedagogy (the science of teaching; instructional methods), andragogy (instructional techniques for adult learners) and instructional technology design.8 Drawing from Kolb’s adult experiential learning cycle, simulation provides a controlled environment for 1) direct and immersive experiences that activate emotions to enhance learning and memory, 2) immediate facilitator-guided feedback that fosters reflection of the experience, 3) discussions of concepts without the stress of patient care followed by further study and 4) repeated practice and experimentation (Fig. 1).

Table 1 Anaesthesiologists prominent in the history of simulation. Year

Name

Contribution

1960s 1960s 1980s

Resusci-Anne, first CPR mannequin Sim One, first computerized mannequin Various screen-based simulations using physiology and pharmacology models

1987

Safer, Lund Denson Philip, Sikorski, Smith, Schwid Gaba

1988 1990s

Good, Gravenstein Schaefer, Gonzales

Comprehensive Anesthesia Simulation Environment, a forerunners of modern computerized mannequins Gainesville Anesthesia Simulator, a forerunner of modern computerized mannequins SimMan, a second generation computerized mannequin with updated airway anatomy

Table 1 highlights the creative contributions by anaesthesiologists to medical education. Adapted from Cooper 2008.2

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

5

Fig. 1. Simulation within the experiential learning Cycle. Fig. 1 illustrates the activities within a simulation training program that corresponds to the adult learning cycle. Adapted from Kolb.5

Simulation has the potential to integrate all three domains of learning: cognitive (knowledge), psychomotor (skills) and affective (attitudes). Anaesthesiology and other specialities have taken advantage of the wide spectrum of simulation tools (screen-based, task trainers, virtual reality, standardised patients (SPs) and full-scale mannequins) for procedural skills training, diagnostic reasoning and decision making, teamwork in crisis management, communication and interprofessional skills, professionalism and systems and processes testing.1,9–11 With broad applications and high participant satisfaction, simulation has now been widely accepted as a useful educational strategy. For simulation-based education to be effective, it should adhere to Table 2 Best practices features of simulation for effective learning. Item

Feature

Reasons to integrate feature

1

Feedback

2 3

Deliberate practice Curriculum integration

4

Outcome measurement

5 6

Simulator validity and fidelity Skill acquisition and maintenance

7

Mastery learning

8

Transfer to practice

9 10

Team training High-stakes testing

11 12

Instructor training Educational and professional context

Promotes reflection for learning, reinforces desired behaviours, corrects mistakes Opportunities for repetition and practice to gain expertise Supplements rather than distracts from learning goals when integrated with other activities Clearly defined learning outcomes are achievable and measurable by reliable instruments Educational objectives should dictate choice of teaching tools Building on activities with various difficulty levels will promote skill development and increase retention Individual learning curves vary so attention to the process of mastery learning is important and simulation should be adaptable to various learning styles Learning should be generalizable and transferable to different scenarios, and from the laboratory to clinical care Teamwork skills promote patient safety Adds further accountability particularly as we move toward competency-based education Facilitator should be skilled in debriefing and simulation technqiues Learning needs to be authentic, relevant and practical

Table 2 summarizes the key features that make simulation an effective instructional strategy. Adapted from Issenberg and McGaghie.12,13

6

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

a number of best practice features as delineated by Issenberg and McGaghie (Table 2).12,13 In accordance with learning principles, the most important features for effective simulation are feedback and deliberate practice. Simulation’s role in assuring the quality of training programmes In training, “the goal of simulation is to replicate patient care scenarios in a realistic environment for the purposes of feedback and assessment.”14 Feedback to the participating learners is one aspect of quality assurance for any simulation activity. Feedback from participants is equally important for programme improvement. Simulation can be used to assure quality in multiple areas. Consistent and reproducible education The public push for improved patient safety measures has made the clinical setting a limited learning environment. Patient care has always superseded learning, but now with increasing patient acuity and a reduction of duty hours, there are fewer patient encounters for teaching.15 Moreover, the ethical implications of practising on real patients prevent true experimentation in the learning process. The advantage of simulation is that learning can occur in a controlled environment without any risk to actual patients, with each trainee receiving the same opportunities to experience and manage challenging cases that they may not otherwise encounter if left to random clinical exposure. New recommendations and requirements in nursing, undergraduate and graduate medical education ensure that all trainees receive a standardised education through simulation. Simulation for nursing education The National Council of State Boards of Nursing has acknowledged the usefulness of simulation for pre-licensure nursing students, and encourages the use of various forms of simulation for training. However, it explicitly states that simulation cannot take the place of actual patients, leaving state boards to determine the extent to which simulation augments learning in the clinical environment. The California Board permits at most 25% of clinical encounters to be simulated patients and requires at least 75% of clinical hours be in direct patient care.16,17 Simulation for undergraduate medical education A survey by the Association of American Medical Colleges showed that a majority of medical schools use simulation throughout the 4 year medical school curriculum, with 52% of medical schools and 39% of teaching hospitals offering all types of simulations (SPs, full-scale mannequins, part task trainers and screen-based virtual reality simulation). About 95% use SPs or full-scale mannequins. Within the medical school curriculum, simulation was commonly used for internal medicine, paediatric emergency medicine and anaesthesiology in clerkship training. Simulation was used mostly for training and assessing core competencies and less often used as a tool for quality improvement and research. A sample simulation curriculum for medical students has been described.18,19 Simulation for graduate medical education In recent years, several residency review committees have added a simulation requirement to their speciality programme requirements. These include anaesthesiology, internal medicine, surgery and surgical critical care. Others, such as emergency medicine, count simulated encounters identical to patient care for the purposes of their experience quotas. Table 3 shows the programme requirement language for the various specialities.20 Anaesthesiology has long used simulation for training; now as a required metric for accreditation, every anaesthesia resident can expect to learn through immersive experiences regardless of where they train. Quality of instruction Simulation has the potential to not only measure student learning but also reflect the instructor’s ability to teach. To maintain a high-quality simulation programme, many simulation centres offer and

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

7

Table 3 ACGME program requirements by specialty (Fall 2011). Specialty

Program requirements

Anaesthesiology

Residents must participate in at least one simulated clinical experience each year (effective July 2011) The sponsoring institution and participating sites must provide residents with access to training using simulation (effective July 2009) Resources must include simulation and skills laboratories. These facilities must address acquisition and maintenance of skills with a competency-based method of evaluation. (effective July 2012) Resources should include a simulation and skills laboratory (effective July 2012) Encounter guideline numbers include both patient care and laboratory simulations None specified None specified None specified None specified

Internal Medicine Surgery

Surgical Critical Care Emergency Medicine Family Medicine Neurology Pediatrics Obstetrics & Gynecology

Table 3 shows the language from various specialty training programs regarding simulation training. Adapted from ACGME website.61

require faculty to attend instructor courses before they can teach with simulation.21 As feedback is at the forefront of learning, most of the simulation instructor courses focus on developing debriefing skills. Best practices for debriefing and features of effective instruction have been reported.13,22–24 Instructor certification is being explored, but standards for simulation educators are not fully established. Establishing resident-as-teacher programmes using simulation can prepare our next generation of medical educators. Adoption of a new educational paradigm Workforce constraints are now shifting the educational structure from a time-based process to an outcomes-based education.15,25 The implications are that specific milestones will need to be met and competency demonstrated before a trainee can perform a procedure on a real patient or advance to the next level. This places more emphasis on simulation for assessment. Testing and validation of assessment instruments To truly measure learning outcomes, simulation programmes should target the highest level of evaluation (Level 4 in Table 4).26 Validated assessment tools are important to gauge learning and to Table 4 Kirkpatrick’s levels of evaluation. Level

Category

Description

1

Reaction

2

Learning

3

Behaviour

4

Result

Measures satisfaction of participation and reactions to:  Organization  Presentation  Content  Teaching methods  Materials  Quality of instruction Measures learning through:  Change in attitudes/perceptions  Acquisition of knowledge  Acquisition of skills Measures behavioural change:  Transfer of learning to workplace  Application of knowledge and skills Measures results:  Changes in organizational practice  Direct benefits to patients

Table 4 outlines the four categories for outcome evaluation. Adapted from Kirkpatrick.26

8

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

measure the quality of training. Multiple choice tests and oral board examinations test cognitive skills. Simulation, on the other hand, embeds all domains of learning in an applied setting, requiring practitioners to demonstrate competence. Simulation as a quality assurance tool has been shown to be effective in the clinical setting. SPs have been validated as assessment tools for use in Objective Structured Clinical Exams (OSCEs) and have even been used as secret patients for quality assurance measures of physician competency and skills.27 SPs are now the gold standard by which interviewing and history-taking skills are measured. In the last 37 years, SPs have been used around the world as a valid and reliable evaluation tool to assess the performance and competency of health professionals.28 Extensive work goes into quality control of case development, SP and staff training and logistical considerations.29 Other forms of simulation have been studied for their effectiveness in evaluating learning and competency, including virtual reality and clinical vignettes.30,31 Transferability of learning to patient outcomes The theoretical underpinnings of simulation should translate into improved learner outcomes; however, this has only been shown in isolated settings. Procedural training tends to be more readily transferable, as demonstrated by one study in which the rates of mechanical complications for central line placement were significantly decreased after residents trained with simulation.32 In another study, simulation-trained anaesthesiology residents weaned patients from cardiopulmonary bypass better than residents taught with traditional didactic methods.33 High-fidelity simulations have also been shown to improve the quality of care during cardiac arrests and during the management of shoulder dystocia.34,35 A meta-analysis of simulation studies showed that when simulation training was compared to no intervention, there are large effects for outcomes of knowledge, skills and behaviours and moderate effects for patient-related outcomes.36 The public demands that physicians maintain their skills, but traditional continuing medical education (CME) activities are not frequently associated with a change in practice.37 Substantiation of learning to clinical environments has rarely been shown for other educational modalities or CME programmes.38 Simulation as a tool for competency assessment Advanced cardiac life support, basic life support and fundamentals of laparoscopic surgery One of the earliest uses of simulation for quality assurance was Basic Life Support (BLS) and Advanced Cardiac Life Support (ACLS) certification. ACLS guidelines were established in 1974 and certification became a requirement for health providers working in acute care settings. Recertification is every 2 years, but a recent study showed that this may not be sufficient as knowledge decay occurs after a year.39 Simulation has been shown to increase retention of skills which may not be frequently exercised, but not all ACLS certification programmes use high-fidelity simulations.40 Another example of certification in procedural skills is the fundamentals of laparoscopic surgery (FLS), which surgeons need to complete before they are eligible for board certification.41,42 This is described further in the section Simulation for board certification. ECFMG and United States Medical Licensing Examination step 2 clinical skills assessment In 1998, the Educational Commission for Foreign Medical Graduates (ECFMG) required international medical graduates to pass a clinical skills assessment (CSA) before beginning a United States residency. Interns who passed the CSA were less likely to be deficient in interpersonal skills compared to those who had not taken the test.43 Six years with 400 000 simulated patient encounters later, SPs prove to be valid and reliable tests of physician’s interpersonal skills.44 In 2005 the United States Medical Licensing Examination (USMLE) added a 1-day CSA using 12 SPs. During each 15-min encounter the student establishes rapport, elicits a history, performs a physical examination, documents findings, formulates a plan and communicates their findings. In a study comparing various student assessment

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

9

measures (grade point average, USMLE step 1, USMLE step 2 and USMLE step 2 CS) with measures of performance during internship (residency directors’ quartile ranking, average competency score in the six core Accreditation Council for Graduate Medical Education (ACGME) competencies), the grade point average and step 2 CS scores were the best predictors of interns’ performance. However, these two variables explained only 30% of the variation in performance, indicating the difficulty in predicting future performance. Simulation for board certification and maintenance of certification Simulation for high-stakes assessment has been developing over the last decade in many different countries as well as in the United States.45 Following are the key efforts made in various specialities. Anaesthesiology Effective Management of Anesthetic Crises (EMAC) In 2002 a joint initiative between the Australian and New Zealand College of Anesthetists (ANZCA) and Australasian simulation centres spawned a 2.5-day course designed for specialists and trainees.46 The course, titled Effective Management of Anesthetic Crises (EMAC), is a required component of fellowship training. Specialists are incentivised to attend through Maintenance of Professional Standards (MOPS) points. The course consists of five modules (Table 5). The course was well received; over 90% of the nearly 500 participants felt that the topics were relevant. The majority agreed or strongly agreed that they would change their practice as a result of participation (71–88% across the various modules). The Israeli National Board Examination in Anesthesiology Beginning in 2003 the Israeli Board of Anesthesiology Examination Committee, in conjunction with the Israel Center for Medical Simulation and the Israel National Institute for Testing and Evaluation, incorporated objective structured clinical evaluation (OSCE) into the Israeli board examination in anaesthesiology. The simulations consisted of five 15-min stations that evaluated trauma management, resuscitation, operating room crisis management, mechanical ventilation and regional anaesthesia (Table 6). The scoring was done using a checklist comprised of 12–20 technical tasks. Non-technical (behavioural) skills were not assessed. A score of 70% on the checklist was required for passing; performance of critical items was mandatory. Each of the two examiners also assessed the examinees’ decision making, situational awareness and manual skills globally using a four-point scale. If the lowest score (designated as ‘insufficient’) was awarded by both examiners the examinee failed the examination regardless of the checklist score. The inter-rater correlations were high (0.75–0.8 for the total, critical and global scores), but inter-station correlations were considerably lower. The authors concluded that generalisability was limited, and that adding stations was likely to improve this shortcoming, albeit with an added cost. Examinees rated the simulations of reasonable difficulty and preferred them to an oral examination.47 The American Society of Anesthesiologists In 2004 the American Society of Anesthesiologists (ASA) appointed a Workgroup on Simulation Education to organise a national network of simulation-based offerings for ASA members. Table 5 The components of the ANZCA Effective Management of Anaesthetic Crises (EMAC) Course. Module

Description

Human Performance Issues Airway Emergencies Anaesthetic Emergencies Cardiovascular Emergencies Trauma Management

Teamwork, crisis resource management Noninvasive and surgical airway management Critical interface w/equipment and drugs Arrhythmias, pacing, coronary syndromes C spine, chest and head injury management

Table 5 lists the skills assessed in the EMAC course. Adapted from Weller 2006.46

10

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

Table 6 The components of the Israeli National Board Examination in anesthesiology. Task

Description

Trauma management Resuscitation OR crisis Mechanical ventilation Regional anaesthesia

ATLS skills in an Emergency Department patient ACLS skills Called in to manage intra-operative event Adjust a ventilator/artificial lung based on blood gas results Demonstrate surface anatomy, volume of local and management of nerve block complications

Table 6 lists the five skills that simulation is used to assess competency for credentialing in Israel. Adapted from Berkenstadt 2006.47

The group’s primary work product was its 2006 White Paper ‘ASA Approval of Anesthesiology Simulation Programs’. Concurrently, the workgroup conducted an online survey, which revealed that the majority (81%) of the 1350 ASA member respondents had an interest in simulation-based CME, with a similar percentage (77%) indicating that they felt simulation-based CME offered benefits compared to traditional, lecturebased CME. ASA members identified the important features of simulation-based training: a realistic mannequin (77%), a high instructor-to-student ratio (76%) and a realistic simulation of the environment (69%). Videotaping of performance (51%) and multidisciplinary training (50%) were less frequently identified as important elements of simulation-based CME. Surprisingly, 71% sought an assessment of their performance. Cost and inconvenience were identified as obstacles to simulation-based training. Half of the survey respondents were not interested in spending course fees that were projected to considerably exceed standard CME fees (to support the high instructor-to-student ratio used in simulation). However, when incentives such as credit towards Maintenance of Certification in Anesthesiology (MOCAÒ) were hypothesised, only 16% of respondents objected to such fees. Based upon the interest identified in the survey, the workgroup recommended that the ASA transition the workgroup to a standing committee, which was done in the fall of 2006. Shortly thereafter, the newly formed ASA Committee on Simulation Education launched an online application for simulation programmes seeking ASA endorsement. The key programme attributes, previously identified in the 2006 White Paper, formed the evaluation elements of the application (Table 7).48 Since 2007, over 40 simulation programmes have submitted applications, and 27 programmes have been endorsed. The American Board of Anesthesiology In January 2010, the American Board of Anesthesiology (ABA) made simulation-based training a mandatory portion of MOCA. A unique feature of the simulation component is that it is embedded into part 4 of the Maintenance of Certification (MOC). Part 4, as designated by the American Board of Medical Specialties, entails the practice performance assessment and improvement (PPAI) portion of MOC for all speciality boards. The simulation experience serves as a stimulus for reflection on one’s practice. After the course, the participant submits a list of at least three practice improvement items to

Table 7 Program attributes evaluated by American Society of Anesthesiologists (ASA). Application heading

Attribute evaluated

Overview Educational offerings Scenario Instructors Facility/equipment Leadership Customer perspective

Unique programmatic features Course experience, target audience(s) Ability to script a scenario relevant to ASA members Quantity of instructors and their simulation experience Ability to host proposed course(s) for ASA members Director’s experience, organizational support Contact info, CME, confidentiality and videotaping policies, evaluation data from existing courses

Table 7 lists the application components to become an ASA endorsed simulation program. Adapted from the ASA website.48

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

11

ASA. In order to receive MOCA credit the participant must confirm that they have addressed their practice improvement plan within 90 days of the course. A little over a year after the implementation of the simulation requirement for MOCA, approximately 500 ABA diplomates have participated in simulation courses. The overwhelming majority strongly agreed that the activity was useful, and 94% indicated that they agreed or strongly agreed they would change their practice as a result of the simulation experience. Family medicine The American Board of Family Medicine (ABFM) includes two self-assessment modules (SAMs) in Part II of its MOC. These modules consist of a knowledge assessment in a chosen domain, followed by clinical simulations. The simulations are screen-based patient care scenarios based upon the topic chosen for the knowledge assessment. The scenarios include “responses to therapeutic interventions, investigations, and passage of time.”49 In 2005, after the first year of SAM modules, over 7000 diplomates of the ABFM had completed the knowledge assessment and clinical simulation modules. Two-thirds of the diplomates chose the diabetes module and the remainder chose the hypertension module. The knowledge assessment modules took an average of 10–11 h to complete. The simulation module took roughly 3 h. The majority of participants indicated that the experience would lead to changes in their practice (55% and 54% for the hypertension and diabetes modules, respectively).50 As of 2006, the ABFM has created additional clinical simulations for asthma, coronary artery disease, major depression and heart failure. The simulations were highly rated, yet some problems were encountered due to difficulties navigating the user interface, slow system operation and browser incompatibilities with the web-based simulations. Surgery The American Board of Surgery requires “documentation of successful completion of ACLS, ATLS and Fundamentals of Laparoscopic Surgery (FLS).” FLS is a comprehensive web-based education module that includes a hands-on skills training component and assessment tool. The module includes the opportunity to “definitely measure and document those skills.” The test measures “cognitive knowledge, case/problem management skills and manual dexterity.”51 The educational foundation for the manual dexterity assessment came from McGill University’s McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS) physical laparoscopic simulator, which was validated by evidence of construct validity (residents improved their scores over time), external validity (similar results at five additional institutions) and predictive validity (MISTELS scores correlated with intra-operative ratings obtained during laparoscopic cholecystectomy, r ¼ 0.81).52 Approximately two-thirds of the variance observed in the operating room was predicted from the MISTRELS scores. The authors’ opinion was that the remaining variance was related to patient factors and anatomic recognition. The five FLS skill tasks are listed in Table 8. To set competency cut-offs, groups of medical student and junior surgical residents were compared to chief surgical residents with laparoscopic surgery experience. The scores of the groups were compared and based upon a receiver-operating

Table 8 Fundamentals of laparoscopic surgery tasks. Skill

Description

Peg Transfer Pattern Cutting Place Ligating Loop Intracorporeal Knot

Lift six pegs, transfer to other hand and place in second pegboard Cut a circular pattern 5 cm in diameter within 300 s Place and secure pretied slipknot at specified location on a foam tube Place and tie 12 cm suture at two premarked points on a longitudinally slit Penrose drain within 600 s Place and tie a suture at two premarked points using a knot pusher

Extracorporeal knot

Table 8 lists the five skills modules for laparoscopic surgery certification. Adapted from Fraser 2003.53

12

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

characteristic curve; a cut-off score for competency was set to achieve a sensitivity of 80% and a specificity of 82%.53 FLS courses are offered at 70 regional centres, and at the annual ACS Clinical Congress and the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) Annual Meeting. Of note, the low cost of the FLS system makes it one of the least expensive simulations in use for credentialing. Incentives and mandates for simulation as quality assurance Just as the airline industry has mandated simulation training at regular intervals for airline pilots and crews, the health-care industry may demand that health-care providers participate in simulation training. In New England, the Controlled Risk Insurance Company (CRICO) provides incentives for anaesthesiologists to participate regularly in simulation training through reduced malpractice premiums. There is evidence that simulation reduces the number of medicolegal claims.54 As the literature shows more evidence linking simulation to direct patient benefits, more hospitals will be mandating simulation as a quality assurance tool to improve procedures and systematic processes.55,56 For at least one medical device, a then-newly approved percutaneously placed carotid stent, simulation-based proficiency must be demonstrated before practitioners can deploy the device in patients.57 Legislative initiatives such as the Enhancing Safety in Medicine Utilizing Leading Advanced Simulation Technologies to Improve Outcomes Now (SIMULATION) Act, first introduced in the United States House of Representatives in 2007, then again in 2009, has drawn attention, thereby increasing awareness of new training strategies for health-care professionals. These efforts along with research reports have resulted in federal funds to encourage further research for simulation as a patient safety tool. The Agency for Healthcare Research and Quality (AHRQ), the Anesthesia Patient Safety Foundation (APSF), Foundation for Education and Research in Anesthesia (FAER) and many other speciality societies have allocated funds for simulation research. Furthermore, consensus meetings with simulation experts have generated many questions for collaborative research.58 Considerable resources have been dedicated to simulation-based training centres. In addition to the ASA and ABA efforts in anaesthesiology (as described above), other clinical speciality societies have also established standing committees to address simulation education. The American College of Surgeons (ACS) created a consortium of ACS-accredited Education Institutes (AEI) that offer “global opportunities for collaboration, research and access to resources” designed to enhance patient safety through simulation. The consortium consists of 62 accredited programmes, 11 of which are outside the United States. To achieve accreditation applicant centres must describe their education programmes that target physicians, residents, medical students, nurses and/or allied health professionals. A full description can be found on the ACS website.59 Growing interest in simulation promulgated the creation of the interprofessional society, the Society for Simulation in Healthcare (SSH) whose mission is to “facilitate excellence in healthcare education, practice and research through simulation.” SSH also offers accreditation as a quality control for simulation training and research.60 Summary Training health-care providers on patients, as we have done for years, is no longer acceptable. Patients expect proficient health-care providers, and simulation provides an alternative to the outdated apprenticeship model. Training programmes agree, and the majority has instituted experiential learning using simulation. The challenges facing educators in this new paradigm include designing simulation activities with sound theoretical principles, monitoring to improve content and instruction and managing the logistical challenges of simulation technology and constraints in time, personnel and finances. Quality assurance of simulation training is measured by evidence of transferability of learning to the clinical setting. While this does exist, more proof is needed. Assessment with simulation has primarily been formative, although simulation for high-stakes certification has been established for specific skills. To gain widespread acceptance for assessment, considerable work remains. Yet it seems likely that simulation-based training is here to stay, and that simulation-based assessment is around the corner. Mandates from the federal government, the

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

13

national speciality boards, the state medical boards and licensing agencies, local medical centre medical staff offices and/or insurers will enact requirements before every medical practitioner is convinced that simulation-based assessment is warranted. As health-care professionals, each of us should be ready to demonstrate our skills. As educators, we need to prepare for evolving teaching strategies. As more and more of us train and teach with simulation, the thought of being assessed with simulation becomes less daunting and more tangible. Practice points  Educational principles such as feedback and deliberate practice must be integrated with simulation for it to be effective as a teaching tool.  Simulation is firmly established in undergraduate training programmes and graduate medical education accreditation committees are beginning to mandate simulation training for residents. Institutions need to allocate resources for these programmes to be successful.  Training programmes need to prepare their educators to adapt to teaching with simulation as workforce restrictions and patient safety concerns are shifting the educational model from apprenticeship to new, interactive training methods.  Simulation needs to be considered as an assessment tool given its theoretical advantage compared to other modalities of certification such as written or oral examinations. Knowledge is no longer sufficient by itself – the public demands demonstrated competence from health-care practitioners.  Exposure to simulation during training prepares practitioners for simulation-based certification and credentialing.  Competence is multidimensional and high-stakes assessment must incorporate multiple forms of evaluation. Simulation should be considered as one component.

Research agenda  Further research is needed to validate simulators and determine their optimal applications.  The appropriate duration and frequency of training need to be determined.  Evidence for the transferability of simulation training to clinical performance outcomes is needed.  The circumstances in which simulation is superior to other training or testing modalities need to be determined.  Performance metrics and scoring standards need to be established.  The cost effectiveness of simulation needs to be evaluated.  The key instructor attributes for simulation-based education need to be identified. Funding source None. Conflict of interest None. References 1. Bradley P. The history of simulation in medical education and possible future directions. Med Educ 2006 Mar; 40(3): 254–262. 2. Cooper JB & Taqueti VR. A brief history of the development of mannequin simulators for clinical education and training. Postgrad Med J 2008 Nov; 84(997): 563–570.

14 3. 4. 5. *6. 7.

8. 9. 10. 11. *12. *13. 14. *15. 16. 17. 18. 19.

20. 21.

22. 23. 24. 25. 26. 27. 28. *29. 30. 31. 32. 33. 34. 35. *36. *37.

38. 39.

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15 Good ML. Patient simulation for training basic and advanced clinical skills. Med Educ 2003 Nov; 37(Suppl. 1): 14–21. Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004 Oct; 13(Suppl. 1): i2–10. Kolb D. Experiential learning: experience as the source of learning and development. New Jersey: Prentice-Hall, 1984. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med 2004 Oct; 79(10 Suppl.): S70–S81. Committee on Developments in the Science of Learning with additional material from the Committee on Learning Research and Educational Practice NRC How people learn. National Academy of Sciences, 2000. Available from: http://www. nap.edu/catalog.php?record_id¼9853. Hofmann B. Why simulation can be efficient: on the preconditions of efficient learning in complex technology based practices. BMC Med Educ 2009; 9: 48. Levine AI & Swartz MH. Standardized patients: the “other” simulation. J Crit Care 2008 Jun; 23(2): 179–184. Sinz E. Simulation-based education for cardiac, thoracic, and vascular anesthesiology. Semin Cardiothorac Vasc Anesth 2005 Dec; 9(4): 291–307. Ostergaard D, Dieckmann P & Lippert A. Simulation and CRM. Best Pract Res Clin Anaesthesiol 2011 Jun; 25(2): 239–249. Issenberg SB, McGaghie WC, Petrusa ER et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005 Jan; 27(1): 10–28. McGaghie WC, Issenberg SB, Petrusa ER et al. A critical review of simulation-based medical education research: 2003– 2009. Med Educ 2010 Jan; 44(1): 50–63. Okuda Y, Bryson EO, DeMaria Jr. S et al. The utility of simulation in medical education: what is the evidence? Mt Sinai J Med 2009 Aug; 76(4): 330–343. Albanese M, Mejicano G & Gruppen L. Perspective: competency-based medical education: a defense against the four horsemen of the medical education apocalypse. Acad Med 2008 Dec; 83(12): 1132–1139. California Board of registered Nursing final statement of reasons November 9, 2009 2009 [cited 2011 October 29]; Available from: www.rn.ca.gov/pdfs/regulations/fsoraddendum.pdf. Clinical Instruction in prelicensure Nursing Programs – National Council of State Boards of Nursing (NCSBN) Position Paper. 2005 [cited 2011 October 29]; Available from: www.rn.ca.gov/pdfs/ncsbn-clininstruct.pdf. Steadman R & Matevosian R. Incorporating simulation into the medical school curriculum. In Riley RH (ed.). Manual of simulation in healthcare. New York, NY: Oxford University Press, 2008, pp. 421–434. Passiment M, Sacks H, Huang G. Medical simulation in medical education: results of an AAMC Survey2011 [cited 2011 October 29]: Available from: https://www.aamc.org/download/259760/data/medicalsimulationinmedicaleducationanaamcsurvey. pdf. ACGME ACfGME. Simulation: new revision to program requirements. RRC News Anesthesiology March 2011 [Sect. 1–6]. Dieckmann P & Rall M. Becoming a simulation instructor and learning to facilitate: the Instructor and Facilitation Training (InFacT) course. In Kyle RR & Murray WB (eds.). Clinical simulation: operations, engineering, and management. Burlington, MA: Academic Press, 2008, pp. 647–652. Salas E, Klein C, King H et al. Debriefing medical teams: 12 evidence-based best practices and tips. Jt Comm J Qual Patient Saf 2008 Sep; 34(9): 518–527. Rudolph JW, Simon R, Rivard P et al. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin 2007 Jun; 25(2): 361–376. Raemer D, Anderson M, Cheng A et al. Research regarding debriefing as part of the learning process. Simul Healthc 2011 Aug; 6(Suppl.): S52–S57. Carraccio C, Wolfsthal SD, Englander R et al. Shifting paradigms: from Flexner to competencies. Acad Med 2002 May; 77(5): 361–367. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990 Sep; 65(9 Suppl): S63–S67. Ozuah PO & Reznik M. Using unannounced standardized patients to assess residents’ competency in asthma severity classification. Ambul Pediatr 2008 Mar–Apr; 8(2): 139–142. Battles JB, Wilkinson SL & Lee SJ. Using standardised patients in an objective structured clinical examination as a patient safety tool. Qual Saf Health Care 2004 Oct; 13(Suppl. 1): i46–50. Furman GE, Smee S & Wilson C. Quality assurance best practices for simulation-based examinations. Simul Healthc 2010 Aug; 5(4): 226–231. Peabody JW, Luck J, Glassman P et al. Comparison of vignettes, standardized patients, and chart abstraction: a prospective validation study of 3 methods for measuring quality. Jama 2000 Apr 5; 283(13): 1715–1722. Wendling AL, Halan S, Tighe P et al. Virtual humans versus standardized patients: which lead residents to more correct diagnoses? Acad Med 2011 Mar; 86(3): 384–388. Barsuk JH, McGaghie WC, Cohen ER et al. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med 2009 Oct; 37(10): 2697–2701. Bruppacher HR, Alam SK, LeBlanc VR et al. Simulation-based training improves physicians’ performance in patient care in high-stakes clinical setting of cardiac surgery. Anesthesiology 2010 Apr; 112(4): 985–992. Wayne DB, Didwania A, Feinglass J et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest 2008 Jan; 133(1): 56–61. Draycott TJ, Crofts JF, Ash JP et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol 2008 Jul; 112(1): 14–20. Cook DA, Hatala R, Brydges R et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011 Sep 7; 306(9): 978–988. Davis D, O’Brien MA, Freemantle N et al. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA 1999 Sep 1; 282(9): 867–874. Steadman RH. Improving on reality: can simulation facilitate practice change? Anesthesiology 2010 Apr; 112(4): 775–776. Pantazopoulos I, Aggelina A, Barouxis D et al. Cardiologists’ knowledge of the 2005 American Heart Association Resuscitation Guidelines: The Athens Study. Heart Lung 2011 Jul–Aug; 40(4): 278–284.

R.H. Steadman, Y.M. Huang / Best Practice & Research Clinical Anaesthesiology 26 (2012) 3–15

15

40. Wayne DB, Siddall VJ, Butter J et al. A longitudinal study of internal medicine residents’ retention of advanced cardiac life support skills. Acad Med 2006 Oct; 81(10 Suppl.): S9–S12. 41. Vassiliou MC, Dunkin BJ, Marks JM et al. FLS and FES: comprehensive models of training and assessment. Surg Clin North Am 2010 Jun; 90(3): 535–558. 42. Okrainec A, Soper NJ, Swanstrom LL et al. Trends and results of the first 5 years of Fundamentals of Laparoscopic Surgery (FLS) certification testing. Surg Endosc 2011 Apr; 25(4): 1192–1198. 43. Boulet JR, McKinley DW, Whelan GP et al. Clinical skills deficiencies among first-year residents: utility of the ECFMG clinical skills assessment. Acad Med 2002 Oct; 77(10 Suppl.): S33–S35. *44. van Zanten M, Boulet JR & McKinley D. Using standardized patients to assess the interpersonal skills of physicians: six years’ experience with a high-stakes certification examination. Health Commun 2007; 22(3): 195–205. *45. Park CS. Simulation and quality improvement in anesthesiology. Anesthesiology Clinics 2011 Mar; 29(1): 13–28. 46. Weller J, Morris R, Watterson L et al. Effective management of anaesthetic crises: development and evaluation of a collegeaccredited simulation-based course for anaesthesia education in Australia and New Zealand. Simul Healthc 2006; 1(4): 209–214. Winter. 47. Berkenstadt H, Ziv A, Gafni N et al. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg 2006 Mar; 102(3): 853–858. 48. Simulation education. American Society of Anesthesiologists, 2007 [cited 2011 October 29]; Available from: http://www. asahq.org/SIM/homepage.html. 49. American Board of family medicine. 2011 [cited 2011 October 29]; Available from: www.theabfm.org/moc/part2.aspx. 50. Hagen MD, Ivins DJ, Puffer JC et al. Maintenance of certification for family physicians (MC-FP) self assessment modules (SAMs): the first year. J Am Board Fam Med 2006 Jul–Aug; 19(4): 398–403. 51. Fundamentals of laparoscopic surgery 2003–2010 [cited 2010 October 29]; Available from: http://www.flsprogram.org/. 52. Fried GM, Feldman LS, Vassiliou MC et al. Proving the value of simulation in laparoscopic surgery. Ann Surg 2004 Oct; 240(3): 518–525 [discussion 25–8]. 53. Fraser SA, Klassen DR, Feldman LS et al. Evaluating laparoscopic skills: setting the pass/fail score for the MISTELS system. Surg Endosc 2003 Jun; 17(6): 964–967. 54. McCarthy J & Cooper J. Malpractice insurance carrier provides premium incentive for simulation based training and believes it’s made a difference. Anesth Patient Saf Found Newslett 2007; 17. 55. Geis GL, Pio B, Pendergrass TL et al. Simulation to assess the safety of new healthcare teams and new facilities. Simul Healthc 2011 Jun; 6(3): 125–133. 56. Hamman WR, Beaudin-Seiler BM, Beaubien JM et al. Using simulation to identify and resolve threats to patient safety. Am J Manag Care 2010 Jun; 16(6): e145–e150. 57. Gallagher AG & Cates CU. Approval of virtual reality training for carotid stenting: what this means for procedural-based medicine. JAMA 2004 Dec 22; 292(24): 3024–3026. 58. Dieckmann P, Phero JC, Issenberg SB et al. The first Research Consensus Summit of the Society for Simulation in Healthcare: conduction and a synthesis of the results. Simul Healthc 2011 Aug; 6(Suppl.): S1–S9. 59. American College of Surgeons, Division of Education Accredited Education Institutes. Enhancing patient safety through simulation 2011 [updated 08/24/2011]; Available from: http://www.facs.org/education/accreditationprogram/ requirements.html. 60. Society for Simulation in Healthcare Council for Accreditation of Healthcare Simulation Programs Informational Guide for the accreditation process from the SSH council for accreditation of Healthcare Simulation Programs 2010 [updated 2010]; [1–52]. Available from: http://ssih.org/cats-accreditation. 61. Program Requirements for Residency Education in Anesthesiology. [cited 2011 October 29]; Available from: http://www. acgme.org/acWebsite/navPages/nav_comPR.asp.