Procedural training and assessment of competency utilizing simulation

Procedural training and assessment of competency utilizing simulation

SE M I N A R S I N P E R I N A T O L O G Y ] (2016) ]]]–]]] Available online at www.sciencedirect.com Seminars in Perinatology www.seminperinat...

700KB Sizes 0 Downloads 105 Views

SE

M I N A R S I N

P

E R I N A T O L O G Y

] (2016) ]]]–]]]

Available online at www.sciencedirect.com

Seminars in Perinatology www.seminperinat.com

Procedural training and assessment of competency utilizing simulation Taylor Sawyer, DO, MEdn, and Megan M. Gray, MD Division of Neonatology, Department of Pediatrics, Neonatal Education and Simulation-Based Training (NEST) Program, University of Washington School of Medicine and Seattle Children0 s Hospital, 1959 NE Pacific St, RR451 HSB, Box 356320, Seattle, WA

article info

abstra ct

Keywords:

This review examines the current environment of neonatal procedural learning, describes

Neonatal procedures

an updated model of skills training, defines the role of simulation in assessing competency,

Procedural skills

and discusses potential future directions for simulation-based competency assessment. In

Procedural training

order to maximize impact, simulation-based procedural training programs should follow a

Competency assessment

standardized and evidence-based approach to designing and evaluating educational

Neonatal simulation

activities. Simulation can be used to facilitate the evaluation of competency, but must incorporate validated assessment tools to ensure quality and consistency. True competency evaluation cannot be accomplished with simulation alone: competency assessment must also include evaluations of procedural skill during actual clinical care. Future work in this area is needed to measure and track clinically meaningful patient outcomes resulting from simulation-based training, examine the use of simulation to assist physicians undergoing re-entry to practice, and to examine the use of procedural skills simulation as part of a maintenance of competency and life-long learning. & 2016 Elsevier Inc. All rights reserved.

Introduction Procedures are fundamental to the care of the newborn. Currently, all graduating pediatric, neonatal, and obstetric physicians are expected to competently perform key neonatal resuscitation procedures without direct supervision, with additional procedures required to become a practicing pediatrician or neonatologist. Acquiring competency in performing neonatal procedures requires education, training, and practice. Once competency is acquired, maintaining skills through continued practice is essential in order to avoid natural skill decay. Medical educators are coming under increased pressure to develop valid methods to ensure the procedural competency of trainees and practicing clinicians.1–3 In recent n

Corresponding author. E-mail address: [email protected] (T. Sawyer).

http://dx.doi.org/10.1053/j.semperi.2016.08.004 0146-0005/& 2016 Elsevier Inc. All rights reserved.

years there has been a decrease in the number of invasive procedures in pediatric and neonatal care, and a subsequent increase in the use of simulation-based medical education for procedural training in these areas.4 Simulation has the benefit of being learner focused and posing no risk to patients. Today, almost all neonatal–perinatal medicine training programs use simulation to teach neonatal resuscitation and procedures.5 In this report we examine the topic of procedural training and assessment of competency utilizing simulation for neonatal clinicians, including those who provide delivery room and neonatal intensive care. We begin with a discussion of the importance of procedures in neonatal care, including cardiopulmonary resuscitation, airway management, and

2

SE

M I N A R S I N

P

E R I N A T O L O G Y

interventions for life threatening conditions. Next, we review an evidence-based pedagogy for procedural training. Then, we examine the ability to assess procedural competency using simulation. Finally, we investigate the potential future directions that procedural training and simulation research may take.

Procedures in neonatal care Approximately 1% of all infants require extensive resuscitation at birth, and up to 15% of extremely low birth weight infants require cardiopulmonary resuscitation (CPR).6,7 The majority of pre-term and ill newborns will need additional procedures during their hospitalization, including vascular access and airway management. The Accreditation Council for Graduate Medical Education (ACGME) states that pediatric, neonatal–perinatal medicine, and obstetrics and gynecology trainees must be competent to perform the procedures considered essential for their area of practice.8 For neonatal– perinatal fellows this includes neonatal resuscitation, venous and arterial access, evacuation of air leaks, and endotracheal intubation.8 For pediatric residents this includes neonatal resuscitation, neonatal intubation, and umbilical vascular access.9 For obstetrical residents this is limited to the provision of neonatal resuscitation.10 In the United States, the standard method for training providers in neonatal resuscitation is attendance of the Neonatal Resuscitation Program (NRP) course. Despite many trainees and providers acquiring NRP provider status, video reviews of real delivery room resuscitations revealed that 50% of patients experienced deviations from recommended NRP care guidelines.11,12 The NRP advocates for the use of simulation as a methodology to educate and train providers in evidence-based resuscitation practices prior to providing clinical care in the delivery room.13 After institution of NRP training, one program reported significant improvements in APGAR scores 14 and another showed reductions in neurodevelopmental impairment in infants with perinatal asphyxia.15 In the obstetrics literature, a large retrospective review showed that simulation of delivery room emergencies was associated with significant reductions in infants born with low APGAR scores and hypoxic ischemic encephalopathy.16 Simulation plays an important role in identifying deficiencies that may lead to compromised clinical care. Simulation has uncovered deficiencies in CPR delivery,17 timing of basic life support initiation,18 timing of cardioversion,19 and intubation skills.20 Despite the importance of developing competency in performing resuscitation and technical procedures during training, there is a lack of high quality studies focused on this area. Procedural experience acquired during training is highly variable and influenced by multiple factors, including number of weeks of service, nights on call, patient census, and natural variations in the acuity of the patients during trainee rotations.3 Several reports over the past decade describe a concerning trend toward a lack of competency in pediatric residents in neonatal procedures.21–25 These trends are likely result from the decrease in time residents spend in the neonatal intensive care unit during training, and the increasing use of nurse practitioners and neonatal hospitalists, who

] (2016) ]]]–]]]

compete for procedural experience in order to maintain their own competency.26–28 Given these trends, it is possible that many pediatric residents are not adequately experienced to perform key neonatal procedures independently upon graduation from training. These changes place an increased importance on the utilization of simulation to ensure that trainees receive sufficient procedural training and experience prior to engaging in independent practice. A recent survey by Sawyer et al.3 described the procedural experience of current neonatal–perinatal fellows during training, and examined the methods used by programs to determine procedural competency. Figure 1 shows the procedural experience for the most recent graduating year of neonatal–perinatal fellows and, as expected, intubation and umbilical line placement are the most frequently performed neonatal procedures. Troublingly, procedures such as thoracentesis, chest tube placement, pericardiocentesis, and exchange transfusion are performed infrequently by fellows, despite the high likelihood of patient mortality and morbidity if they are performed incorrectly. For most of the programs surveyed, procedural competency was determined by direct observation of performance on a live patient by an attending physician for a predetermined, but variable, number of times. However, the majority of fellows reported no standardized process to define when they could perform a procedure independently. To address inadequacies in procedural experience, many programs use simulation to augment clinical training.5 Simulation sessions can occur in the form of “boot camps” and as a series of simulation experiences conducted longitudinally during training.29,30 In either format, evidenced-based methods of training, including incorporation of clear learning objectives, curriculum integration, feedback and debriefing, deliberate practice, mastery learning, and range of difficulty and clinical variation, should be employed in order to achieve the highest impact.4

Procedural training using simulation: An evidence-based approach The Halstedian mantra, “see one, do one, teach one” is the traditional paradigm for teaching procedural skills in medicine. In this paradigm, procedural competency is acquired through direct patient care, with trainees practicing procedures on real patients as part of a medical apprenticeship model. This training model has been highly scrutinized within the past decade due to patient safety concerns.31,32 Simulation-based medical education is an instructional technique that offers the ability to safely gain competency in procedural skills without risk of harm to patients. Its use has been associated with better patient care and improved patient safety.33–39 The utility of simulation for procedural training has been recently reviewed,36 and its use is advocated by the ACGME.40 Thus, a modern pedagogy for procedural skill education should incorporate simulation as a fundamental component. Sawyer et al.41 recently described an evidence-based pedagogy for procedural skills training. The model consists of 6 stages of skill training, identified as “Learn, See, Practice,

SEM

I N A R S I N

P

E R I N A T O L O G Y

3

] (2016) ]]]–]]]

Thoracentesis (needle aspiraon) Double volume exchange transfusion PICC placement Chest tube placement Peripheral arterial line placement NICU Code DR Code Umbilical artery catheter placement Umbilical venous catheter placement Endotracheal intubaon 0

20

40

60

80

100

120

Number of Procedures

Fig. 1 – Procedural experience of graduating neonatal–perinatal fellows in 2015. Prove, Do, Maintain” (LSPPDM). The framework divides the teaching and learning of procedural skills into two phases: the cognitive phase and the psychomotor phase. An overview of the framework is presented in Figure 2. This model relies heavily on simulation both as an educational technique and as a method of skills assessment. The first stage of procedural training in the LSPPDM pedagogy involves learning about the procedure through reading, didactic sessions, or multimedia presentations (Learn). Traditional synchronous didactics such as classroom sessions or one-on-one tutoring have the benefit of allowing a two-way flow of information between the instructor and the learner, but can be time intensive and difficult to schedule. Asynchronous modalities with recorded lectures, self-guided reading, or E-learning modules allow for the educational content to be distributed to a wider audience of

learners with a lower, one-time investment from the expert. However, asynchronous modalities leave little room for clarifications if the learner needs more information or assistance with the material. Educators should use the level of the learner and complexity of the material as a guide for choosing synchronous versus asynchronous forums. Certain aspects of the procedure such as the indications, contraindications, equipment needed, basic steps, and potential complications are discussed during the Learn stage. Verification of content knowledge with a standardized test can be used to ensure adequate cognitive skill prior to moving to hands-on training. Learners who do not demonstrate adequate understanding of the basics of a procedure are at risk of making errors, and may put patients in danger, so should be required to review the material and retake the test until reaching a passing level.

Fig. 2 – The “Learn, See, Practice, Prove, Do, and Maintain” pedagogy. (Adapted with permission from Sawyer et al.41).

4

SE

M I N A R S I N

P

E R I N A T O L O G Y

The next step of training involves demonstrating and modeling of the procedure for the learner by an expert (See). This usually starts with an annotated demonstration of each step, deconstructed and described to point out the most salient aspects of each part of the procedure. The expert may also do a full run through of the procedure, as it would occur in real time, to provide an accurate model for performance and timing. Demonstrations can be done either in-person using a simulator, or via pre-recorded videos, which could utilize footage of either a real or simulated patient. In the Practice step, learners are allowed the opportunity for deliberate practice using simulation. As defined by Ericsson, deliberate practice describes a regimen of effortful activity designed to optimize improvements in the acquisition of expert performance.42 Simulating the procedure leverages many of the key features of deliberate practice by allowing for well-defined learning objectives, permitting focused and repetitive practice, providing precise measurements of performance, and delivering formative feedback. Evaluation at this phase is directed at defining areas for improvement and sequential modification to optimize performance, and is provided through coaching from the educator while the learner performs the procedure on the simulator. Feedback is critical to learning, as evidence by the study from van Schaik et al.43 in which pediatric resident confidence with resuscitation correlated better with mock code experience than real code experience. This phenomenon is potentially due to the more active role in participation during mock codes compared to a real codes, as well as the addition of a facilitated debriefing at the end of mock codes to correct deficiencies and solidify learning. In the Prove step, learners have their skill objectively assessed on a simulator to ensure competency has been achieved prior to performing the procedure in clinical settings. This step utilizes simulation-based mastery learning (SBML).35 Mastery learning augments deliberate practice through the addition of a clearly delineated level of performance that defines “mastery,” and the requirement for continuous practice until mastery-level performance is achieved.44 Mastery is defined practically as achieving a predetermined score on an assessment tool, such as a procedure checklist. The development of checklists used for SBML, and the process by which the mastery score is established, takes considerable effort and is discussed in the next section. Competency assessment using SBML is an important use of simulation as a patient safety modality as it offers a tool for determining which learners will benefit from further practice on a simulator, and clearly defines a level of performance required of a learner before they are eligible to attempt a supervised procedure on a live patient.45 In one study of pediatric residents undergoing newborn resuscitation training, approximately 40% of participants failed a simulation-based assessment and required additional training; with remediation, the majority were able to pass the assessment on retesting,46 highlighting the benefits of formative assessment. In a meta-analysis of simulation-based medical education, mastery learning was found to be superior to non-mastery instruction, with large improvements in skills and moderate improvements in patient outcomes.47

] (2016) ]]]–]]]

In the next two stages of the LSPPDM method, procedural training moves from the realm of simulation to the patient bedside. This represents a key transition point for the learner. In the Do step, the trainee is allowed to perform the procedure on patients under close supervision, with real-time performance-based assessments and feedback. Providing a structured environment within which this supervised practice can take place is accomplished through direct, one-on-one training during a clinical rotation, or during a dedicated medical procedure rotation. Early in this stage an attending physician, or other experienced provider, should supervise the entire procedure and provide specific feedback throughout, as well as conduct a formative assessment of skills after the procedure is complete. Such performance-based assessment of procedural skill in clinical practice is an vital component of competency determination.48 Feedback and assessment in this stage can be accomplished using the same procedural skills checklist that was used for SBML, or via a workplace-based assessments.49 Once achieved, competency with a procedural skill degrades with time if the procedure is not regularly practiced. The term “de-skilling” has been applied to the gradual loss of skills through infrequent practice.50 Rarely performed procedures, such as pericardiocentesis, are at particularly high risk for de-skilling as they are required in only around 1% of neonatal intensive care patients, and individual providers are unlikely to perform them frequently enough to maintain a mastery level of skill.51 Therefore, a Maintain stage that uses a combination of ongoing clinical practice supplemented by simulation, is essential to ensure the maintenance of procedural competency after the initial training period. In the Maintain stage the provider tracks their individual procedural experiences prospectively and identifies procedures that he/ she performs on an infrequent basis that would benefit from simulation training. The need for supplemental simulationbased training is dependent on multiple factors including the individual experience and needs of the clinician, location specific factors such as disease incidence, and the intrinsic aspects of each procedure such as the difficulty and complication rate. The frequency with which procedures need to be performed in order to maintain competency has not been established. Additionally, competency with different procedures may be maintained for different durations based on prior experience. For example, very senior physicians may have high cumulative experience with a procedure and require only focused simulation on an infrequent basis to maintain their skills, while a very junior resident will likely require more frequent and in depth simulation to maintain their skills. With the high level of variation in learner needs, an individualized and informed approach needs to be followed to develop a robust procedural skills maintenance curriculum. For practices that do not have methods of tracking individual physician’s procedural experience, a survey-based needs assessment offers a reasonable means of targeting the procedures most in need of simulation for skills maintenance. The development of a procedural skills training curriculum for neonatal faculty, based on a needs assessment of faculty preference, has been associated with

SEM

I N A R S I N

P

E R I N A T O L O G Y

improvements in neonatal attending self-reported competency with rare neonatal procedures.52 The above described LSPPDM andragogy is evidence-based, but is clearly labor and time intensive. Thus, defining the optimal periods within training in which to conduct the initial Learn, See, Practice, and Prove stages of training and assessment is imperative. Resident and fellow “boot camps” are one such venue.29 Boot camps are short duration, usually 1–3 days, intensive training sessions, that generally take place in the first months of residency or fellowship, and involve a focused curriculum to teach fundamental clinical skills, either prior to, or just after, beginning clinical rotations. Procedural and resuscitation skills are key components of most boot camps. Using this venue as a time when skills are not only trained, but assessed for mastery could improve the safety of physicians in training performing procedures.45

Procedural competency assessment utilizing simulation Competency-based medical education (CBME) has been defined as an “outcomes-based approach to the design, implementation, assessment, and evaluation of a medical education program using an organizing framework of competencies.”53 CBME describes an educational curriculum, which encompasses teaching skills, behaviors, and knowledge, and utilizes ongoing assessments and review as means to achieve competency.54 According to CBME terminology, the ability to perform a procedure independently is a “competency,” for example, an observable ability of a health care professional. Possessing all the required abilities in all domains, in a certain context, at a defined stage of medical education or practice would define a practitioner as “competent.”53 For example, the development of competency in the performance of all key neonatal procedures is required to become a competent neonatologist. The ability to use simulation to assess procedural competency in the context of CBME is an area of intense interest. Simulation offers the opportunity for procedural training and competency assessment in a safe environment, without risk of harm to patients. However, evidence to support the validity of the interpretation of the results obtained from competency-based assessment methods used in simulation are often lacking.55 Assessment methods with a weak validity argument include self-reported confidence or competency. Objective methods of assessment include procedural success, length of time to task completion, and counts of attempts.56 More detailed procedural competency assessment tools often take the form of observational checklists or global-rating scales (GRS), which can be done in person, or via video review.57,58 Checklists usually include a detailed list of steps in the performance of a procedure, designed to be objective in nature. Limitations to checklists include their linear stepwise nature, and the need to complete all steps in order to score highly. Many procedural experts skip, or combine, steps of a procedure, yet are still successful. Such nuances in procedural performance may be overlooked on a checklist. Additionally, checklists rarely differentiate the relative importance of each step, and thus may produce a high score,

] (2016) ]]]–]]]

5

even when critical parts of the procedure are omitted, or done incorrectly.58 GRS utilize performance rating scales such as: 1 ¼ novice, 3 ¼ competent, 5 ¼ expert; or 1 ¼ ready to observe only, 2 ¼ ready to perform the procedure with supervision immediately available, and 5 ¼ ready to supervise others. As such, GRS provide a more broad-based assessment or gestalt impression of competency.58 A specific type of GRS, which includes observable behaviors that define performance at each rating level is called a behaviorally anchored rating scale (BARS). GRS have considerably less granularity than checklists, and as such provide limited means of giving specific feedback on individual procedural steps. However, they are easier to use in practice, and their ability to detect good and poor procedural performance correlates well with checklist scores in many cases.59 In a recent systematic review, GRS showed higher average inter-item and interstation reliability, could be used across multiple tasks, and were found to better capture nuanced elements of expertise.60 Checklists and GRS can be used independently, or in combination. Based on the benefits and drawbacks of these two assessment methods, the use of a hybrid assessment tool that includes both types of assessment has been advocated.41 Most procedural checklists focus predominantly on technical skills, some others measure behavioral skills, such as leadership, teamwork, and communication, and a minority of tools assess both technical and behavioral skills. Behavioral skills are vital to procedural success and resuscitation performance. Behavioral skills are highly correlated with technical skill in neonatal resucitation.61–63 Teamwork training is associated with improved process measures and patient outcomes,64 and teamwork training has been shown to improve the likelihood of error detection and correction.65 The application of behavioral skills in high pressure situations, such as resuscitation, is termed Crisis Resource Management (CRM), and specific training aimed at enhancing CRM skills has been shown to translate to improved behavioral changes in the clinic setting, and improved patient outcomes.66 Deficiencies in behavioral skills are a welldocumented risk factor for patient morbidity and malpractice litagation.64 Therefore, the use of checklists, GRS, or BARS, during simulation training to evaluate performance and provide constructive feedback on both technical and behavioral skills affords educators a powerful tool for performance improvement. Such training has been shown to improve performance in procedural skills and neonatal resuscitation.62,67,68 Multiple tools to assess psychomotor skill and procedural competency have been developed for use in medicine and surgery. A recent review determined that the psychometric and edumetric properties of many such tools were limited.69 Only a few assessment tools with well-developed validity arguments have been published for neonatal procedures and neonatal resuscitation. Examples of some published tools are provided in the Table. Significant work is required to develop and perform validity testing of an assessment tool. As described by Pugh et al.,55 a validity framework, including various sources of validity evidence, needs to be applied when reporting and interpreting the results of a simulationbased assessment of procedural competency. Sources of validity evidence include: content, response process, internal

6

SE

M I N A R S I N

P

E R I N A T O L O G Y

] (2016) ]]]–]]]

Table – Examples of published tools for assessing competency in neonatal procedures and resuscitation. Author

Skill assessed

Validity evidence

Skill typea

Tool typeb

Bismilla et al.80

Neonatal nasal intubation Infant LP Neonatal resuscitation performance Neonatal resuscitation performance and teamwork Neonatal resuscitation performance

IRR ¼ 0.88, scores correlate with global-rating score (spearman's ¼ 0.68). Cronbach's α ¼ 0.77. Cronbach's α ¼ 0.6–0.7.

TS

CL

TS TS

CL CL

IRR ¼ 0.88 for technical and 0.96 for non-technical.

TS and BS

CL

ICC ¼ 0.95 and 0.77 for intra- and inter-rater reliability, respectively. The median percentage of intra-rater agreement was 100%; interrater agreement 78.6–84.0%. Median κ ¼ 0.85 for intra-rater reliability, and 0.42–0.59 for inter-rater reliability. IRR ¼ 0.86

TS

CL

BS

CL

IRR for each subdomain of resuscitation 0.36–0.79, with a mean κ of 0.63. Percentage agreement for each subdomain of resuscitation 84–100%, with a mean overall agreement between the two raters of 90.7%. Cronbach's α ¼ 0.92.

TS

CL

BS

BARS

Gerard et al.59 Lockyer et al.81 Rovamo et al.82

van der Heide et al.83

Sawyer et al.65 Sawyer et al.67

Sawyer et al.67 a b

Teamwork during neonatal resuscitation Neonatal resuscitation performance

Behavioral skill during neonatal resuscitation

Skill type: TS, technical skills; NTS, non-technical skills. Tool type: CL, Checklist; GRS, global-rating scale; BARS, behaviorally anchored rating scale.

structure, relationship to other variables, and the consequences of the assessment.70 Each of these sources of validity can provide evidence to support the validity of score-based inferences of procedural competency. Content validity is the basis for all validity,70,71 and refers to whether or not the content (e.g., steps/metrics) in a tool match the construct that the tool is intended to measure. For example, does completion of all the items on a procedural skills checklist accurately reflect the ability to competently perform the procedure? Are there extraneous steps? Are there missing steps? Evidence of content validity can be obtained by basing the steps and metrics included in the checklist on previously published tools from the medical literature, textbooks, and or procedure guides. Expert consensus, as determined by Delphi methodology, is a commonly used to provide content validation of the items and steps included on an assessment tool. Response process validity refers to insurance of the integrity of the data collection, and work done to minimize error in administering the tool. For example, have raters been instructed on the use of the tool, and is there consistency in how the assessment tool is completed? Controlling for potential sources of error associated with the use of the assessment tool, including rater training and standardizing the environment during assessment, provides evidence of response process validity. Inter-rater reliability is one statistical method to determine response process validity.72 Internal structure validity relates to the statistical reliability and psychometric properties of the assessment tool. Measures on internal consistency, such as Cronbach's α, provide evidence to internal structure validity. Other specific measures include generalizability analysis and item-total correlation.55

Relationship to other variables refers to the concept that the score provided on the assessment tool should correlate with other established methods of measuring procedural success, or competency. Such evidence is provided if higher scores on the assessment tool correspond to advanced training levels, or if scores correlate with other performance measures, such as success or complication rates. Consequence validity involves the intended and unintended consequences of an examination, and relates to the impact of the assessment score to the individual being assessed. An important aspect of consequence validity is the type of assessment being performed: formative versus summative. If summative, what are the consequences of a low score? Will a trainee be prohibited from performing the procedure on a patient based on their score on a simulator? Any pass/fail score determination on an assessment tool used for summative assessment must involve a defensible method for standard setting.55 Due to the inherent differences between simulation and real life clinical practice, competency assessment during simulation, even using a tool with a high level of validity, should not be considered adequate evidence of true clinical performance.48 Assessment of skill during the supervised performance of procedures on real patients is necessary to determine if a learner can be entrusted to perform the procedure independently, without supervision. The concept of gradually allowing a trainee to perform procedures in the clinical environment without direct supervision is an example of an entrustable professional activity.73 The determination of clinical competency with a procedural skill is challenging, but methods such as procedure logs, and direct observation are already in common use in neonatal–perinatal fellowship programs.3 More advanced methods, such as

SEM

I N A R S I N

P

E R I N A T O L O G Y

cumulative sum (CUSUM) analysis of procedural success and failure rates, have also been explored.74 Each of these methods can be time and labor intensive for trainees and supervisors, but if performed regularly have the potential to provide specific, ongoing assessment in order to identify when trainees are ready for independent practice, and when remediation is required.41

Future directions In this review we identified many areas in need of further investigation and research. More data is needed at every level: tracking of clinical and simulation procedural experience, design and implementation of effective procedure training programs, and development and utilization of procedural competency assessment tools; especially tools incorporating both technical and behavioral measures. As the nature of training adjusts to changes in duty hours there is a need for improved methods of tracking an analyzing the procedural experiences of trainees. When deficiencies in experience or competency are identified, educators need evidencedbased systems to enhance trainee learning, ranging from individual training and E-learning modules, to comprehensive simulation programs. Assessment tools provide a vital means of correlating simulation-based competencies to clinical skills, and allow different programs to be compared in a meaningful way. In a review of pediatric resident training, Mills et al.75 found a limited number of studies focused on simulation-based procedural training and resuscitation, a lack of validated tools, limited collection of patient level outcomes, and small subject numbers. The ultimate goal of procedural training programs is to improve patient outcomes. As such, further research is needed to measure and track clinically meaningful patient outcomes that are correlated with training efforts. Assessment data from checklists and GRS used in simulation should be collected in such a way that they can be paired with clinical data to determine which skills and behaviors are most important to patient outcomes. Few studies have successfully tracked patient outcomes associated with simulation-based training endeavors, but the limited evidence is promising. For example, a study by Andreatta et al.76 demonstrated survival after pediatric cardiopulmonary arrest increased after starting a mock code program. Another study by Su et al.77 showed faster extracorporeal cardiopulmonary resuscitation times in real patients after implementation of an extracorporeal cardiopulmonary resuscitation simulation program. Such results are critical to providing evidence of benefit to health system leaders, and may help garner support for simulation-based educational programs. Thus far we have focused on the use of simulation for procedural training and competency assessment for residents and fellows. However, there is increasing interest in the use of simulation to ensure procedural competency of attending physicians and physicians undergoing re-entry to practice. Maintenance of competency in procedural skills is a critical component of life-long learning, and key to the American Board of Medical Specialties core competencies of

] (2016) ]]]–]]]

7

Patient Care and Procedural Skills, and Practice-based Learning and Improvement. For attending physicians who do not perform a specific procedure on a regular basis in clinical practice, or who have long gaps in clinical time, simulation provides a feasible method to refresh and maintain procedural skills. Descriptions of simulation-based procedural skills maintenance training sessions for neonatal attendings are available.52 Some medical specially boards, such as anesthesiology, already utilize simulation-based practice performance assessment and improvement programs to satisfy Maintenance of Certification (MOC) requirements.78 The American Board of Pediatrics could feasibly implement such programs into pediatric and neonatal–perinatal medicine board certification. A report by Braun et al. describes skill refresher training used in the US Army for pediatricians returning from deployment which could be used to assist civilian pediatricians in re-establishing clinical skills upon return to work after long breaks in practice.79 Further research in the area of re-skilling is needed to understand the needs of physicians during re-entry to practice.

Conclusion The development and maintenance of competency with neonatal procedures is critical to all clinicians who provide care to newborns. Simulation provides a safe environment in which to develop competency with procedures, and maintain, or re-establish, competency once it has been achieved. Evidenced-based frameworks to develop and maintain procedural competency have been developed, but have not been fully embraced by the neonatal community. Future work to develop standardized simulation-based training curricula for pediatric and neonatal programs that incorporate an evidence-based approach, develop valid procedural competency assessment tools, and utilize robust methods to track the development and maintenance of procedural competency are needed. Attention to these issues provides an essential step toward improving the procedural competency of neonatal providers, and will translate to improvements in patient safety and care.

refere nces

1. Gaies MG, Landrigan CP, Hafler JP, Sandora TJ. Assessing procedural skills training in pediatric residency programs. Pediatrics. 2007;120(4):715–722. 2. Kasten SJ, Prince ME, Lypson ML. Residents make their lists and program directors check them twice: reviewing case logs. J Grad Med Educ. 2012;4(2):257–260. 3. Sawyer T, French H, Ades A, Johnston L. Neonatal–perinatal medicine fellow procedural experience and competency determination: results of a national survey. J Perinatol. 2016;36(7):570–574. http://dx.doi.org/10.1038/jp.2016.19. 4. Lopreiato JO, Sawyer T. Simulation-based medical education in pediatrics. Acad Pediatr. 2015;15(2):134–142. 5. Johnson L, Mu T, Sawyer T. Use of medical simulation in neonatal–perinatal fellowship training programs. J Neonatal— Perinatal Med. 2012;5(4):339–345. 6. Wyckoff MH, Salhab WA, Heyne RJ, Kendrick DE, Stoll BJ, Laptook AR. Outcome of extremely low birth weight infants

8

7.

8.

9. 10.

11.

12.

13.

14.

15.

16.

17.

18.

19.

20.

21.

22.

23.

24.

SE

M I N A R S I N

P

E R I N A T O L O G Y

who received delivery room cardiopulmonary resuscitation. J Pediatr. 2012;160(2):239–244.e22. Perlman JM, Risser R. Cardiopulmonary resuscitation in the delivery room. Associated clinical events. Arch Pediatr Adolesc Med. 1995;149(1):20–25. Education ACfGM. ACGME Program Requirements for Graduate Medical Education in Neonatal–Perinatal Medicine; 2013 [cited Jaunary 6, 2016]. Education ACfGM. ACGME program requirements for graduate medical education in pediatrics; 2013 [cited January 13, 2016]. Gynecology CoREiOa. Educational Objectives: Core Curriculum in Obstetrics and Gynecology. 10th ed. 2013 [cited 2016 January 13, 2016]. Schilleman K, Siew ML, Lopriore E, Morley CJ, Walther FJ, Te Pas AB. Auditing resuscitation of preterm infants at birth by recording video and physiological parameters. Resuscitation. 2012;83(9):1135–1139. McCarthy LK, Morley CJ, Davis PG, Kamlin CO, O'Donnell CP. Timing of interventions in the delivery room: does reality compare with neonatal resuscitation guidelines? J Pediatr. 2013;163(6):1553–1557.e1. Perlman JM, Wyllie J, Kattwinkel J, Atkins DL, Chameides L, Goldsmith JP, et al. Neonatal resuscitation: 2010 International consensus on cardiopulmonary resuscitation and emergency cardiovascular care science with treatment recommendations. Pediatrics. 2010;126(5):e1319–e1344. Patel D, Piotrowski ZH, Nelson MR, Sabich R. Effect of a statewide neonatal resuscitation training program on Apgar scores among high-risk neonates in Illinois. Pediatrics. 2001;107(4):648–655. Duran R, Gorker I, Kucukugurluoglu Y, Ciftdemir NA, Vatansever Ozbek U, Acunas B. Effect of neonatal resuscitation courses on long-term neurodevelopmental outcomes of newborn infants with perinatal asphyxia. Pediatr Int. 2012;54(1): 56–59. Draycott T, Sibanda T, Owen L, Akande V, Winter C, Reading S, et al. Does training in obstetric emergencies improve neonatal outcome? BJOG. 2006;113(2):177–182. Hunt EA, Vera K, Diener-West M, Haggerty JA, Nelson KL, Shaffner DH, et al. Delays and errors in cardiopulmonary resuscitation and defibrillation by pediatric residents during simulated cardiopulmonary arrests. Resuscitation. 2009;80(7): 819–825. Hunt EA, Walker AR, Shaffner DH, Miller MR, Pronovost PJ. Simulation of in-hospital pediatric medical emergencies and cardiopulmonary arrests: highlighting the importance of the first 5 minutes. Pediatrics. 2008;121(1):e34–e43. Shilkofski NA, Nelson KL, Hunt EA. Recognition and treatment of unstable supraventricular tachycardia by pediatric residents in a simulation scenario. Simul Healthc. 2008;3 (1):4–9. Overly FL, Sudikoff SN, Shapiro MJ. High-fidelity medical simulation as an assessment tool for pediatric residents' airway management skills. Pediatr Emerg Care. 2007;23(1): 11–15. Downes KJ, Narendran V, Meinzen-Derr J, McClanahan S, Akinbi HT. The lost art of intubation: assessing opportunities for residents to perform neonatal intubation. J Perinatol. 2012;32(12):927–932. Falck AJ, Escobedo MB, Baillargeon JG, Villard LG, Gunkel JH. Proficiency of pediatric residents in performing neonatal endotracheal intubation. Pediatrics. 2003;112(6 Pt 1):1242–1247. Haubner LY, Barry JS, Johnston LC, Soghier L, Tatum PM, Kessler D, et al. Neonatal intubation performance: room for improvement in tertiary neonatal intensive care units. Resuscitation. 2013;84(10):1359–1364. Leone TA, Rich W, Finer NN. Neonatal intubation: success of pediatric trainees. J Pediatr. 2005;146(5):638–641.

] (2016) ]]]–]]]

25. O’Donnell CP, Kamlin CO, Davis PG, Morley CJ. Endotracheal intubation attempts during neonatal resuscitation: success rates, duration, and adverse effects. Pediatrics. 2006;117(1): e16–e21. 26. DeLaroche A, Riggs T, Maisels MJ. Impact of the new 16-hour duty period on pediatric interns' neonatal education. Clin Pediatr (Phila). 2014;53(1):51–59. 27. Juretschke LJ. New standards for resident duty hours and the potential impact on the neonatal nurse practitioner role. Adv Neonatal Care. 2003;3(4):159–161. 28. Schulman M, Lucchese KR, Sullivan AC. Transition from housestaff to nonphysicians as neonatal intensive care providers: cost, impact on revenue, and quality of care. Am J Perinatol. 1995;12(6):442–446. 29. Sawyer T, French H, Soghier L, Barry J, Johnston L, Anderson J, et al. Boot camps for neonatal–perinatal medicine fellows. NeoReviews. 2014;15(2):e46–e53. 30. Langhan TS, Rigby IJ, Walker IW, Howes D, Donnon T, Lord JA. Simulation-based training in critical resuscitation procedures improves residents' competence. CJEM. 2009;11(6): 535–539. 31. Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110(6):340–346. 32. Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78(8): 783–788. 33. Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. J Am Med Assoc. 2011;306(9):978–988. 34. McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(Suppl.):S42–S47. 35. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63. 36. Ross JG. Simulation and psychomotor skill acquisition: a review of the literature. Clin Simul Nurs. 2012;8(1):341–349. 37. Schaefer JJ 3rd, Vanderbilt AA, Cason CL, Bauman EB, Glavin RJ, Lee FW, et al. Literature review: instructional design and pedagogy science in healthcare simulation. Simul Healthc. 2011;6(Suppl.):S30–S41. 38. Scholtz AK, Monachino AM, Nishisaki A, Nadkarni VM, Lengetti E. Central venous catheter dress rehearsals: translating simulation training to patient care and outcomes. Simul Healthc. 2013;8(5):341–349. 39. Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: a systematic review. J Gen Intern Med. 2013;28(8):1078–1089. 40. Education ACfGM. Accreditation Council for Graduate Medical Education Bulletin; [cited 2015 March 11, 2015]. 41. Sawyer T, White M, Zaveri P, Chang T, Ades A, French H, et al. Learn, see, practice, prove, do, maintain: an evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90(8):1025–1033. 42. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(Suppl. 10):S70–S81. 43. van Schaik SM, Von Kohorn I, O'Sullivan P. Pediatric resident confidence in resuscitation skills relates to mock code experience. Clin Pediatr (Phila). 2008;47(8):777–783. 44. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Medical education featuring mastery learning with deliberate practice can lead to better health for individuals and populations. Acad Med. 2011;86(11):e8–e9. 45. Cohen ER, Barsuk JH, Moazed F, Caprio T, Didwania A, McGaghie WC, et al. Making July safer: simulation-based

SEM

46.

47.

48.

49.

50. 51.

52.

53.

54. 55. 56.

57.

58.

59.

60.

61.

62.

63.

64.

65.

I N A R S I N

P

E R I N A T O L O G Y

mastery learning during intern boot camp. Acad Med. 2013;88 (2):233–239. Cusack J, Fawke J. Neonatal resuscitation: are your trainees performing as you think they are? A retrospective review of a structured resuscitation assessment for neonatal medical trainees over an 8-year period. Arch Dis Child Fetal Neonatal Ed. 2012;97(4):F246–F248. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technologyenhanced simulation: a systematic review and meta-analysis. Acad Med. 2013;88(8):1178–1186. Rethans JJ, Norcini JJ, Baron-Maldonado M, Blackmore D, Jolly BC, LaDuca T, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901–909. Miller A, Archer J. Impact of workplace based assessment on doctors0 education and performance: a systematic review. Br Med J. 2010:341. Levitt LK. Use it or lose it: is de-skilling evidence-based? Rural Remote Health. 2001;1(1):81. Weil BR, Ladd AP, Yoder K. Pericardial effusion and cardiac tamponade associated with central venous catheters in children: an uncommon but serious and treatable condition. J Pediatr Surg. 2010;45(8):1687–1692. Sawyer T, Strandjord T. Simulation-based procedural skills maintenance training for neonatal-perinatal medicine faculty. Cureus. 2014;4(4):e173. Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–645. Brightwell A, Grant J. Competency-based training: who benefits? Postgrad Med J. 2013;89(1048):107–110. Pugh DM, Wood TJ, Boulet JR. Assessing Procedural Competence: validity considerations. Simul Healthc. 2015;10(5):288–294. Klotz JJ, Dooley-Hash SL, House JB, Andreatta PB. Pediatric and neonatal intubation training gap analysis: instruction, assessment, and technology. Simul Healthc. 2014;9(6):377–383. Cronin C, Cheang S, Hlynka D, Adair E, Roberts S. Videoconferencing can be used to assess neonatal resuscitation skills. Med Educ. 2001;35(11):1013–1023. Lammers RL, Davenport M, Korley F, Griswold-Theodorson S, Fitch MT, Narang AT, et al. Teaching and assessing procedural skills using simulation: metrics and methodology. Acad Emerg Med. 2008;15(11):1079–1087. Gerard JM, Kessler DO, Braun C, Mehta R, Scalzo AJ, Auerbach M. Validation of global rating scale and checklist instruments for the infant lumbar puncture procedure. Simul Healthc. 2013;8(3):148–154. Ilgen JS, Ma IWY, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49(2):161–173. Riem N, Boet S, Bould MD, Tavares W, Naik VN. Do technical skills correlate with non-technical skills in crisis resource management: a simulation study. Br J Anaesth. 2012;109(5):723–728. Sawyer T, Leonard D, Sierocka-Castaneda A, Chan D, Thompson M. Correlations between technical skills and behavioral skills in simulated neonatal resuscitations. J Perinatol. 2014;34 (10):781–786. Thomas EJ, Sexton JB, Lasky RE, Helmreich RL, Crandell DS, Tyson J. Teamwork and quality during neonatal care in the delivery room. J Perinatol. 2006;26(3):163–169. Weaver SJ, Dy SM, Rosen MA. Team-training in healthcare: a narrative synthesis of the literature. BMJ Qual Saf. 2014;23 (5):359–372. Sawyer T, Laubach VA, Hudak J, Yamamura K, Pocrnich A. Improvements in teamwork during neonatal resuscitation

66.

67.

68.

69.

70. 71.

72. 73. 74.

75.

76.

77.

78.

79.

80.

81.

82.

83.

] (2016) ]]]–]]]

9

after interprofessional TeamSTEPPS training. Neonatal Netw. 2013;32(1):26–33. Boet S, Bould MD, Fung L, Qosa H, Perrier L, Tavares W, et al. Transfer of learning and patient outcome in simulated crisis resource management: a systematic review. Can J Anaesth. 2014;61(6):571–582. Sawyer TDOM, Sierocka-Castaneda AMD, Chan DP, Berg BMD, Lustik MMS, Thompson MMD. Deliberate practice using simulation improves neonatal resuscitation performance. Simul Healthc. 2011;6(6):327–336. Kessler DOMDM, Auerbach MMDM, Pusic MMDP, Tunik MGMD, Foltin JCMD. A randomized trial of simulation-based deliberate practice for infant lumbar puncture skills. Simul Healthc. 2011;6(4):197–203. Jelovsek JE, Kow N, Diwadkar GB. Tools for the direct observation and assessment of psychomotor skills in medical trainees: a systematic review. Med Educ. 2013;47(7):650–673. Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37(9):830–837. Cook DA, Beckman TJ. Current concepts in validity and reliability for psychometric instruments: theory and application. Am J Med. 2006;119(2):166.e7–116. Downing SM. Reliability: on the reproducibility of assessment data. Med Educ. 2004;38(9):1006–1012. ten Cate O. Trust, competence, and the supervisor's role in postgraduate training. Br Med J. 2006;333(7571):748–751. Rangarajan V, Empie K, Sawyer T. Cumulative Sum (CUSUM) Analysis to Determine Procedural Competency in NeonatalPerinatal Medicine Fellows: A Feasibility Study. Pediatric Academic Societies (PAS) Annual Meeting, San Diego, CA. April 2015. Mills DM, Williams DC, Dobson JV. Simulation training as a mechanism for procedural and resuscitation education for pediatric residents: a systematic review. Hosp Pediatr. 2013;3 (2):167–176. Andreatta P, Saxton E, Thompson M, Annich G. Simulationbased mock codes significantly correlate with improved pediatric patient cardiopulmonary arrest survival rates. Pediatr Crit Care Med. 2011;12(1):33–38. Su LMD, Spaeder MCMDMS, Jones MBMSNAC-AC, Sinha PMD, Nath DSMD, Jain PNMD, et al. Implementation of an extracorporeal cardiopulmonary resuscitation simulation program reduces extracorporeal cardiopulmonary resuscitation times in real patients. Pediatr CritCare Med. 2014;15(9):856–860. Steadman RH, Huang YM. Simulation for quality assurance in training, credentialing and maintenance of certification. Best Pract Res Clin Anaesthesiol. 2012;26(1):3–15. Braun L, Sawyer T, Kavanagh L, Deering S. Facilitating physician reentry to practice: perceived effects of deployments on US army pediatricians' clinical and procedural skills. J Contin Educ Health Prof. 2014;34(4):252–259. Bismilla Z, Finan E, McNamara PJ, LeBlanc V, Jefferies A, Whyte H. Failure of pediatric and neonatal trainees to meet Canadian Neonatal Resuscitation Program standards for neonatal intubation. J Perinatol. 2010;30(3):182–187. Lockyer J, Singhal N, Fidler H, Weiner G, Aziz K, Curran V. The development and testing of a performance checklist to assess neonatal resuscitation megacode skill. Pediatrics. 2006;118(6): e1739–e1744. Rovamo L, Mattila MM, Andersson S, Rosenberg P. Assessment of newborn resuscitation skills of physicians with a simulator manikin. Arch Dis Child Fetal Neonatal Ed. 2011;96(5): F383–F389. van der Heide PA, van Toledo-Eppinga L, van der Heide M, van der Lee JH. Assessment of neonatal resuscitation skills: a reliable and valid scoring system. Resuscitation. 2006;71(2):212–221.