Interprofessional simulation to improve safety in the epilepsy monitoring unit

Interprofessional simulation to improve safety in the epilepsy monitoring unit

Epilepsy & Behavior 45 (2015) 229–233 Contents lists available at ScienceDirect Epilepsy & Behavior journal homepage: www.elsevier.com/locate/yebeh ...

221KB Sizes 2 Downloads 75 Views

Epilepsy & Behavior 45 (2015) 229–233

Contents lists available at ScienceDirect

Epilepsy & Behavior journal homepage: www.elsevier.com/locate/yebeh

Interprofessional simulation to improve safety in the epilepsy monitoring unit Barbara A. Dworetzky a,⁎, Sarah Peyre b, Ellen J. Bubrick a, Tracey A. Milligan a, Steven J. Yule a, Heidi Doucette a, Charles N. Pozner a a b

Brigham and Women's Hospital, Harvard Medical School, 75 Francis Street, Boston, MA 02115, USA University of Rochester Medical Center, 601 Elmwood Avenue, Rochester, NY, 14642 USA

a r t i c l e

i n f o

Article history: Received 1 October 2014 Revised 18 December 2014 Accepted 13 January 2015 Available online 23 March 2015 Keywords: All epilepsy, seizures Medical simulation Patient safety Epilepsy monitoring Education

a b s t r a c t Objective: Patient safety is critical for epilepsy monitoring units (EMUs). Effective training is important for educating all personnel, including residents and nurses who frequently cover these units. We performed a needs assessment and developed a simulation-based team training curriculum employing actual EMU sentinel events to train neurology resident–nurse interprofessional teams to maximize effective responses to high-acuity events. Methods: A mixed-methods design was used. This included the development of a safe-practice checklist to assess team response to acute events in the EMU using expert review with consensus (a modified Delphi process). All nineteen incoming first-year neurology residents and 2 nurses completed a questionnaire assessing baseline knowledge and attitudes regarding seizure management prior to and following a team training program employing simulation and postscenario debriefing. Four resident–nurse teams were recorded while participating in two simulated scenarios. Employing retrospective video review, four trained raters used the newly developed safe-practice checklist to assess team performance. We calculated the interobserver reliability of the checklist for consistency among the raters. We attempted to ascertain whether the training led to improvement in performance in the actual EMU by comparing 10 videos of resident–nurse team responses to seizures 4–8 months into the academic year preceding the curricular training to 10 that included those who received the training within 4–8 months of the captured video. Results: Knowledge in seizure management was significantly improved following the program, but confidence in seizure management was not. Interrater agreement was moderate to high for consistency of raters for the majority of individual checklist items. We were unable to demonstrate that the training led to sustainable improvement in performance in the actual EMU by the method we used. Conclusions: A simulated team training curriculum using a safe-practice checklist to improve the management of acute events in an EMU may be an effective method of training neurology residents. However, translating the results into sustainable benefits and confidence in management in the EMU requires further study. © 2015 Elsevier Inc. All rights reserved.

1. Introduction While sentinel events in the epilepsy monitoring unit (EMU) are relatively infrequent [1], injuries and death have been reported in this electively admitted group of patients [2–4]. There is little evidence [5], nor consensus, regarding best practices for maximizing patient safety in the EMU. Recently, an expert panel recommended immediate availability of physicians to manage acute events in the EMU [6]. However, there are no recognized guidelines for training personnel in

⁎ Corresponding author at: 75 Francis Street, Boston MA 02115, USA. Tel.: +1 617 732 5946; fax: +1 617 730 2880. E-mail addresses: [email protected] (B.A. Dworetzky), [email protected] (S. Peyre), [email protected] (E.J. Bubrick), [email protected] (T.A. Milligan), [email protected] (S.J. Yule), [email protected] (H. Doucette), [email protected] (C.N. Pozner).

http://dx.doi.org/10.1016/j.yebeh.2015.01.018 1525-5050/© 2015 Elsevier Inc. All rights reserved.

these units. As the goal is to record seizures as they are occurring, staff can sometimes delay urgent management as well as be confused by vague seizure-like presentations resulting in misidentification of nonepileptic critical events. Junior residents and nurses are typical first responders in these units and require training for appropriate emergency response to acute events. Such training can fulfill several of the new ACGME milestones for residents including managing nonconvulsive status epilepticus (http://www.acgme.org/acgmeweb/Portals/0/PDFs/ Milestones/NeurologyMilestones.pdf) [7]. Medical simulation has been shown to improve patient safety by enhancing team performance during critical events [8]. Interdisciplinary collaboration using interprofessional teams has been shown to maximize safety in emergency situations [9]. Deliberate practice with reflection on performance through debriefing can be used to provide robust and consistent learning opportunities targeted at low-frequency, high-acuity events [10] such as those which can occur in the EMU. Procedural checklists are a

230

B.A. Dworetzky et al. / Epilepsy & Behavior 45 (2015) 229–233

powerful tool in deconstructing the complex tasks associated with patient care and are shown to have direct improvement in patient outcomes [11]. They have been used to assess performance as well [12]. We sought to develop an interprofessional simulation curriculum and a valid and reliable safe-practice checklist that could be used to educate and/or assess performance of first responder teams and to investigate whether this simulation-based training could have an effect on clinical care in the actual EMU. 2. Materials and methods 2.1. Curriculum and checklist development The curriculum and checklist were based on lessons learned from a root–cause analysis (RCA) and video review of 2 actual EMU sentinel events. These two events were reviewed and deconstructed by a panel of local experts (3 epileptologists with expertise in the EMU for 5 or more years, 1 emergency physician/simulation expert, and 2 senior RNs (both with neurology and EMU experience)). They independently documented lapses in care, highlighting what they felt was absolutely required for the safe care of patients. Each member of the expert panel then submitted open-ended lists of observed appropriate and inappropriate actions. These lists were collated (BD), representing the first round of a modified Delphi process [13]. Redundancies was eliminated through independent review. Further discussions led to consensus and development of a master list. This list was then resubmitted to the panel asking them to add, omit, or edit steps as appropriate. This process was repeated 4 times until 100% consensus was reached. The final 10 items were then organized using the clinical communication format “SBAR” (situation, background, assessment, and recommendation) and became our checklist [14]. We used the following performance scale: needs improvement (NI), meets standard (MS), or exceeds standard (ES) for each of the 10 items on the checklist. A weighted scoring system for each item was developed by expert agreement between one of the senior neurologists and a senior nurse (BD and HD) in which steps deemed to have greater criticality received greater points. The resulting safe-practice checklist (Appendix e-1) with scoring key (Appendix e-2) was created with strong evidence of construct and face validity. These represented the learning objectives for the simulation curriculum and were used to assess the performance of teams. An education specialist (SP) provided expertise and oversight throughout the process. 2.2. Simulator environment The simulations were carried out in the Neil and Elise Wallace STRATUS Center for Medical Simulation at Brigham and Women's Hospital (BWH), which is a fully interdisciplinary and interprofessional medical simulation center in Boston. Two simulation scenarios were developed to recreate the two sentinel events. One scenario employed a high-fidelity human patient simulator (SimMan 3G, Laerdal, Stavanger) and the other, a standardized patient (live actress). Both were carried out in an environment that emulated an EMU environment that included patient beds, clinical equipment, and monitors. In the simulator case, the participants were able to assess the patient including pupillary response, monitor vital signs, and perform a variety of procedures including airway management, defibrillation, intravenous access, and medication administration, among others. Both scenarios were controlled and monitored from a control room separated from the simulation environment by a one-way mirror. Each scenario was digitally recorded for postscenario review. Permission to record was prospectively obtained from subjects. 2.3. Subjects Subjects included 19 incoming first-year neurology residents (PGY2) and 5 neurology nurses with less than 2 years of direct EMU

experience. The curriculum was presented as part of a resident orientation program for a large academic residency program. Although resident subjects were required to participate in the orientation, they could choose to opt out of participation in the research study without threat of repercussion. Subjects who gave consent for videos to be used for research purposes were included. Although five nurses joined in the exercise, one was called away to an emergency and did not participate in the scenarios and 2 of the remaining 4 nurses did not complete a posttest questionnaire and could not be used in the analysis. Subjects were randomly assigned to one of four teams. Each team consisted of 4–5 residents and one nurse. Since the training occurred during orientation, none of the residents had previously worked with each other or with the nurse on their team. This study was approved by the local institutional review board. 2.4. Program presentation All subjects attended a 20-minute PowerPoint presentation on the principles of Crisis Resource Management. Initially developed by and for the aviation industry in response to the determination that mishaps most often result, at least in part, from human error, Crisis Resource Management emphasizes role clarity, clear communications, situational awareness, and a variety of other behavioral principles. These principles were introduced to health care over a decade ago by anesthesiologists and are now utilized by numerous institutions and disciplines. The presentation included each principle supported by examples from both the medical and aviation realms. The subjects also attended an interactive session using videos to teach seizure classification (35 min). The prestudy instruction was intended to provide neither a comprehensive understanding of the management of seizures nor an in-depth approach to the management of medical emergencies. In fact, it is hypothesized that the learning that occurs as a result of scenario-based simulation takes place during the postscenario debriefing, leveraging the experience of the scenario to emphasize important concepts to engaged adult learners. Subjects received a standardized orientation to the simulator and the simulation environment. It was the intent of the prestudy instruction to ensure that each subject had a comparable understanding of these general concepts. Each team participated in two preprogrammed scenarios modeled after the actual sentinel events. The simulated patient presented with a cardiac arrest mimicking a complex partial seizure. The standardized patient had nonconvulsive status presenting as a psychogenic nonepileptic event (Appendix e-3). In order to maximize fidelity, each scenario was piloted using 2 secondyear neurology residents. For each scenario, a different resident subject was assigned to be the leader. After each scenario, trained facilitators (EB, CNP) provided 25 min of interactive debriefing. Strengths and weaknesses of team performance were identified. The key learning objectives were emphasized. These included the avoidance of anchoring on a single diagnosis, the need to call for help early in the presentation, and patient management concentrating on the first 5 min prior to the arrival of emergency responders. Leadership skills including team communication, resource management, and global assessment were also emphasized. Scenarios were presented in counterbalanced order. The total educational time was 3 h and included a 10-minute closing/evaluation session. 2.5. Data collection and analysis Preprogram and postprogram knowledge and attitude questionnaires were collected electronically. Course evaluations were collected from nurse and resident subjects immediately following the program. Four blinded epileptologists (raters) were introduced to the 10-item checklist independently and scored postprocessed digitally-recorded scenarios in randomized order for each of the four resident–nurse teams and for each of the scenarios (total of 8 possible scenarios to assess).

B.A. Dworetzky et al. / Epilepsy & Behavior 45 (2015) 229–233

2.6. Rater training Raters received one-on-one training on the itemized checklist using 4 actual video-taped EMU acute events. Training sessions lasted approximately 1.5 h. Videos that revealed seizures that were both wellmanaged and poorly managed were included. 2.7. Procedure to compare performance in the actual EMU before and after the exercise Actual EMU cases were reviewed to assess the impact of the training on resident–nurse teams. All videos were obtained as part of the standard EMU procedure. Permission to utilize de-identified videos for education or research is obtained prior to admission. Ten archived videos of prolonged seizure events in the EMU from the year preceding the simulation exercise were selected by reviewing consecutive reports where a junior resident and nurse “responded” at the bedside 4–8 months into the academic year. These ten videos were considered the “control” group. Epilepsy fellows then prospectively identified nurse– resident “responders” present at the patient's bedside for 10 consecutive prolonged seizure events from different junior residents at least 4–8 months after the simulation orientation exercise. Videos from the 20 team responses were placed on a secure server for presentation in pseudorandom order to a single blinded trained rater (epileptologist) who utilized the developed scoring sheet (Appendix e-2). The rater was not familiar with the residents. 2.8. Statistics Paired sample one-tailed t-tests were used to assess any changes in knowledge gained following the simulation exercise, comparing presimulation with postsimulation knowledge. The reliability of the checklist for assessing performance was assessed using intraclass correlation coefficients (ICCs). 3. Results A large majority (86%) of the subjects had some prior educational experience in the simulator. Subject demographics including experience in seizure management and medical simulation as well as Advanced Cardiac Life Support (ACLS) certification are shown in Table 1. The course/curriculum was very favorably reviewed, with over 97% of residents and nurses agreeing or strongly agreeing that the course was challenging, that the faculty were knowledgeable, and that the debriefing was beneficial. Ninety percent said that they plan to use what was learned in their future practice, and three quarters found the scenarios to be realistic. Eighty-two percent responded that they would definitely like to attend this type of course again. Only two of the nurses completed the postquestionnaire data set therefore the statistical inferences were unreliable. Resident knowledge of differential diagnoses, available intravenous medication and dosing, management of agitation and seizures in the EMU, and when the nurse should call

Table 1 Demographics of subjects. Number (%) Adult neurology residents Pediatric neurology residents Nurses Male With Medical simulation experience ACLS certified With ICU experience Mean age in years (range)

16 (67) 3 (12) 5 (21) 14 (61) 13 (69) 16 (84) 15 (79) 29.6 (26–34)

231

the doctor were all improved immediately following the curriculum. Items that did not show any change included knowledge of the duration of postictal states, complications of a seizure, comfort with notifying a supervisor when one did not have adequate knowledge or needed help, or comfort speaking up when recognizing the institution of inappropriate care. However, performance on these items was already high prior to the exercise, potentially representing a “ceiling effect” (Table 2). Two items showed decline after the simulation exercise: comfort with managing status epilepticus and learning in the simulator. For expert raters, the interrater reliability for each checklist item is shown in Table 3. The ICCs for individual raters were low to moderate (0.22–0.58) but were moderate to high (0.59–0.85) for the average of the raters. Lastly, there was no difference in the actual EMU performance of experienced interprofessional teams in the year prior to the orientation exercise compared to 4–8 months following the curricular exercise when assessed by a single rater using the 10-point checklist. 4. Discussion This study evolved from the investigation of actual sentinel events and near misses in our EMU. It led to the creation of a safe-practice checklist for assessing team performance and a simulation-based training program designed to increase resident awareness and competency for responding to emergencies in the EMU. The checklist, created by a local expert panel using a modified Delphi method, was designed with face and construct validity, and its use was shown to be feasible by trained physician raters. To our knowledge, this is the first time that simulation-based training and assessment of EMU management have been reported in the literature. The Joint Commission lists leadership, communication, coordination, and human factors as the leading causes of sentinel events [15]. Inability to collaborate as a team has been shown to correlate with hospitalbased errors [16]. Simulation has been found to be an effective method of teaching these nontechnical skills [17]. Training and evaluation of leadership and communication skills were major objectives of this study and were prominent components of our checklist (items 9 and 10) and scoring system. Medical simulation has also been shown to improve technical skills such as lumbar puncture in medical residents [18] and enhance PGY1 surgical resident skills to the level of a PGY2 resident [19]. Observer ratings of team skills have been shown to correlate with team performance during a simulated task [20], and debriefing modestly enhances performance [21]. Medical simulation has not yet been used as a training method for response to acute events in the EMU. Seizures, common events in these units, require timely response and identification to confirm the seizure or to identify nonseizure-related emergencies as delays can and do lead to catastrophic outcomes. Training to avoid these is imperative. In our study, significant improvement in the cognitive aspects of seizure management was demonstrated: differential diagnoses of an acute EMU event, appropriate anticonvulsant therapy and doses, response for an agitated patient in the EMU, and when a nurse should page the doctor. Most of the residents were already comfortable asking for help and admitting that they did not understand something, knowing the duration of the postictal state and the complications of a seizure. There may have been a ceiling effect associated with these items, or our simple metric was unable to discern any effect of our training. There were two items that showed decline following the simulation exercise: confidence/comfort managing status epilepticus and comfort learning in the simulator. While at first this seems counterintuitive, one possible explanation is that once the residents were made aware of the complexities of the case scenarios, it resulted in a decrease in their level of confidence and need for further training. As these responses were collected using a standard testing format, we cannot be sure of the effect of the simulation curriculum on resident clinical competence, which is

232

B.A. Dworetzky et al. / Epilepsy & Behavior 45 (2015) 229–233

Table 2 Knowledge and attitudes before and after simulation using paired t-tests. Question

Mean score, presimulation

Mean score, postsimulation

t(18)

p-Value

Differential diagnosis of acute events in the EMU? Postictal duration? Which AEDs are IV and what are adult doses? Response for an agitated patient? Complications of a seizure? When should the nurse call the doctor? Confident managing status epilepticus? Comfort learning in the simulator? Comfortable in saying that they do not know/need help? Comfortable in letting know that they make mistakes?

6.11 6.58 4.37 2.74 3.50 1.26 3.74 2.58 1.84 2.37

6.95 6.58 5.84 4.47 3.66 1.89 3.00 2.11 1.74 2.11

4.748 0 3.287 3.511 1.242 1.752 3.441 1.761 0.438 1.045

.00⁎⁎ 1, n.s. .002⁎⁎ .001⁎⁎ .150, n.s. .049⁎ .002⁎⁎ .048⁎ .667, n.s. .310, n.s.

⁎⁎ Result significant at p b .01. ⁎ Result significant at p b .05.

certainly the most important outcome. There was no difference found between responses in the actual EMU for those residents who had undergone the simulation training. Simulation learning may not readily transfer to the real life situation of caring for patients and is a major limitation of our measurement through multiple-choice responses. The best way to assess performance in the EMU for competence in response and treatment of seizures is not known, but it is likely more approximated through simulation than with multiple-choice testing. Guidelines for safety and quality in the EMU will need to be forthcoming, and simulation does seem to be an excellent modality for teaching interprofessional first response teams that typically cover these units. This study has several limitations. The small sample size and data from a single institution make generalizability difficult. The assessment of this learning process using greater numbers of subjects from different institutions will be needed. Because of loss to follow-up, we were unable to assess the effect that this process had on nurses, clearly an important component of any interprofessional EMU team; however, we feel strongly that this did not negatively impact the training process. The design was a simple pre–post design and was without a welldefined control group. We attempted to control for experience by using inexperienced residents and nurses; however, performance may still have been affected by individual experience. In addition, the possibility of nonvalid results from a Hawthorne effect of being observed might have been at play. For the actual EMU videos, two different junior resident cohorts were compared. Lack of an effect may have been due to other undetected differences between these groups. Additionally, the training occurred several months prior to the videos reviewed in the actual EMU which may not have allowed sufficient carryover. Also, raters were trained but not calibrated such that some may have been more lenient and others, stricter in their ratings. This may have led to diminished reliability of the instrument. Lastly, the cost of the use of

simulation for training will vary between institutions and could be an important factor limiting the use of this type of educational exercise. Although this pilot study presents modest statistical evidence of effect, based on our experience and numerous studies from other disciplines showing the benefits of medical simulation, we strongly feel that the creation of a novel simulation-based program for improving care in the EMU is an important development for interprofessional education in neurology.

Table 3 Reliability of safe-practice checklist items using the intraclass correlation coefficient (ICC).

Study sponsorship

Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Mean

Question

ICC for a single rater

ICC for the average of raters

Notice change in baseline The first responder calls for appropriate help Appropriate vital signs assessed Brief history taken Continued assessment of the patient Brief appropriate neurology exam performed Address EEG data Appropriate treatment offered Leader emerges Team communication demonstrated

.511 .556 .277 .063 .298 .222 .487 .477 .541 .577 .401

.807 .833 .605 .212 .630 .588 .792 .785 .825 .845 .692

All analyses were conducted using a two-way mixed-effects model using data from four raters (JL, CR, VA, and RS) on the 8 scenarios. Focus was on consistency of ratings rather than absolute agreement.

5. Conclusions Further work is needed to demonstrate the reliability of the use of the safety checklist for training personnel who work in the EMU and for general training for response to seizures in other settings. The widespread use of a valid and reliable training program could have profound implications for patient safety in and beyond the EMU. Author contributions Dr. Barbara Dworetzky was involved in study concept and design, data collection and interpretation, and drafting of the manuscript. Dr. Sarah Peyre was involved in study concept and design, interpretation of the data, and revision of the manuscript. Dr. Ellen Bubrick was involved in study design and manuscript revision. Dr. Tracey Milligan was involved in study design and manuscript revision. Dr. Steven Yule performed the statistical analysis and helped with the interpretation of the data. Ms. Heidi Doucette was involved in study design and manuscript revision. Dr. Charles Pozner was involved in study design, interpretation of data, and revision of manuscript.

This study was funded by an educational grant (C2010-354) from the American Academy of Neurology (AAN). Ethical approval We confirm that we have read the Journal's position on issues involved in ethical publication and affirm that this report is consistent with these guidelines. Acknowledgments The authors would like to thank the Brigham and Women's Neurology Nurses for their continued commitment to resident education, especially to Susan Gordon who worked as an expert rater. We dedicate this project

B.A. Dworetzky et al. / Epilepsy & Behavior 45 (2015) 229–233

to the late Margaret “Peggy” Gulley whose enthusiasm, passion, and help in launching this project were unmatched but who tragically passed away before its completion. Conflict of interest Dr. Dworetzky is a consultant for SleepMed and interprets EEGs for them. She has nothing to disclose related to this study. Dr. Sarah Peyre, Dr. Ellen Bubrick, Dr. Tracey Milligan, Dr. Steven Yule, Ms. Heidi Doucette, and Dr. Charles Pozner have nothing to disclose. Appendix A. Supplementary data Supplementary data to this article can be found online at http://dx. doi.org/10.1016/j.yebeh.2015.01.018. References [1] Dobesberger J, Walser G, Unterberger I, Seppi K, Kuchukhidze G, Larch J, et al. VideoEEG monitoring: safety and adverse events in 507 consecutive patients. Epilepsia 2011;52(3):443–52. [2] Shafer PO, Buelow J, Ficker DM, Pugh MJ, Kanner AM, Dean P, Levisohn P. Risk of adverse events on epilepsy monitoring units: a survey of epilepsy professionals. Epilepsy Behav 2011;20(3):502–5. [3] Rheims S, Ryvlin P. Patients' safety in the epilepsy monitoring unit: time for revising practices. Curr Opin Neurol 2014;27(2):213–8. [4] Sauro KM, Macrodimitris S, Krassman C, Wiebe S, Pillay N, Federico P, et al. Quality indicators in an epilepsy monitoring unit. Epilepsy Behav 2014;33:7–11. [5] Atkinson M, Hari K, Schaefer K, Shah A. Improving safety outcome in the epilepsy monitoring unit. Seizure 2012;21(2):124. [6] Shafer PO, Buelow JM, Noe K, Shinnar R, Dewar S, Levisohn PM, et al. A consensusbased approach to patient safety in epilepsy monitoring units: recommendations for preferred practice. Epilepsy Behav 2012;25(3):449–56. [7] http://www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/NeurologyMilestones. pdf.

233

[8] Schmidt E, Goldhaber-fiebert SN, Ho LA, McDonald KM. Simulation exercises as a patient safety strategy: a systematic review. Ann Intern Med 2013;158:426–32. [9] Freeth D, Avida G, Berridge EJ, Mackintosh N, Norris B, Sadler C, et al. Multidisciplinary obstetric simulated emergency scenarios (MOSES): promoting patient safety in obstetrics with teamwork-focused interprofessional simulations. J Contin Educ Health Prof 2009;29(2):98–104. [10] Gardner R, Walzer TB, Simon R, Raemer DB. Obstetrics simulation as a risk control strategy: course design and evaluation. Simul Healthc 2008;3:119–27. [11] Haynes AB, Weiser TG, Berry WR, Lipsitz SR, Breizat AH, Dellinger EP, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 2009;360:491–9. [12] Donoghue AJ, Durbin DR, Nadel FM, Stryiewski GR, Kost SI, Nadkarni VM. Effect of high-fidelity simulation on Pediatric Advanced Life Support training in pediatric house staff: a randomized trial. Pediatr Emerg Care 2009;25(3):139–44. [13] McKenna H. The Delphi technique: a worthwhile approach for nursing? J Adv Nurs 1994;19:122–5. [14] Velji K, Baker GR, Fancott C, Andreoli A, Boaro N, Tardif G, et al. Effectiveness of an adapted SBAR communication tool for a rehabilitation setting. Healthc Q (Longwoods) 2008;11(Spec):72–9. [15] http://www.jointcommission.org/assets/1/6/2007_Annual_Report.pdf. [16] Risser DT, Rice MM, Salisbury ML, Simon R, Jay GD, Berns SD. The potential for improved teamwork to reduce medical errors in the emergency department. The MedTeams Research Consortium. Ann Emerg Med 1999;34(3):373–83. [17] Sawyer T, Laubach VA, Hudak J, Yamamura K, Pocmich A. Improvements in teamwork during neonatal resuscitation after interprofessional TeamSTEPPS training. Neonatal Netw 2013;32(1):26–33. [18] Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation based education with mastery learning improves resident's lumbar procedure skills. Neurology 2012;79(2):132–8. [19] Chipman JG, Schmitz CC. Using objective structure assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg 2009;209(3):364–70. [20] Wright MC, Phillips-Bute BG, Petrusa ER, Griffin KL, Hobbs GW, Taekman JM. Assessing teamwork in medical education and practice relating behavioural teamwork ratings and clinical performance. Med Teach 2009;31(1):30–8. [21] Morgan PJ, Tarshis J, LeBlanc V, Cleave-Hogg D, DeSousa S, Haley MF, et al. Efficacy of high-fidelity simulation debriefing on the performance of practicing anaesthetists in simulated scenarios. Br J Anaesth 2009;103(4):531–7.