Debriefing Simulations: Comparison of Debriefing with Video and Debriefing Alone

Debriefing Simulations: Comparison of Debriefing with Video and Debriefing Alone

Clinical Simulation in Nursing (2013) 9, e585-e591 www.elsevier.com/locate/ecsn Featured Article Debriefing Simulations: Comparison of Debriefing w...

165KB Sizes 8 Downloads 118 Views

Clinical Simulation in Nursing (2013) 9, e585-e591

www.elsevier.com/locate/ecsn

Featured Article

Debriefing Simulations: Comparison of Debriefing with Video and Debriefing Alone Shelly J. Reed, DNP, APRN, CNEa,*, Claire M. Andrews, PhD, CNM, FAANb, Patricia Ravert, PhD, RN, CNE, ANEF, FAANc a

Associate Teaching Professor, Brigham Young University College of Nursing, Provo, UT 84604, USA Professor, Arline H. and Curtis F. Garvin Professor in Nursing Excellence, Case Western Reserve University, Cleveland, OH 44106, USA c Dean and Professor, Brigham Young University, College of Nursing, Provo, UT 84604, USA b

KEYWORDS debriefing; video; discussion; simulation; learning; nursing student experience

Abstract Background: Debriefing as part of the simulation experience is regarded as essential for learning. Evidence concerning best debriefing practices from the standpoint of a student nurse participant is minimal, particularly when comparing debriefing types. This study evaluated the differences in the student experience between two debriefing types: debriefing with video and debriefing without video (debriefing alone). Method: Nursing students participating in an intensive care simulation were randomized into one of the two debriefing types: debriefing with video (n ¼ 32) and debriefing alone (n ¼ 32) following simulation completion. After debriefing was completed, students were asked to complete a debriefing experience scale, designed to evaluate the nursing student experience during debriefing. Results: Statistically significant differences were found in only 3 of 20 items on the Debriefing Experience Scale. Debriefing with video had higher means with two items, ‘‘Debriefing helped me to make connections between theory and real-life situations’’ (p ¼ .007) and ‘‘I had enough time to debrief thoroughly’’ (p ¼ .039). Debriefing alone had a higher mean on one item ‘‘The debriefing session facilitator was an expert in the content area’’ (p ¼ .006). Conclusion: Students identified learning as part of their experience with both debriefing types. Although a few differences exist, nursing students reported overall that their experiences were minimally different between debriefing with video and debriefing alone. Cite this article: Reed, S. J., Andrews, C. M., & Ravert, P. (2013, December). Debriefing simulations: Comparison of debriefing with video and debriefing alone. Clinical Simulation in Nursing, 9(12), e585-e591. http:// dx.doi.org/10.1016/j.ecns.2013.05.007. Ó 2013 International Nursing Association for Clinical Simulation and Learning. Published by Elsevier Inc. All rights reserved.

Introduction The use of simulation in nursing education has become widespread as technology has advanced and become more * Corresponding author: [email protected] (S. J. Reed).

accessible. The drive to improve teaching and learning practices has also provided an impetus for simulation use (Cato, 2012; Harder, 2009). Simulation is a learning activity designed to replicate clinical practice. It can be used as a teaching strategy to practice clinical skills, such as decision making and problem solving, and for assessment

1876-1399/$ - see front matter Ó 2013 International Nursing Association for Clinical Simulation and Learning. Published by Elsevier Inc. All rights reserved.

http://dx.doi.org/10.1016/j.ecns.2013.05.007

Debriefing Simulations

e586

and evaluation of student skills (Cantrell, 2008; Jeffries, Bambini, Hensel, Moorman, & Washburn, 2009). Debriefing, the purposeful reflection that follows simulation, is an essential step to maximize learning and enable behavior change, both individually and systemically. It has been identified as the most important step of simulation Key Points (Dieckmann, Molin Friis,  Minimal evidence is Lippert, & Ostergaard, 2009; available concerning Dreifuerst, 2009; Grant, debriefing practices Moss, Epps, & Watts, 2010; that best contribute to Mayville, 2011; Shinnick, participant learning. Woo, Horwich, & Steadman,  Debriefing types com2011). Expert opinion pared in this study are abounds about how a dedebriefing with video briefing should be facilitated, and debriefing withparticularly in the areas of out video (debriefing medical simulation, anesalone). thesia crisis resource man No clear evidence was agement, and crew resource found establishing one management (Fanning & debriefing type as suGaba, 2007; Gaba, Howard, perior to the other. Fish, Smith, & Sowb, 2001; Morgan et al, 2009; Rudolph, Simon, Rivard, Dufresne, & Raemer, 2007). Simulation experts in nursing generally model debriefing practices held for medical simulation, such as discussing performance gaps and examining mental frameworks (Arafeh, Hansen, & Nichols, 2010). Many different types of debriefing exist in nursing education, such as inquiry journals, e-mail-based dialogue, and structured group experiences (Decker, 2007). A literature review identified there is minimal nursing research exploring debriefing methods (Neill & Wotton, 2011). Dreifuerst (2009) stated that ‘‘questions remain on how to debrief, when to debrief, and whom to include in debriefing for the best student learning’’ (p. 110). There is also minimal evidence concerning the best debriefing practices from the perspective of a student nurse, particularly comparing debriefing types and their promotion of student learning. Additional debriefing research is needed to help close these knowledge gaps. This study addressed the question of ‘‘how’’ to debrief, as framed by Dreifuerst (2009). The specific aim of this comparison study was to assess the differences in the undergraduate student nurse experience using the Debriefing Experience Scale (Reed, 2009, 2012).

Literature Review Debriefing is a facilitator-led activity following a simulation experience where participants are encouraged to think reflectively. During debriefing, the completed simulation is discussed and explored. Feedback is given by the facilitator and other group participants regarding simulation

performance and the application to future situations (INACSL Board of Directors, 2011). Dreifuerst (2009) describes debriefing as ‘‘the process whereby faculty and student reexamine the clinical encounter, fostering the development of clinical reasoning and judgment skills through reflective learning processes’’ (p. 109). Video playback of a simulation is sometimes used during debriefing. There are several considerations when using video during debriefing. Prior to videotaping, participants should be informed of policies related to the use of the recordings, and they should sign a consent form giving permission for them to be recorded. Proficiency with video equipment is needed, both for those videotaping and those involved in later review. A recommendation for video segment viewing includes showing two to four segments in each debriefing, with faculty introducing the expectation for each segment, showing the segment, and then allowing learners to self-critique or discuss. Time needed for the use of video with debriefing was not identified (Dreifuerst & Decker, 2012). A review of debriefing literature available for the past 10 years found few studies comparing debriefing with video and debriefing without video (debriefing alone). In nursing education, debriefing with video was found to positively affect the quality and speed of student assessment and psychomotor skills (Chronister & Brown, 2012). Mikasa, Cicero, and Jablonski (2012) concluded in another study that students supported the value of debriefing with video in their self-evaluation of critical thinking skills. In comparison, debriefing alone played more of a role with improved knowledge retention (Chronister & Brown, 2012). Grant et al. (2010) compared debriefing with video to a control group debriefing without video, evaluating target performance behaviors in a subsequent simulation with a performance checklist tool. No significant differences in total performance scores were noted between the two groups, although the authors stated the debriefing with the video group exhibited more desired behaviors in target performance areas than were found in the control group. Similarly, only a few studies comparing debriefing with video and debriefing alone were available in medical simulation literature. Scherer, Chang, Meredith, and Battistella (2003) documented continued improvement in one-half of 10 algorithm-based behaviors for advance trauma resuscitation over a 3-month study period with video-taped review, while no improvement in the behaviors was found over 3 months with verbal feedback only. Conversely, in two research studies comparing oral versus video-assisted oral debriefing, Salvodelli et al. (2006), in an anesthesia-simulated crisis debriefing, and Sawyer et al. (2012), with neonatal resuscitation performance debriefing, concluded that the addition of video review offered no additional advantage or educational benefit over oral debriefing alone.

pp e585-e591  Clinical Simulation in Nursing  Volume 9  Issue 12

Debriefing Simulations

e587

Theoretical and Conceptual Framework Kolb’s (1984) Experiential Learning Theory explains learning as a continuous process, where reflection on concrete experiences creates learning, changing how a person thinks and behaves (DeCoux, 1990; Sewchuk, 2005). Experiential learning is used in nursing and vocational training and has been found be useful in areas where theory and practice merge. Although experiential instruction methods are commonly used in nursing education, limited research exists to explore the nature and extent of their use (Brackenreg, 2004). Thiagarajan (1980) describes experiential learning as learning occurring through direct experiences that are followed by debriefing. Learning is provided by a ‘‘package’’ or tangible product, including objectives, a relevant experience, and a debriefing section where the learner reflects on the experience to achieve the instructional objective(s). It is identified as suited for ‘‘helping professions’’ among other professions. Thiagarajan’s (1980) model is useful as a framework to structure debriefing sessions (Brackenreg, 2004). It is especially suitable for an experiential learning method such as simulation, with the ‘‘package’’ providing objectives for the simulation session, and the simulation, or ‘‘experience,’’ and the debriefing after designed to achieve the instructional objectives. However, although this model seems very appropriate for simulation, Thiagarajan does not describe it as a theory, rather as an instructional model. A combination of Kolb’s Experiential Learning Theory and Thiagarajan’s experiential model was used for this study. Kolb’s (1984) Experiential Learning Theory was employed as the theoretical framework, allowing for the experiential learning that is provided by simulation, followed by debriefing. The model, or ‘‘package’’ described by Thiagarajan (1980), provided the conceptual framework for the simulation session itself. ‘‘Objectives’’ described by Thiagarajan correspond with Kolb’s abstract conceptualization, or thinking stage. Thiagarajan’s ‘‘relevant experience,’’ which in this case is a simulation, also meshes with Kolb’s active experimentation stage. Thiagarajan’s debriefing section combines within Kolb’s concrete experience, or feeling stage, where learning is solidified.

Method Design A quasi-experimental study design was used to compare nursing student experiences between two different debriefing types: debriefing with video and debriefing alone.

graduate program at a university in the western United States. The students were participating in simulations that were part of their critical care course. They were in the fifth semester of a seven-semester curriculum and had participated in seven or eight simulation or debriefing exercises up to that point in their nursing curriculum. Debriefing sessions that students had participated in prior to this study were debriefing alone; none of the students had participated in debriefing with video as part of their debriefing experience.

Instrument of Data Collection The tool used for the study was the Debriefing Experience Scale (Reed, 2009, 2012), created to evaluate the nursing student experience during debriefing. It consists of 20 items and is divided into four subscales: Analyzing Thoughts and Feelings; Learning and Making Connections; Facilitator Skill in Conducting the Debriefing; and Appropriate Facilitator Guidance. The items on the scale are rated in two different areas: the ‘‘experience’’ for the student, and ‘‘importance’’ of the experience to the student. Only the experience portion of the scale was used for this study. Cronbach’s alpha for the overall experience portion was 0.93. Cronbach’s alpha for individual subscales within the experience scale was 0.80 for ‘‘Analyzing Thoughts and Feelings,’’ 0.89 for ‘‘Learning and Making Connections,’’ 0.80 for ‘‘Facilitator Skill in Conducting the Debriefing,’’ and 0.84 for ‘‘Appropriate Facilitator Guidance.’’ The 20 items were rated in a Likert-type rating, from 1 (strongly disagree) to 5 (strongly agree). Validity of the scale was established through a comprehensive literature review of simulation and debriefing articles to identify components important to debriefing as well as a two-step factor analysis process. In addition, the scale structure and content were reviewed by three nationally known simulation experts.

Debriefing The two debriefing methods used in this study were debriefing with video and debriefing alone. Five questions common to each debriefing type were used to guide the discussion (McDonnell, Jobe, & Dismukes, 1997): 1. What did you learn during the simulation session? 2. What did you think of your performance during the simulation? 3. What did you think of the group’s performance during the simulation? 4. What are your questions concerning the simulation? 5. How can what you learned be applied to your future performance?

Sample The convenience sample for this study consisted of 64 nursing students attending a baccalaureate under-

The combined simulation and debriefing session lasted 1 hour and 30 minutes, with 1 hour allotted for simulation, 5 minutes to allow movement to a debriefing room, and

pp e585-e591  Clinical Simulation in Nursing  Volume 9  Issue 12

Debriefing Simulations

e588

25 minutes for debriefing. The simulation portion included simulation of four high-fidelity critical care scenarios. Each student simulation group consisted of eight students, with student groups of four participating in two of the simulations while the other four students observed the simulations in the same room. Following the first two simulations, those watching would then become the participants in two more simulations, with the first group becoming the observers. There were either one or two facilitators in each session. The facilitators were two experienced intensive care nurses with at least 1 year of simulation experience. After completion of all four simulations, each eight-student simulation group and the facilitator(s) were randomized as a group to either debriefing with video (n ¼ 32) or debriefing alone (n ¼ 32). The session facilitators who facilitated the simulation also facilitated the subsequent debriefing. The facilitators had not attended a formal simulation training course to learn to debrief, instead learning on the job during their facilitation experiences and through observation of other facilitators. For the study, the five discussion questions were used to provide a common structure for the debriefing. In addition, consistency among debriefers was provided as both debriefers cofacilitated at least two of the debriefing sessions, including debriefing with video and debriefing alone. For debriefing with video, the session facilitators chose clips of the videotaped simulations they felt were important to discuss and viewed them with the students. These clips, along with the five discussion questions, were used to guide the discussion during debriefing. Debriefing alone consisted of the facilitators using the five discussion questions to guide the discussion; no video playback was used.

Data Collection and Statistical Analysis University institutional review board consent was obtained for the study. No extra credit or compensation was given for study participation. Study investigators did not facilitate the simulation or debriefing nor were they part of the critical care course. Randomization to a debriefing type for each group was performed by drawing the debriefing type from a hat. The debriefing type chosen was then replaced to repeat the randomization process for the next group. After finishing the debriefing session, students were invited to participate in the research study, which included completing the debriefing experience scale to evaluate their debriefing experience. Students were informed that nonparticipation would have no effect on their grade. All students chose to participate, informed consent was obtained, and all students filled out the debriefing experience scale. The 20 items included in the final version of the debriefing experience scale (Reed, 2012) were analyzed statistically using IBM’s SPSS version 19.0 (Statistical Package for Social Sciences, SPSS, Chicago, IL) in the form of an independent sample t test.

Results The average age of the students was 22.8 years, 56 were female (88%), 7 were male (11%), and 1 participant did not identify. The independent sample t test comparing debriefing with video and debriefing alone showed no statistically significant differences between the two debriefing types on 17 of 20 items. Statistically significant higher scores were found with debriefing with video (DWV) over debriefing alone (DA) in two items: ‘‘Debriefing helped me make connections between theory and real-life situations’’ (DWV: mean [M] ¼ 4.3, standard deviation [SD]  0.45. DA: M ¼ 4.2, SD  0.80; p ¼ .007) and ‘‘I had enough time to debrief thoroughly’’ (DWV: M ¼ 4.5, SD  0.57. DA: M ¼ 4.2, SD  1.10; p ¼ .039). In contrast, statistically significant higher scores were found in debriefing alone over debriefing with video with the item: ‘‘The debriefing session facilitator was an expert in the content area’’ (DA: M ¼ 4.8, SD  0.43. DWV: M ¼ 4.6, SD  0.50; p ¼ .006). The results are described in Table 1.

Discussion Both groups agreed learning occurred during debriefing as seen by positive scores in the ‘‘Learning and Making Connections’’ subscale of the debriefing experience scale. This supports Thiagarajan’s (1980) concept that learning is provided by reflection on the experience during debriefing, in addition to solidification of learning during Kolb’s (1984) concrete experience, or feeling stage. The mean for the item ‘‘Debriefing helped me make connections between theory and real-life situations’’ was higher for the debriefing with video group. The statistical significance found supports the findings of Arafeh et al. (2010), who commented that use of video during debriefing allows learners to view their abilities to interact with patients in a complex health care environment. In addition, although not statistically significant, there was a difference in the category ‘‘Debriefing helped me to make connections in my learning,’’ higher with debriefing alone. This supports the claim by Dreifuerst and Decker (2012) that debriefing can be an opportunity to engage learners in the process of bridging content, knowledge, and experience; in this study, found to be more prominent in the debriefing alone group. A statistically significant difference in the mean scores was found in a second item, ‘‘I had enough time to debrief thoroughly,’’ higher with debriefing with video than with debriefing alone. Both types of debriefing were 25 minutes in length. Although there are some recommendations for the length of time needed for debriefing, the concept of time and debriefing is largely unstudied. In a qualitative study examining four debriefing cases, time was found to be one of seven debriefing ‘‘patterns’’ (Overstreet, 2009), although the impact of time on debriefing was not examined in depth. The

pp e585-e591  Clinical Simulation in Nursing  Volume 9  Issue 12

Debriefing Simulations Table 1

e589

The t Test Scores, Debriefing with Video (N ¼ 30) Versus Debriefing Alone (N ¼ 31) Debriefing with Video

Subscale/Item Analyzing Thoughts and Feelings (4 items) Debriefing helped me to analyze my thoughts The facilitator reinforced aspect of the health care team’s behavior The debriefing environment was physically comfortable Unsettled feelings from the simulation were resolved by debriefing Learning and Making Connections (8 items) Debriefing helped me to make connections in my learning Debriefing was helpful in processing the simulation experience Debriefing provided me with a learning opportunity Debriefing helped me to find meaning in the simulation My questions from the simulation were answered by debriefing I became more aware of myself during the debriefing session Debriefing helped me to clarify problems Debriefing helped me to make connections between theory and real-life situations Facilitator Skill in Conducting the Debriefing (5 items) The facilitator allowed me enough time to verbalize my feelings before commenting The debriefing session facilitator talked the right amount during debriefing Debriefing provided a means for me to reflect on my actions during the simulation I had enough time to debrief thoroughly The debriefing session facilitator was an expert in the content area Appropriate Facilitator Guidance (3 items) The facilitator taught the right amount during the debriefing session The facilitator provided constructive evaluation of the simulation during debriefing The facilitator provided adequate guidance during the debriefing

Debriefing Alone

Mean

Standard Deviation ()

Mean

Standard Deviation ()

p

4.3 4.5 4.6 4.1

0.47 0.51 0.49 0.57

4.3 4.6 4.7 4.2

0.53 0.56 0.55 0.78

.507 .968 .755 .078

4.3 4.3 4.3 4.4 4.3 3.9 4.3 4.3

0.54 0.48 0.61 0.56 0.74 0.90 0.64 0.45

4.5 4.3 4.3 4.2 4.2 4.1 4.2 4.2

0.77 0.51 0.54 0.64 0.62 0.94 0.69 0.80

.067 .812 .418 .750 .393 .574 .632 .007*

4.5

0.68

4.5

0.77

.496

4.6

0.50

4.4

0.89

.081

4.5

0.51

4.5

0.72

.209

4.5 4.6

0.57 0.50

4.2 4.8

0.10 0.43

.039* .006*

4.4 4.5

0.50 0.57

4.3 4.5

0.73 0.57

.305 .955

4.5

0.63

4.4

0.67

.704

* Significance set at 0.05.

statistically significant results in this study identified students who felt they had enough time to debrief when video was used in debriefing, different from the experience students had when debriefing alone. However, more evidence is needed to determine the many facets concerning time and the impact it has on the debriefing experience. Statistical significance was found with a third item: ‘‘The debriefing session facilitator was an expert in the content area.’’ One explanation could be that viewing the video took the attention away from the session facilitator in the debriefing with video group, while the debriefing alone group had more time to verbally interact and receive instruction from the session facilitator. Many comments during a debriefing often involve explanation of issues, with the session facilitator involved in most interactions (Dieckmann et al., 2009). Cantrell (2008) found that students watching their own video-taped performance reported stress and intimidation performing in front of faculty during the simulation. Students in this study may have encountered the same feelings when watching their video-taped performance, distracting them from noticing the expertise of the

facilitators. In addition, there was a trend toward significance with the item ‘‘The debriefing session facilitator talked the right amount during debriefing,’’ with the higher mean for debriefing with video. This again raises the possibility that students were more focused on the video than on the facilitator, possibly affecting their perception of facilitator expertise and talking times. Another difference, although not statistically significant, was for the item ‘‘Unsettled feelings from the simulation were resolved by debriefing,’’ higher in the debriefing alone group. One possible explanation for this difference is that viewing a video of one’s performance may provoke a recall of performance stress, as described by Cantrell (2008), rather than resolve it. Another possibility is that the debriefing with video may lead to a discussion of performance and detract from a discussion regarding resolution of feelings. In addition, viewing a video may reduce the amount of discussion altogether, limiting the discussion time needed to resolve feelings. In spite of the significant differences found in three items, 17 items showed no statistically significant

pp e585-e591  Clinical Simulation in Nursing  Volume 9  Issue 12

Debriefing Simulations

e590

difference between debriefing with video and debriefing alone. Dismukes, Gaba, and Howard (2006) state that ‘‘optimal methods of teaching [debriefing] might depend on the types of simulations being performed and what the teaching goals are for a particular session’’ (p. 26). In this mock code simulation in an undergraduate nursing education course, there is no clear evidence that either debriefing with video or debriefing alone is superior.

Limitations A post-hoc power analysis was performed and the study was underpowered as the number of students in each group (32) is half of the amount needed (64) for statistical analysis in a study of this type (Soper, n.d.), limiting the generalizability of the results. Study results were also the product of subjective student measurement; addition of objective measures would add strength to conclusions made. Another limitation was the lack of formal training for the facilitators in debriefing with video and debriefing alone, which could also impact the results. With a combined 1.5-hour simulation and debriefing, student fatigue could have also affected the debriefing experience with either debriefing type.

Conclusions This study contributes to the evidence comparing debriefing types following simulation in nursing education. No evidence was found showing either debriefing with video or debriefing alone to be superior when used in undergraduate nursing students. Although a few differences exist, students reported overall that their experiences were minimally different, including what they perceived they learned. Additional evidence is needed concerning the nursing student experience during debriefing, including the how, whom, and why, as described by Dreifuerst (2009). Research using a multiple site randomized clinical trial, students diverse in age and ethnic origin, and comparison of other debriefing types are just some areas that could be investigated. Replication of the study with a different population could provide additional evidence regarding use of video with debriefing, particularly if the study were better powered. In addition, other measures of learning, such as postdebriefing written examinations or evaluation of clinical performance at the bedside after debriefing, could add to the evidence concerning knowledge gained concerning debriefing.

Acknowledgment Acknowledgment for support of prior debriefing research: Christine Hudak, Professor, Health Informatics at Kent State University, Kent, Ohio.

References Arafeh, J. M. R., Hansen, S. S., & Nichols, A. (2010). Debriefing in simulated-based learning: Facilitating a reflective discussion. Journal of Perinatal & Neonatal Nursing, 24(4), 302-309. http: //dx.doi.org/10.1097/JPN.0b013e3181f6b5ec. Brackenreg, J. (2004). Issues in reflection and debriefing: How nurse educators structure experiential activities. Nurse Education in Practice, 4, 264-270. http://dx.doi.org/10.1016/j.nepr.2004.01.005. Cantrell, M. A. (2008). The importance of debriefing in clinical simulations. Clinical Simulation in Nursing, 4, e19-e23. http: //dx.doi.org/10.1016/j.ecns.2008.06.006. Cato, M. L. (2012). Using simulation in nursing education. In P. Jeffries (Ed.), Simulation in nursing education (2nd ed.). (pp. 1-10). New York, NY: National League for Nursing. Chronister, C., & Brown, D. (2012). Comparison of simulation debriefing methods. Clinical Simulation in Nursing, 8(7)), e281-e288. http: //dx.doi.org/10.1016/j.ecns.2010.12.005. Decker, S. (2007). Integrating guided reflection into simulated learning experiences. In P. R. Jeffries (Ed.), Simulation in nursing education: From conceptualization to evaluation (pp. 73-85). New York, NY: National League for Nursing. DeCoux, V. M. (1990). Kolb’s learning style inventory: A review of its applications in nursing research. Journal of Nursing Education, 29(5), 202-207. Dieckmann, P., Molin Friis, S., Lippert, A., & Ostergaard, D. (2009). The art and science of debriefing in simulation: Ideal and practice. Medical Teacher, 31, 287-294. http://dx.doi.org/10.1080/01421590902866218. Dismukes, R. K., Gaba, D. M., & Howard, S. K. (2006). So many roads: Facilitated debriefing in healthcare. Simulation in Healthcare, 1(1), 23-25. Dreifuerst, K. T. (2009). The essentials of debriefing in simulation learning: A concept analysis. Nursing Education Perspectives, 30(2), 109-114. http: //dx.doi.org/10.1043/1536-5026-030.002.0109. Dreifuerst, K. T., & Decker, S. I. (2012). Debriefing: An essential component for learning in simulation pedagogy. In P. Jeffries (Ed.), Simulation in nursing education (2nd ed.). (pp. 105-129) New York, NY: National League for Nursing. Fanning, R. M., & Gaba, D. M. (2007). The role of debriefing in simulation-based learning. Simulation in Healthcare, 2(2), 115-125. http://dx.doi.org/10.1097/SIH.0b013e3180315539. Gaba, D. M., Howard, S. K., Fish, K. J., Smith, B. E., & Sowb, Y. A. (2001). Simulation-based training in anesthesia crisis resource management (ACRM): A decade of experience. Simulation and Gaming, 32(2), 175193. http://dx.doi.org/10.1177/104687810103200206. Grant, J. S., Moss, J., Epps, C., & Watts, P. (2010). Using video-facilitated feedback to improve student performance following high-fidelity simulation. Clinical Simulation in Nursing, 6(5), e177-e184. http: //dx.doi.org/10.1016/j.ecns.2009.09.001. Harder, B. N. (2009). Evolution of simulation use in health care education. Clinical Simulation in Nursing, 5, e169-e172. http: //dx.doi.org/10.1016/j.ecns.2009.04.092. INASCL Board of Directors. (2011). Standard I: Terminology. Clinical Simulation in Nursing, 7(4S), s3-s7. http://dx.doi.org/10.1016/j.ecns.2011.05.005. Jeffries, P. R., Bambini, D., Hensel, D., Moorman, M., & Washburn, J. (2009). Constructing maternal-child learning experiences using clinical simulations. Journal of Obstetric, Gynecologic and Neonatal Nursing, 38(5), 613-623. http://dx.doi.org/10.1111/j.1552-6909.2009.01060. Kolb, D. (1984). Experiential learning: Experience as a source of learning and development. Upper Saddle River, NJ: Prentice-Hall. Mayville, M. L. (2011). Debriefing: The essential step in simulation. Newborn & Infant Nursing Reviews, 11(1), 35-39. http: //dx.doi.org/10.1053/j.nainr.2010.12.012. McDonnell, L. K., Jobe, K. K., & Dismukes, R. K. (1997). Facilitating LOS debriefings: A training manual (NASA Technical Memorandum 112192). Moffett Field, CA: National Aeronautics and Space Administration Ames Research Center.

pp e585-e591  Clinical Simulation in Nursing  Volume 9  Issue 12

Debriefing Simulations

e591

Mikasa, A., & Cicero, T. (2012, July). Play it again: Effect of simulation recording on evaluation during debriefing. Oral presentation, Sigma Theta Tau International 23rd International Nursing Research Congress. Brisbane, Australia. Morgan, P. J., Tarshis, J., LeBlanc, V., Cleave-Hogg, D., DeSousa, S., Haley, M. F., & Law, J. A. (2009). Efficacy of high-fidelity simulation debriefing on the performance of practicing anaesthetists in simulated scenarios. British Journal of Anaesthesia, 103(4), 531537. http://dx.doi.org/10.1093/bja/aep222. Neill, M. A., & Wotton, K. (2011). High-fidelity simulation debriefing in nursing education: A literature review. Clinical Simulation in Nursing, 7(5), e161-e168. http://dx.doi.org/10.1016/j.ecns.2011.02.001. Overstreet, M. L. (2009). The current practice of nursing clinical simulation debriefing: A multiple case study. Doctoral dissertation, University of Tennessee. Retrieved May 28, 2013, from http://trace.tennessee.edu/ utk_graddiss/627. Reed, S. J. (2009). Comparison of debriefing methods following simulation: Development of pilot instrument. Doctoral thesis, Case Western Reserve University, Cleveland, OH. Reed, S. J. (2012). Debriefing experience scale: Development of a tool to evaluate the student learning experience in debriefing. Clinical Simulation in Nursing, 8(6), e211-e217. http://dx.doi.org/10.1016/ j.ecns.2011.11.002. Rudolph, J. W., Simon, R., Rivard, P., Dufresne, R. L., & Raemer, D. B. (2007). Debriefing with good judgment: Combining rigorous feedback

with genuine inquiry. Anesthesiology Clinics, 25(2), 361-376. http: //dx.doi.org/10.1016/j.anclin.2007.03.007. Salvodelli, G. L., Naik, V. N., Park, J., Joo, H. S., Chow, R., & Hamstra, S. J. (2006). Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology, 105(2), 279-285. Sawyer, T., Sierocka-Castaneda, A., Chan, D., Berg, B., Lustick, M., & Thompson, M. (2012). The effectiveness of video-assisted debriefing versus oral debriefing alone at improving neonatal resuscitation performance. Simulation in Healthcare, 7(4), 213-221. http: //dx.doi.org/10.1097/SIH.0b013e3182578eae. Scherer, L. A., Chang, M. C., Meredith, J. W., & Battistella, F. D. (2003). Videotape review leads to rapid and sustained learning. American Journal of Surgery, 185, 516-520. http://dx.doi.org/10.1016/S00029610(03)00062-X. Sewchuk, D. (2005). Experiential learningda theoretical framework for perioperative education. AORN Journal, 81(6), 1311-1318. http: //dx.doi.org/10.1016/S0001-2092(06)60396-7. Shinnick, M. A., Woo, M., Horwich, T. B., & Steadman, R. (2011). Debriefing: The most important component in simulation. Clinical Simulation in Nursing, 7(3), e105-e111. http://dx.doi.org/10.1016/j.ecns.2010.11.005. Soper, D. (n.d.). A-priori sample size calculator for student t-tests. In Statistics calculators. Retrieved February 6, 2013, from http://www. danielsoper.com/statcalc3/calc.aspx?id¼47. Thiagaragjan, S. (1980). Experiential learning packages. Upper Saddle River, NJ: Educational Technology Publications.

pp e585-e591  Clinical Simulation in Nursing  Volume 9  Issue 12