JEPM is growing in many ways

JEPM is growing in many ways

Editorial JEPM Is Growing in Many Ways Armin Schubert, MD, MBA* Department of General Anesthesiology, The Cleveland Clinic Foundation, Cleveland, OH ...

44KB Sizes 4 Downloads 58 Views

Editorial JEPM Is Growing in Many Ways Armin Schubert, MD, MBA* Department of General Anesthesiology, The Cleveland Clinic Foundation, Cleveland, OH

*Section Editor, Journal of Education in Perioperative Medicine, in the Journal of Clinical Anesthesia; Editor-in-Chief, Journal of Education in Perioperative Medicine; Chief, Department of General Anesthesiology, Cleveland Clinic Address correspondence to Dr. Schubert at the Department of General Anesthesiology, The Cleveland Clinic, E-31, 9500 Euclid Ave., Cleveland, OH 44195, USA. Received and accepted for publication July 1, 2002.

After a short dry spell, contributions to the Journal of Education in Perioperative Medicine (JEPM) are again pouring in. During the last 12 months, JEPM’s editorial board, assisted by several ad-hoc reviewers, has completed over 35 independent peer reviews of more than ten submitted manuscripts. I would like to thank them for their efforts and value provided to our authors. The JEPM web site (www.jepm.org), now holds over 150 pages of material relevant to anesthesia education, most of it peer-reviewed. The official electronic journal of the Society for Education in Anesthesia (SEA) is now in its fourth year of publication, having featured a total of nine issues and 40 articles or abstracts. The increase in submissions, along with a surging interest by educators to publish, has led to the creation of two new sections within JEPM. At the beginning of this year, the section for Peer Review Educational Materials, edited by John Doyle, MD, PhD, was established. It will feature works describing, reviewing, and evaluating educational materials potentially useful in perioperative medicine. Increasingly, it is recognized that educators, when developing new educational materials, are creating intellectual property for which academic recognition is not— but should be— currently available. Such recognition can be achieved through peer review of the type to which authors and readers of scientific literature have been accustomed. In the next issue of JEPM, Dr. Doyle will further elaborate on the goals and requirements for this section. JEPM’s newest section is devoted to Simulation in Perioperative Education. Michael Olympio, MD, has been appointed section editor. In this section, articles related to simulation are considered. This will include well-documented simulation scenarios, which might be considered equivalent to case reports in clinical medicine. Again, through adherence of quality submission standards and a rigorous peer review process, these education-related works should receive greater recognition than they would if merely presented at the author’s home institution. Other manuscripts discussing or studying simulation will also be reviewed. One example of this type of work is the paper by Kim et al., published in this issue of the Journal of Clinical Anesthesia.1 In a carefully conducted, randomized prospective trial, textbook learning for advanced cardiac life support (ACLS) was compared with interactive screenbased simulation. The authors were unable to show an advantage of the former over the latter. Based on written pre- and post-testing, immediate performance scores improved significantly after both kinds of learning, with the textbook group temporarily outperforming the simulation group. Learning performance was partially sustained for one week, but the difference between the two methods disappeared. Moreover, learner satisfaction was no better in the simulation than in the textbook groups. At first glance, this study appears to deal an unexpected blow to the much

Journal of Clinical Anesthesia 14:386 –387, 2002 © 2002 Elsevier Science Inc. All rights reserved. 655 Avenue of the Americas, New York, NY 10010

0952-8180/02/$–see front matter PII S0952-8180(02)00402-6

Editorial

hailed concept of simulation as an effective learning method, one that may promote better student satisfaction, involvement, retention and performance.2–5 However, it should be understood that the work of Kim et al. is likely to be applicable only in very specific circumstances. These include a relatively static learning task that is algorithm based, learners who are complete novices, and a limited amount of time (⬍3 hr) available for learning with simulation. One could also argue effectively that the authors did not apply interactive simulation to the learning task most suited to show the unique advantages of simulation in adult learning of complex issues. For example, simulation may be at its best when used as an opportunity to practice behavioral aspects of crisis management3 or to tutor health care professionals rather than as a preparation for a standardized test.4,6 In addition, both the learner population studied (medical students vs anesthesia residents) and the assessment tool utilized (written multiple-choice test vs. behavior based evaluation) may affect results from investigations such as the one conducted by Kim et al. In a similar evaluation of ACLS learning, Schwid et al. showed better retention of performance by anesthesia residents and faculties tested by videotape scoring of simulated scenarios2 10 to 11 months later. Moreover, Nyssen et al. recently found screen-based simulation of a crisis scenario about anaphylaxis to contribute to performance improvement in Belgian anesthesia trainees.3 These observations may well mean that successful application of simulation for performance improvement

presupposes a certain preexisting level of expertise and can be discerned only through structured, behavior-based evaluation. The (relatively negative) findings of Kim et al. would certainly support this notion. Despite the limitations of their study design and applicability, these investigators are to be congratulated for pursuing a rigorous approach to the evaluation of simulation learning relevant to perioperative educators.

References 1. Kim JH, Kim WO, Min KT, Yang JY, Nam YT: A comparison of textbook and computer simulation in the study of advanced cardiac life support. J Clin Anesth 2002;14:XXX-XXX. 2. Schwid HA, Rooke GA, Ross B, Sivarajan M: Use of a computerized advanced cardiac life support simulator improves retention of advanced cardiac life support guidelines better than a textbook review. Crit Care Med 1999;27:821– 4. 3. Nyssen A, Larbuisson R, Janssens M, Pendeville P, Mayne A: A comparison of the training value of two types of anesthesia simulators: computer-screen based and mannequin-based simulators. Anesth Analg 2002;94:1560 –5. 4. Eliot CR, Williams KA, Woolf BP: An intelligent learning environment for advanced cardiac life support. Proc AMIA Annu Fall Symp 1996;:7–11. 5. Chopra V, Gesink BJ, de Jong J, Bovill JG, Spierdijk J, Brand R: Does training on an anaesthesia simulator lead to improvements in performance? Br J Anaesth 1994;73:293–7. 6. Schneider AJ, Murray WB, Mentzer SC, Miranda F, Vaduva S: “Helper:” a critical events prompter for unexpected emergencies. J Clin Monit 1995;11:358 – 64.

J. Clin. Anesth., vol. 14, August 2002

387