eLearning techniques supporting problem based learning in clinical simulation

eLearning techniques supporting problem based learning in clinical simulation

International Journal of Medical Informatics (2005) 74, 527—533 eLearning techniques supporting problem based learning in clinical simulation Charles...

95KB Sizes 0 Downloads 21 Views

International Journal of Medical Informatics (2005) 74, 527—533

eLearning techniques supporting problem based learning in clinical simulation Charles Docherty ∗, Derek Hoy, Helena Topp, Kathryn Trinder Glasgow Caledonian University, Cowcaddens Road, Glasgow G4 OBA, Scotland, UK Received 1 November 2004 ; received in revised form 16 February 2005; accepted 23 March 2005 KEYWORDS Problem based learning; eLearning; Clinical simulation

Summary This paper details the results of the first phase of a project using eLearning to support students’ learning within a simulated environment. The locus was a purpose built clinical simulation laboratory (CSL) where the School’s philosophy of problem based learning (PBL) was challenged through lecturers using traditional teaching methods. The solution: a student-centred, problem based approach to the acquisition of clinical skills that used high quality learning objects embedded within web pages, substituting for lecturers providing instruction and demonstration. This encouraged student nurses to explore, analyse and make decisions within the safety of a clinical simulation. Learning was facilitated through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that eLearning techniques can help students acquire clinical skills in the safety of a simulated environment within the context of a problem based learning curriculum. © 2005 Elsevier Ireland Ltd. All rights reserved.

1. Introduction PBL has recently been the chosen educational philosophy of Glasgow Caledonian University (GCU) School of Nursing, Midwifery and Community Health. With this approach, learning is derived from the exploration of issues arising from patient centred scenarios and is claimed to develop skills in critical thinking, decision making, team-working and problem solving [1]. As a teaching method PBL ∗

Corresponding author. E-mail address: [email protected] (C. Docherty).

is resource intensive, and because it is contentious whether long-term benefits for professionals actually materialise, there is demand for rigorous evaluation [2]. Being a new approach for many in GCU, staff development focused on how students acquire theoretical knowledge, with some success [3]. However, for clinical skills acquisition, the studentcentred approach was compromised through lecturers providing information and demonstrating activities in a traditional, teacher-centred manner, where students were passive recipients with limited practice under direct supervision. There were quality concerns as lecturers had varied clinical

1386-5056/$ — see front matter © 2005 Elsevier Ireland Ltd. All rights reserved. doi:10.1016/j.ijmedinf.2005.03.009

528 expertise that was of unknown currency and validity. This issue, clinical credibility, is a longrecognised challenge for nurse lecturers [4—6]. That aside, the contradictory philosophies of education, liberal for theory and traditional for skills acquisition, could only have been confusing for students. It was proposed, therefore, that teachers be encouraged to step back from their traditional role, and allow students to develop team-working, analytical and decision making skills in simulated practice, by providing quality assured learning resources for students, facilitated through eLearning technologies. The case for using eLearning techniques to support PBL derives from their shared philosophical underpinning of constructivism. The conceptual exploration, analysis and decision making central to PBL [7] parallels the exploration, analysis and decision making required in browsing hypertext based learning objects [8]. Both produce incidental learning, and students have the advantage of being in control of the process and content of their learning. Disadvantages such as studying to inappropriate depth, becoming distracted or spending excessive time searching for resources are common to both, requiring greater self-discipline by students [9]. The role of the educationalist is complementary: teaching in PBL is more accurately a process of facilitation, while eLearning can also be facilitative through discussion forums, and using learning objects such as animation or video that promote enquiry. Within the context of the clinical skills laboratory, where outcomes are measured in terms of clinical competence, the challenge for educationalists is to achieve the fine balance between giving instruction and promoting enquiry, in order that efficient and effective skills acquisition occurs in the short-term, and benefits of PBL materialise in the long-term. The purpose of this study is therefore to evaluate this change, however, the difficulties in evaluating such innovations in nursing education are well documented: one review of 26 evaluative studies of using computers in learning spanning 30 years found significant design flaws, including small sample size and lack of controls [10]. Evaluating virtual environments from an educational perspective is a developing concept [11]. However, using the rationale that learning technology is one factor in a complex situation, it can be argued that a multidimensional and flexible approach needs to be taken to its evaluation. Such an approach would attempt to show how technology contributes to the overall learning experience and has been widely accepted [11—16]. In this project, therefore, quantitative outcomes in terms of student satisfaction; exam results; and self-efficacy in relation to

C. Docherty et al. clinical skills were combined with interview data to provide a holistic impression of the change in educational approach, rather than attempting to make discrete, causal links in a reductionist way. This type of project requires skills in management, investigation and specification, design, production, validation and evaluation, with careful documentation and consideration given to future maintenance [17]. A team was formed to reflect the demands of the project and was supported with a substantial grant from GCU.

2. Materials and methods 2.1. Creating the ICU learning environment A DraegerTM mannequin was attached to, among other equipment, a ventilator and monitor that could simulate and display clinical data. This became ‘George Morgan’, the central character of a PBL scenario that students would ‘nurse’ each week for 6 weeks. Typical bedside equipment completed the environment and ICU background sounds provided an extra layer of fidelity. A networked computer was provided at a workstation, and video recording equipment was installed that could be controlled remotely. Software was created in AuthorwareTM supplemented with tools for network communication. Formative evaluation, based on the work of Tessmer [18], was extensive. Content was checked for accuracy and currency by intensive care experts, while final year students determined usability and identified errors and misunderstandings. A ‘minimum instruction’ approach was taken to encourage students to investigate and explore, and learn from each other. This has been shown to be more effective than providing explicit directions [19], and is consistent with the philosophy of PBL. With feedback, the interface became simple and intuitive, with easy access to scenario resources, e-mail links to facilitators and on-line multimedia quizzes. Students in groups of four selected an hour-long, unsupervised lab session each week as the PBL scenario unfolded. This complemented face-toface tutorials and fixed resource sessions such as lectures. Once logged onto the web-based system, students involved themselves in practice situations, receiving subtle guidance and prompts from multimedia learning objects embedded within the scenario text, with no teacher present to instruct or demonstrate. Instead, current, practising nurses from intensive care units were videoed, demonstrating their clinical competence in a variety

eLearning techniques supporting problem based learning in clinical simulation of situations. These ‘learning objects’ modelled nursing interaction with equipment, data, patient and family, and encouraged students to explore, practice and develop their competence in the activities being portrayed. Linking digital media to specific learning outcomes has since become standard in learning object production [20]. George’s needs changed over the 6-week period and the intensive care environment reflected this. For example, get-well cards and personal items such as photographs and posters began to accumulate after the first week, as well as different monitoring and treatment equipment. Through pop-up videos and embedded learning objects, George’s family regularly challenged students’ values with ethical dilemmas related to consent, advocacy, confidentiality, and the right to life, providing an insight into the social and psychological complexities of intensive nursing care. Like a good ‘soap opera’, the unfolding scenario and characterisation invited students to become involved at a personal and emotional level. For those that did, opportunities were there to empathise, and practice the higher order skills required to support intensive care patients and their families through crises, all within the safety of a simulation. Vicarious learning was supported through students recording their laboratory performances for peer review. With consent, these became new learning objects, incorporated into the system for access by students’ contemporaries.

2.2. Study design The ‘traditional’ delivery of the course involved, each week, a 1-h lecture, and a scenario-based 3h PBL tutorial, during which practice skills were discussed and demonstrated by lecturers in class. The eLearning innovation involved, each week, the substitution of one tutorial hour, for 1 h in the technology-facilitated clinical simulation, where the students were much more in control of their learning. In addition, on-line discussion and e-mail facilities were available to this group. One cohort (n = 160) that undertook this course in the traditional manner was compared with a second cohort (n = 143) following the introduction of the eLearning innovation, exactly 1 year later. Student satisfaction with PBL and exam results was compared. These comparisons were valid, as important variables were controlled through efforts to ensure that teaching personnel, group size, teaching materials, teaching environment, exams and fixed resources were as far as possible identical for each cohort. The independent variable was the eLearning innovation that substituted for 1 h

529

of traditional teaching each week. A 26-item ‘student satisfaction’ questionnaire, previously shown to have acceptable test/re-test reliability and internal consistency, consisted of sections relating to specific aspects of PBL, the process of facilitation, the availability of resources, and the effectiveness of learning. Each item was scored on a Likert-type scale, from 1 to 5. This was distributed at the end of the 6-week course for each cohort. Analysis of data was performed using Wilcoxon Sum of Ranks test. An identical exam was given to both cohorts, consisting of multiple choice (MCQ) and short answer type questions. Contamination was prevented by strictly controlling the circulation of papers. Samples of marking from each cohort, from the same teachers, were analysed (non-eLearning, n = 70; eLearning n = 60), students’ marks compared using Wilcoxon Sum of Ranks tests. In addition to comparisons between cohorts, changes to self-efficacy within the eLearning group were measured. Perceptions of efficacy have been shown to influence the actions that individuals take, how much effort they invest, how long they persevere in the face of disappointing results and whether tasks are approached self-assuredly or not [21—23]. These perceptions thus become strong predictors of performance [24]. Self-efficacy has thus been found useful in the evaluation of educational interventions [22,25—27], and was chosen to provide an indirect measure of how the clinical performance of students in the eLearning cohort developed over time. It was not considered appropriate to compare the traditional cohort with the eLearning cohort, as the opportunity to develop self-efficacy in relation to clinical skills in this group was very limited. With the help of senior intensive care nurses, 25 nursing activities were identified to reflect module learning outcomes, and CSL activities. A questionnaire was then constructed to rate how confident students felt that they could perform each of these, using an established 5-point rating scale and criteria ranging from ‘no confidence whatever’ to ‘very confident’ [12]. Seventy-three students completed this questionnaire prior to the lab session in week 1, then again on completion of the labs in week 6. These were coded for confidentiality purposes, and 63 matched pairs were obtained for analysis using Wilcoxon Matched Pairs Test. Finally, interviewing selected lecturers, students and their mentors within the eLearning cohort helped provide a full and much richer understanding of the effectiveness of the educational intervention. SPSS was used for all statistical analysis.

530

C. Docherty et al.

3. Results 3.1. Student satisfaction: differences between traditional and eLearning groups By calculating the difference in mean scores for each item, for the traditional and eLearning groups, and using Wilcoxon Sum of Ranks Test (Table 1), a statistically significant difference in student satisfaction emerged (P < 0.001), the eLearning group demonstrating greater satisfaction.

3.2. Exam scores: differences between traditional and eLearning groups Applying Wilcoxon Sum of Ranks test to the data there was no statistically significant difference between traditional and eLearning cohorts in relation to either ‘total marks’ P = 0.25 or marks for ‘short answer’ questions, P = 0.92. There was a statistically significant difference in ‘multiple choice’ marks, eLearning cohort demonstrating higher scores, P = 0.0003. Table 1 Item 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

Self-efficacy: growth in self-efficacy within the eLearning group, as measured pre- and postexperience. For these 63 matched pairs, scores for each item were compared before and after the eLearning experience. The improved mean score for each item demonstrated that students were more confident in being able to perform each of the clinical skills represented in the questionnaire, this growth being statistically significant, ranging from P = 0.0047 to P < 0.0001.

4. Interviews Two lecturers and eight students were purposively selected and agreed to participate in a series of individual semi-structured interviews at the beginning, middle and end of the module. Mentors were interviewed when these students went on placement. A systematic approach to data collection and analysis was taken: interviews were recorded and transcribed, scripts were reviewed and agreed to be a true reflection by subjects.

Student satisfaction Mean scores

Difference

Traditional cohort n = 107; 66% response

eLearning cohort n = 111; 78% response

2.0 3.3 3.2 3.3 3.2 3.4 3.3 3.3 3.3 3.4 3.4 3.4 3.4 2.6 3.0 2.4 3.3 3.1 3.1 3.1 2.8 2.8 2.9 2.1 2.4 3.2

3.3 3.8 3.9 4.0 3.9 3.7 3.7 3.9 3.4 3.9 3.8 3.8 3.9 3.6 3.5 3.2 4.0 3.4 3.6 3.8 3.4 3.3 3.5 3.3 3.4 3.6

Analysis using Wilcoxon Sum of Ranks: (P < 0.001).

+1.3 +0.5 +0.7 +0.7 +0.7 +0.3 +0.4 +0.6 +0.1 +0.5 +0.4 +0.4 +0.5 +1.0 +0.5 +0.8 +0.7 +0.3 +0.5 +0.8 +0.6 +0.5 +0.6 +1.4 +1.0 +0.4

eLearning techniques supporting problem based learning in clinical simulation Two independent qualitative researchers analysed the transcripts. The principles of component analysis [28] were used to explain how the process of learning and teaching was perceived by participants. Three categories, ‘understanding’, ‘learning relationships’ and ‘reflection’ were identified as themes that inevitably overlapped. Understanding related to a growth in the student’s ability to interact with the scenario in a meaningful way, apply knowledge, learn from mistakes and reflect upon experiences with the aid of video. It became a product of learning activities within the simulated environment. Learning relationships were seen as essential to developing this understanding, and were shaped by, and shaped experiences that the individual, and the group participated in. Sharing experiences, such as resuscitating ‘George’, role-playing his relatives, required team-work and created a bond of friendship between students. Reflecting on simulated practice, especially with the aid of video, helped develop understanding and identified strengths and weaknesses in learning. Shared learning experiences occurred through students supporting each other in teamwork and role development, prompted by, for example, observing professional nurses on video, reflecting on performances and modelling behaviours accordingly. The social nature of learning that was found to occur in the CSL as groups negotiated roles and activities was consistent with theories of ‘learning communities’ [7,29]. The novelty of the experience contributed to students reflecting upon their experiences there. One student, referring to her experience in attempting to resuscitate ‘George’ in an emergency situation found that when she was telling this story to family and friends afterwards, ‘just talking about it reinforced it’. Mentors confirmed that transfer of learning was enhanced by students’ experiences in the CSL, and students referred to this experience when reflecting on and making sense of practice. Students in this cohort seemed to have a greater understanding of the nurses’ role in intensive care and were able to participate in nursing activities earlier than expected.

5. Discussion The improvement in student satisfaction (Table 1) could be attributed to the change to a much more consistent educational philosophy, and the provision of high quality learning materials intrinsic to this new development. Little else could realistically have produced such a statistically significant improvement, as there was no evident difference between cohorts in terms of demography, edu-

Table 2

531

Analysis of exam results, traditional cohort

N = 70

Mean

S.D.

Min

Max

Multiple choice Short answer Total (out of 20)

1.57 7.7 9.47

0.7 2.6 3.1

0 1.9 3

4 13.1 16

cation, or social circumstances that may have explained why they were so satisfied with their educational experience. Changes to exam scores, overall, are insignificant (Tables 2 and 3), however, on deeper analysis of the data, there was significantly higher MCQ scores for the eLearning cohort, (P = 0.0003). This could possibly be attributed to a weekly multimedia quiz within the system that in some ways was similar to, and could have been a rehearsal for, the MCQ exam. The self-efficacy results were extremely encouraging (Table 4). The slight variation in statistical significance between items reflected the fact that some students were already familiar with some clinical skills through previous clinical experience, therefore the improvement in self-efficacy experienced during the course was not as marked. These results, overall, reflect the self-efficacy enhancing measures integral to the eLearning development, such as vicarious learning, peer-group re-enforcement and encouragement, and video role modelling. The results were very positive and provide evidence that learning clinical skills in a simulated environment can be problem based, student-centred, and facilitative using multimedia learning objects and technology. It was clear that many students found the experience motivating and enjoyable, suspending disbelief and talking of George as if he was a real person, while others had more difficulty relating to the characters of the scenario. Feedback from lecturers undertaking tutorials with students indicated that these were more interactive and animated than previously, with greater depth to discussion. Caution needs to be exercised in generalising from these findings, however, as the evaluation was limited to two cohorts, with the eLearning group in particular being the subject of much attention. Hawthorne and novelty effects [30] were bound to have contributed to differences in findings. HowTable 3

Analysis of exam results, eLearning cohort

N = 60

Mean

S.D.

Min

Max

Multiple choice Short answer Total (out of 20)

2.35 7.6 9.97

1.1 2.2 2.47

0 3 5

5 12 15

532

Table 4

C. Docherty et al.

Pre- and post-measures of self-efficacy

Item 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

Mean scores (63 matched pairs) Pre-module score

Post-module score

2.66 2.18 1.67 2.34 1.72 2.77 2.41 3.25 2.33 2.51 1.80 1.58 1.66 1.85 2.44 2.93 3.13 2.28 2.10 1.90 2.05 2.69 3.03 3.66 2.77

3.85 3.61 2.46 3.75 2.9 3.49 3.33 3.92 3.26 3.47 2.98 2.82 2.98 3.10 3.25 3.50 3.95 3.15 3.54 3.30 2.63 3.38 3.69 4.00 3.74

Significance of the difference (Wilcoxon matched pairs test) P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P < 0.0001 P = 0.0001 P < 0.0001 P < 0.0001 P = 0.0047 P < 0.0001

A significant improvement in self-efficacy scores was obtained for each item, ranging from P = 0.0047 to P < 0.0001.

ever, the system was robust and the educational approach soundly based on theoretical principles, contributing to consistency in teaching and learning, connecting students with each other and their facilitators; linking classroom and laboratory activities, and linking theory with practice. This reflects Collis’ view, that learning is more of a process of making links and connections than of working through someone else’s way of developing a thought [31].

6. Conclusion It is clear that the innovation consisted of many factors. These may have interacted and had different effects on outcomes, and further work is indicated if cause and effect relationships are to be established. From the available evidence, the initial phase of this project has been highly successful. Since its inception, ‘George’ has provided a standardised educational experience for over 700 students in its original form, while the learning objects have been re-used in different contexts by a further 400 students. Although detailed cost—benefit

analyses have yet to be performed, this project has shown that savings in the form of student contact time, and CSL teaching time are possible, while maintaining the quality of educational experience. The creation of quality digital resources that were specifically designed for one context and re-usable in another indicated additional potential savings. The results of this project provided confidence and motivation to increase the scale of activities: educational benefits for students were unequivocal. However, providing this facility and maintaining quality for a large number of simulated ward and community settings, creates difficulties in terms of digital resource production, storage, access, and maintenance. That is the next challenge.

References [1] M.A. Albanese, S. Mitchell, Problem based learning: a review of literature on its outcomes and implementation issues, Acad. Med. 68 (1) (1993) 52—81. [2] Newman M. Project on the effectiveness of problem based learning. Available at http://www.hebes.mdx.ac.uk/ teaching/Research/PEPBL, 2002.

eLearning techniques supporting problem based learning in clinical simulation [3] A. Johnston, R. Tinning, Meeting the challenge of problem based learning: developing the facilitators, Nurse Education Today 21 (3) (2001) 161—169. [4] M. Crotty, Clinical role activities of nurse teachers in Project 2000 programmes, J. Adv. Nursing 18 (1993) 460—464. [5] M. Crotty, T. Butterworth, The emerging role of the nurse teacher in project 2000 programmes in England: a literature review, J. Adv. Nursing 17 (1992) 1377—1387. [6] Peach L. Fitness for Practice, the UKCC Commission for Nursing and Midwifery Education. London: United Kingdom Central Council for Nursing, Midwifery and Health Visiting; 1999 September. [7] S. Barab, T.M. Duffy, From practice fields to communities of practice, in: D. Jonassen, S. Land (Eds.), Theoretical Foundations of Learning Environments, Erbaum, New Jersy, 1999. [8] H. Burgess, Putting the e into EAL, in: SiSWE Symposium: the place of eLearning in SiSWE—–aiding learners by using technology, February 28, 2003, Heriot Watt University, 2003. [9] R. Mason, Ensuring the learning in eLearning, in: SiSWE Symposium: the place of eLearning in SiSWE—–aiding learners by using technology, February 28, 2003, Heriot Watt University, 2003. [10] M.J. Lewis, R. Davies, D. Jenkins, M.I. Tait, A review of evaluative studies of computer-based learning in nursing education, Nurse Education Today 21 (2001) 26—37. [11] Britain S, Liber O. A framework for the pedagogical evaluation of virtual learning environments. Available at http://www.jtap.ac.uk/reports/htm/jtap-041.html, JISC, Technology Applications Programme, 1999. [12] S.W. Draper, M.I. Brown, F.P. Henderson, E. McAteer, Integrative evaluation: an emerging role for classroom studies of cal, Comput. Education 26 (1996) 17—32. [13] S.W. Draper, Prospects for summative evaluation of cal in higher education, Assoc. Learning Technol. J. 5 (1) (1997) 33—39. [14] C. Gunn, Cal evaluation: what questions are being answered? A response to the article integrative evaluation, in: Draper, et al. (Eds.), Computers and Education, 27, 1996, pp. 157—160. [15] C. Gunn, CAL evaluation: future directions, Assoc. Learning Technol. J. 5 (1) (1997) 40—47. [16] N. Rushby, Quality criteria for multimedia, Assoc. Learning Technol. J. 5 (2) (1997) 18—30.

533

[17] D. Lowe, W. Hall, Hypermedia and the Web: An Engineering Approach, Wiley, London, 1999. [18] M. Tessmer, Planning and Conducting Formative Evaluations, Kogan Page, London, 1993. [19] J. Preece, H. Sharp, D. Benyon, S. Holland, T. Carey, HumanComputer Interaction, Addison-Wesley Publishing Company, Wokingham, 1994. [20] D. Leeder, Reusable Learning Objects for Professional Education, in: SiSWE Symposium: the place of eLearning in SiSWE—–aiding learners by using technology, Heriot Watt University, 2003. [21] A. Bandura, Social Foundations of Thought and Action: A Social Cognitive Theory, Prentice-Hall, Englewood Cliffs, New Jersey, 1986. [22] J.E. Murdock, P.J. Neafsey, Self-efficacy measurements: an approach for predicting practice outcomes in continuing education? J. Continuing Education Nursing 26 (4) (1995) 158—165. [23] F. Pajares, Self-efficacy in academic settings, in: Annual Meeting of the American Educational Research Association, April 18—22, 1995, San Francisco, 1995. [24] M.B. Hodge, Do anxiety, math self-efficacy, and gender affect nursing students’ drug dosage calculations? Nurse Researcher 24 (4) (1999) 36—41. [25] D. Goldenberg, C. Iwasiw, E. MacMaster, Self-efficacy of senior baccalaureate nursing students and preceptors, Nurse Education Today 17 (1997) 303—310. [26] V.E. O’Halloran, S.E. Pollock, T. Gottlieb, F. Schwartz, Improving self-efficacy in nursing research, Clinical Nurse Specialist 10 (2) (1996) 83—87. [27] C. Docherty, The instructional design and evaluation of a multimedia program to help mentors develop skills in assessing student nurses’ clinical performance, Glasgow University, Glasgow, 2002. [28] O. Werner, G. Schoepfler, Systematic Field Work, Sage Publications, London, 1987. [29] J.T. Mayes, Learning Technology and Groundhog Day, in: W. Strang, V. Simpson, D. Slater (Eds.), Hypermedia at work: Practice and Theory in Higher Education, University of Kent Press, Canterbury, 1995. [30] D.F. Polit, B. Hungler, Nursing Research Principles and Methods, fifth ed., Lippincott, Philadelphia, 1995. [31] B. Collis, Tele-learning in a Digital World: The future of Distance Learning, International Thomson Computer Press, London, 1996.