Evaluation of a pilot e-learning primary health care skills training program for pharmacists

Evaluation of a pilot e-learning primary health care skills training program for pharmacists

Available online at www.sciencedirect.com Currents in Pharmacy Teaching and Learning 5 (2013) 580–592 Research http://www.pharmacyteaching.com Eva...

309KB Sizes 0 Downloads 58 Views

Available online at www.sciencedirect.com

Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

Research

http://www.pharmacyteaching.com

Evaluation of a pilot e-learning primary health care skills training program for pharmacists$ Barbara Farrell, PharmDa,b,c,d,*, Brad Jennings, MEde, Natalie Ward, PhD(c)f, Pia Zeni Marks, MAe, Natalie Kennie, PharmDg,h, Lisa Dolovich, PharmDi,j, Derek Jorgenson, PharmDk, Caitlin Jones, BSc (Pharm)b,d, Ashley Gubbels, BSc (Pharm)b,d a

b

Pharmacy Department, Bruyère Continuing Care, Ottawa, Ontario, Canada Bruyère Research Institute and CT Lamont Primary Health Care Research Centre, Ottawa, Ontario, Canada c Department of Family Medicine, University of Ottawa, Ottawa, Ontario, Canada d School of Pharmacy, University of Waterloo, Ontario, Canada e Centre for Extended Learning, University of Waterloo, Waterloo, Ontario, Canada f Department of Sociology and Anthropology, University of Ottawa, Ottawa, Ontario, Canada g Summerville Family Health Team, Mississauga, Ontario, Canada h Department of Family and Community Medicine, University of Toronto, Toronto, Ontario, Canada i Department of Family Medicine, McMaster University, Hamilton, Ontario, Canada j Centre for Evaluation of Medicines, St. Joseph’s Healthcare Hamilton, Hamilton, Ontario, Canada k College of Pharmacy and Nutrition, University of Saskatchewan, Saskatoon, Saskatchewan, Canada

Abstract Objectives: The ADapting pharmacists’ skills and Approaches to maximize Patient’s drug Therapy effectiveness (ADAPT) online education program was developed using a cognitive apprenticeship model to enable practice change in interprofessional patient-focused care. The content included videos, case vignettes, real-world practice activities, and discussion boards. The evaluation objectives included assessing participant satisfaction, program effect on perception of importance of performing skills, confidence in performing skills, reported learning, aspects contributing to learning, and reported use of new skills. Methods: Mixed evaluation included qualitative (e.g., discussion boards and open-ended survey questions) and quantitative data (e.g., Likert-type scale). Analysis used immersion/crystallization in a convergent parallel approach. Results: More than 83% of respondents indicated high or very high satisfaction with most modules; there was less satisfaction with the “Orientation” and “Making Decisions” modules. More than 60% of respondents indicated increased perception of importance of skills and more than 68% felt increased confidence in performing skills. Qualitative results substantiated these ☆ Financial disclosure statement: Program development and evaluation were funded by Health Canada under the Health Care Policy Contribution Program in collaboration with the Canadian Pharmacists Association (CPhA). The evaluation budget was contracted from CPhA to the Bruyère Research Institute to ensure independent evaluation from the organization. Neither Health Canada nor CPhA were involved in the study design, collection, analysis, or interpretation of data and writing of the reports or decisions regarding manuscript submission. * Corresponding author: Barbara Farrell, PharmD, 43 Bruyère St., Ottawa, Ontario, K1N 5C8 Canada. E-mail: [email protected]

1877-1297/13/$ – see front matter r 2013 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.cptl.2013.07.005

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

581

findings and indicated gains in knowledge of skills and intention to change practice, providing many examples of initial behavior changes (e.g., incorporating a systemic approach to medication assessment). Conclusion: The results speak of the effectiveness of using a cognitive apprenticeship model for adult professional learners. Expert demonstration, authentic activities that utilize new skills, and peer support, promoted satisfaction, prompted learning and facilitated behavior change. Systematic assessment of qualitative program materials and quantitative survey data provides a robust approach to program evaluation. r 2013 Elsevier Inc. All rights reserved. Keywords: Continuous professional development; Pharmacy education; Online learning

Introduction Evolving scopes of practice and a higher level of interprofessional collaboration in primary health care are driving changes in roles and responsibilities for pharmacists.1 These new responsibilities require expanded knowledge and skills within a number of domains including patient interviewing, making complex medication-related decisions, performing and documenting comprehensive medication assessments, collaborating with prescribers, and taking responsibility for drug therapy outcomes.2 The Accreditation Council for Pharmacy Education (ACPE) competencies were updated in 2007 to ensure pharmacists graduate with skills necessary to embrace evolving scopes of practice and expanding responsibilities.3 Canadian accreditation standards were similarly updated in 2006.4 The Centre for the Advancement of Pharmaceutical Education (CAPE) educational outcomes identify the skills pharmacists must be capable of performing upon graduation, that are similar to those emphasized by ACPE competencies.5 As a result of these recent pharmacy school competency revisions, practicing pharmacists in North America who did not graduate within the last decade may lack the specialized knowledge and skills to confidently and competently practice in advanced patient care roles. Our group sought to address the needs6–8 of practicing pharmacists by developing an e-learning program. The overall goal of the “ADapting pharmacists’ skills and Approaches to maximize Patients’ drug Therapy effectiveness” (ADAPT) program is to help pharmacists improve patient care and collaboration skills and give them confidence to use these skills to change their practices. We are not aware of similar curricular efforts targeting the broad range of different skill needs of practicing pharmacists using an asynchronous e-learning approach. Program design The ADAPT program concept and initial framework was designed by a working group of the Primary Care Pharmacy Specialty Network (PC-PSN), a joint initiative of the Canadian Pharmacists Association (CPhA) and the Canadian Society of Hospital Pharmacists (CSHP). A management committee worked with the Centre for Extended

Learning at the University of Waterloo and subject matter experts to design and develop program content for presentation within the UW–ACE (ANGEL) online learning environment.9 An initial environmental scan of the online pharmacy continuing education (CE) landscape revealed that programs tended to be based on a traditional transmission-oriented model where participants are provided with the content and they demonstrate learning through assignments submitted for grading. In this model, learning is viewed as a passive, highly individual activity10 and as primarily cognitive. As Wenger notes, it is a form of learning that has a beginning and an end, is separated from the real-world context of practice, and is viewed as a result of formalized “teaching.”11 We were not able to find publications describing the impact of online CE programs in terms of transfer of learning to practice. Specific skill-based outcomes in support of the ADAPT program’s goal include the following abilities of pharmacists:

     

to to to to to to

provide comprehensive medication management, collaborate with health care providers, interview and assess patients, make evidence-based clinical decisions, document care, and develop and implement care plans.

It was recognized that pharmacists would need assistance developing change management strategies to enable a cultural shift to accommodate the desired outcomes. Therefore, change management became an implicit desired outcome of the program. Given the desired outcomes, it was evident that the pedagogical approach must facilitate both the development of knowledge and skills, as well as transfer of learning to practice. To create this learning environment, design decisions were influenced by the notion of cognitive apprenticeship, a process by which the novice learner engages in authentic (i.e., real world) activities under the guidance of experts whose involvement diminishes over time as learners gain competency.9 It is a model of instruction that works to make expert thought processes clear and explicit within the expert–novice relationship.10–12 An overview of the ADAPT design approach is outlined in Table 1.

582

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

Table 1 Overview of ADAPT Design approach Activity Subject matter experts



Learners

   

Cohort of learners (18 each) Moderators (practicing pharmacists familiar with ADAPT’s goals and with prior teaching experience)

    

Work on building the course began by defining seven modules from the learning outcomes, along with related learning objectives. Table 2 provides the names and length of the seven modules and their main goals. The course prospectus, that describes each module’s learning objectives, can be downloaded from http://www.pharmacists.ca/ index.cfm/education-practice-resources/professional-devel opment/adapt/. Unlike transmission-oriented learning objectives that view objectives as an endpoint for gaining factual knowledge, ADAPT’s learning objectives situate learning in the context of the types of practice sought by the program’s goals. This process was aided by framing the development of the learning objectives within new competencies being

Provide presentations and simulated patient encounter videos focusing on specific aspects of patient care process Provide tools, templates, and resources Watch simulations and reflect on pharmacist role in encounters Complete real-world (authentic) practice activities using tools, templates, and resources Self-assess own practice using global rating scales describing practice competencies at an expert level Complete action plans Reflect on learning within discussion boards Monitor participation; provide assistance to those having difficulty Facilitate group development and communication on discussion boards Provide individual feedback on action plans

developed for existing practitioners in primary health care.13 These competencies defined specific roles and activities that are needed “for effective patient care in the evolving primary health care field and to meet the needs of populations of patients and their communitites.”13 Dreyfus and Dreyfus’ work on skill acquisition, a model that views skill acquisition as a process of moving from rules-based thinking about knowledge and processes to a tacit understanding of situations, was used to establish common expectations around initial and terminal learning skills levels for each module.14 Once the program’s structure was established, module authors began outlining key lecture content, collecting relevant real-world samples, and designing activities to

Table 2 ADAPT module goals Module Module title and length 1

Orientation (one week)

Goal

To introduce learners to the online learning environment and underlying philosophy, course structure, and learning approach 2 Medication assessment To practice using a comprehensive medication assessment approach for a patient with multiple (two weeks) medications and conditions in order to prevent/solve medication-related problems 3 Collaboration (two weeks) To work collaboratively with all members of the primary health care team to improve patient care, as well as, develop a plan and take practical steps to move from one stage of a collaborative working relationship to the next stage with one or more physicians 4 Interviewing patients (two To increase participants’ comfort and ability in conducting focused and comprehensive interviews/ weeks) assessments with patients Break (one week) 5 Making decisions (three To integrate information from patients and the literature to make decisions about individual patients, as weeks) well as make decisions in the face of uncertainty and take responsibility for those decisions through action, follow-up, and response 6 Documentation (two To improve the effectiveness of written medication-related assessments and suggestions and to ensure weeks) that medication histories are accurate and comprehensive enough to facilitate subsequent decisionmaking 7 Putting it together (two To review key skills covered in Modules 2–6 and create a plan to implement those skills in participants’ weeks) practice settings

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

facilitate knowledge transfer. Learning activities included small group discussions, polls, peer review, and feedback, as well as activities to be conducted in the participant’s practice site. Virtual practice experiences were designed, including video case studies, where expert practitioners modeled specific skills on which the learners then reflected, and videos in which additional experts discussed challenges and successes with skill implementation. Three patient cases were presented via video and used as prompts for activities and discussions. The OSCAR Electronic Medical Record (EMR) platform was adapted as a learning and teaching tool.15 Using authentic tools like an existing EMR provided a low-risk method for participants to apply learning in practice—a key component within the cognitive apprenticeship model. The order and timing of activities was developed to approximate four hours of learning per week; each module was designed to take place over a one- to three-week period. Participant assessment was formative in nature, in keeping with the pedagogical goal of providing support for practice change by focusing on clarifying (or making explicit) expert processes. The culminating activity at the end of each module was the ADAPT Action Plan based on Specific, Measurable, Attainable, Relevant, and Time-Bound (SMART) goals. It provided participants with a space to record reflections on content, while assisting them with planning and managing practice change. Participants were asked to explain how they would implement course concepts into existing practice by stating achievable practice change goals, deadlines for accomplishing goals, assistance needed, planning and resources required, and a tracking approach to determine goals attainment. Other than learners’ participation on discussion boards, the Action Plans were the only activities assessed by moderators. Based on guidelines developed by ADAPT subject matter experts, moderators provided timely, personalized, context-specific guidance for each participant based on how course concepts intersected with their own individual goals and practice environments. To illustrate how content and activities were used for each module, the next section provides an overview of Module 2.

An illustration of ADAPT module content: Module 2—medication assessment The goal of Module 2 was to provide an opportunity for learners to practice performing a comprehensive medication assessment for a patient with multiple medications and conditions. It included an overview and introduction to the patient care process and exposed learners to specific patient care skills, such as collaboration, patient interviewing, evidence-based practice, and documentation, which are the focus of subsequent modules. A breakdown of module content and activities is found in Table 3.

583

Module 2 began with narrated presentations to provide introductory and background information on the key principles in the patient care process.16 Materials and tools used by some Canadian academic pharmacy programs were used in order to expand on concepts (e.g., Therapeutic Thought Process) and to allow consistency of approaches.17 Learning activities centered around one mock patient case. Three variations of patient–pharmacist-simulated interviews and the EMR record (provided electronically and on paper) supported specific learning activities. These activities included reflecting on and discussing differences between comprehensive and focused medication assessments, selecting patient-specific information required to identify drug-therapy problems, and developing a care plan. The patient case focused on common medical problems (e.g., hypertension, dyslipidemia, and use of natural health products) as well as specific patient complexities (e.g., non-adherence).18 Learners were required to “work up” the patient case by identifying drug-therapy problems and developing a care plan, with the focus of the module being the use and integration of the patient care process into a medication assessment, rather than on specific therapeutic content.

Objectives The objectives of our evaluation included (a) assessing participant satisfaction, (b) determining impact on participant perception of importance of skills and confidence in performing skills, (c) determining whether participants reported learning, (d) identifying aspects that contributed to learning, and (e) determining if participants had intentions to, or were able to, begin incorporating new skills in practice.

Research design and methods Mixed methods using quantitative and qualitative data were used. We followed a convergence model in which quantitative and qualitative data were collected and analyzed separately. These data were then compared, contrasted, and interpreted together.19 Quantitative methods included descriptive analysis (e.g., percentages) of Likerttype scale questions from embedded surveys at mid-point and after the final module (Appendix 1). Qualitative methods included deductive and inductive content analysis of open-ended survey questions, discussion board postings, module assignments, and participant action plans for each module. The research team was interdisciplinary, comprising of members with backgrounds in pharmacy, research methodology and education; all were involved in some aspect of program development. The study was approved by the Bruyère Continuing Care Research Ethics Board.

584

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

Table 3 Module 2 outline Desired Primary Health Care Pharmacist Competencies: Care Provider Rolea A. Assess Patients 2.2 Elicit and complete an assessment of required information to determine the patient’s medication-related and relevant health needs. 2.3 Assess if a patient’s medication-related needs are being met. 2.4 Determine if a patient has health needs that require management. B. Plan Care 2.5 Refer patients for management of priority health and wellness needs that fall beyond the scope of practice of pharmacists. 2.6 Develop a shared plan of care that addresses a patient’s medication-therapy problems and priority health needs. 2.7 Implement the care plan. C. Follow Up and Evaluate 2.8 Elicit clinical and / or lab evidence of patient outcomes. 2.9 Assess and manage patients’ new medication-related needs. Learning Objective #1: Identify essential information to elicit for a medication assessment. (Analysis) Activity Description: Time 2 hours 1. Watch Powerpoint presentation by subject matter expert summarizing patient care process and essential information to elicit for a medication assessment. 2. Watch three videos of patient pharmacist interviews with a standardized patient that focus on initial collection of patient information for a medication history, a focused assessment and a comprehensive assessment. 3. Post summary of reflective questions posed on discussion board and respond to group member postings. Discussion board is moderated.

Week 1

Learning Objectives #2: Discriminates between available problem solving frameworks and tools and applies them to assist in identifying drug therapy problems. (Analysis) Activity Description: Time 1. Watch Powerpoint presentation by subject matter expert summarizing the use of two systematic 2 hours process for identifying drug-therapy problems and how to state drug therapy problems. 2. Re-watch the focused medication assessment video and review a mock medical record. Using the systematic processes introduced, identify the drug therapy problems for the case scenario. Identify and state drug therapy problems identified in Module 2 ADAPT Action Plan. Sample medication history form provided. 3. Once completed, watch Sample Case Discussion Powerpoint presentation illustrating patient work up. 4. Reflect on the process of identifying drug therapy problems in practice by recording observations in Module 2 ADAPT Action Plan. Learning Objective # 3: Identify the components of a care plan. (Comprehension-Application) Activity Description: Time 1. Watch Powerpoint presentation by subject matter expert summarizing how to develop care plans. 0.75 hours Sample care plan template provided. 2. Watch video of pharmacist discussing care plan with a standardized patient. 3. Reflect on video and how to develop and implement care plans in practice and record observations in Module 2 ADAPT Action Plan. 4. Watch Powerpoint presentation by subject matter expert summarizing components of a follow up evaluation. Learning Objective # 4: Produce the following components of a medication assessment: compile a medication history, identify drug-therapy problems, suggest solution-focused recommendations and develop a follow up/monitoring plan. (Creating) Activity Description: Time 1. Re-watch comprehensive medication assessment video and review mock medical record for the 3 hours remainder of the drug therapy problems for the case. 2. Develop a care plan and post to the discussion board and respond to group member postings. Discussion board is moderated. 3. Self-assess care plan using Care Plan – Global Rating scale provided. 4. Once care plan is posted, review sample care plans that are released. 5. Reflect on medication assessment process considering challenges, useful evidence-based medicine resources, responsibility and collaboration and record observations in Module 2 ADAPT Action Plan.

Week 2

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

585

Table 3 Continued Learning Objective # 5: Identify steps for further learning to complete medication assessments efficiently and effectively and develop a plan to address those learning needs. (Affective – receiving and responding to phenomena) Activity Description: Time 1. Watch Powerpoint presentation of subject matter expert summarizing ways to become more 0.75 hours effective and efficient in providing medication assessments in practice. 2. Considering what was learned in the module, identify goals for future learning and develop a plan by completing the Module ADAPT Action Plan. a Competency from: Kennie N, Farrell B, Ward N, et al. Pharmacists’ provision of primary health care: a modified Delphi validation of pharmacists competencies. BMC Fam Pract. 2012;13:27.

Participants The ADAPT pilot was open to all Canadian pharmacists and program tuition was waived for those participating in the pilot evaluation. Interested pharmacists completed a self-assessment to ensure they had time for participation and space for private patient discussions. A purposive sampling strategy was utilized as discussed by Jorgenson et al.20 Data collection Mid-point and final surveys Embedded surveys were administered to all participants at mid-point and within one month following program completion (Appendix 1) using open-ended questions with free-text responses and Likert-type scale questions. The survey links appeared as activities at the end of the fourth and seventh modules; a reminder was posted on a participant announcement board and they were also sent an e-mail reminder for each survey. Discussion boards, assignments, and action plans Data collection occurred as the program unfolded. One week after the end of each module, a research assistant downloaded the discussion board, assignments, and action plan content (henceforth referred to as “activities”), blinded them, and distributed them to assigned research team members for analysis. An overview of the data collection and analysis process is presented in Figure 1. Data analysis Qualitative data analyses were conducted systematically and concurrently with data collection.21 Using the process of immersion/crystallization, analysts alternated between being immersed in the data and stepping away temporarily to identify themes or patterns noticed during the immersion period.22,23 This process was used to elicit a better understanding of the program and learner experience. Weekly subgroup and monthly analyst meetings facilitated ongoing

development of explanations and hypotheses throughout the research period.24 Activities were analyzed using a standard template created in Microsoft Word (with space for satisfaction and learning, as well as further thoughts on the data) by at least two analysts. Analysts submitted completed templates to the group and everyone met monthly over seven months. During these teleconferenced meetings, each analyst described their findings, summarizing and adding to what was captured on the templates. The co-analyst working on the same activity was then able to verify or dispute the analysis. Other team members contributed to the discussion; teleconferences were recorded and transcribed. Each module activity was reviewed by two analysis team members to improve rigor and reduce bias. The core team (lead investigator, research associate, and research assistant) then further reduced the data from activity analysis templates and survey analysis by organizing them into module-specific matrices in Microsoft Word.25 Each matrix included categories identified deductively and inductively and presented summative statements for the data from each module activity. Matrices were compared against teleconference transcripts to ensure the summative statements represented the data accurately. Quantitative survey data were recorded using Microsoft Excel and analyzed using descriptive statistics. Responses to questions ranking importance and confidence (much higher, somewhat increased, about the same, less, and much less) and satisfaction (not satisfied, neither satisfied or dissatisfied, somewhat satisfied, satisfied, and very satisfied) were grouped and percentages calculated. Responses to open-ended questions were presented in relation to the question with which they were associated. These survey findings were distributed, along with a synthesized analysis of the results, to evaluation team members for further analysis and discussion at two of the regular monthly meetings and then discussed together with a review of matrices at a final team retreat. Thus, the qualitative and quantitative data were analyzed separately and then considered together, following a convergence model to create overall themes.19

586

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

Module runs over 2-3 weeks

Data downloaded & blinded 1 week after Module completion

Module activities each analyzed by two research team members using template

Steps repeated for each Module

Data analysis teleconference to discuss activity findings; survey findings discussed after Interviewing Module and last Module

Core team meeting to reduce Module data to matrices

Full team data analysis retreat to review Module matrices and survey findings Fig. 1. Overview of data collection and analysis steps.

Results All 86 pharmacists registered in the pilot ADAPT program (September 2010–January 2011) provided consent to participate in the evaluation. They represented a broad range of practice settings from across Canada. The average age was 40 years (range 24–62), more than 70% were female (63/86), and 77% (66/86) did not have additional training beyond entry to practice credentials.21 During the program, some participants withdrew; leaving 77 participants by mid-point and 75 by the final survey. Those leaving the program were offered the opportunity to comment on the program and why they were leaving. All stated that they were leaving primarily because of personal time constraints. A companion paper will discuss participation and reasons for withdrawal. The results are presented to correspond with the first three levels of Kirkpatrick’s evaluation model for training.26 These levels include participant reaction (level 1), participant learning (level 2), and behavior change (level 3). Participant reaction is presented as satisfaction and includes what Alliger terms affective reactions (the extent to which learners enjoyed the material) as well as utility judgments (the perceived value of the learning for job performance).27 Participant learning and factors affecting learning, assessed through qualitative observation by the researchers is supported by assessment of attitude and confidence changes. Behavior change, assessed on an ongoing basis, is presented as both intent to change behavior and examples of change attempts described as participants tried out new skills. For

each section, qualitative data are followed by available quantitative data.

Pharmacist satisfaction Throughout the qualitative data analysis, participants consistently identified a number of program aspects as contributing to satisfaction with learning. These included watching how others demonstrated skills and activities that emphasized peer interaction and real-world application. For example, the video simulations of pharmacist–patient interviews were routinely identified as being useful to practice. One participant noted: “I loved having the visual [videos] to learn from, and I will always be able to think back to these videos and remember what I learned.” Participants commented consistently on usefulness of resources, tools, templates, and samples. They valued activities requiring them to interact with patients and other health care providers. The interactive discussions with peers were highlighted as fun and unique, with participants finding it interesting to see how others were trying to implement change in practice and to learn from their unique perspectives. This is illustrated as follows: “…the most useful aspect is the ability to network with pharmacists with these shared values and a variety of expertise.” Practice activities, that often forced learners out of daily routines to explore something different, made people feel uncomfortable but exultant when they succeeded. Comments such as this were common: “So gratified!!!! This 92 year old gentleman supplied me with almost every detail I

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

required for a medication assessment. This is the first time I have attempted a med assessment without a complete patient chart, I guess it is possible!” Finally, participants indicated the combination of presentations, videos, tools/samples, activities, and discussion provided reinforcement and would enable personal practice change. Participants identified issues that reduced satisfaction; this was most apparent in feedback about the “Making Decisions” module, that addressed evidence-based decisionmaking. Many participants found the module fell at a busy time of year (preparing for December holidays) and had difficulty with activity pacing. They also struggled with connecting module relevancy to their own practices, found some activities repetitive, and were frustrated by needing access to resources requiring subscriptions. Additionally, many found the module content intimidating due to lack of knowledge of statistics or unfamiliarity with computer searching. One student noted: “There’s nothing I detest more than critically analyzing an article! Unfortunately, it is a necessary evil of evidence-based medicine.” Although many participants expressed dissatisfaction with material, many also found it valuable as follows: “This exercise has given me new ideas – and likely more effective ones – to consider when answering drug-related problems.” The mid-point survey question regarding satisfaction with the first four modules (including Orientation) was answered by 45% of participants (35/77), while the final survey question regarding satisfaction with the last three modules was answered by 55% of participants (41/75). These response rates are consistent with the high end of usual response rates obtained with online surveys of education programs.29 Most respondents indicated high or very high satisfaction with most modules with fewer satisfied with the “Orientation” and “Making Decisions” modules (Table 4).

587

increased awareness of new concepts and approaches, recognition of current knowledge and skill level, and perceptions of knowledge and skill improvement, as well as program aspects that contributed to learning. Polls, practice activities, and subsequent self-reflection appeared to help learners establish baseline knowledge, self-assess, identify knowledge gaps, compare with others, and begin to recognize personal skill limitations. For example, one student noted: “I certainly see now that I have not been providing adequate follow-up for my patients.” Initial changes in knowledge were apparent as participants moved through a discovery phase and described an increased awareness of concepts and approaches. These changes in awareness can be attributed to learners’ interactions with the presentations and simulations, both in the form of critique, as well as through comparison to their own practices. We noted that participants seemed to learn as much from observing perceived errors in simulation videos as we had anticipated they would gain from watching exemplary videos. Learners consistently demonstrated the following key concepts were new and valuable: using a structured approach to identify drug-therapy problems and conduct patient interviews, using a head-to-toe assessment and physical assessment techniques, using focused clinical questions, having access to evidence-based resources, and considering guidelines in a critical fashion. Identifying differences between focused and comprehensive interviews, using lines of questioning to assess patients, asking more open-ended questions, and handling challenging scenarios were also identified as important new skills. The ability to practice and learn in a supportive environment and reflect on the experience within the safe environment of their cohort with peer feedback seemed to be important to learning. Participants gained knowledge from each other in terms of examples given and strategies used, commenting frequently along the lines of: “I got a lot of great ideas how to document from looking at the other participants’ work…” Learning needs identified early on were addressed in subsequent modules and evidence of increased knowledge

Pharmacist learning Evidence of pharmacist learning, as self-reported or observed in the various activities, is described below as

Table 4 Participant satisfaction with individual modules Module

Not satisfied, N (%)

Neither satisfied or dissatisfied, N (%)

Somewhat satisfied, N (%)

Satisfied, N (%)

Very satisfied, N (%)

No response, N (%)

Orientationa Assessmenta Collaborationa Interviewinga Decisionsb Documentationb Put togetherb

1 1 1 0 3 0 1

9 2 3 0 3 3 3

4 2 1 1 9 1 2

17 25 19 15 23 17 18

4 5 11 15 2 19 16

0 0 0 4 1 1 1

a b

(2.9) (2.9) (2.9) (7.3) (2.4)

Data from mid-point survey, N ¼ 35. Data from final survey, N ¼ 41.

(25.7) (5.7) (8.6) (7.3) (7.3) (7.3)

(11.4) (5.7) (2.9) (2.9) (22) (2.4) (4.9)

(48.6) (71.4) (54.3) (42.9) (56.1) (41.5) (44)

(11.4) (14.3) (31.4) (42.9) (4.9) (46.3) (39)

(11.4) (2.4) (2.4) (2.4)

588

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

and skills gained became more apparent as modules progressed. Significant learning, in terms of the hierarchy of appraised evidence, available resources, developing focused clinical questions and critiquing literature, was apparent in the action plan reflections for the “Making Decisions” module and linked to application in subsequent modules, despite low satisfaction scores with this module. Learners noted the quality of documentation improved using skills attained in the “Making Decisions” modules, through the “Documentation” module to the “Putting it Together” module, as described by one participant commenting their most important learning was how to document “in a clear, effective manner that provides evidence-based recommendations for drugrelated problems.” Learners routinely reflected on learning, identifying a number of skills gained. They seemed to synthesize learning in the last module as we observed evidence of learners drawing on content from all modules into their final care plans.

Pharmacist attitude changes Over the 15 weeks of the program, we observed, in the various activities, increased perception of importance of skills, greater feeling of professional responsibility for outcomes of patient care, the need to listen to and assess patients (instead of just providing information), desire for further collaboration with other health care providers, and the desire to improve the quality of documentation and to support decisions with evidence. We also observed a shift from feeling limited by external pressures of time, exemplified as follows: “I sometimes feel I do not have enough time to document all of my patient-care related activities,” to a feeling of personal accountability in order to better care for patients, stated as follows: “I will need to make time management changes to incorporate detailed documentation in my daily practice.” In the final survey, more than 60% (25/41) of respondents indicated that the importance they assigned to skills taught in ADAPT had increased (Table 5).

Effect on pharmacists’ confidence Activities initially revealed participants’ lack of confidence in many areas. Participants were preoccupied with getting a written copy of what to ask when performing a head-to-toe assessment, suggesting a lack of confidence in their ability to use this approach. Learners’ language was often imprecise, lacking conviction and certainty; terms such as “seems thorough” were commonly used. Learners demonstrated lack of confidence in knowing what or how to document and discomfort in performing new skills (e.g., physical assessment and taking notes while interviewing patients). They appeared intimidated by the evidence-based learning module (“Making Decisions”). Over the 15 weeks of the program, increasing confidence was observed. This type of comment was prevalent: “I gained confidence in my own abilities that I already possessed. I realized that I can do this.” They began noting they felt more capable of handling difficult interview situations, they felt their ability to critically evaluate a study and accept or reject its implications was better and feedback from physicians was increasing their confidence. Despite this, learners participating in the “Putting it Together” module still indicated a desire to feel even more confident as highlighted in this quote: “I still need to become more confident in assessing patients, although I see improvement.” In the final survey, most respondents felt their confidence in performing the skills taught in ADAPT had improved, though by how much varied (Table 5). The only skill area in which some participants indicated their confidence decreased was the one related to making decisions. Pharmacist behavior change Processes of undertaking behavioral change are described below in terms of contemplating or intending to change, grappling with ability to change, sharing struggles with practice activities and strategies for change, and trialing larger practice changes with observation of impact on patient care. Intention to change was apparent throughout, with people stating what they had learned as follows: “I have learned how to communicate my recommendations to the

Table 5 Participants’ perception of the importance they assign to skills and their confidence in performing skills as a result of having participated in ADAPT, N ¼ 41a Importance is... N (%)

Confidence is… N (%)

Module

Much About the Somewhat less Less same increased

Much higher

No Much Response worse

About the Somewhat Worse same improved

Much No improved Response

Assessment Collaboration Interviewing Decisions Documentation Put Together

0 0 0 0 0 0

7 6 9 12 16 11

0 1 (2) 1 (2) 0 0 0

0 0 0 2 (5) 0 0

11 9 13 11 18 10

a

0 0 0 0 0 0

All data from final survey.

16 14 11 14 11 10

(39) (34) (27) (34) (27) (24)

18 20 20 15 14 20

(44) (49) (49) (37) (34) (49)

(17) (15) (22) (29) (39) (27)

0 0 0 1 (2) 0 0

8 13 6 5 5 9

(20) (32) (15) (12) (12) (22)

22 19 22 22 18 21

(54) (46) (54) (54) (44) (51)

(27) (22) (32) (27) (44) (24)

0 0 0 0 0 1 (2)

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

physician in a clear, concise manner and determine appropriate follow-up,” then followed by strong statements about their intention to use this new knowledge: “I will use this approach in practice to improve my contribution to patient care.” Though measurable goals were not always articulated in action plans, participants clearly desired to move forward with applying knowledge and skills in practice. Struggles with making even small changes in practice were sometimes apparent. Though learners had become increasingly aware of the importance of certain skills like physical assessment, they grappled with whether they should be performing these skills and were apprehensive about how people might react. One participant stated: “It was very uncomfortable for me [physical assessment] because pharmacy to my knowledge has never been considered a hands-on profession.” However, with time and practice, we observed more comments such as: “Now I have seen how physical assessments can easily be integrated into my practice, especially in small steps. I am willing to step outside my comfort zone.” Others similarly felt new skills were important but struggled with conceptualizing how to make time or change workflow to incorporate services such as comprehensive medication management or using evidence-based medicine approaches. Practice change efforts appeared to be supported through peer encouragement in discussion boards and by positive patient and physician feedback. Participants often used discussion boards to share struggles and strategies to implement change. Here, three learners share strategies for addressing a common interviewing challenge: I hope that by improving my listening skills, specifically the use of silence, slowing down, concentrating on what the patient is saying, listening to hear—not to respond, that better listening will help me in my practice to better understand patients medication experience…. Hi, I like your idea of silence—1–2–3 count—I tend to be a bit chatty sometimes myself, I think I’ll try this technique —& see if it helps me get more info from the patients. Table 6 Common practice changes cited by participants Using more structured approaches to patient interviewing (including incorporating “head-to-toe” and physical assessments) Conducting more patient interviews Using a systematic approach to medication assessment More accurately identifying drug-related problems and developing care plans Exploring collaborative practice with other health care professionals Using a structured and more informative approach to documentation Increasing efficiency in analyzing and applying evidence to guide recommendations

589

Hi, that is a really neat trick you have to count to three before responding. I can imagine you have learned a great deal more information this way, and I might try it myself… Common practice changes cited by participants during the program are outlined in Table 6. Participants expressed beliefs that changes to their interviewing approaches resulted in “more productive” interviews, thus improving their delivery of patient care and resulting in patients often expressing thanks. They indicated that changes in documentation approaches had improved their relationships with physicians, for example: “I have been using the SOAP format including my recommendations to fax requests.... They are responding much more quickly than before.” Discussion The ADAPT online education program emphasized transfer of learning with activities designed to have participants engage in authentic practice, with guidance from experts, and by providing space for learning from each other and reflecting on knowledge and skills gained. Our results suggest this approach allowed participants to gain awareness of personal limitations, acquire knowledge and skills, change attitudes about importance of skills, and acquire personal confidence in performing them, and that this facilitated attempts to change behavior in practice. The following outlines lessons learned in terms of what worked and what did not, as well as limitations, implications, and suggested course improvements. Activities that demonstrated skills, included at-work practice activities using program tools or templates or emphasized peer interaction strongly contributed to satisfaction and learning. Polls, presentations, and video simulations helped increase awareness, reveal learning needs, and model new skills and approaches for participants. Both positive and negative modeling seemed to contribute to learning. Such modeling allows learners to build a conceptual model of targeted outcomes, makes visible the processes required to accomplish outcomes, and as a result, helps build learner confidence.11 Hearing how experts deal with problems is critical to learners developing a belief in their own capabilities; witnessing that experts struggle lets learners know that their struggles are neither unique nor a sign of incompetence.11 These results suggest that the cognitive apprenticeship model for adult learners in a professional learning context is effective. While undergraduate pharmacy programs have moved to include experiential learning (real-world practice situations) as key curricular components28–30 to improve learning in the cognitive and affective domains, few continuing education programs do so. We propose that those responsible for constructing e-learning programs in the future give the model of cognitive apprenticeship consideration, having found that it worked well within our e-learning approach.

590

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

Increased recognition of the importance of employing a patient-centered approach, interdisciplinary collaboration, and taking on a higher level of professional responsibility and accountability for the outcomes of patient care were key areas of pharmacist learning. These findings are also consistent with the benefits of experiential learning observed with pharmacy students.29 Discussion board feedback and reinforcement from peers also appeared to facilitate learning, underscoring the importance of the social dimension of learning. These findings echo work published by Hansson et al.,31 who found that online learning involving business students fosters ownership of course material, sense of community, and willingness to use learning in a new domain. The discussion board design, which included a new board every few days and minimal instructions to moderators, seemed to result in “missed opportunities” for learning. These were felt to be situations where there was potential for learning that was not realized. Moderators did not always make connections between individual’s isolated posts (e.g., that barriers to implementing care plans were similar across different practice settings), consistently provide enough detailed feedback or routinely highlight excellent individual submissions for the entire cohort. Discussion boards were sometimes used only to submit work, rather than to hold discourse and there was variation in the amount and quality of commentary within posts, so depth seemed difficult to achieve in some conversations. Limitations The initial program evaluation did not follow up participants after program completion to objectively determine how participation permanently changed their practice. However, data gathered through initial evaluation provided substantial triangulated (convergent) self-reported evidence of concrete examples of behavior changes. Evaluation is ongoing to determine whether changes in practice have been sustained. Most of the researchers involved in evaluation were involved in aspects of program conception and design, allowing for bias in interpretation of the results; therefore, responsibility for activity analysis was assigned to those not directly involved in content creation. Two researchers independently assessed and compared findings in group discussions and research staff not involved in content creation or delivery, were responsible for contributing to matrix development. The volunteer participants in the pilot were early adopters who were future-oriented and highly motivated to embrace change20; further work on the ADAPT program needs to consider how to best engage a wider spectrum of pharmacists in everyday clinical practice. Implications Evidence of satisfaction, learning, and attempts to change behavior practice, as well as participants’ positive

reaction to modeling, real-world practice activities, and social dimension of online learning suggest that this mechanism can potentially be used successfully to assist pharmacists in making necessary changes to incorporate new patient care skills in practice. Pharmacy continuing educators should take note, focusing less on the development of transmission-oriented online learning programs and more on programs that incorporate principles of experiential learning and cognitive apprenticeship. Course improvements Subsequent course changes to improve satisfaction, learning process, knowledge acquisition, and application of content included lengthened orientation module (with additional practice activities), added moderator training (utilizing guided activities and textbook resource),32 and a broader moderator role. The latter was modified to include time for expert guidance and feedback, including responding to participant questions and increased discussion board facilitation. A discussion board participation self-assessment tool was added to help participants build skills in effective discussion board use. Recommendations to address satisfaction and learning for those having difficulty with the “Making Decisions” module included more time for module completion (e.g., expand from two to three weeks), engaging learners in an earlier discussion about importance of evidence-based medicine approaches in practice to establish relevance, selection of evidence-based resources freely available to most learners, and reordering activities to focus on appraisal of clinical guidelines before randomized controlled trials and meta-analyses. Finally, a formal assessment method was recommended to facilitate evaluation of whether learners achieved expected course competencies. Summary This paper reports a systematic examination of an e-learning program designed to improve skills and confidence of pharmacists’ in effectively managing medication therapy through collaborative patient-centered care. The ADAPT e-learning program used instructional strategies incorporating expert demonstration, authentic activities and resources, as well as feedback from peers and moderators, all of which emphasized the social dimension of learning in order to aid transfer of learning of practice. Pilot participants indicated high satisfaction with most program content, articulated perceived learning, increased confidence and raised importance of skills, and made attempts to improve practice skills. Feedback and evaluation results have been used to make changes in subsequent iterations to maximize learning. The accredited program is currently running nationally with an assessment component that allows granting of a Certificate in Patient Care Skills.33,34 Any pharmacist interested in upgrading and practically

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

applying new patient care and collaboration skills will find this program beneficial. The paper also presents an innovative and practical methodology for examining learner experiences in online education programs that differs from current methods of testing via information recall. The broad inclusiveness of materials collected and systematic evaluation of such data

591

should provide a feasible and detailed methodology that can be transferred to other program evaluations. Acknowledgments We gratefully acknowledge the input of Dr. Doug Archibald in reviewing this manuscript.

Appendix 1. Mid-point and Final survey questions Mid-point survey questions: 1. 2. 3. 4. 5. 6.

What is the single most important concept discussed in Module 1–4 so far? What single question would you have liked to ask the moderator? What single change would have made Modules 1–4 better so far? What single content/reading/activity has not been useful so far? What changes have occurred to your practice as a result of what you’ve learned so far? Please indicate your level of satisfaction with your learning so far (Module 1, 2, 3 and 4)—from not satisfied, to very satisfied.

Final survey questions: 1. 2. 3. 4. 5. 6.

What is the single most important concept discussed in Modules 5, 6 and 7? What single question would you have liked to ask the moderator? What single change would have made Modules 5, 6 and 7 better? What single content/reading/activity in Modules 5, 6 and 7 has not been useful? What changes have occurred to your practice as a result of what you’ve learned in Module 5, 6 and 7? Please indicate your level of satisfaction with your learning so far (for Modules 5, 6 and 7)—from not satisfied, to very satisfied. 7. Please indicate whether you feel your confidence in your ability to competently perform the following skills (providing medication reviews, collaborating with health care providers, interviewing and assessing patients, making evidencebased clinical decisions, documenting care, developing and implementing care plans) has improved as a result of your participant in all 7 ADAPT online modules—from worse to much improved. 8. Please indicate whether you feel the importance you assign to these skills has changed as a result of your participation in all 7 ADAPT online modules—from less to much higher.

Referernces 1. Task Force on a Blueprint for Pharmacy. Blueprint for Pharmacy: The Vision for Pharmacy. 2008. Ottawa, ON: Canadian Pharmacists Association; www.blueprintforpharmacy. ca, Accessed March 6, 2013. 2. Farrell B, Pottie K, Haydt S, Kennie N, Sellors C, Dolovich L. Integrating into family practice: the experiences of pharmacists in Ontario, Canada. Int J Pharm Pract. 2008;16:309–315. 3. Accreditation Council for Pharmacy Education. Accreditation Standards and Guidelines. 2012. Chicago, IL; Accreditation Council for Pharmacy Education; www.acpe-accredit.org. Accessed March 6, 2013. 4. Accreditation Standards and Guidelines for the First Professional Degree in Pharmacist Programs, 2013. Toronto, ON: Canadian Council for Accreditation of Pharmacy Programs; http://www.ccapp-accredit.ca/site/pdfs/university/CCAPP_accred_ standards_degree_2006.pdf. Accessed March 6, 2013. 5. CAPE Educational Outcomes. 2013. Alexandria, VA: American Association of Colleges of Pharmacy; http://www.aacp.org/

6.

7.

8.

9.

10.

resources/education/cape/Pages/2004CAPEOutcomes.aspx. Accessed March 6, 2013. Austin Z, Dolovich L, Lau E, et al. Teaching and assessing primary care skills in pharmacy: The family practice simulator model. Am J Pharm Educ. 2005;69:500–507. Farrell B, Dolovich L, Austin Z, Sellors C. Mentoring pharmacists as they integrated into family practice: practical experience from the IMPACT project. Can Pharm J. 2010;143:28–36. Pottie K, Haydt S, Farrell B, et al. Pharmacist’s identity development within multidisciplinary primary health care teams in Ontario: qualitative results from the IMPACT project. Res Social Adm Pharm. 2009;5:319–326. Lave J, Wenger E. Situated Learning: Legitimate Peripheral Participation. Cambridge, UK: Cambridge University Press; 1991. Collins A, Hawkins J, Carver S. A cognitive apprenticeship for disadvantaged students. In: Means C, Chelemer C, Knapp M, eds. Teaching Advanced Skills to Students at Risk. San Francisco, CA: Jossey–Bass; 1991:216–243.

592

B. Farrell et al. / Currents in Pharmacy Teaching and Learning 5 (2013) 580–592

11. Collins A, Brown J, Holum A. Cognitive apprenticeship: making things visible. Am Educ. 1991;15(6–11):38–46. 12. Collins A. Cognitive apprenticeship. In: Sawyer R, ed. Cambridge Handbook of the Learning Sciences. Cambridge, UK: Cambridge University Press; 2006:47–60. 13. Kennie-Kaulbach N, Farrell B, Ward N, et al. Pharmacist provision of primary care: a modified Delphi validation of pharmacists’ competencies. BMC Fam Pract. 2012;13: 27–45. 14. Benner P. Using the Dreyfus model of skill acquisition to describe and interpret skill acquisition and clinical judgement in nursing practice and education. Bull Sci Technol Sci. 2004;24: 189–199. 15. OSCAR Electronic Medical Record. http://oscarmcmaster.org/? page_id=37 [serial online]; 2010 Accessed July 26, 2013. 16. Cipolle RJ, Strand LM, Morley PC. Pharmaceutical Care Practice: The Clinician’s Guide, 2nd ed., New York, NY: McGraw-Hill Companies; 2004. 17. Winslade N, Bajcar J. Therapeutic thought process algorithm. http://www.napra.org/Content_Files/Files/algorithm.pdf [serial online]; 1995. 18. Kennie N, Dolovich L. Reliability testing of a case-leveling framework for assigning level of difficulty of pharmacist’s initial patient medication assessments. J Am Pharm Assoc. 2008;48:640–647. 19. Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: Sage Publications; 2007. 20. Jorgenson D, Gubbels A, Farrell B, Ward N, Dolovich L, Jennings B. Characteristics of pharmacists who enrolled in the pilot ADAPT education program: implications for practice change. Can Pharm J. 2012;145(6):260–263. 21. Gifford S. Analysis of non-numerical research. In: Kerr C, Taylor R, Heard G, eds. Handbook of Public Health Methods. Sydney, AU: McGraw Hill Australia; 1998:543–554. 22. Borkan J. Immersion/crystallization. In: Crabtree BF, ed. Doing Qualitative Research. Thousand Oaks, CA: Sage Publications, Inc; 1999:179–194.

23. Schensul S, Schesul J, LeCompte M. Essential Ethnographic Methods: Observations, Interviews, and Questionnaires. Walnut Creek, CA: AltaMira Press; 1999. 24. Barley S. Images of imaging: notes on doing longitudinal fieldwork. In: Huber G, van de Ven A, eds. Longitudinal Field Research Methods: Studying Processes of Organizational Change. Thousand Oaks, CA: Sage; 1995:1–37. 25. Miles MB, Huberman AM. Qualitative Data Analysis: An Expanded Sourcebook. Thousand Oaks, CA: Sage Publications; 1994. 26. Kirkpatrick D, Kirkpatrick J. Evaluating Training Programs: The Four Levels, 3rd ed, San Franscisco, CA: Berrett-Koehler Publishers, Inc; 1994. 27. Alliger GM, Tannenbaum SI, Bennett W Jr, Traver H, Shotland A. A meta-analysis of the relations among training criteria. Personnel Psychol. 1997;50:341–358. 28. Katajavuori N, Lindblom-Ylanne S, Hirvonen J. The significance of practical training in linking theoretical studies with practice. High Educ. 2006;51:439–464. 29. Plake KS, Wolfgang AP. Impact of experiential education on pharmacy students’ perception of health roles. Am J Pharm Educ. 1996;60:13–19. 30. Turner CJ, Jarvis C, Altiere R. A patient focused and outcomes-based experiential course for first year pharmacy students. Am J Pharm Educ. 2000;64:312–319. 31. Hansson A, Friberg F, Segesten K, Gedda B, Mattsson B. Two sides of the coin - general practitioners’ experience of working in multidisciplinary teams. J Interprof Care. 2008;22:5–16. 32. Collison G, Elbaum B, Haavind S. Facilitating Online Learning: Effective Strategies for Moderators. Madison, WI: Atwood Publishing; 2000. 33. ADAPT patient care skills development. http://www.pharma cists.ca/index.cfm/education-practice-resources/professional-de velopment/adapt/; 2012 Accessed July 26, 2013. 34. Canadian Council for Continuing Education in Pharmacy. Policy on the Accreditation of Continuing Education Certificate Programs. 2012. Saskatoon, SK: Canadian Council on Continuing Education in Pharmacy. Accessed March 6, 2013.