Getting Research Into Practice: Which Strategies Work?

Getting Research Into Practice: Which Strategies Work?

Getting Research Into Practice Which Strategies Work? Shannon D. Scott, RN, PhD e Research findings may take one to two decades to be translated into...

283KB Sizes 0 Downloads 84 Views

Getting Research Into Practice Which Strategies Work? Shannon D. Scott, RN, PhD

e Research findings may take one to two decades to be translated into standard health care practice.

204

© 2008, AWHONN

Each year there are an enormous number of scientific discoveries made with regard to health care. These discoveries may stem from rigorous, systematic research or from careful reflection and evaluation of everyday practice. Yet, only a small proportion of these discoveries are applied to everyday clinical practice. Despite the considerable amount of money spent on health care research and societal pressure for “best practices,” until recently, little attention has been paid to ensure that the findings of research are actually implemented in routine clinical practice. Demonstrating the magnitude of the lack of implementation, recent evidence suggests that research findings may take one to two decades

to be translated into standard health care practice (Sussman, Valente, Rohrbach, Skara, & Pentz, 2006). In this column, we tackle this very issue and attempt to shorten the gap of time between the creation of research knowledge and the implementation of best evidence in clinical practice. Up to this point in the Evidence and Outcomes column, we’ve been discussing the challenges and factors related to putting research into our clinical practices. We’ve discussed implementation of research into our clinical practices in relation to the Promoting Action on Research Implementation in Health Services framework (Kitson, Harvey, & McCormack, 1998; Rycroft-Malone, 2004; Rycroft-Malone et al., 2002). The framework suggests that successful use of implementation of evidence into

http://nwh.awhonn.org

clinical practice is a function of three elements: (1) the evidence to be put into practice, (2) the environment where the research is going to be implemented, and (3) the type and extent of facilitation used to help the implementation process. While these three factors are essential to the implementation process, for those individuals charged with the research implementation task what is often sought is advice on what strategies have been used in the past and which ones are effective.

Strategies for Implementing Research in Practice In the literature, there are many different strategies to get research into clinical practice. These strategies vary greatly from sending printed materials to health care professionals to giving financial rewards for achieving the desired behavior. Many of these strategies don’t require explanation, such as educational materials, conferences and workshops; however, the focus of some strategies is not clear and thus demands explanation. Local consensus processes are methods of including the participating providers in the process and discussions to ensure that they agree with the approach to manage the particular clinical issue. Outreach visits use trained individuals to meet with providers in their clinical practice venues to provide information. Audit and feedback involve providing information on clinical performance in health care over a specified period of time (Grol & Wensing, 2005). Local opinion leaders are clinical providers who were nominated by their fellow colleagues, given their ability to influence others. The process through which the opinion leaders were recruited should be explicitly stated. Reminders are an intervention that prompts the health care provider to perform a clinical action (Grol & Wensing). Reminders can be manual or computerized. Organizational interventions are those that involve a change in the structure or delivery of health care, that is, who delivers health care, how care is organized or where care is delivered (Cochrane Effective Practice and Organization of Care Group, 2008). Multiple interventions are the use of more than one strategy at a time. The body of research that we discuss often goes under the guise or moniker of several terms, including

June | July 2008

knowledge translation, research implementation and research utilization. The similarities and differences of the various terms aside, the essence and assumptions underpinning these movements is the same—that is, it’s believed that implementing the best research evidence will result in improved patient outcomes.

Systematic Reviews There are five key systematic reviews (i.e., Bero et al., 1998; Foxcroft & Cole, 2000; Grimshaw et al., 2001; Oxman, Thomson, David, & Haynes, 1995; Thompson, Estabrooks, Scott-Findlay, Moore, & Wallin, 2007) that illustrate potential strategies to get research into practice. Reviewing the results of systematic reviews is a useful place to acquire knowledge. Systematic reviews can help clinicians and decision-makers keep abreast of the scientific literature by summarizing large bodies of evidence and helping explain differences among studies that focus on the same question. Systematic review is in fact a scientific method. Conducting a scientific review involves the application of scientific strategies to the assembly, critical appraisal and synthesis of all relevant studies that address a specific question (Cook, Mulrow, & Haynes, 1997). Put simply, systematic reviews are at the cornerstone of the implementation of research into clinical practice because they render the vast amount of scientific literature useful to decision-makers. It’s important to note that only two of the systematic reviews (i.e., Foxcroft & Cole; Thompson et al.) that we discuss deal exclusively with nurses.

It’s believed that implementing the best research evidence will result in improved patient outcomes.

Applying Findings to Nursing Practice Generalizing findings from reviews not specific to nursing must be done cautiously; however, useful results can be acquired. While health care professionals experience similar challenges in incorporating evidence, there are differences that shape how each group implements research in their practice. Two key factors are at the heart of these differences. First, the nature and structure of work among different health care professions are unique. In the case of nursing, the norm tends to be continuous patient care over a short period of time. On the other hand, physicians tend to have more episodic contact with their patients but over a longer period of

Shannon D. Scott, RN, PhD, is an assistant professor in the Faculty of Nursing, University of Alberta, Edmonton, Alberta, Canada. Address correspondence to: shannon.scott@nurs. ualberta.ca. DOI: 10.1111/j.1751-486X.2008.00324.x

Nursing for Women’s Health

205

time. Second, each health care discipline has a unique social structure. For instance, nurses typically work in hierarchical social structures as salaried employees, whereas physicians tend to work in more autonomous roles and many are not salaried but have other reimbursement structures. These differences can shape how research is used or not in clinical practice.

Which Strategies Are Most Effective?

Passive dissemination strategies are generally ineffective, whereas speaking reminders and educational outreach are more effective approaches to change health professionals’ behavior.

206

Thompson et al. (2007) explored studies that aimed to increase research use in nursing. While their literature search strategy generated more than 8,000 titles, only four studies met the inclusion and exclusion criteria. The strategies evaluated were researcher-led educational meetings (ineffective in two studies), educational meetings led by a local opinion leader (in one study, effective at increasing research use) and the formation of multidisciplinary committees to facilitate research use (one study, effective at increasing research use). The authors suggested that little is known about how to increase research use in nursing and that currently there is a lack of evidence to either support or discourage the effectiveness of specific interventions. Foxcroft and Cole (2000) conducted a Cochrane review (a type of systematic review) and explored the nursing research to understand the extent to which organizational infrastructures (developments in the structure or delivery of health care) are effective at promoting evidence-based nursing practice. They were unable to identify any studies that employed rigorous enough methods to be included in the review; yet, when the research design criteria were relaxed, seven case studies were identified. In these studies, positive outcomes from the organizational infrastructure interventions were reported. The interventions evaluated in these case studies were developments such as consortium, nursing research and professional development divisions, appointment of clinical chairs, reward systems for research productivity and the creation of nursing research committees. The outcomes assessed included factors such as increasing awareness of nursing research, increased participating in nursing research, enhanced research productivity, research dollars awarded, publication counts, proportions of higher educated nurses, self-reported

Nursing for Women’s Health

orientation to research and self-reported acquisition of new knowledge. Oxman et al. (1995) explored the literature to determine the effectiveness of different types of interventions in improving health professional performance and health outcomes (e.g., general patient management, preventive services, diagnostic services or hospital utilization). The results of their review suggested that when dissemination-only strategies were used alone, there was little or no change in health professional behavior. However, the results were less clear and consistent for more complex strategies to change health care professionals’ behavior. For instance, outreach visits and local opinion leaders ranged from ineffective to highly effective (yet an overall moderate effective was realized). Educational materials, didactic conferences, rounds and workshops failed to demonstrate change in performance or health outcomes. It’s important to note that the more interactive interventions, such as outreach visits, tended to demonstrate more effectiveness. Next, Grimshaw et al. (2001) comprehensively explored 41 systematic reviews of professional behavior strategies changes. They offered that passive dissemination strategies are generally ineffective, whereas speaking reminders and educational outreach are more effective approaches to change health professionals’ behavior. At the time of this review, these authors suggested that multiple strategies were more effective than single strategies for changing behavior; however, later work (i.e., Grimshaw et al., 2001) has called this assumption into question, suggesting that there is no relationship between the new interventions (multiple interventions) and effectiveness (effect size). Finally, in 1998, Bero et al. explored systematic reviews of interventions to promote the implementation of research findings into clinical practice. They reviewed 18 systematic reviews on topics such as dissemination and implementation of clinical practice guidelines, continuing medical education, audit and feedback, computerized decision support systems and multifaceted interventions. They found that educational outreach visits, reminders, multifaceted interventions and interactive education meetings were consistently effective strategies to get research findings implemented. Educational materials and didactic educational meetings

Volume 12

Issue 3

best evidence for clinical decisions. Annals of Internal Medicine, 126, 376–380. Foxcroft, D., & Cole, N. (2000). Organisational infrastructures to promote evidence-based nursing practice. Cochrane Database of Systematic Reviews, 3. Art No.: CD002212. DOI:10.1002/14651858. Grimshaw, J., Shirran, L., Thomas, R., Mowatt, G., Fraser, C., Bero, L., et al. (2001). Changing provider behavior: An overview of systematic reviews of interventions. Medical Care, 39(Suppl. 2), II2–II45. Grol, R., & Wensing, M. (2005). Selection of strategies. In R. Grol, M. Wensing, & M. Eccles (Eds.), Improving patient care: The implementation of change in clinical practice (pp. 122–134). New York: Elsevier.

Systematic efforts to promote evidencebased nursing practice are crucial if the benefits of research-informed practice are to become widespread in health care.

had little or no effect, and variable effectiveness was found with audit and feedback, local consensus processes, local opinion leaders and patientmediated interventions.

Kitson, A., Harvey, G., & McCormack, B. (1998). Approaches to implementing research in practice. Quality in Health Care, 7, 149–159.

(individual), meso- (unit and organizational) and macro- (system and regional) levels, with strategies to facilitate the development of infrastructures that help health professionals provide evidence-based care. NWH

Conclusions When we review the results of these reviews in sum, several themes start to emerge in terms of some potentially promising approaches to get research into clinical practice. First, passive approaches, such as the use of printed materials, do little to change health professionals’ behavior to use research. However, the use of more active or interactive approaches is promising. Systematic efforts to promote evidencebased nursing practice are crucial if the benefits of research-informed practice are to become widespread in health care. It’s crucial that nurse leaders and decision-makers respond at micro-

June | July 2008

http://nwhTalk.awhonn.org

References Bero, L., Grilli, R., Grimshaw, J., Harvey, E., Oxman, A., & Thomson, M. (1998). Closing the gap between research and practice: An overview of systematic reviews of interventions to promote the implementation of research findings. British Medical Journal, 317, 465–468. Cochrane Effective Practice and Organization of Care Group. (2008). EPOC Web page. Retrieved March 24, 2008, from http://www.epoc.cochrane.org/en/ index.html Cook, D., Mulrow, C., & Haynes, R. (1997). Systematic reviews: Synthesis of

Oxman, A. D., Thomson, M. A., Davis, D. A., & Haynes, B. (1995). No magic bullets: A systematic review of 102 trials of interventions to improve professional practice. Canadian Medical Association Journal, 153, 1423–1431. Rycroft-Malone, J. (2004). The PARIHS Framework—A framework for guiding the implementation of evidence-based practice. Journal of Nursing Care Quality, 19, 297–304. Rycroft-Malone, J., Kitson, A., Harvey, G., McCormack, B., Seers, K., & Titchen, A., et al. (2002). Ingredients for change: Revisiting a conceptual framework. Quality and Safety in Health Care, 11, 174–180. Sussman, S., Valente, T., Rohrbach, L., Skara, S., & Pentz, M. (2006). Translation in the health professions: Converting science into action. Evaluation & the Health Professions, 29, 7–32. Thompson, D., Estabrooks, C., Scott-Findlay, S., Moore, K., & Wallin, L. (2007). Interventions aimed at increasing research use in nursing: A systematic review. Implementation Science, 2, 15.

Nursing for Women’s Health

207