A Quality Improvement Evaluation Case Study

A Quality Improvement Evaluation Case Study

A Quality Improvement Evaluation Case Study Impact on Public Health Outcomes and Agency Culture William C. Livingood, PhD, Radwan Sabbagh, MD, MPH, St...

327KB Sizes 1 Downloads 124 Views

A Quality Improvement Evaluation Case Study Impact on Public Health Outcomes and Agency Culture William C. Livingood, PhD, Radwan Sabbagh, MD, MPH, Steve Spitzfaden, MS, Angela Hicks, RN, Lucy Wells, MAOM, Suzannah Puigdomenech, Dale F. Kramer, PhD, Ryan Butterfield, MPH, William Riley, PhD, David L. Wood, MD, MPH Background: Quality improvement (QI) is increasingly recognized as an important strategy to improve healthcare services and health outcomes, including reducing health disparities. However, there is a paucity of evidence documenting the value of QI to public health agencies and services.

Purpose: The purpose of this project was to support and assess the impact on the outcomes and organizational culture of a QI project to increase immunization rates among children aged 2 years (4:3:1:3:3:1 series) within a large public health agency with a major pediatric health mission. Methods: The intervention consisted of the use of a model-for-improvement approach to QI for the delivery of immunization services in public health clinics, utilizing plan-do-study-act cycles and multiple QI techniques. A mixed-method (qualitative and quantitative) model of evaluation was used to collect and analyze data from June 2009 to July 2011 to support both summative and developmental evaluation. The Florida Immunization Registry (Florida SHOTS [State Health Online Tracking System]) was used to monitor and analyze changes in immunization rates from January 2009 to July 2012. An interrupted time-series application of covariance was used to assess signifıcance of the change in immunization rates, and paired comparison using parametric and nonparametric statistics were used to assess signifıcance of pre- and post-QI culture items.

Results: Up-to-date immunization rates increased from 75% to more than 90% for individual primary care clinics and the overall county health department. In addition, QI stakeholder scores on ten key items related to organizational culture increased from pre- to post-QI intervention. Statistical analysis confırmed signifıcance of the changes. Conclusions: The application of QI combined with a summative and developmental evaluation supported refınement of the QI approach and documented the potential for QI to improve population health outcomes and improve public health agency culture. (Am J Prev Med 2013;44(5):445– 452) © 2013 American Journal of Preventive Medicine

Introduction

Q

uality improvement (QI) approaches and methods are increasingly being recognized as critical strategies for reducing health disparities and increasing the effectiveness of healthcare systems.1

From the Center for Health Equity & Quality Research (Livingood, Kramer, Butterfıeld, Wood), Department of Pediatrics (Wood), Department of Neurology (Kraemer), University of Florida College of Medicine-Jacksonville; Duval County Health Department (Livingood, Sabbagh, Spitzfaden, Hicks, Wells), Bureau of Immunization (Puigdomenech), Florida Department of Health, Jacksonville, Florida; JPHsu College of Public Health (Livingood, Butterfıeld), Georgia Southern University, Statesboro, Georgia; and College of Public Health (Riley), University of Minnesota, Minneapolis, Minnesota Address correspondence to: William C. Livingood, PhD, University of Florida, College of Medicine–Jacksonville, CHEQR, Tower II, 6th Floor, Suite 6015, Jacksonville FL 32209. E-mail [email protected]. 0749-3797/$36.00 http://dx.doi.org/10.1016/j.amepre.2013.01.011

The utility of QI has been demonstrated and reported for health care over several decades,2,3 but QI applications for public health or population health are a relatively recently reported phenomena, with few documented effects.4 However, the use of QI for public health is increasingly being recognized for its potential to address many challenges, including the need to document success and improve performance and increase effectiveness as the public sectors are stressed with new and emerging threats with dwindling resources.5–16 Immunizations are particularly relevant for public health QI applications. In particular, immunizations have been credited with major increases in life expectancy and improvements in child survival during the 20th century, and they are recognized as one of the greatest public health accomplishments of that century.17 Healthy People

© 2013 American Journal of Preventive Medicine • Published by Elsevier Inc.

Am J Prev Med 2013;44(5):445– 452 445

Livingood et al / Am J Prev Med 2013;44(5):445– 452

446 18

2020 reports that for each birth cohort vaccinated with the routine immunization schedule, 33,000 lives are saved, 14 million cases of disease are prevented, $9.9 billion is saved in direct healthcare costs, and $33.4 billion is saved in indirect costs. However, approximately 42,000 adults and 300 children in the U.S. die each year from vaccine-preventable diseases.18 With the existence of clearly identifıed evidence-based methods for increasing immunization rates,19 –21 but less than optimal rates at state and local levels,22 it would appear that more-optimal immunization rates would be achieved through QI efforts to increase and improve implementation of well documented evidencebased approaches. However, multicomponent immunization interventions are rarely tested as QI interventions, especially among disadvantaged populations.23 The Duval County Health Department (DCHD), the largest County Health Department Pediatric Medicaid service provider in Florida, but serving a community with consistently low immunization rates and consistently reporting low immunization rates for its own clinics, selected immunization as a focus of a Robert Wood Johnson Foundation (RWJF) program for building the evidence for QI in public health. Duval County has the highest proportionate African-American population of the larger counties in Florida (⬎30%), and the clinics are located in proximity to parts of the county with high proportions of low-income households. All seven DCHD primary care clinics and the dedicated immunization clinic participated in the QI during the course of this project. The public health outcome goal for DCHD RWJF’s supported QI project was to increase the immunization

rates for children aged 2 years (4:3:1:3:3:1 series) in the DCHD clinics from 75% to 90%. This goal was to be achieved through two major activities: (1) a QI initiative focused on the immunization rates for children aged 2 years and (2) summative and formative evaluation of the QI immunization project.

Methods Quality Improvement Intervention The Model for Improvement24 was selected as the QI approach to increase immunization rates, which involved the use of the plando-study-act (PDSA) cycles for QI. Six formal PDSA cycles were completed in addition to other less-formal rapid-cycle PDSAs during the QI evaluation project, which was implemented from July 1, 2009, to June 30, 2011. The performance measures for monitoring the up-to-date immunization rates for children aged 19 –35 months (four diphtheria, tetanus and pertussis [DTaP]; three polio; one measles, mumps, and rubella (MMR); three Haemophilus influenzae type B [HIB]; three hepatitis; one Varicella series) were rates that were generated from Florida SHOTS (State Health Online Tracking System), the state immunization registry. Immunization rates continued to be monitored following the end of the project. Table 1 displays the populations of children aged 2 years and the up-to-date immunization rates for each of the seven primary care clinics and the dedicated (drop-in) clinic. More-formal PDSA cycles included enterprise PDSA: reminder/recall data review, improved communication with parents, missed opportunity rapidcycle PDSA (Appendix A, available online at www.ajpmonline. org). The pilot clinic was identifıed during the fırst month (July 2009) and the plan phase of the fırst PDSA cycle commenced in the pilot clinic in August 2009. Active engagement of the enterprise-level QI team in enterpriselevel PDSA began in February 2010. QI techniques included the use

Table 1. Immunization up-to-date percentages for children aged 2 years (24 months–35 months) for 4:3:1:3:3:1 series July 2009

July 2010

July 2011

July 2012

DCHD clinics

%

n

%

n

%

n

%

n

DCHD totals

75

1509

82

1504

90

1369

91

1333

Primary care clinic A

88

41

92

50

95

75

91

67

Primary care clinic B

80

180

79

176

90

152

94

128

Primary care clinic C

68

349

81

399

89

364

91

394

Pilot primary care clinic

66

109

89

116

91

124

93

99

Primary care clinic D

82

220

82

192

92

180

95

195

Dedicated immunization clinic

73

134

77

86

82

92

88

104

Primary care clinic E

73

278

80

270

90

213

89

204

Primary care clinic F

79

198

84

215

88

169

90

141

Note: n shows total number of children aged 2 years. The 4:3:1:3:3:1 series consists of the following: 4 diphtheria, tetanus, and pertussis (DTaP); 3 polio; 1 measles, mumps, and rubella (MMR); 3 Haemophilus influenzae type B (HIB); 3 hepatitis; and 1 Varicella zoster virus series. DCHD, Duval County Health Department (Florida)

www.ajpmonline.org

Livingood et al / Am J Prev Med 2013;44(5):445– 452 25

26

27

447

of root cause analysis, Pareto charts, fıshbone diagrams, variations of control charts,28 and data displays and process maps.29 Diffusion of the rapid-cycle PDSA to all clinics, with a focus on “missed opportunities,” began with process mapping and training in Fall 2010, although pilot clinic initiatives were shared and replicated throughout 2010, as were enterprise-level organization policy changes.

spectively through paper surveys in March through May 2011. The surveys were distributed to senior leadership (n⫽12) and clinical staff (n⫽28) identifıed by QI staff and QI immunization champions as having roles in the immunization QI project; the response rate was 90%.

Evaluation Methods

Immunization records for all children eligible for the 2-year immunization who utilized any of the eight DCHD clinics were included in the calculation of up-to-date immunization rates. All DCHD clinics providing immunizations were included in QI interventions, with one clinic serving as a pilot QI project. QI culture survey data were collected from clinic staff participating on QI teams and DCHD leadership involved with enterprise-level QI.

The primary evaluation question was, Were the QI processes effective in improving immunization rates? Secondary evaluation questions included (1) What factors contributed to success of the QI initiative? (2) What factors impeded success of the QI initiative? and (3) What were the lessons learned for the public health systems and for the DCHD? Another evaluation (emergent or grounded) question arose as the QI “culture” emerged as a critical issue. The emergent question was, Did the QI process change the organizational culture reflected in key characteristics of QI practice during normal operations? Evaluation was intended to be both summative and formative. Summative evaluation focused on the primary research question and potential lessons learned for DCHD and possible broader public health systems. The formative evaluation focused on using fındings as data became available, to inform decision making for QI, similar to the concepts of Developmental Evaluation.30,31 In this context, formative/developmental evaluation is highly complementary to the data-driven decision making of QI, as both emphasize actively using data and results to improve practice during the intervention rather than maintaining fıdelity to a predefıned intervention.32 A mixed-method33,34 (quantitative and qualitative) research design was used to accomplish the multiple roles (summative, formative, and developmental) of evaluation.

Quantitative Data Collection The Florida Immunization Registry (SHOTS) was the primary source for monitoring immunization rates and providing feedback on impact of the QI project. For consistency, data were systematically extracted monthly (July 2009 –July 2012) on the 4th day of the month or the fırst week day after the 4th if the 4th occurred on a holiday or weekend. Immunization rates for an additional 5 months before the beginning of the project (starting January 1) were obtained from the registry retrospectively. In addition to immunization rates, SHOTS provided data for Reminder/Recall and, combined with the clinical record system, provided data on Missed Opportunities. As the culture of QI became a clear issue from qualitative data early in the project, and with recognition of the limitations of qualitative methods in quantifying and assessing signifıcance of the culture change, QI culture instrumentation was added as an exploratory research approach to complement instrumentation development that was already underway with practice-based QI research in Georgia.35 This instrument was concerned less with individual QI techniques, and was focused instead on underlying principles of quality improvement such as data-driven decision making, involvement of everyone in ownership of problems and solutions, and emphasis on positive constructive improvement of process rather than blaming individuals. The ten items selected from Schouten’s QI Culture Assessment Instrument36 reflected organizational characteristics associated with a QI culture, and pre–post perceptions of these characteristics were collected retroMay 2013

Sample

Quantitative Analysis Immunization rates of children aged 2 years from January 2009 to July 2012, from SHOTS-based DCHD-aggregated clinics and individual clinics, were the primary tool for monitoring and assessing impact of this QI project. A mixed-model, repeated-measures ANCOVA model was fıt to these immunization data to assess the signifıcance of changes over time before and after the implementation of the QI program (interrupted time-series design37). Since the full-scale implementation of the QI program across all clinic was implemented during 2010, rate of vaccinations for this year were excluded from the analysis, creating an interrupted time-series design. Three auto-correlation covariance structures were fıt to these data, and the optimal structure was determined using Akaike’s corrected information criterion. The optimal covariance structure in this case was heterogenous, fırst-order autoregressive correlations. The outcome variable for the ANCOVA model was rate of vaccination, and the predictors were month (starting with January 2009 as Month 1 and ending with July 2012 as Month 43); clinic (eight clinics included); intervention (“before” intervention for 2009 and “after” intervention for 2011 and 2012); and the interaction of month and clinic. The year 2010, when the full QI project was implemented, was the time frame for the interrupted period. This ANCOVA model considers a trend over time that can shift and/or change slope (with the interaction of month and intervention) after the intervention commences. Given a signifıcant difference among the clinics, all possible pairwise comparisons of clinic means were evaluated using differences in least squares means applying Tukey-Kramer adjusted signifıcance levels to control for multiple comparisons.

Analysis of Quality Improvement Culture Analysis of QI culture38 was evaluated using parametric and nonparametric matched-pair comparison tests (paired-sample t-test, sign test, and Wilcoxon signed-rank test) to compare pre and post perceptions of QI culture on selected items adapted from the Schouten QI Collaboratives Instrument.36

Qualitative Data Collection Qualitative data were obtained from three sources from July 2009 through July 2011: observation of QI meetings, interviews, and archival data. Evaluation staff observed the group dynamics, group interaction, and the processes of implementing QI initiatives, including pilot clinic team meetings, enterprise team meetings, and

Livingood et al / Am J Prev Med 2013;44(5):445– 452

Qualitative Data Analysis Three analytic approaches39 were applied: (1) theory-driven thematic analysis40 using communication and group process theory to guide observations and analysis of data; (2) data- driven (grounded theory) thematic analysis41 based on themes that emerge from the data; and (3) interpretive phenomenological analysis,42 concerned with the meanings that those experiences hold for the participants, facilitated by the participatory approach of involving QI staff, immunization program staff, and clinical staff with evaluators in reviewing and assessing observations, interviews results, archival data, and immunization data. Analysis and interpretation of qualitative data was an ongoing participatory process beginning in August 2009 and continuing through August 2011, with more limited summative analysis by the current authors continuing through July 2012.

Results Tracking and reporting of immunization rates for each of the public health clinics and the aggregated county health department clinics documented consistent improvement from the beginning of the project (Figure 1). The ANCOVA model yielded a trend for month (slope⫽0.000480, p⬍0.0001); a difference among the DCHD clinics (p⫽0.0442); an interrupted interval effect (an increase of 0.1412, p⫽0.0002); and a month-by-interval interaction (slope⫽– 0.0047, p⫽0.0015) reflecting a declining monthby-month slope after the primary implementation year. Although there was a signifıcant difference among the clinics overall, pairing of clinics for analysis did not yield signifıcant differences for any pair when adjusted for multiple comparisons. Changes in outcomes were complemented by perceived changes in QI culture, which was assessed with retrospective pre–post assessment using selected Schouten QI Culture items. The selected items and the pre–post comparison of QI culture assessment are displayed in Table 2. All differences between perceived pre- and post-QI culture characteristics were signifıcant. Qualitative methods yielded important insights about both the challenges and supports for

100 95 90

90% target

85 80 75 70

Full QI-evaluation implementation

Pilot

65 60 55

5/4/2012

7/4/2012

1/4/2012

3/4/2012

9/4/2011

11/4/2011

7/4/2011

5/4/2011

3/4/2011

1/4/2011

9/4/2010

All DCHD clinics

11/4/2010

7/4/2010

5/4/2010

3/4/2010

1/4/2010

9/4/2009

11/4/2009

5/4/2009

3/4/2009

7/4/2009

Funded total QI-eval project period

50 1/4/2009

trainings. Observers used semistructured formats for notes that included group dynamics, communication patterns, and content. Semistructured interviews were performed by evaluation staff to obtain employee’s perspectives of the QI process. Interviews were conducted with key DCHD stakeholders, pilot team members, QI technical staff, and clinic quality-assurance teams. Plan-do-study-act materials and results used to improve immunization rates that were compiled by the Quality Improvement Offıce were major parts of archival records. Notes from the weekly evaluation workgroup meetings that were conducted between the evaluation workgroup and the QI project facilitators also were archived. Notes of these meetings, which included discussion of PDSA activities reported by the QI Offıce staff, reports of evaluation staff observations, and emerging issues and progress in implementing the QI initiatives, were summarized and stored electronically.

Percentage

448

Pilot clinic

Figure 1. DCHD primary care children aged 2 years who completed the 4:3:1:3:3:1 immunization series, January 2009 –July 2012 Note: n⫽1299 –1744 children, aged 2 years, enrolled per month. Mean population, at beginning of each month, of children aged 2 years⫽1493. DCHD, Duval County Health Department (Florida); eval, evaluation; QI, quality improvement

implementing QI within DCHD. Appendix B (available online at www.ajpmonline.org) summarizes selected challenges and supports. Many of the challenges were overcome through a combined QI and formative/developmental evaluation approach that included the use of (1) both qualitative and quantitative data to inform problem solving and (2) a participatory evaluation design, which complemented QI principles of engagement, particularly for the formative evaluation approach. This participatory evaluation approach/design was not originally planned but substantially informed the interpretation of data and fındings. It involved key stakeholders in the project and reinforced the organization’s prioritization for this effort. Most importantly, the barriers and challenges were identifıed as problems and became a source of formative feedback used to modify and refocus QI efforts. One example of how QI PDSA was integrated with mixed-method evaluation feedback is the weekly meetings of evaluators, QI TA staff, QI primary care staff, and immunization staff to discuss progress and review data and information. The practice of delaying immunizations to use immunizations to encourage families to bring children back for scheduled visits was seen as complementing the medical home concept. When this practice was identifıed during the QI-Evaluation meetings, the leadership quickly clarifıed agency policy that all immunizations should be given at the earliest opportunity within CDC guidelines. When it became apparent that the results of root cause analysis focused more on external factors outside of the control of the QI teams, the evaluation team worked with the QI team using logic models to integrate evidence-based approaches with root cause analysis. This was particularly www.ajpmonline.org

Livingood et al / Am J Prev Med 2013;44(5):445– 452

Table 2. Pre–post comparison of organizational QI culture (n⫽36) Mean scorea Item

Pre

Post

1. My unit supports goals and activities for quality improvement.

3.629

4.286

2. Management prioritizes success for quality improvement.

3.8

4.229

3. Members of my unit were directly involved in changes for quality improvement.

3.8

4.143

4. Members of my unit are motivated in implementing changes for quality improvement.

3.629

4.057

5. Members of my unit are motivated in implementing changes for quality improvement.

3.429

4

6. Goals are readily measurable for quality improvement.

3.371

4.2

7. My unit uses measurements to plan changes for quality improvement.

3.857

4.257

8. My unit considers continuous improvement as part of working process.

3.857

4.257

9. My unit tracks progress continuously.

3.571

4.229

3.629

4.171

10. Information, ideas, and suggestions are actively exchanged for quality improvement. a

Range of scores: 1 (strongly disagree) to 5 (strongly agree).

important to focus the QI efforts on evidence-based interventions that primary care staff could use to improve their performance, rather than fınding fault with others for low performance.

Discussion The DCHD previously had success with using QI techniques to improve revenue and other administrative performance measures. This project was the fırst major large-scale systematic organizational QI initiative within DCHD to improve performance on a public health outcome. The use of QI resulted in what would be considered breakthrough improvement in immunization rates. DCHD achieved a 91% up-to-date overall rate, and all but one of the specifıc clinics achieved at least a 90% up-todate immunization rate for the 4:3:1:3:3:1 series. The one sub-90% clinic achieved an 88% up-to-date immunization rate for the 4:3:1:3:3:1 series, a major improvement from the consistently lower immunization rates before the QI project. The organization has greatly expanded the use QI initiatives (from zero requested QI initiatives during the year before the May 2013

449

project to 13 in the last calendar year of the project). t-test Sign rank Specifıc requests were for more-formal PDSA cycles, SD p p following and in conjunc1.207 0.0002 ⬍0.0001 tion with the reporting of improvements in performance measures related to 1.236 0.0168 0.0089 immunizations. The intraorganizational approach to 1.198 0.0437 0.0359 multiple division collaboration on a common public 1.117 0.0053 0.0068 health issue also has been expanded. The developmental eval1.027 ⬍0.0001 ⬍0.0001 uation fındings focused on monitoring and using performance measures to in1.091 ⬍0.0001 ⬍0.0001 form decision-making for quality improvement as 0.645 0.0009 0.0010 data became available. In addition to the formative 0.645 0.0009 0.0010 and developmental evaluation approach, the various evaluation methods were 0.990 0.0004 0.0001 not focused on assessing the impact of a specifıc QI tech0.810 0.0004 0.0004 nique or a specifıc PDSA cycle. With supporting quality improvement as the primary goal of the evaluation, the focus of assessing impact was on the overall effect of the multifaceted QI efforts on the main outcome (immunization rates) and the QI culture of the organization—what is frequently referred to as Big QI.38 Examples of QI organization change include: the perceived resistance to QI at the beginning of the project was at least partially due to the use of extensive, large-scale, time-consuming application of QI techniques. For example, multiple-day retreats to examine all possible causes of a problem, or involvement of large numbers of managers to correct a simple problem were perceived negatively. Feedback on these negative perceptions was used to shape the QI focus to emphasize more small-scale rapid-cycle PDSAs. A major barrier in using data to guide problem solving and decision making was the lack of linkage between Department of Health data systems for clinical record-keeping and the immunization registry that would permit the identifıcation of Missed Opportunities. Consequently, a software program was developed and utilized to bridge the two systems using Windows EXCEL. Progress was substantial throughout the project and continued beyond this QI evaluation project (Figure 1).

450

Livingood et al / Am J Prev Med 2013;44(5):445– 452

Progress was accelerating as the QI project moved toward completion of the evaluation cycle, and data were still being monitored and displayed a year after the evaluation was completed. Statistical analysis affırmed improved outcomes (immunization rates) and improved QI culture. Because each term in the ANCOVA model interrupted time-series design was signifıcant, the model suggests there was a small, increasing trend in vaccination rate before the intervention; over 1 year, the increase in vaccination rate was about 0.0058, or 0.58%. It should be noted that the application process combined with the early stages of planning QI including the pilot testing resulted in an organization focus on immunization rates. The organization focus from the top to the front-line staff is likely to have accounted for the improvement during this period, particularly since archival data did not show any improvement before developing this QI project. The full-scale QI intervention led to an increase in the vaccination rate (0.1412 or about 14%), which was attenuated slightly in the year after the QI evaluation project (the attenuation was about 5.6% in the fırst year after the intervention). This project provides some clear answers to the question: Is there an association between local health department organizational and administrative factors and childhood immunization coverage rates?43 The QI project clearly had benefıt and the QI culture and outcomes were sustained after the project (rates continued to be monitored for more than 1 year after the project concluded), but the lack of additional reinforcement provided by formative/developmental evaluation feedback may have resulted in the relatively small post-project attenuation. The small, but signifıcant, attenuation, if continued over 3 years would revert rates to baseline levels and suggests that a single project does not a culture make. Complementary efforts may be required to sustain and reinforce the QI organizational behaviors that would reflect institutionalization of behaviors associated with public health agency culture. The use of QI techniques within DCHD had not inspired confıdence in QI in the past, nor was there support for QI activities. QI techniques, which are perceived to be an imposition (e.g., extra work or bureaucratic requirements), may breed the same type of worker alienation that QI is intended to overcome. Particularly important to overcoming these challenges were the application of QI principles related to tracking and public reporting of organizational performance, encouraging participation of all staff involved with providing the service, and use of data to inform decision-making related to internal clinic process. These principles and practices of QI can be lost with the emphasis on specifıc QI techniques devoid of key

QI principles. With the emphasis on the three QI principles, this immunization QI project appeared to have a very positive effect on the organization use and confıdence in QI. The key QI practices of tracking and using data to inform decision making, support for QI processes, and involvement of staff in the QI process increased according to the responses to the culture items from the Schouten QI culture survey.

Limitations Limitations of this study include the application of QI in a single county health department, restricting generalizability. The retrospective pre- and post-assessment of organization culture was less than optimal, but, as stated, emerged from the qualitative fındings, resulting in more of a pilot use of the instrument than optimal application. Conducting the survey at the beginning of the project and at the conclusion would provide more-optimal results. The study also uses research and evaluation designs that are responsive to the complexity of organization change, but they are innovative applications that have not been scrutinized for this purpose and would benefıt from continued refınement and development.

Conclusion Assessing aggregated QI impact including QI culture was a major focus of the evaluation design, in contrast to trying to assess the impact of individual PDSA cycles using a more classic intervention research design, such as an RCT. Using potentially intrusive research designs intended to control all variables but a single intervention and its outcome can run counter to the purpose of changing the QI culture that involves multiple variable changes in organization practices and may be the more important strategic outcome.44 The evaluation design used for this evaluation involving formative/development evaluation, interrupted time-series design covering multiple PDSA cycles, and survey research on QI culture provides a robust model that may be more sensitive to the interactive and dynamic nature of complex systems.45 This design also complements the primary focus of the project: improvement in practice rather than research. The research design and methods used in this study that are focused on changes in organizational culture associated with QI are very different from those reporting change in rates associated with implementation of specifıc Preventive Task Force recommendations that may be linked to specifıc QI techniques or interventions.23 Clarifying and evaluating both the outcomes and characteristics of organization culture that are desired impacts of QI may be important to achieve optimal results of QI in www.ajpmonline.org

Livingood et al / Am J Prev Med 2013;44(5):445– 452

public health settings. The use of evaluation designs that are responsive to that intended purpose may be critical. The authors recognize the many QI and evaluation staff that contributed to the QI efforts and their evaluation, including Kimberly Pierce, Eulisa Morgan Murphy, Anita Davis, Violetta DeLoatch, Paula Burns, and Niketa Walawaka. The authors also acknowledge the clinical staff who engaged in the QI and who were ultimately responsible for the improved immunization rates, with special recognition of Margaret Varnedure who directs the immunization clinic. Robert Wood Johnson Foundation provided funding for this article. No fınancial disclosures were reported by the authors of this paper.

References 1. IOM. Crossing the quality chasm: a new health system for the 21st century. Washington DC: National Academies Press, 2001. 2. Boston: Institute for Healthcare Improvement. (2003). The Breakthrough Series: IHI’s Collaborative Model for Achieving Breakthrough Improvement. IHI Innovation Series white paper. www.ihi.org/knowledge/ Pages/IHIWhitePapers/TheBreakthroughSeriesIHIsCollaborative ModelforAchievingBreakthroughImprovement.asp. 3. President’s Advisory Commission on Consumer Protection and Quality in the Health Care Industry. Quality fırst: better health aare for Americans. Washington DC: Agency for Healthcare Research and Quality; 1998. www.hcqualitycommission.gov/fınal/. 4. Dilley JA, Bekemeier B, Harris JR. Quality improvement interventions in public health systems: a systematic review. Am J Prev Med 2012;42(5S1):S58 –S71. 5. Public Health Quality Forum. Consensus statement on quality in the public health system. Washington DC: DHHS, 2008. www.hhs.gov/ash/ initiatives/quality/quality/phqf-consensus-statement.html. 6. Honor=e PA, Wright D, Berwick DM, et al. Creating a framework for getting quality into the public health system. Health Aff 2011; 30(4):737– 45. 7. Riley WJ, Moran JW, Corso LC, Beitsch LM, Bialek R, Cofsky A. Defıning quality improvement in public health. J Public Health Manag Pract 2010;16(1):5–7. 8. Honoré PA, Scott W. Priority areas for improvement of quality in public health. Washington DC: DHHS, 2010. www.hhs.gov/ash/initiatives/ quality/quality/improvequality2010.pdf. 9. Honoré PA, Wright D, Koh HK. Bridging the quality chasm between health care and public health. J Public Health Manag Pract 2012; 18(1):1–3. 10. Riley WJ, Parsons HM, Duffy GL, Moran JW, Henry B. Realizing transformational change through quality improvement in public health. J Public Health Manag Pract 2010;16(1):72– 8. 11. Riley W, Brewer R. Review and analysis of quality improvement techniques in police departments: application for public health. J Public Health Manag Pract 2009;15(2):139 – 49. 12. Bender K, Halverson PK. Quality improvement and accreditation: what might it look like? J Public Health Manag Pract 2010;16(1):79 – 82. 13. Riley WJ, Beitsch LM, Parsons HM, Moran JW. Quality improvement in public health: where are we now? J Public Health Manag Pract 2010;16(1):1–2. 14. Morrow CB, Nguyen QV, Shultz RG, Murphy JM, Mignano MA. A local health department’s journey to the summit: a case study of a

May 2013

15.

16.

17. 18.

19.

20.

21.

22.

23.

24.

25. 26.

27.

28. 29.

30. 31. 32.

33.

34.

35.

451

decade of quality improvement. J Public Health Manag Pract 2012; 18(1):63–9. Lotstein D, Seid M, Ricci K, Leuschner K, Margolis P, Lurie N. Using quality improvement methods to improve public health emergency preparedness: prepare for pandemic influenza. Health Aff 2008;27(5): w328 –w339. Randolph GD, Lea CS. Quality improvement in public health: moving from knowing the path to walking the path. J Public Health Manag Pract 2012;18(1):4 – 8. CDC. Achievements in public health, 1900 –1999: Control of infectious diseases. Morb Mortal Wkly Rep 1999;48(29):621–9. DHHS. Offıce of Disease Prevention and Health Promotion. Healthy People 2020. Washington DC. www.healthypeople.gov/2020/ topicsobjectives2020/overview.aspx?topicid⫽23. Guide to Community Preventive Services. Universally recommended vaccinations: community-based interventions implemented in combination (abbreviated). www.thecommunityguide.org/vaccines/ universally/communityinterventions.html. Lemstra M, Rajakumar D, Thompson A, Moraros J. The effectiveness of telephone reminders and home visits to improve measles, mumps and rubella immunization coverage rates in children. Paediatr Child Health 2011;16(1):e1– e5. McElligott JT, Roberts JR, O’Brien ES, et al. Improving immunization rates at 18 months of age: implications for individual practices. Public Health Rep 2011;126(S2):33– 8. Smith PJ, Singleton JA; National Center for Immunization and Respiratory Diseases; CDC. County-level trends in vaccination coverage among children aged 19 –35 months—U.S., 1995–2008. MMWR Surveill Summ 2011;60(4):1– 86. Fu LY, Weissman M, McLaren R, et al. Improving the quality of immunization delivery to an at-risk population: a comprehensive approach. Pediatrics 2012;129(2):e496 – e503. Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. 2nd ed. San Francisco CA: Jossey-Bass, 2009. Agency for Healthcare Research and Quality Website. Rockville MD: Agency for Healthcare Research and Quality, 2011. www.ahrq.org. Institute for Healthcare Improvement. Pareto diagram (IHI tool). Cambridge, MA: Institute for Healthcare Improvement, 2011. www.IHI.org. Institute for Healthcare Improvement. Cause and effect diagram (IHI tool). Cambridge MA: Institute for Healthcare Improvement, 2011. www.IHI.org. American Society for Quality Website. Control charts (IHI tool). Milwaukee WI: American Society for Quality, 2004. www.ASQ.org. Institute for Innovation and Improvement Website. Process mapping— an overview (Quality and Improvement Service Tools). 2012; www. institute.nhs.uk. Patton MQ Developmental evaluation. Eval Pract 1994;15(3):311–9. Patton MQ. Developmental evaluation: applying complexity concepts to enhance innovation and use. New York: Guilford, 2011. Woodhouse LD, Toal R, Nguyen T, et al. A merged model of quality improvement and evaluation: maximizing return on investment. Health Promot Pract 2013. [Epub ahead of print] Plano Clark VL. The adoption and practice of mixed methods: U.S. trends in federally funded health-related research. Qual Inq 2010; 16(6):428 – 40. Creswell JW, Klassen AC, Plano Clark VL, Smith KC, for the Offıce of Behavioral and Social Sciences Research. Best practices for mixed methods research in the health sciences. August 2011. NIH. obssr.od. nih.gov/mixed_methods_research. Livingood W, Marshall N, Peden A, et al. Health districts as quality improvement collaboratives and multijurisdictional entities. J Public Health Manag Pract 2012;18(6):561–70.

452

Livingood et al / Am J Prev Med 2013;44(5):445– 452

36. Schouten LM, Grol RP, Hulscher ME. Factors influencing success in quality-improvement collaboratives: development and psychometric testing of an instrument. Implement Sci 2010;5:84. 37. Biglan A, Ary D, Wagenaar AC. The value of interrupted time-series experiments for community intervention research. Prev Sci 2000;1(1):31–49. 38. Duffy G, McCoy K, Moran J, Riley W. The continuum of quality improvement in public health. Q Manag Forum 2010;35(4):1, 3–9. 39. Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Q Methods 2006;5(1):80 –92. 40. Boyatzis RE. Transforming qualitative information: thematic analysis and code development. Thousand Oaks CA: Sage, 1998. 41. Strauss A, Corbin J. Basics of qualitative research: techniques and procedures for developing grounded theory. 2nd ed. Newbury Park CA: Sage, 1998. 42. Smith JA, Flowers P, Larkin M. Interpretative phenomenological analysis: theory, method and research. London: Sage, 2009.

43. Ransom J, Schaff K, Kan L. Is there an association between local health department organizational and administrative factors and childhood immunization coverage rates? J Health Hum Serv Adm 2012;34(4):418–55. 44. Walshe K. Understanding what works—and why—in quality improvement: the need for theory-driven evaluation. Int J Qual Health Care 2007;19(2):57–9. 45. Livingood WC, Allegrante JP, Airhihenbuwa CO, et al. Applied social and behavioral science to address complex health problems. Am J Prev Med 2011;41(5):525–31.

Appendix Supplementary data Supplementary data associated with this article can be found, in the online version, at http://dx.doi.org/10.1016/j.amepre.2013.01.011.

Did you know? You can track the impact of an article with citation alerts that let you know when the article has been cited by another Elsevier-published journal. Visit www.ajpmonline.org today!

www.ajpmonline.org