Using Population Dose to Evaluate Community-level Health Initiatives

Using Population Dose to Evaluate Community-level Health Initiatives

SPECIAL ARTICLE Using Population Dose to Evaluate Community-level Health Initiatives Lisa T. Harner, MA,1 Elena S. Kuo, PhD,2 Allen Cheadle, PhD,2 Su...

199KB Sizes 0 Downloads 35 Views

SPECIAL ARTICLE

Using Population Dose to Evaluate Community-level Health Initiatives Lisa T. Harner, MA,1 Elena S. Kuo, PhD,2 Allen Cheadle, PhD,2 Suzanne Rauzon, MPH,3 Pamela M. Schwartz, MPH,4 Barbara Parnell, PhD,5 Cheryl Kelly, PhD, MPH,1 Loel Solomon, PhD4 Successful community-level health initiatives require implementing an effective portfolio of strategies and understanding their impact on population health. These factors are complicated by the heterogeneity of overlapping multicomponent strategies and availability of population-level data that align with the initiatives. To address these complexities, the population dose methodology was developed for planning and evaluating multicomponent community initiatives. Building on the population dose methodology previously developed, this paper operationalizes dose estimates of one initiative targeting youth physical activity as part of the Kaiser Permanente Community Health Initiative, a multicomponent community-level obesity prevention initiative. The technical details needed to operationalize the population dose method are explained, and the use of population dose as an interim proxy for population-level survey data is introduced. The alignment of the estimated impact from strategy-level data analysis using the dose methodology and the data from the population-level survey suggest that dose is useful for conducting real-time evaluation of multiple heterogeneous strategies, and as a viable proxy for existing population-level surveys when robust strategy-level evaluation data are collected.

Supplement information: This article is part of a supplement entitled Building Thriving Communities Through Comprehensive Community Health Initiatives, which is sponsored by Kaiser Permanente, Community Health. Am J Prev Med 2018;54(5S2):S117–S123. & 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

INTRODUCTION

C

ommunities are increasingly addressing health issues through place-based initiatives aligned with the Social Ecologic Model at multiple levels (e.g., individual, family, community) across multiple sectors (e.g., school, worksite, neighborhood).1,2 This approach recognizes that health is driven by multiple behaviors, with complex cultural, economic, social, and environmental influences that are difficult to comprehensively understand and address.3–6 Key imperatives for addressing multicomponent community health initiatives (CHIs) include (1) selecting, prioritizing, and implementing the most effective portfolio of strategies in alignment with available resources; and (2) evaluating the impact of multiple overlapping strategies on population-level health. Addressing these imperatives is complicated by the heterogeneity of the strategies being implemented, from intensive programs targeting

relatively few people to more broad-based policy and environmental changes. Data are essential to assess trends, monitor the effectiveness of health interventions, and identify opportunities for improvement. Rather than attempt to evaluate individual strategies, evaluations of large-scale multisectoral initiatives typically use national and state health surveys. Although these surveys provide important information on population-level health indicators From the 1Institute for Health Research, Kaiser Permanente Colorado, Denver, Colorado; 2Center for Community Health and Evaluation, Kaiser Permanente Washington Health Research Institute, Seattle, Washington; 3 Nutrition Policy Institute, University of California, Berkeley, California; 4 Kaiser Permanente, Oakland, California; and 5Private Consultant, Woodland Park, Colorado Address correspondence to: Lisa Harner, MA, Institute for Health Research, Kaiser Permanente Colorado, 2550 S. Parker Road, Suite 200, Aurora, CO 80014. E-mail: [email protected]. 0749-3797/$36.00 https://doi.org/10.1016/j.amepre.2018.01.026

& 2018 American Journal of Preventive Medicine. Published by Elsevier Inc. This is Am J Prev Med 2018;54(5S2):S117–S123 S117 an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

S118

Harner et al / Am J Prev Med 2018;54(5S2):S117–S123

and health risks, they may not be sufficient to yield meaningful data at a local level. National- or state-level public health surveillance data may not align with the geography of a local initiative (e.g., neighborhood, census tract, ZIP code), making it difficult to sufficiently describe or make statistical inferences about the target population. They may also lack survey items that address local health concerns, and have a significant lag time between data collection and data availability.7,8 Additionally, national and state health survey questions and methodology may change over time, impacting the validity of data trends. These factors can impede realtime understanding of the initiative’s progress and datadriven decisions targeting improvement. For example, in 2011 there were changes in data collection and processing for the Behavioral Risk Factor Surveillance System and the Colorado Child Health Survey (CHS): The wording of the physical activity (PA) questions was modified, and the survey methodology was changed to include cell-phone numbers and an advanced data weighting method. Because of these changes, Behavioral Risk Factor Surveillance System and CHS data collected since 2011 cannot be accurately compared with Behavioral Risk Factor Surveillance System and CHS data collected prior to 2011. Similarly, for the rural county discussed in this paper, the survey sample size of the CHS was too small to provide estimates with acceptable levels of statistical reliability for the years of interest. Therefore, the data for children aged 5–14 years that met PA guidelines was unavailable even with the data being combined for years 2011–2013 and 2013– 2015.9,10 Given these limitations, national, state, or county data are often inadequate to inform work being implemented at a community level.11–14 To address these challenges, the population dose methodology was developed and has become fundamental to the work of Kaiser Permanente’s CHI, a multisector place-based initiative designed to promote obesity prevention, policy, and environmental change in communities.15 Although the term “dose” is being used more frequently in public health literature, there is no widely accepted definition of dose or method for dose measurement.16–18 This paper focuses on a dose–response relationship found through measurement of a Kaiser Permanente CHI youth PA initiative, building on the population dose methodology previously described by Cheadle et al.19 Using one CHI community as an example, the technical details needed to operationalize the population dose method will be explored, as well as the potential use of population dose estimates as an interim proxy for population-level survey data. The objective of the population dose methodology is to quantify community interventions to estimate their

impact at the population level, particularly in the absence of timely population-level data aligned with the behavior change being targeted. The dose method relies on realtime understanding of how community work is implemented, including its reach (number of people exposed to an intervention and assumed to be influenced by it) and strength (relative change in behavior for each person exposed). More background and details regarding operationalizing dose are available in an online toolkit.20

METHODS Dose Defined Population dose uses elements of the Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) method of combining reach and effectiveness to estimate the likely impact of a community change strategy on population-level behavior.21 Dose is the product of reach (number of people affected by a strategy divided by target population size) and strength (the effect size or relative change in behavior for each person exposed to the strategy). For example, if 50% of youth enrolled in a school district participate in a Safe Routes to School (SRTS) program that increases minutes of daily PA by an average of 2%, the population dose is 50% (reach) × 2% (strength) ¼ 1%. Conceptually, population dose estimates the average effect size of a strategy across the whole target population. Estimating Reach Reach is defined as the number of people affected by a strategy divided by target population size (the average effect across the population). For programmatic strategies, the numerator for reach is the number of participants in the program. For environmental and policy strategies, there are four options to calculate the numerator for reach: 1. observational data collected through a validated tool, such as System for Observing Play and Leisure Activity in Youth22; 2. number of people who regularly encounter an improved environment, and are likely affected by it (e.g., number of residents living within 1/4 mile of a newly renovated park); 3. population residing within geographic extent of adopted policy (e.g., city-level complete streets policy); and 4. literature standards referencing reach of evidencebased strategy.23

Estimating Strength Three approaches to rating strategy strength were developed depending on the data that are accessible. The first and preferred approach is to conduct strategywww.ajpmonline.org

Harner et al / Am J Prev Med 2018;54(5S2):S117–S123

level behavior change evaluations. For example, an SRTS strategy might collect pre and post data on students’ active transport habits that include distance to and from school and the number of times students actively transport throughout the school year. From this information, strength and dose can be estimated directly. When strategy-level results are not available, a second approach is to assign a default strength rating to the strategy based on general characteristics of the strategy as typically implemented, similar to the interventions and associated ratings outlined in the Guide to Community Preventive Services (The Community Guide).24 Minimal- and lowstrength strategies include estimated low-frequency and low-intensity programs, promotion, and environmental changes. Medium-strength strategies have greater estimated intensity or frequency of exposure. High strength ratings are only given to evidence-based strategies with demonstrated effect sizes of ≥10%. The following default effect sizes are assigned to the corresponding category: high 10%, medium 5%, low 2%, and minimal 0.5%. The strength rating guide in the dose toolkit has further details on default ratings of specific strategies.25 The third approach for estimating strength is applied when strategy-level evaluation data are lacking, but detailed information about the frequency and intensity of strategy implementation is available. This information allows the refinement of default strength ratings to reflect actual implementation in the community. For example, SRTS strategies have a default low rating, but for a walking school bus that occurs daily, the default could be increased to medium. Conversely, SRTS strategies might be rated minimal if the strategy is primarily infrequent media and promotion. Decisions on strength ratings are based on these factors: 1. Frequency of exposure—the frequency at which the environment or program is encountered; for example, a walking school bus 1 day a week versus 5 days a week. 2. Intensity of exposure—the magnitude of environmental changes or the intensity of a given program. For example, children in a walking school bus program walk 1/4 mile to and from school versus 1/2 mile to and from school. 3. Degree to which the healthy choice is the only choice—the completeness of the exposure. For example, all school buses drop off children 1/4 mile from school versus just one school bus.

Calculating Dose Dose is calculated by multiplying reach times strength. To compute dose, the authors assign strategy effect sizes of high, medium, low, and minimal strength when they do not have strategy-level evaluation data available. May 2018

S119

Effect sizes are the percentage change in behaviors resulting from exposure to the interventions. For example, a 10% change is interpreted as a 10% increase in a behavior over baseline. High strength strategies have an effect size of ≥10%; medium strength 5%, low strength 2%, and minimal strength 0.5%. In alignment with a similar approach of summing individual domain scores to obtain an intensity score,26,27 dose of multiple strategies targeting similar outcomes and target populations are summed to form a dose cluster (e.g., minutes of PA among youth). In the case of PA, summing dose clusters assumes that there is not a ceiling above which additional activity has no effect. Neither synergy (whole greater than the sum of the parts) nor saturation (lower than expected impact of adding additional strategies) is assumed. Dose clusters allow approximation of the collective impact of multiple overlapping policy, programmatic, and environmental strategies that comprise most community initiatives.

Example: LiveWell Colorado Initiative in Routt County, Colorado Routt County, a rural area of roughly 22,500 people, received LiveWell Colorado funding from 2009 to 2016. Four elementary schools in Routt County collectively serve ≅1,250 students. The Routt County coalition selected coordinated school health as a focus, overlapping policy, environmental, and programmatic strategies designed to increase PA of students. Their efforts included incorporating active living language into school wellness policies (e.g., recommend teachers provide 3– 5-minute PA boosts to students during and between classroom times, review physical education curriculum to ensure that programs are consistent with current standards and best instructional practices); adoption of evidence-based physical education curriculum; actionbased learning in classrooms (e.g., one 5-minute PA break during core classes); after-school PA programs; SRTS (e.g., signage, flashing lights at crosswalk, new sidewalk); and promotional efforts to raise awareness and educate students around active living. Dose Computations, Population Surveys Dose was computed for five individual strategies targeting youth PA that were fully implemented by the third year of the initiative in all elementary schools in Routt County. Table 1 lists the reach and strength calculations, and the dose estimates for each strategy. A detailed example for action-based learning may be helpful for understanding the approach. Teacher surveys indicated that students received an additional 4.1 minutes of PA each day in classrooms. The strength calculation included these elements:

S120

Harner et al / Am J Prev Med 2018;54(5S2):S117–S123

Table 1. Strategy-Level Dose Calculations for Physical Activity Behavior Change Strategy Safe routes to school

After-school PA programs

Action-based learning

School wellness policy Evidence-based PE curriculum

Total dose

Year 3 strategy-level calculations Reach ¼ 251 of 1,257 elementary students in Routt County ¼ 20% Strength ¼ (20 minutes increase in daily PAa/85 minutes baseline PAb) × (5/7days per week) × (6/12 months per year) ¼ 1.8% Dose ¼ 100% × 1.8% ¼ 1.8% Reach ¼ 476 of 1,257 elementary students in Routt County ¼ 37% Strength ¼ (30 minutes increase in daily PAa/85 minutes baselineb) × (3/7 days per week) × (5 weeks/52 weeks per year) ¼ 1.5% Dose ¼ 37% × 1.5% ¼ 0.5% Reach ¼ 100% students Strength ¼ (4.1 minutes increase in daily PAa/85 minutes baseline PAb) × (5/7days per week) × (8/12 months per year) ¼ 2.3% Dose ¼ 100% × 2.3% ¼ 2.3% Reach ¼ all students, 100% Strength ¼ 0.5%c Dose ¼ 100% × 0.5% ¼ 0.5% Reach ¼ 100% of students participated in PE Strength ¼ (3 minutes per week increase in PAa/85 minutes baseline PAb) × (36/52 weeks per school year) ¼ 2.0% Dose ¼ 100% × 2.0% ¼ 2.0% Population PA dose 1.8% + 0.5% + 2.3% + 2.0% + 0.5% ¼ 7.1%

Data source Teacher tallies

Attendance, class description

Teacher survey

Literature review and school policy review Class descriptions, PE teacher KII

Sum of individual PA strategies’ dose

a

For dose estimate calculations, it was assumed that school PA minutes engaged in as a result of the community interventions were in addition to established baseline minutes of PA. b When strategy baseline data were not available, the national average daily minutes of PA as established through accelerometer measures for elementary and secondary school–aged youth was used.28 c Existence of a school wellness policy does not necessarily mean that it is implemented, but updating policy language to incorporate and encourage active living indicates a shifting cultural norm for schools, so this strategy was assigned a minimal strength. KII, key informant interview; PA, physical activity; PE, physical education.

1. Amount of behavior change—4.1 additional minutes of daily PA (over the baseline national average of 85 daily PA minutes for elementary aged youth).28 2. Frequency—elementary students engaged in action-based learning 5 of 7 days per week. 3. Duration—elementary students are in school 8 of 12 months per year. As shown in Table 1, reach ¼ 100% (all students); strength ¼ (4.1/85) × (5/7) × (8/12) ¼ 2.3%; and dose ¼ 100% × 2.3% ¼ 2.3%. In addition to the strategy-level data, all elementary schools in Routt County collected population-level data from a yearly “5210” parent survey of first- through fifthgrade students for one baseline year and 3 years’ followup. The 5210 initiative promotes evidence-informed recommendations about fruit and vegetable consumption, screen time, PA, and sugar-sweetened beverages.29,30 A survey was implemented in Routt County schools during the students’ annual registration and consisted of four questions, one for each of the 5210 behaviors. The survey question for PA asked how many minutes per day the student was physically active outside of school time (walking, running, biking, swimming,

playing outside, dancing). Percentage change in students’ PA minutes from baseline was calculated for each of the three follow-up survey years. Average minutes of PA reported at baseline were compared to follow-up time periods using t-tests.

Alignment Between Estimated Dose and Measured Population Change The estimated population-level dose was 3% after 1 year, which included three strategies: SRTS, after-school PA programs, and action-based learning. In Year 2, population dose was estimated at 3.5% based on the continuing Year 1 strategies plus the strategy to update the school wellness policies. In Year 3, the population dose of the fully implemented strategies (SRTS, after-school PA programs, action-based learning, school wellness policy updates, evidence-based physical education curriculum) was estimated to be 7.1% (Table 2). The 5210 parent survey was conducted in Fall 2011 (n¼598 of 1,214 enrolled first- through fifth-grade students); Fall 2012 (n¼536 of 1,214 enrolled firstthrough fifth-grade students); Fall 2013 (n¼495 of 1,214 enrolled first- through fifth-grade students); and Fall 2014 (n¼546 of 1,257 enrolled first- through fifthgrade students), with an overall average response rate of www.ajpmonline.org

Harner et al / Am J Prev Med 2018;54(5S2):S117–S123

S121

Table 2. Dose Estimates and 5210 Survey Results for Physical Activity Behavior Change Strategy Safe routes to school After-school PA programs Action-based learning School wellness policy Evidence-based PE curriculum Total dose 5210 Population-level Survey Change in PA minutes from baseline, p-value

Year 1 dose,a %

Year 2 dose,a %

Year 3 dose,a %

0.8 0.2 2.0 N/A N/A 3.0 4.1 0.014

1.1 0.4 1.5 0.5 N/A 3.5 2.6 0.133

1.8 0.5 2.3 0.5 2 7.1 5.3 0.002

a Percent change in PA from baseline calculated using dose formulas,25 with examples for Year 3 presented in Table 1. N/A, not applicable; PA, physical activity; PE, physical education.

44%. At baseline, survey results indicated an average of 54.7 minutes PA per day. After 3 years, there was a statistically significant 5.3% increase in reported PA compared with baseline levels (po0.05). Comparing the dose estimate with the populationlevel survey results shows relatively close alignment. The 7% dose estimate is considered higher dose (i.e., a level where the authors might expect to see a statistically significant change in a population-level survey), and the change in PA activity from baseline to Year 3 shown in the population-level 5210 survey was statistically significant (po0.05). Therefore, the population change was consistent and aligned with the implementation of highdose strategies in this LiveWell community.

DISCUSSION Routt County was unique among CHI communities in that they collected both detailed strategy-level data for multiple strategies to estimate dose and annual population-level data to assess initiative impact. This breadth of data allowed the authors to illustrate the utility of dose for conducting real-time evaluation of multiple heterogeneous strategies, allowing for process evaluation and course corrections throughout the initiative. For example, the action-based learning teacher survey results were reviewed each year with stakeholders to assess progress and refine strategies as needed to increase impact. In the second year, the surveys showed a slight drop in classroom PA. After sharing the dose estimates from both years with school district superintendents, they reviewed the action-based learning process and barriers for use in the classroom, then implemented supportive actions to encourage its use (e.g., action-based learning training for teachers, incorporation of action-based learning into school wellness policy). The Year 3 teacher surveys showed an increase in classroom PA. Yearly review of population dose estimates and strategy implementation facilitate the understanding of progress, allowing for May 2018

course corrections and strategic allocation of resources to ensure sustainability of successful strategies. Having strategy reach, strength, and population-level behavior change data allowed the estimated population dose to be used for logic model confirmation. The measured population-level changes in PA aligned with the high-dose PA strategies implemented by Routt County. There was less likelihood of false positive results because findings were corroborated with data from highdose scores that indicated significant work was happening around PA. Therefore, attributing favorable population-level changes to the initiative was reasonable.31 Finally, given how well-aligned dose evaluation was with 5210 population-level survey results, dose was a viable proxy for the survey. Population-level surveys often do not meet community evaluation needs because of resource constraints, geographic scale, survey questions, timing, or suppressed data. In the case of Routt County, relevant PA data from the annual Colorado Child Health Survey for children aged 5–14 years meeting PA guidelines was suppressed even with the data being combined for years 2011–2013 and 2013–2015. Because of LiveWell funding, the school was able to conduct an annual 5210 survey for all elementary school–aged youth in Routt county. Although the population-level surveys were not sustainable long term, the strategy-level evaluations were, allowing Routt County to continue monitoring PA strategies.

Limitations The most important limitation of using dose for evaluation is that the components, reach, and strength of strategies and population-level change are difficult to measure with sufficient precision to draw firm conclusions about attribution or impact. Additionally, quality of data collection can affect accuracy and generalizability of information for other programs. Given the study design, the authors were unable to control for the variable 1–3 years of exposure that youth may have experienced. Descriptions of strategy implementation required for reach and strength estimates rely largely on progress

S122

Harner et al / Am J Prev Med 2018;54(5S2):S117–S123

reporting from the community implementers and other institutions involved (e.g., schools). These self-reported accomplishments may have been biased in favor of making changes appear to be more comprehensive and sustainable than was true in practice. Where possible, progress reporting was supplemented with secondary data for verification, and strategy-level evaluations involving direct observation and environmental assessments. The ratings of the strength component of population dose are often subjective given the lack of information from the scientific literature or strategy-level evaluations, particularly for environmental and policy strategies. Additionally, although duration and amount of PA were considered, this method does not analyze by breadth of PA (e.g., moderate PA, vigorous PA) but rather counts all PA types equally. Multiple independent raters were used to standardize ratings as much as possible. In addition, several strategy-level evaluations were conducted to estimate effect sizes and used to further refine and validate the ratings. Quantifying the cumulative dose of strategies targeting the same outcomes requires assumptions about how overlapping strategies interact within a population. For example, is synergy occurring, whereby a combined effect of action-based learning, SRTS, and after-school PA programming results in a greater overall increase in PA than would be expected from the effect of the individual strategies alone? Simply adding up dose may not be accounting for a potential larger impact as strategies reinforce each other. The authors feel that this conservative approach is justified given the lack of empirical evidence to support the synergy hypothesis.

CONCLUSIONS Despite limitations, the dose methodology provides a useful lens for interpreting population-level results and determining attribution to the initiative. In particular, the approach emphasizes real-time evaluations while implementing an initiative, and rules out chance findings in population-level changes at initiative end by only attributing changes where accompanied by high-dose strategies. Beyond schools, population dose has utility in workplaces, hospital systems, towns, or other defined populations. Beyond active living, dose has potential utility where multiple interventions are being employed to change the same behavior.

ACKNOWLEDGMENTS We would like to acknowledge the contributions of the LiveWell Colorado organization, the LiveWell Northwest Colorado coalition, and community residents for their dedication to healthy eating and active living efforts in Routt County. Cheadle, Kuo,

Harner, Rauzon, Schwartz, and Solomon contributed to development of the dose method. Harner, Kuo, and Cheadle wrote the manuscript, with input from Kelly, Rauzon, Schwartz, and Solomon. Harner, Kuo, and Parnell, conducted the design, data collection, and analysis of the evaluation in Routt county used as an example of the dose method. This project was funded by Kaiser Permanente and the Colorado Health Foundation. The authors acknowledge and thank Kaiser Permanente for financial support. No financial disclosures were reported by the authors of this paper.

SUPPLEMENT NOTE This article is part of a supplement entitled Building Thriving Communities Through Comprehensive Community Health Initiatives: Evaluations from 10 Years of Kaiser Permanente's Community Health Initiative to Promote Healthy Eating and Active Living, which is sponsored by Kaiser Permanente, Community Health.

REFERENCES 1. Cohen DA, Scribner RA, Farley TA. A structural model of health behavior: a pragmatic approach to explain and influence health behaviors at the population-level. Prev Med. 2000;30(2):146–154. https://doi.org/10.1006/ pmed.1999.0609. 2. McLeroy KR, Bibeau D, Steckler A, Glanz K. An ecological perspective on health promotion programs. Health Educ Q. 1988;15(4):351–377. https://doi.org/10.1177/109019818801500401. 3. Ebbeling CB, Pawlak DB, Ludwig DS. Childhood obesity: publichealth crisis, common sense cure. Lancet. 2002;360(9331):473–482. https://doi.org/10.1016/S0140-6736(02)09678-2. 4. Kumanyika SK, Obarzanek E, Stettler N, et al. Population-based prevention of obesity: the need for comprehensive promotion of healthful eating, physical activity, and energy balance: a scientific statement from American Heart Association Council on Epidemiology and Prevention, Interdisciplinary Committee for Prevention (formerly the Expert Panel on Population and Prevention Science). Circulation. 2008;118(4):428–464. https://doi.org/10.1161/CIRCULATIONAHA. 108.189702. 5. U.S. Office of the Surgeon General. The Surgeon General’s Call to Action to Prevent and Decrease Overweight and Obesity. Rockville, MD: U.S. DHHS, 2001. 6. Woodward-Lopez G. Obesity: Dietary and Developmental Influences. Boca Raton, FL: CRC/Taylor & Francis, 2006. https://doi.org/10.1201/ 9781420008920. 7. Shah SN, Russo ET, Earl TR, Kuo T. Measuring and monitoring progress toward health equity: Local challenges for public health. Prev Chronic Dis. 2014;11:E159. https://doi.org/10.5888/pcd11.130440. 8. Puma JE, Belansky ES, Garcia R, Scarbro S, Williford D, Marshall JA. A community-engaged approach to collecting rural health surveillance data. J Rural Health. 2017;33(3):257–265. https://doi.org/10.1111/ jrh.12185. 9. Shupe A. Colorado Department of Public Health and Environment. Behavioral Risk Factor Surveillance System (BRFSS) Dataset Details. www.chd.dphe.state.co.us/cohid/brfssdata.html. Accessed December 15, 2017. 10. Colorado Department of Public Health and Environment. VISION: Visual Information System for Identifying Opportunities and Needs website. www.colorado.gov/pacific/cdphe/vision-data-tool. Accessed June 14, 2017.

www.ajpmonline.org

Harner et al / Am J Prev Med 2018;54(5S2):S117–S123 11. Luck J, Chang C, Brown E, Lumpkin J. Using local health information to promote public health. Health Aff (Millwood). 2006;25(4):979–991. https://doi.org/10.1377/hlthaff.25.4.979. 12. Shah SN, Russo ET, Earl TR, Kuo T. Measuring and monitoring progress toward health equity: local challenges for public health. Prev Chronic Dis. 2014;11:E159. https://doi.org/10.5888/pcd11.130440. 13. Barry P, Lee SJC, Kincheloe J, et al. Independent state health surveys: Responding to the need for local population health data. J Public Health Manag Pract. 2014;20(5):E21–E33. https://doi.org/10.1097/ PHH.0b013e3182a9c0ce. 14. Fielding JE, Frieden TR. Local knowledge to enable local action. Am J Prev Med. 2004;27(2):183–184. https://doi.org/10.1016/j.amepre.2004.04.010. 15. Cheadle A, Schwartz PM, Rauzon S, Beery WL, Gee S, Solomon L. The Kaiser Permanente Community Health Initiative: overview and evaluation design. Am J Public Health. 2010;100(11):2111–2113. https://doi. org/10.2105/AJPH.2010.300001. 16. McHugh M, Harvey J, Kang R, Shi Y, Scanlon P. Measuring the dose of quality improvement initiatives. Med Care Res Rev. 2016;73(2):227– 246. https://doi.org/10.1177/1077558715603567. 17. Hasson H. Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci. 2010;5:67. https://doi.org/10.1186/1748-5908-5-67. 18. Reed D, Titler M, Dochterman J, et al. Measuring the dose of nursing intervention. Int J Nurs Terminol Classif. 2007;18(4):121–130. https://doi.org/10.1111/j.1744-618X.2007.00067.x. 19. Cheadle A, Schwartz PM, Rauzon S, Bourcier E, Senter S, Spring R. Using the concept of “population dose” in planning and evaluating communitylevel obesity prevention initiatives. Am J Eval. 2012;34(1):64–77. 20. Center for Community Health and Evaluation. Healthy dose: a toolkit for boosting the impact of community health strategies, v.1.1. https://share. kaiserpermanente.org/article/dose-creating-measuring-impact/. 21. Glasgow RE, Klesges LM, Dzewaltowski DA, Estabrooks PA, Vogt TM. Evaluating the impact of health promotion programs: using the RE-AIM framework to form summary measures for decision making involving complex issues. Health Educ Res. 2006;21(5):688–694. https://doi.org/ 10.1093/her/cyl081.

May 2018

S123

22. McKenzi TL, Marshall SJ, Sallis JF, Conway TL. Leisure-time physical activity in school environments: an observational study using SOPLAY. Prev Med. 2000;30(1):70–77. https://doi.org/10.1006/ pmed.1999.0591. 23. Brennan L, Castro S, Brownson RC, Claus J, Orleans CT. Accelerating evidence reviews and broadening evidence standards to identify effective, promising, and emerging policy and environmental strategies for prevention of childhood obesity. Annu Rev Public Health. 2011;32:199– 223. https://doi.org/10.1146/annurev-publhealth-031210-101206. 24. Task Force on Community Preventive Services. Guide to Community Preventive Services. Atlanta, GA: CDC. www.thecommunityguide.org/ topic/physical-activity. Accessed December 15, 2017. 25. Center for Community Health and Evaluation. Healthy Dose Toolkit Strength Rating Guide. https://share.kaiserpermanente.org/wp-con tent/uploads/2015/08/StrengthRatingGuide.pdf. Published March 2016. Accessed December 15, 2017. 26. Fawcett SB, Collie-Akers VL, Schultz JA, Kelley M. Measuring community programs and policies in the Healthy Communities Study. Am J Prev Med. 2015;49(4):636–641. https://doi.org/10.1016/j.amepre. 2015.06.027. 27. Frongillo EA, Fawcett SB, Ritchie LD, et al. Community policies and programs to prevent obesity and child adiposity. Am J Prev Med. 2017;53(5):576–583. https://doi.org/10.1016/j.amepre.2017.05.006. 28. Troiano RP, Berrigan D, Dodd KW, Masse LC, Tilert TT, McDowell M. Physical activity in the United States measured by accelerometer. Med Sci Sports Exerc. 2008;40(1):181–188. https://doi.org/10.1249/ mss.0b013e31815a51b3. 29. Let’s Go! Annual Report 2016. https://mainehealth.org/-/media/lets-go/ files/reports/lets-go-annual-report-year-10.pdf. Accessed December 15, 2017. 30. 5210 Gulf Coast. Healthcare resources. www.letsgogulfcoast.org/health care-resources/. Accessed December 15, 2017. 31. Cheadle A, Beery WL, Greenwald HP, Nelson GD, Pearson D, Senter S. Evaluating the California Wellness Foundation’s Health Improvement Initiative: a logic model approach. Health Promot Pract. 2003;4(2):146– 156. https://doi.org/10.1177/1524839902250767.