ARTICLE IN PRESS
Evaluation and Program Planning 31 (2008) 145–159 www.elsevier.com/locate/evalprogplan
Assessing local capacity for health intervention Moya L. Alfonsoa,, Jen Nickelsona, David L. Hogebooma, Jennifer Frenchb, Carol A. Bryanta, Robert J. McDermotta, Julie A. Baldwina a
Florida Prevention Research Center at the University of South Florida College of Public Health, 13201 Bruce B Downs Blvd., Tampa, FL 33612, USA b Sarasota County Health Department, 2200 Ringling Boulevard, Sarasota, FL 34237, USA Received 14 March 2007; received in revised form 8 January 2008; accepted 15 January 2008
Abstract Because of their location within the practice realm, participatory, community-based public health coalitions offer many lessons about implementing and sustaining local interventions. This paper presents a case study of capacity assessment at the local level. Capacity evaluation methods are presented, with emphasis on the theoretical framework used to guide the evaluation. The capacity evaluation framework presented herein was theoretically based and designed to generate practical information to facilitate the adoption of a locally tailored youth obesity prevention program, VERBTM Summer Scorecard (VSS). Using multiple methods, four aspects of community capacity were assessed, including community, knowledge and skills, resources, and power. Within each category, factors that facilitated or impeded program implementation were distinguished. The evaluation protocol was designed to generate information increasing community capacity to sustain a community-based obesity prevention program. Capacity tables were used as a program-planning tool and as a system for sharing implementation and sustainability requirements with potential adopters. This case study also explores how to use capacity assessment results to empower coalitions to serve as catalysts for development of local programs in other communities. r 2008 Elsevier Ltd. All rights reserved. Keywords: Capacity; Community; Coalitions; Implementation; Interventions
1. Introduction Minkler and Wallerstein (2002) conclude that a ‘‘major limitation of most community-organizing and community building efforts to date has been a failure to adequately address evaluation processes and outcomes’’ (p. 295). However, the problem of there being a dearth of formal evaluation initiatives and methods is changing. Participatory evaluation can indeed improve both the process and outcome of health education and promotion initiatives. For instance, in their examination of a community-based participatory research initiative, Bryant et al. (2007) found that whereas rigorous and sophisticated evaluation designs Corresponding author. Tel.: +1 813 974 4867; fax: +1 813 974 5172.
E-mail addresses:
[email protected] (M.L. Alfonso),
[email protected] (J. Nickelson),
[email protected] (D.L. Hogeboom),
[email protected].fl.us (J. French),
[email protected] (C.A. Bryant),
[email protected] (R.J. McDermott),
[email protected] (J.A. Baldwin). 0149-7189/$ - see front matter r 2008 Elsevier Ltd. All rights reserved. doi:10.1016/j.evalprogplan.2008.01.001
had obvious strengths, those presumed benefits had to be balanced against what community members perceived as unnecessary complexity. Community members preferred easy-to-understand, straightforward, and practical measures that could offer rapid feedback and facilitate their seeking funds to sustain interventions through local foundations and philanthropic organizations. Community members also offered an alternative lens for evaluating health education intervention results, thereby expanding the scope for evaluation, minimizing the ‘‘clinical trials’’ mindset that often resides with academic researchers, increasing the local relevance of some results versus others—thereby offering guidance for program re-design or providing focus for the next intervention, and diminishing the boundaries that separate ‘‘insiders’’ from ‘‘outsiders.’’ Finally, by allowing community members to make context relevant to the evaluation, academicians can gain insights for framing interventions and improving understanding of what Green (2006) refers to as ‘‘practice-based evidence’’ (p. 406).
ARTICLE IN PRESS 146
M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
Participatory, community-based public health coalitions play an important role in fostering improvement in public health outcomes (Pluye, Potvin, & Denis, 2004; Pluye, Potvin, Denis, & Pelletier, 2004). To serve in this role, coalitions must have both the generic ‘‘capacity for action needed to address any health problem’’ and community capacity required to facilitate the prevention process (Chinman et al., 2005, p. 146). Brownson, Kreuter, Arrington, and True (2006) refer to capacity as an intermediate outcome; capacity comes between the intervention and the long-term health outcome (see also Chinman et al., 2005). Many definitions of capacity have been offered. According to The American Heritages Dictionary of the English Language, Fourth Edition (2000), capacity refers to ‘‘the ability to receive, hold, or absorb’’ (http://www.bartleby. com). Capacity occurs at multiple levels—community, organizational, and individual (Chinman et al., 2005). This paper focuses at the community and organizational levels. Whereas community and organizational capacity are distinct entities, they are related. Chinman et al. provide a review of the community capacity literature, including definitions. Community capacity definitions emphasize skills (e.g., ability to garner resources) necessary to affect change (Chinman et al.). Four core dimensions of community capacity have been identified (Chinman et al.), including community (e.g., member involvement), skills (e.g., problem solving), resources (e.g., people power), and power (i.e., collective efficacy). In contrast, organizational capacity refers to the ‘‘adequacy of inputs (knowledge, financial resources, trained personnel, well-managed strategic partnerships, etc.) necessary to carry out a program and achieve desired outcomes’’ (Cassidy & Leviton, 2006, p. 149). These resources may include inputs needed to sustain as well as implement a program (Chinman, Imm, & Wandersman, 2004). Sustainability, in addition to capacity, should be considered as a process beginning with program planning (Johnson, Hays, Center, & Daley, 2004; Pluye, Potvin, & Denis, 2004; Pluye, Potvin, Denis, Pelletier, & Mannoni, 2005; Weiss, Coffman, & Bohan-Baker, 2002). From an evaluation perspective, sustainability should also be operationalized as an outcome, and it should be tracked (Pluye, Potvin, Denis et al., 2004; Weiss et al., 2002). Monitoring sustainability outcomes and providing feedback to evaluation stakeholders in a timely manner can increase community or organizational capacity to sustain effective programs (Weiss et al., 2002). Based on Pluye et al’s work, a toolkit for specifically assessing sustainability process and outcome is available online (www. cacis.umontreal.ca/perennite/index_en.htm). Much needs to be learned about the capacity required to implement and sustain local, evidence-based interventions (Chinman et al., 2005). Moreover, applied and innovative evaluation methods that allow for the assessment of such capacity are needed. The purpose of this paper is to describe the assessment methods used to determine the
capacity required to implement and sustain a local physical activity intervention program—the VERBTM Summer Scorecard (VSS) program. Capacity evaluation methods are described with emphasis on the theoretical framework used to guide them. Capacity tables are presented as a program planning tool and as a system for sharing implementation and sustainability requirements for potential program adopters (Brownson et al., 2006). Gauging the match between existing local capacity and necessary program capacity requirements is discussed as a databased approach to disseminating locally derived programs to other communities (Table 1). Community level interventions that modify the social environment have the potential to influence health-related outcomes such as obesity (Cohen, Finch, Bower, & Sastry, 2006). In response to growing concerns about the increasing prevalence of obesity and the health risks associated with it, a community-wide coalition to address obesity prevention was organized in 2003 in Sarasota County, Florida. The coalition comprised representatives from government agencies (e.g., health department, school board), non-profit organizations, and businesses across the county offering products or services related to obesity prevention. After a year of strategic planning and goal setting, the coalition decided to replicate a communitybased intervention developed by a similar group in Lexington, Kentucky (Courtney, Florida Prevention Research Center, & VERBTM Partnership Team, 2006). The program, called VSS, was designed to offer a wide variety of opportunities for tweens (youth ages 9–13 years old) to be physically active in the community. Although VSS is a
Table 1 Estimated youth attendance at VERBTM Summer Scorecard Events in 2005 Event observed (Date)
Estimated no. in attendance
Basketball bash (7/8) Basketball free throw (7/8) Bounce house/rock climb (7/28) Cooking class for kids (7/13) Dance class (7/20) Fitness frenzy (6/17) Harry potter crafts (7/15) Kids summer beach run (estimate of total 5/31–8/2) Library dance (6/3) Magic workshop (6/13) Pool party (7/22) Retro games (6/28) Splash party (6/8) Swimming is fun (7/8) Venice beach run (10 events) Venice beach run (7/27) Yoga (6/16, 6/23, 6/30)
30 24 5 29 53 30 35–38 100–500 100–120 28 15 15 80 30 30 per week 18 29
Notes: Estimated number by visual observation. Estimates per event unless noted. Total attended cannot be estimated because the same youth may have attended multiple events.
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
local, community-based initiative, it capitalized on the national social marketing campaign, VERBTM It’s What You Do, implemented by the Centers for Disease Control and Prevention (CDC). VERBTM promotes participation in physical activity by tweens. The philosophy of VERBTM is to encourage tweens to try ‘‘new’’ activities, emphasizing fun and adventure rather than high-level performance or skill. (Additional information on how to start a VSS program is available in Courtney et al. (2006). The scorecard manual is available for download at http:// publichealth.usf.edu/prc/downloads.html.) In Sarasota county, the VSS program was the first project sponsored by the coalition and implemented in 2005 with funding through the Florida Prevention Research Center (FPRC) at the University of South Florida. In addition to funds, the FPRC provided technical assistance in social marketing and program planning and evaluation. 2. Implementation 2005 The coalition began development of the VSS program in February 2005. Initial steps included recruiting community partners, developing a promotional brochure, and designing the scorecard, with input from Sarasota county tweens and parents. Over 30 partners (vendors) agreed to participate in the VSS program and offer activities such as paintball, golf, roller-skating, skate boarding, swimming, kayaking, and numerous others. Vendors were divided almost equally between those who devoted between one and four staff members to the VSS program (n ¼ 6) and those who devoted five or more staff members (n ¼ 5). Most (n ¼ 9) devoted 5 or less hours to the program. The final scorecard was created and distributed at schools to students in grades 5–7 in May 2005. Scorecards were also available at many partner sites throughout the county. The VSS provided opportunities for tweens to try new activities and become eligible for winning prizes, as well as a mechanism to measure the amount of activity tweens were doing. Through community partnerships, special events and discounts were available to tweens throughout the summer. When tweens attended an event or patronized a participating business to partake in an hour of physical activity, they received a stamp, sticker, or signature on their scorecard. When the scorecard was filled and submitted, tweens became eligible for prizes. Approximately 10,000 scorecards were distributed to public elementary, middle, and charter schools in May 2005. Schools were instructed to distribute them to students in grades 5–7 or ages 9–13. As a parallel initiative, schools were provided a letter to send home to every middle school parent about the VSS program. Facts about the VSS program were included on the back of the May school lunch menu distributed to parents of youth in grades K–12. A description of the VSS program also was e-mailed to principals and teachers. Physical education teachers were encouraged to promote the program in their
147
schools. Scorecards also were distributed to all Sarasota County YMCAs and libraries, Sarasota County Health Department clinics, parks and recreation summer camps, and places of worship. Scorecards also were made available at various meetings of community organizations. Students from a local high school media department developed a promotional video that was aired through closed-circuit television in elementary and middle schools. In addition, VSS was promoted via radio and television stations, including county government and public television channels, and in print media (community newspapers and magazines). A video also was aired in elementary and middle schools to promote a sponsored culminating event (The Grand Finale). Vendors (i.e., participating businesses) were provided with scorecards, cardholders, stamps or stickers, posters, and a letter explaining the program. A program website also was established and maintained: www.verbsarasota.com. Process, outcome, and impact evaluation results, along with the program logic model, are available for review at http://publichealth.usf.edu/prc/ downloads.html. 3. Materials and methods 3.1. Introduction The primary purpose of the capacity assessment was to evaluate the capacity needed to implement and sustain the VSS program in Sarasota county. The assessment focused on community capacity and was guided by two questions. (1) Which specific elements of capacity were required to implement the VSS program in Sarasota county? (2) Which capacity-related elements were required to sustain the VSS program in Sarasota county? The information in Table 2 summarizes the capacity component of the overarching program evaluation used in the first year, including outcomes addressed, theoretical foundations, indicators assessed, questions used to assess each indicator, instrumentation, and sources of information. The University of South Florida Institutional Review Board, Social and Behavioral Sciences Division, reviewed and approved all study procedures and materials. 3.2. Logic model The VERBTM national campaign’s logic model (Huhman, Heitzler, & Wong, 2004) was adapted for use in evaluating the VSS program. Implementation and sustainability outcomes were included in the Sarasota county VSS program logic model, the specific components of which are illustrated in Fig. 1. Inclusion of these outcomes in the model was intended to facilitate implementation, and increase sustainability and potential program transferability to other communities. The first step in increasing community capacity to affect obesity in Sarasota county was to engage the coalition in campaign implementation. Level of engagement in campaign
148
Table 2 Capacity assessment measurement and instrumentation Indicator
Questions
Instrument
Other sources
Program created that can be implemented smoothly in Sarasota and other counties
Chinman et al. (2005)— Capacity and Weiss et al. (2002)—Sustainability
Community capacity required to implement and sustain program
What aspects of the coalition (e.g., participation and involvement, leadership, connections among people and institutions, etc.) affected the implementation of the VERBTM Summer Scorecard program in Sarasota county? What aspects made it easy to implement? What aspects made it difficult to implement? Appropriate cross-section of members (items 9–10) Members share a stake in both process and outcome (items 13–15) Multiple layers of participation (items 16–17) Think about all the ways in which you participated in the VERBTM Summer Scorecard program—both minor details and major activities. How much time did you or your staff members spend (total) participating in the VERBTM Summer Scorecard program? How many staff members participated in the VERBTM Summer Scorecard program? How often out of the normal workweek did they devote time to participating in the program? What is the most important thing the Sarasota county Obesity Coalition could do to encourage your continued commitment to participating in the VERBTM Summer Scorecard program? What information or knowledge was needed to implement the VERBTM Summer Scorecard program? What skills were needed to implement the VERBTM Summer Scorecard program? What types of technical assistance or support would have made it easier to implement the VERBTM Summer Scorecard program? Ability to compromise (Item 12) Development of clear roles and policy guidelines (items 20–21) Appropriate pace of development (items 24-25) Open and frequent communication (items 26–28) Established informal relationship and communication links (items 29–30) Concrete, attainable goals and objectives (items 31–33) Skilled leadership (item 40) Probe: What information or knowledge did you need to participate? Probe: What skills did you need to participate? Probe: What types of technical assistance or support would make it easier to participate? What resources (e.g., person power, funding, time, expertise, etc.) were needed to implement the VERBTM Summer Scorecard program? What resources—if available—would have made it easier to implement the VERBTM Summer Scorecard program? History of collaboration or cooperation I the community (items 1–2) Flexibility (items 18–19) Adaptability (items 22–23) Sufficient funds, staff, materials, and time (items 38–39) Probe: What resources (e.g., person power, funding, time, expertise, etc.) were needed to participate? Probe: What resources—if available—would make it easier to participate?
Telephone Interview Guide—Executive Committee Members
Field Notes and Program Documentation
Knowledge and skills required to implement and sustain program
Resources required to implement and sustain program
The Wilder Collaboration Factors Inventory Vendor Interview
Telephone Interview Guide—Executive Committee Members
Field Notes and Program Documentation
The Wilder Collaboration Factors Inventory
Vendor Interview
Telephone Interview Guide—Executive Committee Members The Wilder Collaboration Factors Inventory Vendor Interview
Field Notes & Program Documentation
ARTICLE IN PRESS
Theoretical construct
M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
Outcome
Future implement-ation
Weiss et al. (2002)— Sustainability
What will have to happen, if anything, to sustain the VERBTM Summer Scorecard program in Sarasota county?
Field Notes & Program Documentation
The Wilder Collaboration Factors Inventory
Vendor Interview Telephone Interview Guide—Executive Committee Members Telephone Interview Guide—Executive Committee Members
Vendor Survey
Vendor Survey
ARTICLE IN PRESS
Implementation in Summer 06 and so on
Telephone Interview Guide—Executive Committee Members
M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
Belief that Program Should be Sustained
What were the strongest aspects of the VERBTM Summer Scorecard program in Sarasota county? What were the weakest aspects? What effect, if any, do you think the program had on physical activity among Sarasota county youth? What does this say about the coalitions’ ability to affect change? To what extent does Sarasota county (program stakeholders) feel a sense of ownership of the VERBTM Summer Scorecard program? Explain. What are some ways the coalition could maintain commitment to continue the VERB Summer Scorecard Program? Collaborative group seen as a legitimate leader in the community (items 3–4) Favorable political and social climate (Item 5-6) Members see collaboration as in their self-interest (Item 11) Shared vision (items 34–35) Unique purpose (items 36–37) My organization/business has the power to help children reach and keep a healthy weight. Likert Based on your experiences with the VERBTM Summer Scorecard program, should it be sustained in Sarasota county?
Power required to implement and sustain program
149
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
150
Inputs
Activities
Scorecard Distribution
Formative Research Obesity Coalition Youth Advisory Board Vendor Contributions USF PRC
Kick-off Events Summer Events Ongoing Discounts Grand Finale Promotion Activities: -Printed Materials -Media Campaign
Mid-Term Outcomes
Long-Term Outcomes
Involvement of physical activity vendors in Summer Scorecard program
Involvement of other community partners in the Summer Scorecard Program
Community partners have stake in tweens’ physical activity
Interns Website Public Relations Specialist
Partnerships Monitoring & Tracking
Engage coalition in campaign implementation
Successful implementation of the summer 05 program
Community capacity to address physical activity
Stakeholder commitment for future programs
Future Implementation
Booster activities throughout school year
Program created that can be implemented & sustained in Sarasota and other counties
Reduction in negative health outcomes related to inactivity and obesity
SCHD Staff Hours
Short-Term Outcomes
Fig. 1.
implementation is associated with program ownership and sustainability (Johnson et al., 2004; Pluye, Potvin, & Denis, 2004; Pluye, Potvin et al., 2004). The more engaged the coalition is in campaign implementation, the more likely the program will succeed and become a part of regular programming. According to the logic model, successful implementation of the summer 2005 program would ultimately increase stakeholder commitment for future programs, which would lead to community partners having a stake in tweens’ physical activity and, also, a program created that could be implemented readily in Sarasota county as well as other venues. The capacity-specific information that was collected identified aspects of capacity—community, knowledge and skills, resources, and power—that influenced the implementation and sustainability of the program in Sarasota county and informed the portability of the program elsewhere. Community refers to member involvement and bonding both to the community and ‘‘among its members’’ (Chinman et al., 2005, p. 146). Knowledge and skills refers to those required to lead an organization or coalition and implement interventions (Chinman et al., 2005). Resources comprise the ability to locate, marshal, and direct assets, as well as those inputs (e.g., staff) required to implement quality health promotion programming (Chinman et al., 2005). Power
refers to evidence of empowerment or collective efficacy (Chinman et al., 2005). Within each category, factors that made it easy or difficult to implement the program were distinguished. Each of these aspects has demonstrated relationships with coalitions’ ability to address communitybased issues successfully (Mattessich, Murray-Close, & Monsey, 2001). Knowing the capacity required to implement and sustain VSS in Sarasota county was instrumental in identifying the means of creating a program that could be implemented seamlessly. 3.3. Sources/participants The community, in this case, was defined as individuals and organizations involved with implementing and sustaining the VSS program. These parties included members of the obesity prevention coalition, including members of the coalition executive committee, of which the VSS program was a part, and local vendors that provided action outlets. A range of 10–22 coalition members participated in 17 coalition meetings focusing on the planning of the VSS program and numerous activities throughout the county. Of these 17 meetings, 7 were about the VSS program and other coalition business, 4 were specific to VSS, and 6
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
centered on planning the Grand Finale event. Thirty-four vendors participated in the program. Secondary sources of information included program documents and field notes obtained from three FPRC interns. 3.4. Measurement A combination of instruments and methods contributed to capacity measurement including the Wilder Collaboration Factors Inventory (Mattessich et al., 2001), interview guides, field notes taken by program interns, and documents generated through coalition activities. General community capacity (i.e., the capacity of the coalition to address health issues and facilitate the prevention process) as well as community capacity specific to the implementation and sustainability of the VSS program was assessed. The Wilder Collaboration Factors Inventory (Mattessich et al., 2001) was selected to measure general community capacity. The Wilder Collaboration Factors Inventory consists of 40 Likert-type items grouped into 20 factors associated with successful collaboration (e.g., open and frequent communication). The 20 factors were identified based on a meta-analysis of case studies of previous successful collaborations (Mattessich et al., 2001). The inventory was not designed to provide a total collaboration score, but rather, descriptive factor scores for collaborative groups (e.g., coalitions) to use as ‘‘starting points for discussion’’ and markers of functioning (Mattessich et al., 2001, p. 35). Factor scores consist of average ratings for each factor (i.e., total ratings for all items per factor divided by the total number of ratings) and are used to identify strengths (i.e., factors with average score of 4.00 or higher), borderline areas that may need attention (i.e., factors with average score between 3.00 and 3.99), and areas of concern (i.e., factors with average score below 2.99). Because of the nature and purpose of the inventory, other psychometric research has not been conducted (Mattessich et al., 2001). To measure community capacity specific to the implementation and sustainability of the VSS program, interview guides were developed for members of the coalition and for vendors. Capacity-related and sustainabilityrelated items included in the interview guides are summarized in Table 2. Performance indicators were developed for each capacity-related and sustainability-related outcome included in the program logic model (see Fig. 1). Key indicators measured in this assessment corresponded to the framework discussed previously—the level of community, knowledge and skills, resources, and power required to implement and sustain the program (Chinman et al., 2005). Within each of these indicators, items were designed to answer the guiding evaluation questions: ‘‘What did it take to implement the program?’’, ‘‘What would it take to sustain the program?’’ Emphasis was placed on identifying facilitators and barriers to program implementation. In addition, the Belief that the Program Should be Sustained and considerations for Future Im-
151
plementations were measured. This information was collected for sustainability planning purposes (Chinman et al., 2004; Weiss et al., 2002). In addition to the questions from the interview guides, resource-related items from the Wilder Collaboration Factors Inventory were analyzed in relation to the qualitative findings. Field notes and program documentation (e.g., budget) also were analyzed for evidence of resources required to implement and sustain the VSS program. As a part of the Getting to Outcomes process, which involves the adoption of evidence-based interventions, Chinman et al. (2004) provides a tool for assessing organizational capacity, which is similar to community capacity. Using their tool, Chinman et al. (2004) asked organizations to assess requirements for these types of capacity items to determine whether the existing capacity was sufficient, and to devise a plan to enhance capacity. For the present study, similar tables were developed for use in FPRC capacity assessment (see Appendix: Capacity Evaluation Summary Tables). A table was created for each of Chinman et al.’s (2005) four core components of community capacity (see Appendix). Dimensions of each component, identified using the literature and expanded once data were analyzed, were summarized in the first column of each table, Capacity Assessment Dimension. Identified Implementation Requirements were summarized in the second column, and Sustainability Requirements were summarized in the third column. Finally, space for coalition members to record Program Planning Implications that emerged when reviewing and discussing the report, along with tables, was provided in the fourth column. The completed tables (see Appendix) were provided to the coalition and are included in the ‘‘tool kit’’ distributed to other communities interested in implementing the program. 3.5. Analysis A template analysis plan with multiple coders and an initial coding template was used to analyze data obtained from interviews and field notes (see Fig. 2 for preliminary coding template). Partial transcripts of interviews were entered into a Microsoft Word table that was sorted by participant ID number or question. Field notes were organized and distributed to evaluation team members using Microsoft Word. Evaluation team members reviewed data from each method and coded them independently. Evaluation team members met to discuss necessary additions or revisions to the template. A second round of independent coding was conducted after which team members discussed coding results and reconciled coding differences. Analysis results were then synthesized for each method. Results across methods were compared and contrasted, and a summary report was developed. Inventory responses were entered into a Microsoft Excel spreadsheet. Descriptive statistics were calculated for each item, and scale scores were created for each scale using the formula presented by Mattessich et al. (2001). Graduate
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
152
Community Level of involvement (length of time, attendance frequency, contributions) Range of involvement (i.e., diversity of organizations involved) Participation in specific activities Maintenance of connections among people and institutions (see sustainability table) Commitment to continue work started or supported under the initiative Knowledge & Skills Knowledge required for program delivery Consensus building Goal setting Problem solving Ability to implement program (efficiently) Ability to locate and allocate resources, funding Leadership – program champion Resources Funding People power (staffing) Time Equipment Materials Expertise Technical assistance Organizational support Power Shared vision Sense of ownership Strengths and barriers Perceived ability to affect change/address problem NOTES: 1. 2.
3.
Within each major code, the analysis team searched for initial implementation requirements and sustainability requirements. The initial template was developed using delineations of capacity discussed in Chinman, Hannah, Wandersman, Ebener, Hunter, Imm, and Sheldon (2005) and sustainability discussed in Weiss, Coffman, and Bohan-Baker (2002). The initial template was expanded to include codes that emerge from the data.
Fig. 2. Initial coding template.
student assistants involved in the implementation of the program, but not in the capacity evaluation analysis process, reviewed analysis results and provided interpretation and verification support. In addition, coalition members participated in the evaluation process by providing interpretation and verification.
(n ¼ 10) and vendors (n ¼ 13). FPRC staff analyzed primary and secondary data. A summary report was developed and provided to key stakeholders. In addition, the results of this evaluation were used to complete the VERBTM Summer Scorecard Program Capacity Tables (see Appendix), including capacities required to implement and sustain the program.
4. Results 4.2. Program revision and sustainability planning 4.1. Capacity assessment The FPRC spearheaded the VSS program capacity evaluation (year one) in the fall of 2005. The Wilder Collaboration Factors Inventory was administered to coalition members (n ¼ 11) during one of the scheduled meetings. In addition, telephone interviews were conducted with members of the coalition executive committee
The capacity assessment was designed to produce information useful for planning continuation of the program (Chinman et al., 2004). Administration of the Wilder Collaboration Factors Inventory (Mattessich et al., 2001), combined with qualitative data (i.e., interviews, documents, field notes), revealed strengths (e.g., ‘‘history of collaboration or cooperation in the community’’),
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
borderline areas (e.g., ‘‘members share a stake in both process and outcomes’’), and areas of concern (e.g., ‘‘sufficient funds, staff, materials, and time’’). Facilitators and barriers to implementation (i.e., Implementation Requirements column) were identified that were specific to community, knowledge and skills, resources, and power. For example, results from the first implementation year suggested the following facilitating factors. Having people from various demographic regions of the county (Community), knowledge and understanding of the community, expertize and technical assistance (Resources), and united group around the topic (Power). Examples of barriers to implementation included lack of representation of the priority audience on the coalition (Community), being a first year program (Knowledge), and not enough ‘‘worker bees’’ (Resources). The Capacity Evaluation Summary Tables were designed to assist the coalition in program revision and sustainability planning (i.e., continued implementation)—assuming the coalition and other key stakeholders agreed the program should be sustained. Capacity evaluation results informed sustainability planning. The VSS program was popular in Sarasota county. Tweens and their families participated in local events and expressed satisfaction with the program. Most key informants (i.e., coalition members and partners) believed the VSS program should be continued in Sarasota county. (For indicators of satisfaction and effectiveness, see Comprehensive Evaluation Report for the 2005 Sarasota County VERB Summer Scorecard Program available from http://health.usf.edu/publichealth/prc/downloads.html.) The ‘‘Program Sustainability’’ columns of the Capacity Evaluation Summary Tables were completed using capacity evaluation results. Strategic planning recommendations (i.e., factors to address) were made that, if followed, would enable Sarasota county to plan for future implementations (Chinman et al., 2004). These recommendations (and others) were discussed during coalition meetings and influenced planning and implementation of the VSS program for 2006. 5. Discussion This paper presented an example of methods to use in assessing the capacity needed to implement and sustain a community-based intervention. The strengths and weaknesses of the capacity assessment described herein should be considered prior to adoption. For example, strengths include the engagement of community partners in the evaluation process, the use of a theoretical framework to guide data collection and share results with current and future program adopters, the use of mixed methods (i.e., interviews, observation, and inventory), and the collection of practical information that could be used to inform program revisions and sustainability planning and could be communicated to other potential program adopters. Concomitant weaknesses include the failure to construct a more direct measure of program buy-in, the participation
153
of only one community, the inconsistent attendance by coalition members, and the modest cooperation of vendors. Future assessments should seek to improve on the methods presented herein through building on the strengths of this approach and addressing the weaknesses. This case study offers insight in the form of ‘‘lessons learned’’ about operationalizing and evaluating capacity and the potential for program sustainability within a community. The following general capacity evaluation steps are suggested: (1) select a capacity level (i.e., individual, organization, community), (2) define the unit of measurement (i.e., individual, organizational, community), (3) pose research questions, (4) select or develop measures to address each research question, (4) specify data collection procedures and collect data, (5) analyze data, (6) share results, and (7) partner with community members at each step in the process. The information derived from direct experience is contained in the capacity tables and systematizes the program planning and revision process. Additionally, the tables become tools for sharing implementation and sustainability requirements with potential adopters. This ‘‘technology transfer’’ alters the process of program implementation and refinement from one that is merely of academic and internal interest to one that is applied, practical, and ambulatory. The sharing of capacity and sustainability elements is necessary if communities are to learn from one another (Brownson et al., 2006). Completed capacity tables enable the transfer of experiential data regarding implementation and sustainability requirements for the program (in this case, VSS) so as to permit potential adopters to gauge the match between the capacity required and their existing local capacity. Sharing this practical information with new adopters not only fosters advance consideration of whether existing local capacity supports program requirements, but also acts as a catalyst for the brainstorming of potential funding sources, community leaders, and organizations (Dino et al., 2006). In retrospect, the inclusion of a more direct measure of program buy-in would have been an important evaluation component. Pluye, Potvin, and Denis (2004) and Pluye, Potvin, Denis et al. (2004) argue that coalitions consist of organizational structures, and that programs may be sustained in organizations. Evidence of sustainability then occurs when the program is ‘‘routinized’’ within coalitionrelated organizations (Pluye, Potvin, & Denis, 2004, p. 123; Pluye, Potvin, Denis et al., 2004). It is difficult to say whether the VSS has become routinized in Sarasota county. Evidence suggests, rather than becoming housed at the coalition level VSS has, instead, become housed within specific agencies of the coalition. Without championship and buy-in from these agencies at the coalition level, continued implementation of the VSS program in Sarasota county may be tentative. This finding relates to individual coalition member buy-in. Even though the extent to which members shared a stake in both the process and outcome—at the coalition level not program
ARTICLE IN PRESS 154
M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
level—and the extent to which Sarasota county felt a sense of ownership of the VSS program were measured, individual coalition member and vendor buy-in to the VSS program was not assessed. The fluidity of coalition attendance and the modest cooperation of vendors in the evaluation process served as challenges to obtaining participation in the capacity assessment. Whereas most coalition executive committee members participated in interviews, only about half of coalition members attended the meeting when administration of the Wilder Collaborative Factors Inventory occurred. Only 38% of vendors who participated in the VSS program participated in capacity interviews. Many vendors were too busy during the workday to participate in an interview; non-business telephone numbers were typically unavailable for contacts outside of the workday. In retrospect, providing vendors with an opportunity to complete an Internet survey at their convenience may have increased participation rates. Although efforts were made to obtain a representative sample of key informants (i.e., those with high and low levels of program participation), criteria for being interviewed included having devoted at least a moderate amount of time to planning and implementing the VSS program. Thus, whereas participants were able to speak to the capacity required to implement and sustain the VSS program, an incomplete portrait of what was required and deemed valuable may have been obtained. Further limiting generalizability is the fact that the capacity assessment was only conducted within one implementation site, Sarasota county. This venue may differ from other sites. Moreover, Sarasota County is not an ethnically diverse community, and thus, program success and determinants of community capacity may be better or worse in other venues where there is a broader representation of ethnic groups. These issues notwithstanding, the manner in which capacity for implementation and sustainability was determined, with the consent and participation of a locally derived coalition, is a strength of this study from which other communities can learn.
6. Conclusions Participatory, community-based public health coalitions offer much to learn about the implementation and sustainability of local and evidence-based interventions because of their presence within the realm of practice. The VSS capacity evaluation (year one) has been conducted and the results used for program revision and sustainability planning purposes. The second year capacity evaluation is underway to coincide with the post-2006 VSS program implementation. Year two focuses on program revision, continued fit between program and community needs (Chinman et al., 2004), and changes in community capacity across time. Implementation results from year one suggest that the VSS program was successfully implemented in Sarasota county. Evaluation results support a conclusion that the program shows promise in getting tweens to try new physical activities. The capacity evaluation framework and tables presented represent one way to facilitate adoption, examine implementation, enhance sustainability, and disseminate an innovative community-based intervention to new sites (Dino et al., 2006). Acknowledgments The authors would like to thank the members of the Sarasota County Obesity Prevention Coalition. This publication was supported by Cooperative Agreement Number 1-U48-DP-000062 funded by the Centers for Disease Control and Prevention, Prevention Research Centers Program. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention. Appendix Capacity evaluation summary are given in Tables A1 and A2. Sarasota county VERB summer scorecard program: capacity evaluation summary are given in Tables A3 and A4.
Table A1 Community Capacity assessment dimension
Implementation requirements
Sustainability requirements
Commitment to continue work started or supported under the initiative
Most coalition members and vendors stated the VSS should be continued Most vendors interviewed stated they would continue to offer free or low cost physical activities to youth
There was a need for more ‘‘worker bees’’—people to do day-to-day activities
Level of involvement (length of time, attendance frequency, contributions)
Varied across agencies with a few agencies bearing the weight of implementation tasks
Greater involvement across agencies represented on the coalition
Program planning implications
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
155
Table A1 (continued ) Capacity assessment dimension
Implementation requirements
Sustainability requirements
Program planning implications
Agencies involved in kick off and Grand Finale were the most involved Maintenance of connections among people and institutions
Few coalition members communicated with vendors
More and ongoing contact with vendors Coalition members need to be able to ‘‘sell’’ the program to vendors
Participation in specific activities
Many coalition members assisted with program planning
More participation and follow-through is needed from more coalition members, especially activities ‘‘:on the ground’’
Range of involvement (i.e., diversity of organizations involved)
Key players in the community were involved
Greater community involvement was desired
Table A2 Knowledge and Skills Capacity assessment dimension
Implementation requirements
Ability to implement program (efficiently)
Handling media Obtaining and retaining vendor involvement Public service skills
Ability to locate resources, funding
Finding people to help How to work with system serving youth Locating equipment for physical activity (e.g., jump ropes for the library)
Additional sources of funding and resources
Communication
People skills
More and ongoing communication with vendors More communication of program goals to tweens in the schools
Diplomacy Ability to sell program to others Ability to explain the program to vendors ‘‘from the heart’’ Consensus building
Collaboration Being a team player
Fund raising
Getting sponsorships Interacting with the business community
Goal setting
Identifying coalition goals Identifying program goals
Knowledge required for program delivery
Successful community example Social marketing (especially audience segmentation) Program theory (logic model) Costs Community Key players in the community Coalition commitment VERB at National level Individual’s strengths and skills What to expect from vendors and hosting locations
Leadership—program champion and mechnism for continued leadership
Limited number of agencies Motivation Energy Enthusiasm
Sustainability requirements
Timing considerations Individual roles and responsibilities
Need fewer leaders and more ‘‘worker bees’’
Program planning implications
ARTICLE IN PRESS 156
M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
Table A2 (continued ) Capacity assessment dimension
Implementation requirements
Sustainability requirements
Networking and recruitment
Businesses/vendors Youth (e.g., youth at risk) Coalition members
Need to recruit broader base of community coalition participation, including parents, youth, and businesses
Organization
Event planning and coordination Youth board
Problem solving
Making materials youth friendly Brainstorming problems that might occur at events (e.g., space constraints, handling parents, supervising youth) Flexibility
Improving award distribution system for grand finale and other events
Social marketing
Formative research—youth and parents Evaluation Promotion (e.g., high school PSA)
Formative research with parents
Technical skills
Website development Graphic artist Printing
Webmaster to maintain site
Training
Youth researchers Vendors
Program planning implications
Table A3 Resources Capacity assessment dimension
Implementation requirements
Belief/enthusiasm/team spirit
Belief in the program and ability to spread enthusiasm to others
Commitment
Vendors Youth board Coalition members
More active youth board—need someone committed to working with youth over the course of program
Communication
Frequent and regular communication
More and ongoing communication with vendors More communication of program goals to tweens in the schools
Use of meetings, emails, calls and so on High school youth Equipment
Printing Physical activity (e.g., jump ropes)
Expertize
Graphics and printing
Sustainability requirements
Social marketing Evaluation
More formative research from parents More and earlier outcome evaluation data
Funding
$65,253 not including donations and in-kind See Budget
‘‘more’’—exact amount TBD Less complicated funding system
Knowledge
Of the community Of youth and families
Leadership/program champion
Materials
Community spokesperson Youth Board—high school youth promote to tweens Promotional Scorecard
More giveaways and prizes for youth More stickers for vendors—some ran out
Program planning implications
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
157
Table A3 (continued ) Capacity assessment dimension
Implementation requirements
Sustainability requirements
Posters for vendors Stickers for vendors Giveaways
Sign-in sheets for vendors’ use may be helpful
Media coverage
See table
Marketing—earlier and more Utilize high schools
Organizational support
Major organizations in the community represented on the coalition
Broader program implementation base Greater buy-in from community vendors and schools
People power (staffing)
Sarasota county health department and parks and recreation staff provided most person power High school PSA development
More equal distribution of labor across coalition members/agencies Dedicated staff person Evaluation committee comprising coalition and community members
Place for events
Facilities for holding events were key to offering free or low cost activities
Combining the program with existing programs, such as beach runs and summer camps Consider non-traditional locations for physical activity such as libraries More events needed closer to south county
Support
Lexington’s notebook and one-on-one contact Secretarial
Technical assistance
Web site Graphics
Use vendor training guide Having a webmaster
Time
Coalition members—‘‘tremendous amount of person hours’’
More start-up time More devoted time for some members
Transportation
SCAT bus availability
Functional in the evenings for events Hold events where youth already are
Working relationships
Youth serving organizations Schools Universities Community partners
More free events Promotion of program in summer camps
Capacity assessment dimension
Implementation requirements
Sustainability requirements
Groundwork
Pilot laid the groundwork for community-based obesity prevention in Sarasota County
Build on success; learn from evaluation results and experiences
Program planning implications
Table A4 Power
Media Perceived ability to affect change/ address problem Program champion
Need more media coverage to influence community Medium to strong on part of coalition members and vendors
Share evaluation results with vendors—some doubt as to their impact on outcomes Need someone to champion or be a spokesperson for program
Program planning implications
ARTICLE IN PRESS 158
M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159
Table A4 (continued ) Capacity assessment dimension
Implementation requirements
Sustainability requirements
Reaching the target audience
Power to reach those at risk for obesity
Improved tactics for reaching those youth who are not already physically active
Sense of ownership
At least of individual agencies—depended on level of involvement, which varied across agencies and individuals
More buy-in across coalition members and community (e.g., youth, parents, community members), especially involvement in field work (e.g., evaluation activities)
Shared vision
Strong desire to address obesity in Sarasota county
Processing meeting to assess evaluation results and agree upon program modifications
Strengths and barriers
Designation of coalition partners’ responsibilities
System for follow through and responsibility To build on strengths and weaknesses summarized in table
See table
References Brownson, R. C., Kreuter, M. W., Arrington, B. A., & True, W. R. (2006). Translating scientific discoveries into public health action: How can schools of public health move us forward? Public Health Reports, 121, 97–103. Bryant, C. A., McCormack Brown, K. R., McDermott, R. J., Forthofer, M. S., Bumpus, E. C., Calkins, S. A., et al. (2007). Community-based prevention marketing: Organizing a community for health behavior intervention. Health Promotion Practice, 8, 154–163. Cassidy, E. F., & Leviton, L. C. (2006). The relationships of program and organizational capacity to program sustainability: What helps programs survive? Evaluation and Program Planning, 29, 149–152. Chinman, M., Hannah, G., Wandersman, A., Ebener, P., Hunter, S. B., Imm, P., et al. (2005). Developing a community science research agenda for building community capacity for effective preventive interventions. American Journal of Community Psychology, 35(3/4), 143–157. Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to outcomes: Promoting accountability through methods and tools for planning, implementation, and evaluation. Santa Monica, CA: Rand Corp Available from /http://www.rand.org/pubs/technical_reports/TR101/S. Cohen, D. A., Finch, B. K., Bower, A., & Sastry, N. (2006). Collective efficacy and obesity: The potential influence of social factors on health. Social Science & Medicine, 62, 769–778. Courtney, A., Florida Prevention Research Center, VERB Partnership Team. (2006). Designing a successful VERB Scorecard Campaign in your community. Atlanta, GA: Department of Health and Human Services, Centers for Disease Control and Prevention. Dino, G., Eidson, P., Gray, B., Harris, J., Keller, H., McVay, J., et al. (2006). Summary report. Taking research to practice: An exploratory meeting. Atlanta, GA: Department of Health and Human Services, Centers for Disease Control and Prevention. Green, L. W. (2006). Public health asks of systems science: To advance our evidence-based practice, can you help us get more practice-based evidence? American Journal of Public Health, 96, 406–409. Huhman, M., Heitzler, C., & Wong, F. (2004). The VERB campaign logic model: A tool for planning and evaluation. Preventing Chronic Disease, 1(3) Available from /www.cdc.gov/pcd/issues/2004/jul/04_0033.htmS. Johnson, K., Hays, C., Center, H., & Daley, C. (2004). Building capacity and sustainable prevention innovations: A sustainability planning model. Evaluation and Program Planning, 27, 135–149. Mattessich, P. W., Murray-Close, M., & Monsey, B. R. (2001). Collaboration: What makes it work. A review of research literature on
Program planning implications
factors influencing successful collaboration. Saint Paul, MN: Amherst H. Wilder Foundation. Minkler, M., & Wallerstein, N. (2002). Improving health through community organization and community building. In M. Minkler, B. K. Rimer, & F. M. Lewis (Eds.), Health behavior and health education (3rd ed., pp. 279–311). San Francisco, CA: Jossey-Bass. Pluye, P., Potvin, L., & Denis, J. L. (2004). Making public health programs last: Conceptualizing sustainability. Evaluation and Program Planning, 27(2), 121–134. Pluye, P., Potvin, L., Denis, J. L., & Pelletier, J. (2004). Program sustainability: Focus on organizational routines. Health Promotion International, 19(4), 489–500. Pluye, P., Potvin, L., Denis, J.-L., Pelletier, J., & Mannoni, C. (2005). Program sustainability begins with the first events. Evaluation and Program Planning, 28(2), 123–137. Weiss, H., Coffman, J., & Bohan-Baker, M. (2002). Evaluation’s role in supporting initiative sustainability. Cambridge, MA: Harvard Family Research Project, Harvard University Graduate School of Education. Moya L. Alfonso is a Research Assistant Professor at the Florida Prevention Research Center and Senior Research Coordinator for the Center for Social Marketing at the University of South Florida (USF) College of Public Health. Jen Nickelson is a doctoral student in the Department of Community and Family Health at the USF College of Public Health and a Graduate Research Associate for the Florida Prevention Research Center at the USF College of Public Health. David L. Hogeboom is Coordinator of Research Programs and Services for Academic Affairs and for the Florida Prevention Research Center at the USF College of Public Health. Jennifer French is a Healthy Living Coordinator for the Sarasota County Health Department. Carol A. Bryant is a Professor in the Department of Community and Family Health and Co-Director of the Florida Prevention Research Center at the USF College of Public Health.
ARTICLE IN PRESS M.L. Alfonso et al. / Evaluation and Program Planning 31 (2008) 145–159 Robert J. McDermott is Professor in the Department of Community and Family Health and Co-Director of the Florida Prevention Research Center at the USF College of Public Health.
159
Julie A. Baldwin is a Professor in the Department of Community and Family Health and the Co-Director of the Research Methods and Evaluation Unit of the Florida Prevention Research Center at the USF College of Public Health.