The context of program implementation and evaluation: A pilot study of interorganizational differences to improve child welfare reform efforts

The context of program implementation and evaluation: A pilot study of interorganizational differences to improve child welfare reform efforts

Children and Youth Services Review 33 (2011) 2273–2281 Contents lists available at ScienceDirect Children and Youth Services Review j o u r n a l h ...

377KB Sizes 0 Downloads 16 Views

Children and Youth Services Review 33 (2011) 2273–2281

Contents lists available at ScienceDirect

Children and Youth Services Review j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / l o c a t e / c h i l d yo u t h

The context of program implementation and evaluation: A pilot study of interorganizational differences to improve child welfare reform efforts Thomas M. Crea a,⁎, David S. Crampton b a b

Graduate School of Social Work, Boston College, 140 Commonwealth Ave., McGuinn 302, Chestnut Hill, MA 02467, United States Mandel School of Applied Social Sciences, Case Western Reserve University, 10900 Euclid Ave., Cleveland, OH 44106, United States

a r t i c l e

i n f o

Article history: Received 21 April 2011 Received in revised form 14 July 2011 Accepted 15 July 2011 Available online 23 July 2011 Keywords: Program implementation Organizational context Child welfare reform

a b s t r a c t This article contributes to the growing literature on evaluation and implementation science by examining the interaction between staff perceptions of organizational strength with perceptions and indicators of program fidelity. As part of a pilot project related to the evaluation of the Family to Family initiative, a survey was distributed to employees within two urban child welfare agencies with a total of 410 respondents across both sites, for a combined response rate of 72.2%. Survey results were analyzed both in terms or respondents' perception of their agency as well as in relation to measures of program performance and workload. Multivariate models show that organizational indicators are the most significant and positive predictors of perceived program implementation. Specifically, staff who positively perceived the availability of information within their agency also believed that the programs were well implemented in their agency. These findings suggest that as the value of program changes are articulated within an organization, the implementation of the initiative is perceived to improve. © 2011 Elsevier Ltd. All rights reserved.

1. Introduction Recently, increased attention has been paid by evaluation researchers to contextual influences on program implementation. This focus highlights a growing awareness of the influence of organizational factors on how a program is implemented at a particular site. The extent to which context interacts with program activities has been termed the “intervention–context interface” (Bissett, Daniel, & Potvin, 2009, p. 554) in which program implementation implies a “process of social change” (Berwick, 2008, p. 1183) within an organization. This process of implementation suggests both that programs will be adapted to fit specific contexts, and that evaluation approaches will likewise adapt to capture the interactive nature of programs and their contexts (Glasgow, Lichtenstein, & Marcus, 2003). Yet, program implementation research traditionally has examined fidelity and program drift, without a simultaneous exploration of contextual influences to this implementation (Bissett et al., 2009). The purpose of this study is to present findings from a pilot study in two sites, which explored the interaction of organizational factors and indicators of program fidelity of the Family to Family initiative, sponsored by the Annie E. Casey Foundation. Begun in 1992, Family to Family has been implemented in over 60 child welfare agencies

⁎ Corresponding author. Tel.: + 1 617 552 0813. E-mail addresses: [email protected] (T.M. Crea), [email protected] (D.S. Crampton). 0190-7409/$ – see front matter © 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.childyouth.2011.07.012

across 17 states. Previous studies have examined aspects of Family to Family implementation, particularly in regards to model fidelity and variability (Crampton, Crea, Abramson-Madden, & Usher, 2008; Crea, Crampton, Abramson-Madden, & Usher, 2008; Crea, Usher, & Wildfire, 2009) and the extent to which practices influence proximal and distal outcomes (Crea et al., 2009; Usher, Wildfire, Webster, & Crampton, 2010). These studies included analyses of administrative data as well as qualitative explorations of contextual factors. The current study extends this body of work by pursuing a quantitative analysis of contextual factors, and how these factors are related to indicators of Family to Family implementation in 2 large urban child welfare agencies with a history of pursuing the initiative spanning multiple years.

2. The context of program implementation and evaluation The extent to which programs are implemented as intended depends in large part on contextual differences that result in, or may necessitate, changes in how these models are practiced. These differences should be captured in implementations studies, and multiple methods exist to capture important indicators of program fidelity. Yet, these studies are subject to the same organizational influences that may govern how data are collected, and how findings are used to improve practice. The following sections explore the interrelationship between the concepts of organizational context, program fidelity measurement, and how data are used to inform practice and organizational learning.

2274

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

2.1. Organizational context, program implementation, and organizational learning Researchers have long acknowledged the importance of measuring program fidelity – the extent to which practices adhere to program design – in interpreting outcomes (Elmore, 1980; Lipsky, 1980). Identifying program processes and indicators of implementation is increasingly seen as critical to identifying the success or failure of an intervention (Kalafat, Illback, & Sanders, 2007). Recently, scholars have focused increased attention on the factors that impede or facilitate program implementation. In a study of therapist adherence to the Incredible Years Parenting Program, Stern, Alaggia, Watson, and Morton (2008) found a number of factors that contribute positively to adherence, including administrative and program structure, ongoing monitoring efforts by the agency, and strong supervisory and leadership skills. Similarly, strong leadership skills and supervisory support have been shown to influence service effectiveness, specifically in the context of child welfare service systems (Yoo & Brooks, 2005), supporting earlier findings that the context of service provision and measurement of outcomes are interdependent constructs (Usher, Wildfire, & Gibbs, 1999). The study of organizational factors is often informed by the assumptions of the sociotechnical model of program implementation. That is, program implementation catalyzes a process of social change in an organization, such that (a) effective implementation requires a certain set of organizational attributes that are amenable to positive change (Glisson, 2007); and (b) implementing a program changes and expands these attributes in an organization (Berwick, 2008; Bissett et al., 2009). From the perspective of the sociotechnical model, adopting a program implies adapting this program to the unique context of the organization. Capturing the interaction of program implementation and organizational context, in turn, implies a program evaluation which measures both model fidelity and features of the organization. Contextual factors within agencies may also determine how evaluations are conducted and information used. In an agency committed to organizational learning, administrators and staff integrate evaluation activities within the agency's structure, communications, and overall vision (Torres & Preskill, 2001). Even so, this integration requires intentional efforts by organizational leaders and evaluators to engage stakeholders in the evaluation process, such that findings are used to improve practice (Preskill, Zuckerman, & Matthews, 2003). These evaluation efforts may be facilitated by factors such as staff knowledge and support of evaluation, and a belief that evaluation is worth the effort required (Botcheva, White, & Huffman, 2002). Barriers to evaluation efforts may include lack of evaluator competence; negative staff attitudes and lack of trust towards evaluation efforts; inappropriate evaluation instruments and methodology; and other contextual factors such as lack of agency resources and staff turnover (Taut & Alkin, 2003). The available evidence suggests that the need for evaluation should be clearly communicated by agency leadership, such that evidence is integrated closely with efforts to improve practice. The importance of organizational leadership in promoting a learning culture is well documented. Organizational “leaders”, as distinct from “managers”, use evaluation data strategically to make administrative decisions that affect agency practice (Owen & Lambert, 1998). In a related vein, Schweigert (2006) expands the traditional definition of effectiveness beyond demonstrating causal linkages to outcomes, to increasing the “understanding of the dynamics of communities and interventions” and pursuing “well-informed accountability” (p. 417). It is the task of the organizational leader to promote integration of these evaluation efforts within the culture of the agency. One paradigm of integrating evaluation and practice is the selfevaluation approach. Here, evaluators work closely with agency

leadership and staff to establish and measure outcomes, monitor performance indicators, and provide a system of continuous feedback to refine programs and practices (Kum, Duncan, & Stewart, 2009; Usher, 1995). Self Evaluation is one of the four core strategies of the Family to Family initiative, and was designed to encourage greater collaboration among agency and community partners around the use of data to improve practice in child welfare (Webster, Needell, & Wildfire, 2002). Not surprisingly, efforts at self-evaluation are part of the larger organizational context, and will succeed only in the environment of the learning organization (Taut, 2007). While many agencies across a number of disciplines actively pursue self evaluation, little research appears to have been conducted on how organizational culture interacts with self evaluation, or in general with the pursuit of implementing programs. 2.2. Measuring organizational culture and program implementation Multiple standardized measures of organizational culture have been created for use in mental health and juvenile justice systems (Glisson et al., 2008), health care organizations (Scott, Mannion, Davies, & Marshall, 2003), child welfare systems (Westbrook, Ellett, & Deweaver, 2009), and state governments (Lauderdale, 1999). Glisson and James (2002) differentiated the concepts of climate (employees' shared perceptions of the psychological impact of their work environment) and culture (the beliefs, expectations and assumptions shared by employees about how work is done within an organization). The combination of culture, climate, and work attitudes such as job satisfaction and commitment to the organization, form the overarching concept of organizational context (Glisson et al., 2008). While many standardized measures of organizational factors exist, these measures have not often been used in tandem with efforts to capture indicators of program fidelity. Other studies examine fidelity indicators across sites, and hint at the importance of organizational influences to implementation. Zvoch (2009) used teachers' repeated observations of a childhood literacy program across 100 classrooms, and found that teacher and school characteristics significantly predicted greater program fidelity. Kalafat et al. (2007) examined implementation of a school-based family support program in twenty sites and found that high implementation predicted more positive outcomes. In an implementation study of a socio-educational program for immigrant families in Spain, Rego, Otero, and Moledo (2009) found variation in staff support for the program, as well as some difficulties implementing the program as intended. Previous studies of Family to Family implementation, the program which is the subject of the current study, found significant variability in implementation fidelity across sites based on administrative self-evaluation data in four sites (Crea et al., 2008; Crea et al., 2009). These studies also found that contextual influences such as administrative support, the presence of adequate resources, and even physical space, played a significant role in determining how, and to what extent, Family to Family was pursued by agencies (Crampton et al., 2008; Crea et al., 2008). The current study expands past research on Family to Family by exploring how organizational factors are related to specific aspects of implementation at two sites. This study relies on staff and supervisor perceptions of organizational factors and program implementation, as practitioner attitudes have been shown to be important components of program implementation (Tuchman & Sarasohn, 2011). The present study is guided by 5 research questions: (1) What differences emerge between sites, and between supervisors and workers within sites, on dimensions and constructs of organizational excellence, as measured by the Survey of Organizational Excellence (SOE)? (2) What differences emerge between sites, and between supervisors and workers within sites, on indicators and measures of

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

the implementation of Family to Family, using a Survey of Family to Family Implementation? (3) Which organizational dimensions predict the perceived strength of Family to Family implementation? (4) How is perceived implementation of Family to Family related to administrative measures of implementation in these sites? (5) How might patterns of perceived implementation and organizational excellence be related to external demands, measured by a monthly average of referrals and average numbers of children in care? 2.3. Family to Family The Family to Family initiative seeks to improve outcomes for children and families involved in the child welfare system, guided by four principles: (1) A child's safety is paramount; (2) Children belong in families; (3) Families need strong communities; and (4) Public child welfare systems need partnerships with the community and with other systems to achieve strong outcomes for children (Annie E. Casey Foundation [AECF], n.d.). The goals of the initiative are to provide a neighborhood-based, culturally sensitive network of care located where children live; to reduce reliance on institutional or congregate care; to recruit adequate numbers of foster families for those children needing out-of-home placements; to promote a team approach that includes birth and foster families in decision-making; and screening services to preserve the family where possible while attending to the needs of children. Family to Family is comprised of four interrelated strategies which are thought to operate together synergistically, such that all strategies are mutually interdependent (Annie E. Casey Foundation, n.d.): (1) Team Decisionmaking (TDM), which involves the inclusion of birth family, relatives, and community members in the placement decision-making process; (2) Building Community Partnerships (BCP), which involves building and sustaining relationships among community leaders and organizations in neighborhoods having high rates of child welfare involvement, and to promote ongoing support for families once their involvement with the child welfare system has ended; (3) Resource Family Recruitment, Development, and Support (RDS), which involves recruiting and retaining foster and kinship families that can support children and families in their own communities; (4) Self-Evaluation (SE), which involves collaboration among analysts, agency staff and community members to evaluate data on key outcomes in order to assess progress and needed changes in policy or practice. Agencies implementing Family to Family dedicate a number of staff to full-time facilitation of TDM meetings and frequently charge one staff member with the task of self-evaluation analyst. Agencies also receive frequent technical assistance for each of the core strategies, by telephone, on-site training, and in-person consultation at agencies. 3. Method 3.1. Measures 3.1.1. Survey of organizational excellence The Survey of Organizational Excellence (SOE) was developed by researchers at the University of Texas at Austin and is currently used statewide by the government agencies, including child welfare agencies, in Texas and Missouri on a biennial basis. The purpose of the survey is to assist “leadership by providing information about workforce issues impacting the quality of service ultimately delivered those served” (Landuyt, 2000, p. 2). Data from the survey are entered

2275

into a longitudinal database housed at the University of Texas and analysis of these data is part of an overarching strategic plan in Texas and Missouri to identify strengths and weaknesses in state human service agencies. After undergoing a number of iterations, the current SOE consists of 86 Likert-scale items each with the following responses: (1) Strongly Disagree, (2) Disagree, (3) Feel Neutral, (4) Agree, (5) Strongly Agree and (6) Don't Know/Not Applicable. These items are organized under 20 constructs within 5 dimensions, and the SOE has strong reliability and validity. The SOE dimensions and constructs are as follows: (1) Work Group (constructs: Supervisor Effectiveness; Fairness; Team Effectiveness; Diversity); (2) Accommodations (constructs: Fair Pay; Physical Environment; Benefits; Employment Development); (3) Organizational Features (constructs: Change Oriented; Goal Oriented; Holographic; Strategic; Quality); (4) Information (constructs: Internal; Availability; External); and (5) Personal (constructs: Job Satisfaction; Time and Stress; Burnout; Empowerment). 3.1.2. Survey of Family to Family implementation The SOE contains 20 Likert-scale items which can be used to tailor items specific to an organization. These items were used to create a scale measuring the implementation of the 4 core strategies of the Family to Family initiative (TDM; RDS; SE; and BCP) as well as overall opinions of Family to Family as a guiding framework for child welfare practice (F2F; see Table 3 for individual items). 3.1.3. Administrative data Agencies in Fresno and Louisville collect administrative data on TDM implementation. These data were used to explore differences and similarities in participant attendance, a key factor related to TDM implementation fidelity (see Fig. 1). Both agencies also publicly post the number of children in out-of-home care (OHC) per month; these figures were used to examine changes in OHC over time as an additional measure of contextual influences (see Fig. 2). Fresno also provides data regarding monthly referral rates, but comparable data were not available from Louisville. 3.2. Sample In 2005, the Annie E. Casey Foundation decided that the Family to Family initiative might be more effective if they targeted their resources in a smaller number of sites (Batterson et al., 2007). Using evaluation tools with questions similar to the Family to Family implementation questions used in this study, Family to Family Technical Assistants rated the implementation of the Family to Family strategies in over 50 sites. Eleven sites were eventually selected for enhanced work from this assessment (Usher et al., 2010). Two of those eleven sites participated in this research project, each having a substantial history of pursuing the Family to Family initiative: Jefferson County (Louisville), Kentucky Department of Community-Based Services (DCBS) began implementing Family to Family in March of 2002; Fresno County, California Department of Children and Family Services (DCFS) began implementation in April of 2003. There two sites were selected because they have similar periods of time in implementing Family to Family and were judged to be among the most successful Family to Family sites in the country. While the two study sites are very different demographically and geographically, they both have success in implementation, thus a comparison between the two may identify key common characteristics of successful implementation that can occur in very different organizations. The SOE–F2F survey was administered by the first author over the course of two days at each of these sites, in December 2008 and in March 2009, respectively. At each site, administrative staff members helped coordinate scheduling of times for respondents to complete the survey on site, and posted flyers and other reminders of the survey prior to the data collection effort. Meals were provided as an incentive

2276

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

for respondents to complete the survey. Of the 284 employees invited in Louisville, 181 completed the survey for a response rate of 63.7%. Of the 284 employees invited in Fresno, 229 completed the survey for a response rate of 80.6%. Combined, the 410 respondents across both sites comprised a response rate of 72.2%. 3.2.1. Analysis Differences in demographic characteristics were tested using chi square tests (see Table 1). Mean differences were tested between caseworkers and supervisors on the dimensions and constructs of the Survey of Organizational Excellence (SOE), within sites and between sites, using independent samples t-tests (see Table 2). Mean differences were also tested for Family to Family implementation indicators, between supervisors and workers within sites, and for mean differences between sites, using independent samples t-tests (see Table 3). Five linear regression models were specified, predicting perceived implementation of each of the four core Family to Family strategies (TDM; RDS; SE; BCP) and perceptions of overall Family to Family

Table 1 Respondent characteristics (N = 410)1.

Gender Female Male Missing Age 29 or younger 30–39 40–49 50 or older Missing Race ** African-American/Black Hispanic/Mexican-American Anglo-American/White Asian-American/Native American Multiracial/Other Missing Hours per week ** Less than 20 20–39 40 or more Missing Job position Child welfare worker Supervisor Missing Years of service at organization 0–2 3–5 6–10 11–15 16+ Missing Education Bachelor's, or below Master's, or above Missing Salary $0–$35,000 $35,001–$45,000 $45,001–$50,000 $50,001–$60,000 $60,001 or higher Missing Primary service area Intake/Investigation/Assessment Ongoing Permanency Missing 1

Louisville (n = 181) %

Fresno (n = 229) %

68.5 14.4 17.1

72.9 15.7 11.4

22.7 30.4 22.7 20.4 3.8

15.3 38.8 16.6 26.2 3.1

28.7 0.6 48.6 0.0 2.8 19.3

10.0 32.4 26.2 5.2 5.7 20.5

0.0 53.0 42.0 5.0

2.6 1.7 93.4 2.3

76.8 18.8 4.4

81.6 15.3 3.1

22.1 17.1 17.1 16.0 20.5 7.2

16.2 17.5 23.6 18.7 19.2 4.8

48.6 35.4 16.0

47.2 38.4 14.4

37.0 33.7 11.6 11.6 1.7 4.4

7.4 17.4 19.2 27.1 24.5 4.4

33.7 32.6 23.2 10.5

38.9 25.8 26.6 8.7

Independent samples t-tests; **p b 0.01.

implementation (see Table 4). Predictors included all dimensions of the SOE (Work Group; Accommodations; Organizational Features; Information; Personal), controlling for the following respondent characteristics: Hours worked per week; Supervisor (v. caseworker); Age (categorized); Service Area (Intake/Investigation and Ongoing, v. Permanency); Receipt of merit increase in last year (yes v. no); Plans to work at the same agency in 2 years (yes v. no); Salary (categorized); and Site (Fresno v. Louisville). As part of the full evaluation of Family to Family completed in 2009, site profiles were completed for each of the anchor sites (defined as those sites demonstrating the fullest commitment to implementing the initiative; Usher et al., 2010). These profiles included information and quantitative data on implementation on each of the four core strategies of Family to Family over time within each anchor site. For the purposes of this study, data were drawn from the Jefferson County, Kentucky (Louisville) and Fresno County, California (Fresno) site profiles, to create figures comparing these sites' experiences in implementing Team Decisionmaking (TDM), the core strategy with the most complete administrative data between the sites (see Fig. 1). These comparisons included attendance of TDM meetings by community representatives (CR); Service providers (SR); Family members or friends (FMF); and whether meetings were held in a community location (CR). These comparisons were completed for both Removal TDMs (meetings held to determine whether a child should be removed from his or her birth home); and Change of Placement TDMs (meeting held to determine whether a child should change placements once in foster care). An additional analysis examined the monthly patterns of children in out-of-home care (OHC) and referrals to the child welfare agency for investigation, for which each agency collects monthly data (see Fig. 2). The purpose of this analysis is to create an approximate measure of the workload demands on the agency on a per-monthly basis, as an additional means of examining influences on organizational dynamics. To make meaningful comparisons between the 2 agencies, the time frame used for this analysis spanned July 1, 2008 to June 30, 2009; this window also covered the SOE–F2F data collection efforts in each site. The average number of children in OHC was calculated over these 12 months, and subtracted from the actual number of children in care per month, such that Fig. 2 shows the extent to which numbers exceeded or were lower than the mean. This same analysis was conducted for number of referrals to the agency for Fresno only; monthly averages were not available for Louisville.

4. Results Some demographic differences emerged between the two sites. Race of respondents significantly differed between the 2 sites (p b 0.01) with more White and Black respondents in Louisville, and more Hispanic/Mexican-American respondents in Fresno. Hours per week also differed (p b 0.01) with over half of Louisville respondents reporting working between 20 and 39 h per week, and 93.4% of Fresno respondents working 40 or more hours per week (see Table 1). Respondents in Louisville rated organizational excellence more highly than respondents in Fresno on many dimensions and constructs; nearly all differences were significantly different (p b 0.05; see Table 2). Overall internal consistency was high, with α = 0.93 for the full SOE instrument. The biggest differences between the 2 sites appeared to be in the Work Group, Organizational Features and Personal dimensions, where respondents in Louisville rated all dimensions and related constructs more highly (p b 0.05 for all comparisons). Louisville respondents also rated the Accommodations dimension higher (p b 0.05) and some constructs, although respondents in Fresno rated their pay as being more fair (p b 0.05). There were no significant cross-site differences for the constructs of Benefits and Availability of Information.

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

2277

Table 2 Survey of Organizational Excellence (SOE) dimensions and constructs1. Louisville Totala (n = 181) M (SD)

Fresno Worker (n = 139) M (SD)

Supervisor (n = 34) M (SD)

Work group Supervisor effectiveness a, b Fairnessa, b Team effectivenessa, b Diversitya, b Mean (SD) score (26–110)a, b α = 0.91

21.1 16.4 18.5 13.0 69.0

(5.3) (4.3) (4.8) (3.3) (16.1)

20.5 16.0 18.1 12.2 66.9

(5.2) (4.2) (4.7) (3.4) (15.7)

23.1 18.1 19.9 13.9 75.1

Accommodations Fair paya, b, c Physical environmenta, b Benefits Employment developmenta, b Mean (SD) score (8–75)a, b α = 0.64

5.8 12.9 9.7 17.2 45.4

(2.7) (3.3) (2.7) (3.9) (8.6)

5.5 12.5 9.7 16.8 44.5

(2.5) (3.4) (2.6) (3.9) (8.7)

6.9 (3.1) 13.9 (2.6) 10.2 (2.8) 18.3 (3.2) 49.2 (7.1)

15.7 13.1 21.9 30.3 23.8 104.7

(4.3) (3.2) (5.5) (6.5) (5.2) (21.6)

15.4 12.9 21.4 30.0 23.1 102.7

(4.4) (3.2) (5.5) (6.6) (5.0) (21.4)

9.1 19.2 23.0 51.4

(2.6) (4.6) (5.0) (11.4)

8.9 18.8 22.7 50.5

(2.6) (4.7) (4.9) (11.3)

11.7 11.6 15.8 19.0 58.2 328.7

(3.8) (3.6) (4.3) (4.6) (14.8) (66.7)

11.2 11.1 15.3 18.5 56.0 264.9

(3.8) (3.6) (4.4) (4.5) (14.7) (50.2)

Organizational features Change oriented a Goal oriented a Holographica,b Strategica,c Qualitya,b,c Mean (SD) score (41–157)a,b α = 0.91 Information Internala Availabilityb External a Mean (SD) score (19–77) a α = 0.88 Personal Job satisfactiona,b Time and stressa,b Burnouta,b Empowermenta,b Mean (SD) score (22–95)a,b α = 0.93 Overall M (SD) score (150–512)a,b Overall internal consistency α = 0.93 1 a b c

16.5 13.7 23.7 31.5 26.0 111.5

(5.2) (4.2) (4.8) (3.4) (16.2)

(4.1) (3.0) (5.1) (6.2) (4.9) (21.2)

9.6 (2.6) 20.8 (4.2) 24.2 (5.2) 54.5 (11.3)

13.4 13.4 17.6 20.6 65.0 294.2

(2.8) (2.7) (3.2) (4.5) (11.8) (45.8)

Totala (n = 229) M (SD) 18.3 15.3 15.9 11.5 61.0

(5.4) (3.8) (4.8) (3.3) (15.3)

Worker (n = 187) M (SD) 18.0 15.1 15.7 11.3 60.1

(5.4) (3.8) (4.7) (3.4) (15.3)

Supervisor (n = 35) M (SD) 19.4 15.5 16.3 11.9 63.0

(4.9) (3.2) (4.7) (2.8) (13.6)

6.7 (2.7) 11.6 (3.0) 9.7 (2.4) 14.3 (3.9) 42.2 (9.2)

6.5 (2.6) 11.5 (2.9) 9.5 (2.4) 14.2 (4.0) 41.6 (9.0)

7.6 (2.8) 12.2 (2.7) 10.1 (1.9) 14.6 (3.0) 43.9 (7.9)

13.7 11.3 19.4 28.1 21.5 94.0

13.6 11.3 19.2 27.6 21.1 92.7

13.8 11.1 20.0 30.5 23.0 98.4

(4.1) (3.2) (5.5) (5.9) (4.9) (20.8)

7.6 (2.6) 18.6 (4.0) 21.1 (4.8) 47.3 (10.4)

10.1 10.2 14.2 16.9 51.4 295.8

(3.3) (3.1) (3.9) (4.2) (13.2) (63.6)

(4.0) (3.1) (5.6) (5.8) (4.8) (20.5)

7.6 (2.6) 18.5 (4.0) 20.9 (4.9) 46.9 (10.5)

9.9 (3.4) 10.1 (3.1) 14.0 (3.9) 16.7 (4.2) 50.7 (13.2) 292.1 (63.2)

(4.0) (3.6) (4.8) (5.3) (4.9) (19.2)

7.5 (2.4) 18.7 (3.8) 21.4 (4.2) 47.6 (9.3)

10.3 10.2 14.5 17.5 52.4 305.3

(2.9) (2.8) (3.3) (3.7) (11.1) (55.0)

Independent samples t-tests. p b 0.05 for Louisville v. Fresno. p b 0.05 for Louisville (caseworkers v. supervisors). p b 0.05 for Fresno (caseworkers v. supervisors).

Supervisors in Louisville rated the overall SOE more highly than caseworkers (p b 0.05) and all dimensions except for Information; these supervisors also rated all organizational constructs higher than workers, and almost all of these differences were significant (p b 0.05). These patterns did not emerge in Fresno, however. Supervisors rated most indicators more highly than workers, but only three of these differences were statistically significant (Fair Pay; Strategic nature of the organization; and Quality of the organization; p b 0.05). Findings from F2F implementation were more mixed between the sites (see Table 3). Overall internal consistency was acceptable for the scale, with α = 0.71. Fresno respondents rated the implementation of Team Decisionmaking (TDM) higher than Louisville overall and in 3 of 4 indicators (p b 0.05 for all). No cross-site differences emerged in Resource Family Development and Support (RDS) or related indicators, or in Self-Evaluation (SE) and related indicators except for one (agency staff are able to allocate more time to participate in Louisville; p b 0.05). Louisville respondents rated Building Community Partnerships (BCP) more highly than Fresno overall and for both indicators (p b 0.05). This finding reflects key practice differences between the sites, and will be discussed in more detail later in the paper. While respondents in both Louisville and Fresno generally rated Family to Family highly, those in Fresno rated it more highly (p b 0.05). Supervisors and workers rated some dimensions and constructs differently in both sites. Supervisors rated the full Family to Family

instrument more highly in Louisville and Fresno. In both sites, supervisors also expressed greater agency support for Family to Family overall compared with workers (p b 0.05 for Family to Family overall subscale, and nearly all related indicators). Linear regression models were also specified to predict the perceived implementation of each core strategy, and Family to Family overall, by demographic characteristics and each of the 5 dimensions of organizational excellence (see Table 4). Variance explained in the models was lowest for the model predicting TDM implementation (R 2 = 0.190) and highest for the model predicting overall F2F implementation (R 2 = 0.414). While some demographic differences emerged across the regression models, the most consistent predictors were organizational in nature. Higher organizational Information scores predicted higher implementation for TDM (p b 0.05), RDS (p b 0.01), and overall opinions of F2F (p b 0.01). Fresno respondents predicted higher implementation of TDM (p b 0.01) and better opinions of F2F overall (p b 0.01), but lower implementation scores for BCP (p b 0.01). This last finding again reflects important practice differences between the sites that will be discussed further. The next set of analyses examined attendance at Team Decisionmaking (TDM) meetings at each of the 2 sites, as a quantitative measure of implementation in CY 2008 (see Fig. 1). More Removal TDM meetings were held in Louisville (N = 1046) than Fresno (N = 554) during this period. Yet, in Fresno, a much higher percentage of meetings were attended by community representatives, service

2278

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

Table 3 Survey of Family to Family (F2F) implementation1. Louisville

Fresno

Total (n=181) Worker (n=139) Supervisor (n=34) Total (n=229) Worker (n=187) Supervisor (n=35) M (SD) M (SD) M (SD) M (SD) M (SD) M (SD) Team Decisionmaking (TDM) Meetings led by skilled facilitator a, c Community partners invited a, c Meetings held in community locations Birth parents and relatives attend a,c Mean (SD) score (0–20) a, c α = 0.61

3.4 (1.3) 3.4 (1.2) 2.7 (1.3) 3.6 (1.1) 13.0 (3.4)

b

Resource development, recruitment and support (RDS) Community partners assist w/recruit. Training and support meetings held in community locations Mean (SD) score (0–10) α = 0.74 Self evaluation (SE) Self eval team meets regularly c Agency team members can allocate time to participatea Clear linkages b/t SE team and agency leadership a SE team meets with other strategy teams to monitor implementation b Mean (SD) score (0–20) c α = 0.90 Building community partnerships (BCP) Visitations b/t birth parents and children held in community locationsa Staff meetings held in comm. locations a Mean (SD) score (0–10) a α = 0.68 Family to Family overall F2F is integral part of practice a, b Adequate training in F2F a, b, c Clearly understand 4 core strategies a, b, c F2F is the best way to pursue practicea,c Agency supports F2F a, b, c Immediate supervisor supports F2Fa,b, c Mean (SD) score (0–25)a, b,c α = 0.89 Overall M (SD) score (0–90) a, b, c Overall internal consistency α = 0.71 1 a b c

(1.3) (1.3) (1.3) (1.2) (3.6)

3.4 (1.2) 3.5 (0.9) 2.2 (1.2) 3.5 (1.0) 12.6 (2.5)

3.7 (1.1) 3.8 (1.0) 2.6 (1.1) 4.0 (1.0) 14.1 (2.6)

2.9 (1.4) 3.1 (1.4) 6.0 (2.5)

2.8 (1.5) 3.0 (1.5) 5.7 (2.7)

3.1 (0.8) 3.5 (1.1) 6.6 (1.4)

2.7 (1.3) 3.1 (1.4) 5.8 (2.4)

2.9 (1.4) 2.7 (1.4) 2.6 (1.4) 2.6 (1.4)

2.8 2.6 2.5 2.5

(1.4) (1.5) (1.4) (1.5)

3.2 (1.4) 3.0 (1.2) 3.0 (1.3) 3.1 (1.3)

2.8 (1.4) 2.3 (1.4) 2.3 (1.3) 2.6 (1.4)

10.8 (5.0)

10.4 (5.2)

12.2 (4.5)

10.0 (4.8)

9.7 (4.8)

11.5 (4.4)

3.4 (1.2)

3.4 (1.2)

3.4 (1.0)

2.6 (1.2)

2.7 (1.2)

2.4 (1.0)

3.0 (1.3) 6.3 (2.1)

2.9 (1.4) 6.2 (2.3)

3.2 (1.1) 6.6 (1.5)

2.4 (1.1) 5.0 (2.0)

2.5 (1.1) 5.1 (2.0)

2.2 (0.9) 4.7 (1.6)

3.6 (1.2) 3.3 (1.2) 3.3 (1.2) 3.4 (1.2) 3.6 (1.2) 3.6 (1.4)

3.4 3.1 3.1 3.3 3.4 3.5

3.9 (1.0) 3.9 (0.9) 3.9 (1.0) 3.7 (1.0) 4.0 (0.9) 4.1 (1.2)

3.8 (1.1) 3.5 (1.1) 3.6 (1.1) 3.7 (1.2) 3.9 (1.0) 4.0 (1.0)

3.8 3.4 3.5 3.6 3.8 3.9

4.1 4.0 4.2 4.3 4.3 4.3

23.6 (5.0) 61.5 (11.0)

18.6 (4.5) 59.2 (13.0)

17.1 (5.3) 53.1 (14.0)

3.3 3.4 2.8 3.7 13.0

(1.2) (1.2) (1.2) (1.2) (1.3) (1.4)

19.7 (6.3) 55.1 (15.4)

3.7 3.8 2.6 3.9 13.9

(1.1) (1.0) (1.1) (0.9) (2.8)

4.1 4.0 2.5 4.3 14.8

2.8 (1.3) 3.0 (1.4) 11.4 (4.2)

2.7 2.3 2.3 2.5

(0.7) (0.6) (1.1) (0.5) (1.3)

2.5 (1.4) 3.5 (1.3) 11.8 (4.0)

(1.4) (1.4) (1.3) (1.4)

3.5 2.6 2.6 2.9

(1.1) (1.1) (1.1) (1.2) (1.1) (1.0)

18.2 (4.5) 58.3 (13.0)

(1.3) (1.4) (1.4) (1.3)

(0.9) (0.8) (0.8) (0.7) (0.7) (1.0)

20.5 (3.9) 63.3 (11.5)

Independent samples t-tests. p b 0.05 for Louisville v. Fresno. p b 0.05 for Louisville (caseworkers v. supervisors). p b 0.05 for Fresno (caseworkers v. supervisors).

Table 4 Predictors of core strategy implementation. TDM

RDS

SE

BCP

F2F

B (SE)

p

B (SE)

p

B (SE)

p

B (SE)

p

B (SE)

p

Intercept Hours per week Supervisor (yes) Age (categorized) Years of service at org. (categorized)

7.9 (1.1) − 0.1 (0.2) − 0.2 (0.5) 0.1 (0.2) 0.1 (0.2)

b0.001 0.538 0.703 0.671 0.679

0.9 (0.9) 0.1 (0.1) − 0.5 (0.4) 0.2 (0.2) 0.2 (0.1)

0.315 0.400 0.230 0.359 0.118

− 0.4 (1.8) 0.3 (0.2) 0.1 (0.8) 0.2 (0.3) 0.7 (0.3)

0.844 0.195 0.892 0.498 0.026

2.6 (0.8) 0.1 (0.1) − 0.3 (0.4) 0.0 (0.1) 0.1 (0.1)

b0.001 0.325 0.361 0.819 0.328

10.8 (4.3) 0.8 (0.6) 0.4 (1.9) 0.3 (0.8) 1.5 (0.7)

0.012 0.194 0.823 0.732 0.038

Service area (v. permanency) Intake/Invest. Ongoing Merit increase (yes) Plan to work at agency in 2 years (yes) Salary (categorized)

0.4 (0.4) − 0.3 (0.4) 0.1 (0.3) − 0.5 (0.5) − 0.2 (0.2)

0.278 0.523 0.616 0.267 0.374

− 0.7 (0.3) − 0.5 (0.3) 0.4 (0.3) 0.0 (0.4) 0.1 (0.2)

0.043 0.148 0.122 0.986 0.681

− 1.0 (0.6) − 0.5 (0.7) 1.2 (0.5) 0.4 (0.8) 0.0 (0.3)

0.134 0.420 0.024 0.593 0.910

0.3 (0.3) 0.6 (0.3) − 0.1 (0.2) 0.1 (0.3) − 0.2 (0.1)

0.377 0.058 0.718 0.709 0.127

− 1.9 (1.5) − 1.2 (1.6) 1.9 (1.3) 0.8 (1.9) − 0.1 (0.7)

0.225 0.459 0.139 0.690 0.939

Organizational dimensions Work group Accommod. Org. features Information Personal Fresno (v. Louisville) R-square

− 0.0 (0.0) − 0.0 (0.0) 0.1 (0.0) 0.1 (0.0) − 0.0 (0.0) 1.6 (0.5) R2 = 0.190

0.182 0.820 0.032 0.017 0.713 b0.001

0.0 (0.0) − 0.0 (0.0) − 0.0 (0.0) 0.1 (0.0) 0.0 (0.0) 0.1 (0.4) R2 = 0.194

0.388 0.382 0.559 0.002 0.599 0.851

0.1 (0.0) − 0.0 (0.1) − 0.0 (0.0) 0.1 (0.1) − 0.0 (0.0) − 0.5 (0.7) R2 = 0.205

0.014 0.760 0.950 0.116 0.420 0.468

− 0.0 (0.0) − 0.0 (0.0) 0.0 (0.0) 0.0 (0.0) 0.0 (0.0) − 0.9 (0.3) R2 = 0.216

0.285 0.535 0.121 0.096 0.572 0.008

0.0 (0.1) − 0.1 (0.1) 0.1 (0.1) 0.4 (0.1) − 0.0 (0.1) 8.1 (1.8) R2 = 0.414

0.644 0.654 0.115 b0.001 0.960 b0.001

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

2279

90 80 70 60 50 40 30

Louisville

20

Fresno

10 0 CR

SR

FMF

CL

Removal TDMs Louisville: N=1,046 Fresno: N=554

CR

SR

FMF

CL

COP TDMs Louisville: N=738 Fresno: N=408

Fig. 1. Site-by-site comparison of attendance at Team Decisionmaking (TDM) meetings. (CY 2008). NOTE: CR = Community representative; SR = Service provider; FMF = Family member or friend; CL = Meeting held in community location.

providers, and family members or friends; practically no meetings were held in community locations at either site. For Change of Placement (COP) TDM meetings, community representatives attended a higher percentage of meetings in Louisville compared with Fresno. Service providers and family members and friends attended higher numbers of meetings in Fresno, but these cross-site differences were not as great as for Removal TDMs. The next analyses examined monthly deviations from the average numbers of children in OHC in both sites, and for referrals in Fresno (see Fig. 2). Over this 12 month period, the mean number of children in OHC care in Louisville was 1064 (SD = 30). The mean number in Fresno during the same time period was 2261 (SD = 42). The difference in these means was statistically significant, with t = 144.84, df = 11, and p b 0.001. In Louisville, the actual monthly deviation from the mean exceeded the standard deviation in 7/1/08, 8/1/08, and 9/1/08, and was lower than the standard deviation in 4/1/09. In Fresno, the actual monthly deviation from the mean exceeded the standard deviation in 7/1/08 and 8/1/08, and was lower than the standard deviation in 2/1/09. For referrals to Fresno DCFS, the actual

monthly deviation was highest in 3/1/09 (the time of SOE–F2F data collection at the site) and also exceeded the standard deviation in 5/1/09. Referral rates were lower than the standard deviation in 7/1/08, 8/1/08, and 11/1/08. 5. Discussion The findings from the SOE and F2F surveys suggest significantly different perceptions of organizational excellence and of Family to Family implementation between the sites. Administrative data pertaining to Team Decisionmaking (TDM) support the differences in perceptions of F2F implementation from the survey. Specifically, staff in Fresno reported that their agency had higher TDM fidelity and attendance data confirmed that this is correct. In addition, the widely varying patterns of children served on a monthly basis between the agencies provide evidence that these contextual factors may be related to perceptions of organizational excellence. It is unclear from these data, however, whether organizational factors influence indicators such as the numbers of children in out-of-home care (Glisson,

Fig. 2. Monthly deviations from average children in out-of-home care (OHC) and referrals. (July 1, 2008–June 30, 2009).

2280

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281

2007), or vice versa, whether the increased workload demands of serving more children influence employees' perceptions of their organization. Either way, these cross-site differences reveal important information for child welfare reform efforts. Respondents in Louisville indicated higher levels of organizational excellence; yet, Fresno respondents tended to rate implementation of Family to Family higher. Based on administrative data, a higher percentage of TDM meetings were attended by key stakeholders in Fresno, an important indicator of implementation fidelity (Crea et al., 2009). These findings suggest the possibility of an inverse relationship between organizational factors and Family to Family implementation between these two sites, a pattern which runs contrary to prevailing opinion on the relationship between organizational excellence and program implementation (Glisson, 2007; Zvoch, 2009). One explanation may be that as an effort to reform child welfare services, Family to Family is accepted more willingly in resource-constrained environments. This explanation is supported by the high demand for services experienced by Fresno during the time of data collection (see Fig. 2). An alternative, but related, explanation for these differences is that the high workload demands in Fresno artificially depressed indicators of organizational excellence, such that these findings represent an artifact of above-average external pressures. Further research is required to replicate or repudiate this inverse relationship between sites. The reason behind these fluctuating patterns of referrals in Fresno is unclear, but may be related to economic conditions. In March of 2009, at the time of data collection, Fresno experienced a record unemployment rate of 11.2% (Economic Development Department, 2009). Sedlak et al. (2010) found that parental unemployment contributed to an increased likelihood of child neglect, although the relationship between the economic recession and overall child maltreatment rates has yet to be strongly established (Millett, Lanier, & Drake, 2011). Thus, while economic factors in Fresno may play a role in shaping the organizational context of the agency, further research is needed to disentangle these effects. Additional findings from this study highlight the importance of organizational dimensions as influences of Family to Family implementation. Using data combined from both sites, multivariate models showed organizational indicators as being the most common significant (and positive) predictors of core strategy implementation, and of Family to Family overall. Thus, when controlling for cross-site differences, organizational factors emerge as important variables in explaining program implementation. Specifically, the Information dimension of organizations (i.e., the information flow and availability of information within organizations) significantly predicted TDM and RDS implementation and opinions of Family to Family in general. These findings suggest that as the values and strategies of Family to Family are articulated within an organization, the perceived implementation of the initiative is seen to improve. This study has limitations. Participant responses to survey items may be subject to recall bias. While the study also draws from administrative data to compare with survey findings, the survey itself is based on respondents' retrospective impressions of implementation, and not direct observation of program activities. As is the case with any administrative data, these data are of unknown validity and reliability and thus should be interpreted with caution. The differential response rates between the two sites may have introduced bias to the study results. An additional limitation is that survey items pertaining to Family to Family were created by researchers and experts in the field, but were not vetted ahead of time with staff in Fresno. Administrators in Fresno pointed out that the Building Community Partnerships (BCP) strategy operates differently than in other Family to Family sites, in that Fresno does not rely on community locations for meetings or visitations between birth parents and children. Thus, an important contextual difference was overlooked prior to data collection, and is only reflected by low

indicators of BCP in Tables 3 and 4. Future efforts at data collection will involve agency administrators in the creation of survey items before disseminating the survey at sites. 6. Conclusion and implications for research and practice Findings from this study revealed significant cross-site differences in organizational excellence and program implementation. However, despite these differences, organizational factors still emerged as strong predictors of perceived program implementation across these two sites. The results provide important information to agency administrators, and in Louisville, administrators have used these findings to advocate for increased county funding to fill vacant positions, as well as to use trainings strategically to improve child welfare reform efforts (see Crea et al., in press). In some ways, the information provided by these surveys is an expansion of the self-evaluation approach used by agencies to inform practice (Webster et al., 2002); these surveys include information on implementation, but more broadly cover topics to help illuminate the inter- and intra-organizational dynamics which may influence implementation (Schweigert, 2006). Future research should involve data collection at a wider variety of sites with varying degrees of experience with Family to Family, and multiple points of data collection to examine organizational changes over time. Previous research on Family to Family implementation showed in a qualitative study that leadership and resources were perceived as essential to effective child welfare reform (Crampton et al., 2008). A mixed method study showed that strong leadership and adequate resources are correlated with desired changes in child welfare services (Crea et al., 2008). A related study of Family Group Decision Making in Pennsylvania showed that need for services was not associated with reform, but that strong leadership, both in initial start up and in maintaining implementation, is critical (Rauktis, McCarthy, Krackhardt, & Cahalane, 2010). While we have evidence that leadership is important in child welfare reform, we don't know how it works. In the case of Family to Family, reform leaders conduct systematic training to educate agency staff and community partners early during the implementation process, so that key stakeholders understand the purposes and processes of the Family to Family practices (Crea et al., 2008). This study suggests that these methods of communicating the need for reform are successful when they change the organizational culture and staff members believe that they understand the need for reform and how the reform will improve services. This study showed that supervisors, who presumably receive more briefings on the reform efforts, are more supportive than case workers who may have less direct information about reform, so perhaps more communication is needed on the frontline. The practice implications of this study suggest that reform leaders need to communicate and provide information about the reforms so that all staff become part of the change effort and feel that their organization supports them and their work. References Annie E. Casey Foundation (no date). Family to Family: Core strategies. Retrieved from http://www.aecf.org/MajorInitiatives/Family%20to%20Family/CoreStrategies.aspx. Batterson, M., Crampton, D., Crea, T., Harris, F., Madden, A., Usher, L., & Williams, J. (2007). Implementing Family to Family. Chapel Hill, NC: The University of North Carolina at Chapel Hill. Berwick, D. M. (2008). The science of improvement. Journal of the American Medical Association, 299(10), 1182–1184. Bissett, S., Daniel, M., & Potvin, L. (2009). Exploring the intervention–context interface. American Journal of Evaluation, 30(4), 554–571. Botcheva, L., White, C. R., & Huffman, L. C. (2002). Learning culture and outcomes measurement practices in community agencies. American Journal of Evaluation, 23(4), 421–434. Crampton, D. S., Crea, T. M., Abramson-Madden, A., & Usher, C. L. (2008). Challenges of street-level child welfare reform: The case of Team Decisionmaking. Families in Society, 89(4), 512–520. Crea, T. M., Crampton, D. S., Abramson-Madden, A., & Usher, C. L. (2008). Variability in the implementation of Team Decisionmaking (TDM): Scope and compliance

T.M. Crea, D.S. Crampton / Children and Youth Services Review 33 (2011) 2273–2281 with the Family to Family practice model. Children & Youth Services Review, 30, 1221–1232. Crea, T. M., Crampton, D. S., Knight, N., and Paine-Wells, L. (in press). Organizational factors and Family to Family: Contextual elements of systems reform. Child Welfare. Crea, T. M., Usher, C. L., & Wildfire, J. B. (2009). Implementation fidelity of Team Decisionmaking (TDM). Children & Youth Services Review, 30, 119–124. Economic Development Department (2009). California's unemployment rate decreases to 11.0%. News release #09-30, May 21, 2009. Available from http://www.edd.ca. gov/About_EDD/pdf/urate200905.pdf Elmore, R. F. (1980). Backward mapping: Implementation research and policy decisions. Political Science Quarterly, 94(4), 601–616. Glasgow, R., Lichtenstein, E., & Marcus, A. (2003). Why don't we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. American Journal of Public Health, 58, 1261–1267. Glisson, C. (2007). Assessing and changing organizational culture and climate for effective services. Research on Social Work Practice, 17(6), 736–747. Glisson, C., & James, L. R. (2002). The cross-level effects of culture and climate in human service teams. Journal of Organizational Behavior, 23, 767–794. Glisson, C., Landsverk, J., Schoenwald, S., Kelleher, K., Hoagwood, K. E., Mayberg, S., & Green, P. The Research Network on Youth Mental Health. (2008). Assessing the Organizational Social Context (OSC) of mental health services: Implications for research and practice. Administration and Policy in Mental Health, 35, 98–113. Kalafat, J., Illback, R. J., & Sanders, D. (2007). The relationship between implementation fidelity and educational outcomes in a school-based family support program: Development of a model for evaluating multidimensional full-service programs. Evaluation and Program Planning, 30, 136–148. Kum, H. C., Duncan, D. F., & Stewart, C. J. (2009). Supporting self-evaluation in local government via Knowledge Discovery and Data mining. Government Information Quarterly, 26, 295–304. Landuyt, N. (2000). The survey of organizational excellence. Available from http:// www.utexas.edu/research/cswr/survey/site/prospect/promo_packet.pd Lauderdale, M. (1999). Reinventing Texas government. Austin, TX: University of Texas Press. Lipsky, M. (1980). Street-level bureaucracy: Dilemmas of the individual in public services. New York: Russell Sage Foundation. Millett, L., Lanier, P., & Drake, B. (2011). Are economic trends associated with child maltreatment? Preliminary results from the recent recession using state level data. Children & Youth Services Review, 33(7), 1280–1287. Owen, J. M., & Lambert, F. C. (1998). Evaluation and the information needs of organizational leaders. American Journal of Evaluation, 19(3), 355–366. Preskill, H., Zuckerman, B., & Matthews, B. (2003). An exploratory study of process use: Findings and implications for future research. American Journal of Evaluation, 24(4), 423–442. Rauktis, M. E., McCarthy, S., Krackhardt, D., & Cahalane, H. (2010). Innovation in child welfare: The adoption and implementation of Family Group Decision Making in Pennsylvania. Children and Youth Services Review, 32, 732–739.

2281

Rego, M. A. S., Otero, A. G., & Moledo, M. D. M. L. (2009). Evaluation of the implementation of a socio-educational program with immigrant families: A case study. Evaluation and Program Planning, 32, 21–30. Schweigert, F. J. (2006). The meaning of effectiveness in assessing community initiatives. American Journal of Evaluation, 27(4), 416–436. Scott, T., Mannion, R., Davies, H., & Marshall, M. (2003). The quantitative measurement of organizational culture in health care: A review of the available instruments. Health Services Research, 38(3), 923–945. Sedlak, A. J., Mettenburg, J., Basena, M., Petta, I., McPherson, K., Greene, A., & Li, S. (2010). Fourth National Incidence Study of Child Abuse and Neglect (NIS–4): Report to Congress. Washington, DC: US. Department of Health and Human Services, Administration for Children and Families. Stern, S. B., Alaggia, R., Watson, K., & Morton, T. R. (2008). Implementing an evidencebased parenting program with adherence in the real world of community practice. Research on Social Work Practice, 18(6), 543–554. Taut, S. (2007). Studying self-evaluation capacity building in a large international development organization. American Journal of Evaluation, 28(1), 45–59. Taut, S. M., & Alkin, M. C. (2003). Program staff perceptions of barriers to evaluation implementation. American Journal of Evaluation, 24(2), 213–226. Torres, R. T., & Preskill, H. (2001). Evaluation and organizational learning: Past, present, and future. American Journal of Evaluation, 22(3), 387–395. Tuchman, E., & Sarasohn, M. K. (2011). Implementation of an evidence-based modified therapeutic community: Staff and resident perceptions. Evaluation and Program Planning, 34(2), 105–112. Usher, C. L. (1995). Improving evaluability through self evaluation. Evaluation Practice, 16(1), 59–68. Usher, C. L., Wildfire, J. B., & Gibbs, D. A. (1999). Measuring performance in child welfare: Secondary effects of success. Child Welfare, 78(1), 31–51. Usher, C. L., Wildfire, J., Webster, D., & Crampton, D. (2010). Evaluation of the anchorsite phase of Family to Family. Chapel Hill, NC: Jordan Institute for Families, School of Social Work, the University of North Carolina at Chapel Hill. Webster, D., Needell, B., & Wildfire, J. (2002). Data are your friends: Child welfare agency self-evaluation in Los Angeles County with the Family to Family initiative. Children & Youth Services Review, 24(6–7), 471–484. Westbrook, T. M., Ellett, A. J., & Deweaver, K. W. (2009). Development and validation of a measure of organizational culture in public child welfare agencies. Research on Social Work Practice, 19, 730–741. Yoo, J., & Brooks, D. (2005). The role of organizational variables in predicting service effectiveness: An analysis of a multilevel model. Research on Social Work Practice, 15(4), 267–277. Zvoch, K. (2009). Treatment fidelity in multisite evaluation. American Journal of Evaluation, 30(1), 44–61.