Journal of Applied Developmental Psychology 40 (2015) 57–62
Contents lists available at ScienceDirect
Journal of Applied Developmental Psychology
Comprehensive community initiatives: The road ahead for research and practice Emily S. Lin a, Jonathan F. Zaff a b c
⁎, Amy R. Gerstein c
b,
nFocus Solutions, United States America's Promise Alliance and Boston University, United States Stanford University, United States
a r t i c l e
i n f o
Available online 8 August 2015 Keywords: Comprehensive community initiative Evaluation Research implications
a b s t r a c t In this concluding article, we summarize the body of articles in this special issue to identify and explore themes for the next wave of research and practice in CCIs. We suggest that these articles reveal that while evaluation, data, and assessments are a powerful way to frame clear, actionable tactics within complex systems, any efforts to be data-driven in CCIs must be grounded in strong relationships built on trust between stakeholders. We propose that people who wish to use data more effectively in the work of CCIs should not take the position of the removed analyst, but rather that of an engaged, data-informed leader, taking responsibility for enabling members of the CCI to build and maintain the relationships and collective identity of the CCI. To explore this proposal, we articulate a vision for what data-informed CCI leadership could look like, based on theories of sense-making grounded in developmental science. We then examine how these theories play out in real-life using examples of how three CCI leaders have tried to engage in data-driven work. Finally, we propose implications of this conception of leadership for collaborative, community-based efforts in the future. © 2015 Elsevier Inc. All rights reserved.
The articles in this special issue describe and study multiple dimensions of multiple Comprehensive Community Initiatives (CCIs). While the points of analysis are different, and their results varied, one point is clear—implementing CCIs is hard. Indeed, compared to more narrowly focused policies to address social issues, CCIs are expensive, time-intensive, complex, and extremely difficult to implement. A reasonable question, then, is why would CCIs be presumed to be a worthwhile intervention to address social inequities and promote youth and community development? The short answer is that, despite our natural cognitive bias toward simple, reductionist explanations (e.g., Feltovich, Spiro, & Coulson, 1993; Grotzer, 2012), complex social problems do require complex solutions. As a symptom of poverty and economic inequality, many of the issues CCIs are designed to address share characteristics of “wicked problems” (Rittel & Webber, 1973): there is no definitive formulation of the problem (i.e., the problem looks different in different places, for different people, at different times); the causes of the problem are multiple, interconnected, and contested; and there are no static solutions (i.e., any intervention effectively changes the problem space). As such, attempts to resolve such issues should be responsive to the local, complex, and dynamic nature of wicked problems (Gibson, Smyth, Nayowith, & Zaff, 2013, September 19).
⁎ Corresponding author. E-mail address:
[email protected] (J.F. Zaff).
http://dx.doi.org/10.1016/j.appdev.2015.07.002 0193-3973/© 2015 Elsevier Inc. All rights reserved.
The introductory article to this issue considers CCIs as such an attempt to respond productively to complexity, using a relational developmental systems perspective (e.g., Lerner, 2012; Overton, 2013) that understands young people as constantly interacting in mutually developmentally influential ways with multiple contexts over time (Bronfenbrenner, 1986; Bronfenbrenner & Morris, 2006). The dynamic, evolving, and emergent nature of developmental systems (Overton, 2013) means not only that young people are continuously adapting as they influence and are influenced by their contexts, but that those who seek to intervene in those contexts must similarly adapt to the changing personal, social, physical, and political conditions around them. In addition to the mandate to align resources, needs, and strategies among multiple parties and levels of stakeholders within a CCI, then, those attempting to use CCIs to address “wicked problems” must recognize that the nature and needs of the stakeholders themselves, as well as the relationships between them, are negotiable and subject to constant change. Thus, it becomes clear that, beyond simply ensuring the provision of high-quality child- or family-facing services and actions (which—while a challenging task—can and is being done by a number of individual organizations around the country), CCIs seeking to address inequities of opportunity should likely be prepared to respond to the continuously changing conditions of young people's lives and the multi-layered ecology within which they are embedded. As proposed in the introductory article, a “youth system” framework can serve as a helpful planning and evaluation heuristic, piecing together the ecology around the young person and leading communities to strengthen the supports around a
58
E.S. Lin et al. / Journal of Applied Developmental Psychology 40 (2015) 57–62
young person to produce a “supportive youth system” (Zaff, Donlan, Jones, & Lin, 2015). Using the idea of a supportive youth system, we summarize the body of articles in this special issue to identify and explore themes for the next wave of research and practice in CCIs. We suggest that these articles reveal that while evaluation, data, and assessments are a powerful way to frame clear, actionable tactics within complex systems, any efforts to be data-driven in CCIs must be grounded in strong relationships built on trust between stakeholders. We propose that people who wish to use data more effectively in the work of CCIs should not take the position of the removed analyst, but rather that of an engaged, data-informed leader, taking responsibility for enabling members of the CCI to build and maintain the relationships and collective identity of the CCI. To explore this proposal, we articulate a vision for what data-informed CCI leadership could look like, based on theories of sense-making grounded in developmental science. We then examine how these theories play out in real-life using examples of how three CCI leaders have tried to engage in data-driven work. Finally, we propose implications of this conception of leadership for collaborative, communitybased efforts in the future. Summary of special issue The articles in this special issue address the idea of tackling the complex nature of large-scale community issues in multiple ways. Zaff et al. (2015) observed that the complexity of CCIs can lead to misalignment in perceived needs among stakeholders, and therefore divergence in strategies to address those needs. With families and direct service providers focusing on meeting basic needs, and with CCI leadership focusing on long-term academic and positive developmental outcomes for youth, a clear misalignment of goals emerges, which make any efforts to tame complexity even more difficult. Osher et al., (2015) gave another example of a CCI in which intentions to align efforts in a community did not play out as planned on the ground. They proposed that the extent to which there was a mismatch of outcome with intention in Say Yes, that mismatch could be explained by was due to a lack of thorough implementation, such as ensuring quality of programming, maintaining consistent goals during leadership transitions, and providing institutional support for changes to routines. In addition, they noted that plans for a comprehensive monitoring and evaluation system did not get realized in practice, particularly at the individual level, suggesting that the lack of follow-through on this aspect to be a barrier to the ultimate success of the CCI in producing positive outcomes for youth. As data tracking was one of the key components of Say Yes, Osher and colleagues note that while Say Yes Syracuse was able to develop many useful tools for data tracking across all levels of the CCI, technical, social, and structural issues (such as a leadership changes and a lack of professional development time) made for unfavorable conditions for “constructive data use.” In contrast, with their emphasis on “homegrown” initiatives, Lanspery and Hughes (2015) made the case for an organic approach to systems change, in which local organizations and constituencies came together because of perceived mutual benefit, rather than being brought together by outside forces, such as a funder. Their argument was that without a financial incentive to come together, partner organizations had to articulate—early on—clear, shared goals and benefits in order to get any action moving at all. Such clarity drove them through the inevitable moments of uncertainty that came later and also helped create common language across stakeholder types, as evidenced in their descriptions of the work in both Detroit and Philadelphia. As a result, potentially controversial individual-level tactics, like using assessments as a means of matching youth to targeted skill development and interventions, could be implemented on a groundwork of trust and understanding. The implication here, then, is that the work of reducing complexity can be accomplished through technical, data-driven solutions, but that solutions are best implemented when relational work is done prior to
moments of crisis in the system. Mancini and colleagues' (this issue) work on the role of social capital in helping young people deal with unexpected events emphasizes this perspective. As Kim, Oesterle, Catalano, and Hawkins (2015) show, a community can collect representative data of a youth population and the data, if collected over time, can provide important insights into what development typically looks like; as well as where intervention points emerge. Altogether, this body of articles seems to reveal that while evaluation, data, and assessments are a powerful way to find a coherent throughline in complex systems, any efforts to be data-driven in CCIs must be grounded in strong relationships built on trust between stakeholders. Using data and evidence in CCIs The need for both of these principles to work hand-in-hand is particularly important now, as the systematic use of data and evidence is increasingly touted as a key tool for promoting organizational learning in complex environments (Data Quality Campaign, 2011). Data and evidence can be broadly understood to be information about phenomena relevant to youth development, gathered from observations conducted in a systematic or otherwise documentable method, and interpreted by actors at multiple levels of a system. This broad definition has been concretized within a myriad of standards of evidence (e.g., Blueprints for Healthy Youth Development, 2014; Coburn & Turner, 2012; Data Quality Campaign, 2011; Smyth & Schorr, 2009; Marsh, Pane, & Hamilton, 2006; McLaughlin & London, 2013; Nutley, Walter, & Davies, 2007). Regardless of the specific standard, “datadriven decision-making,” “continuous improvement,” and “evidencebased practice” are all variations on a general belief that communitybased work produces better outcomes for young people when the work is responsive to data and evidence at multiple steps along the way. This belief has captured the imaginations and priorities of practitioners and researchers alike, and the recent proliferation of tools and power to both capture and analyze large amounts of data has led to predictions of data use transforming fields ranging from health care to government to business (Manyika et al., 2011). Yet, despite the clear value of this information to strategic learning and action, the body of research on the actual use of data and evidence in youth policy and practice at the community or school district level (see Honig & Coburn, 2008, for a review) has shown that not only do many communities frequently lack the capacity to gather, interpret, and use evidence effectively, but also that deeper cultural, relational, structural, and belief barriers can impede data use (e.g., Diamond & Cooper, 2007; Nelson, Leffler, & Hansen, 2009; Nutley et al., 2007). Since the 1970s, evaluation researchers like Carol Weiss and Michael Patton have built a body of theoretical and empirical knowledge on the different ways research, evidence, and data are used, mis-used, and not used in educational and social change settings (Patton, 2008; Weiss, Murphy-Graham, & Birkeland, 2005). For example, formative evaluation data are often intended to be used to test and refine theories of action (conceptual use) or to make programmatic or strategic decisions (instrumental use). However, such intended uses can clash against the cognitive bias that predisposes us to seek information and adopt interpretations that confirm rather than challenge our pre-existing beliefs (e.g., Brandtstädter, 2006; Nickerson, Perkins, & Smith, 1985). Prior research in education practice and policy has corroborated the existence of this bias when stakeholders engage with evidence (e.g., Spillane, 2000; Spillane & Callahan, 2000). As a consequence, evidence acquisition and evidence use in social settings tend most commonly to be political, used to support an argument or justify an action, even in cases where the evidence was not consulted in developing that argument or action in the first place (e.g., Corcoran, Fuhrman, & Belcher, 2001; Marsh, 2006; Robinson, 1988). Of particular concern for leaders of CCIs is the fact that they must not only make sense of data for themselves, but for multiple stakeholders
E.S. Lin et al. / Journal of Applied Developmental Psychology 40 (2015) 57–62
with multiple perspectives and prior dispositions to the data. For example, Honig and Coburn (2005, 2008) observe that those who conduct research, interpret research, and have the power to decide what evidence should be used in action are often different people, not to mention distinct from those who are actually “living the data.” Consequently, stakeholders from these various perspectives enter data collection, interpretation, and use with very different beliefs about the usefulness of that data. The social landscape of CCIs is often contested and ambiguous, with discrepancies between different stakeholders' representations not easily reconciled. We argue, therefore, that people who wish to use data more effectively in the work of CCIs should not take the position of the removed analyst, but rather that of an engaged leader, taking responsibility for enabling members of the CCI to build and maintain the relationships and collective identity of the CCI. The role of leadership in acquiring, interpreting, and using evidence in the education field has been proposed to be largely around “sense-making,” or meaning that people give to experiences (Spillane, Reiser, & Reimer, 2002). Knapp, Copland, and Swinnerton (2007) describe data-informed leadership as having a moral as well as strategic dimension, “resting on a foundation of values and strategic thinking that guides the leaders' reach for data” (Fullan, 2001, p. 82). To promote system-wide adaptation and collective learning, then, CCI leaders must learn how to use evidence in a way that not only considers, but respects and integrates the perspectives of multiple stakeholders. In doing so, we suggest researchers are not only more likely to find success in having the products of their analysis taken up by CCI stakeholders, but the research will likely be of higher quality, and the enterprise itself more likely to succeed (London & McLaughlin, 2014). Next, we articulate a vision for what that kind of leadership could look like, based on theories of sense-making, grounded in developmental science. We then examine how these theories play out in real-life CCIs using examples of how three CCI leaders have tried to engage in data-driven work. Finally, we propose implications of this conception of leadership for collaborative, community-based efforts in the future. The nature of leadership in data-driven work In this article, we are informed by the idea of leadership as taking responsibility for enabling others to achieve shared purpose in the face of uncertainty (Ganz, 2009). Typically, we think of data as providing an “objective” way of reducing uncertainty. We use data and research descriptively, to map a previously unknown landscape, predictively, to identify the most likely outcomes for a given trajectory, or prescriptively, to find interventions most likely to produce desired outcomes for a particular profile. The use of skills-based assessments in the Philadelphia Youth Network of targeted, evidence-based practices in Communities That Care, and of data-driven intervention planning in Say Yes are all examples of this. Yet, while such approaches can reduce uncertainty (and therefore, complexity) in useful ways, uncertainty can never be completely removed. For example, the often-cited importance of “trust” in CCI work is a proxy for the ability to continue to move in partnership and alignment with other CCI stakeholders when the outcomes of such partnerships are unknown. Lanspery and Hughes (2015) and Osher et al., (2015) show the importance of both foundational and ongoing work around relationship-building and creating a shared vision and goals, which help CCI stakeholders collectively stay the course in times of uncertainty. Bruner (1986) framed this interplay as the difference between “two ways of knowing”—paradigmatic and narrative. Paradigmatic knowing is analytic and strategic, and narrative knowing focused on the moral and emotional resources necessary to act under uncertain circumstances (e.g., those circumstances encountered when trying to integrate research and evidence into social change efforts). In other words, sensemaking leadership is not just about convincing people that the data you
59
hold is generally true (i.e., paradigmatic knowing), but interpreting the evidence, as well as the holes in the evidence, in a way that speaks to its “lifelikeness” (i.e., narratives; Bruner, 1986, p.11); resonating with people's existing understanding of their own lived experience. If the role of narrative sensemaking is to bring evidence into a “lifelike” interpretive frame, then CCI leaders must learn how to use evidence in a way that leverages the pre-existing emotional responses of multiple stakeholders to moral issues toward productive ends. Brandtstädter (2006) argued that while emotions in response to a mismatch between intended and actual outcomes can motivate developmental examination and regulation of goals, those emotions can also lead to feelings of hopelessness and despair, especially in the absence of resources to successfully take corrective action. The activation of these emotions is not intentional, nor is the activation of any associated “actional tendency” (Brandtstädter, 2000, p. 8). However, intentional processes, such as choosing how to represent or communicate information, coact with these unintentional processes. As an example, consider a CCI in which metrics-based assessment could potentially show that an individual program was not having the impact on young people that they were publicly perceived to have. The fear, then, is that this uncovering of the gap between their demonstrated outcomes and their intended outcomes would lead to punitive consequences (e.g., the reduction or removal of funding or political support), instead of providing invaluable information to strengthen the work. The data leader's role in this situation is to find ways to elicit and build on an existing sense of trust and shared investment between the overall initiative and the program, then to interpret the performance measurement effort in a way that fits within a larger understanding that the data are used for learning and growth purposes, not punishment. To shape the emotional valence of this process toward hope and self-efficacy (which open up the developmental action space toward intended outcomes) rather than toward fear and despair (which close that space), we propose that a data leader should frame the interpretation of these data and subsequent actions as being specifically valuesdriven. Values can be differentiated from goals in that values are abstract and aspirational and do not prescribe a concrete outcome. These high-level, abstract aspirations for how to be or why to act are distinct from concrete goals for what to do. Although distinct, values and goals must be “semantically linked” (Brandtstädter, 2006, p. 51) in the data interpretation and elaboration process into order to realize developmental potential through action, especially given the ease with which actions may become habitual and thus dissociated from their original motivations (Brandtstädter, 2000). Brunstein, Schultheiss, and Maier (1999) provide empirical support for this values-goals-action integration in their discussion of the pursuit of personal goals. They highlight the importance of integrating specific action commitments with more abstract “life goals” (p. 189) in helping individuals move forward in a goal-directed way. Sheldon and Emmons (1995) also found that individuals who feel successful in goal-pursuit and committed to continuing those pursuits tend to operate within differentiated goal systems, in which goals are articulated as operationally distinct at a low level, but in an integrated way, such that these low-level goals are linked to the achievement of a desired “possible self” (p. 40), an imagined and motivating representation of what one's future self might look like. In the above example, then, a productive leader might be able to find ways to elicit and build on an existing sense of trust and shared investment between the overall initiative and the program, then to interpret the performance measurement effort in a way that fits within a larger understanding that the data are used for learning and growth purposes, not punishment. This sense of trust and connection is not found in the data, nor can it be guaranteed—indeed, trust is characterized by commitment to a relationship despite uncertainty about the outcome of that relationship. Thus, even in the most “data-driven” of environments, CCI leaders must continuously engage with and move stakeholders through such uncertain situations.
60
E.S. Lin et al. / Journal of Applied Developmental Psychology 40 (2015) 57–62
We therefore argue that CCI leaders are most effective in using data and evidence when they are able to link their use to a desired vision of the future in a way that is values driven and sense-making. That is, if leaders are able to frame the use of data and evidence using individual and organizational stories that convey an image of the future that is motivational and values-driven, they will be more successful in their efforts to integrate data and evidence use into their local practices. Such integration allows for actors within a system to learn from and adapt to the dynamic, complex nature of that system in an informed, purposeful way, ultimately leading to more positive outcomes for youth. Examples of leadership in data-driven CCI work Here, we lay out three examples of how data leadership has played out in our own research. These examples are drawn from participants in a three-month long professional development experience meant to develop leadership skills in data-driven CCI leaders, as well as work we have conducted in communities using an integrated data system. Example 1. Janice is the Director of Evaluation and Research at a “backbone organization” for a CCI that describes itself as a “place-based collaborative” working to ensure healthy outcomes for children in a neighborhood with the lowest health, educational attainment, economic self-sufficiency, and safety indicators in a city of about 100,000 people. At the time we spoke with Janice, her CCI had already, as a collaborative, identified indicators for progress in the areas of early childhood education and learning, education, health and wellness, community engagement and advocacy, and digital literacy. Their goal was to build consensus around these outcomes, not just among institutional partners in the CCI, but also among the residents themselves. As Janice engaged in this work, she realized that building consensus around existing indicators was not enough to create the sense of resident ownership that was consistent with the values of the CCI. Realizing that evaluation was seen as a process of extracting data and sending it “somewhere else,” Janice's goal shifted from building consensus to “giving data back to the people.” In describing the subtle but important contrast between the “old” and “new” ways of doing evaluation, Janice said, “I think people end up using evaluation to do things that are more, you know, ‘Were the materials clear, was the coffee hot,’ you know, much more satisfactionoriented. But this is much more, ‘What was the goal of your meeting? And did you do that?’” As an example of the latter way of doing evaluation, she described a meeting that the CCI held in partnership with the local police department. At the beginning of the meeting, she said they asked attendees to answer questions like, “Did they trust the police? Did they trust that they would be treated well by the police? If they called, did they trust that the police would help them with their problem?” They also asked questions about whether attendees felt their children were safe at school. The data were then presented back to the attendees and the police. According to Janice, “People were happy to see that people did feel safe, and people felt better about the role of the police on the school campuses. We're seeing that people do feel safe in the neighborhood. And in the group, we saw that there are very few people who didn't feel they'd be treated well by the police if they called. And, um, we didn't know the answer to that question. And we felt good about seeing that, and I think the police felt good about seeing that.” Here, then, data were used not to prove a point or to get funding, but to inquire honestly into the experiences of the residents of the community. In this particular example, the result was that CCI stakeholders who could potentially have vastly different perspectives (i.e., residents, the police, and CCI backbone staff) were able to reveal common ground, creating a foundation of “feeling good” and trusting one another.
Example 2. Sarah is a leader of a community foundation in the South, focused on improving educational outcomes for children in that community. A scientist by training, Sarah thrives on numbers and data, and has always been a “background” person at her organization. Recently, her foundation partnered with the local school district to create a community dashboard, so that reports on youth academic outcomes could be pulled and analyzed in “real-time.” The specifics of the dashboard were not articulated beyond that, but it was Sarah's job to convince a number of community-based organizations (CBOs) to contribute to the effort. In Sarah's case, the challenge was not necessarily one of creating trust. Representatives of the various CBOS were supportive of the idea of the dashboard in theory, but they seemed hesitant to really commit resources beyond verbal approval. As Sarah said, it seemed they had “barriers” up. However, after learning and practicing an exercise called “public narrative” (Ganz, 2009), in which leaders use personal stories that reveal a values-driven meaning for their individual engagement in the work, then create collective stories that engage those same values and that relate to the current moment in time, Sarah began to approach her meetings with these CBOs differently. In Sarah's words: “I was just so in the meetings to drive right into the agenda. It was never make a story of it, it's just get through the agenda. That was just my work style, but I found that doing [public narrative] now before a meeting really, really helped. It allows them to be relaxed. They don't come in as rigid either. They begin to kind of listen, barriers down, to just have an honest conversation.” One of the consequences of this new, relational approach to her data work was that Sarah realized that a barrier to these CBOs committing was not that they did not believe in the mission of the CCI, but that they did not see how they, as individual organizations, fit in. In describing her interactions with representatives from a rape crisis and family crisis center, Sarah said, “They was like, “I don't—we've been trying to figure out why you invited us to this meeting. We don't see how we fit in.” However, after Sarah “told the story of why I'm involved and how I see this involvement and how I'm asking them to get involved, they got it and it made complete sense to them.” She was then able to “show them how if the families deal with the crisis, most likely the child is not come to school, or if they do come they're gonna act out or shut down. You affect both attendance and behavior … this is the healthy side of the indicators. Then there's the smart side. They saw if the healthy isn't working, smart isn't gonna move … They realize ‘Okay, we fix the healthy side” … and now they gotta go back and figure out a better way of monitoring their data to report back.” Not only did this more relational, leadership approach help Sarah create commitment toward the data work, it opened her eyes to the fact that she “went in probably a little naïve myself, thinking people just got it.” A major goal of a dashboard like the one Sarah's community was developing is to define and represent the landscape of work for the CCI—that is, the conceptual map that is meant to guide the stakeholders through uncertainty. However, a fundamental finding of cognitive research is the importance of engaging with prior conceptions before presenting new conceptions. Sarah had been unaware of the extent to which she had not engaged her stakeholders' prior conceptions until she used a more narrative way of knowing that gave them the room to see how their perspectives on and approaches to the work fit into the larger picture. Example 3. During a bi-monthly leadership committee meeting Nancy, the Executive Director of the youth collaborative (CCI) asked the partners to voice major concerns they were facing. Ray, one of the two superintendents in the group, said, “I'm really worried about truancy. I think we have a serious problem on our hands.” The other ten partners gathered around the table quickly agreed. Julie, the elementary school district superintendent, spoke “I'm also concerned, and I'm afraid it costs us money.” Others nodded, money is at stake and this is not a wealthy city. “If truant residents receive our services they face serious sanctions! We need to
E.S. Lin et al. / Journal of Applied Developmental Psychology 40 (2015) 57–62
know more about this,” said the Human Services Director. The Chief Physician of the local hospital also weighed in, “Perhaps there are health issues at the root of this. Maybe we can help.” The head of the county health department also wondered about the nature of the challenge. Nancy noted agreement about a common challenge this city appeared to face. They all agreed they needed to know more in order to address their concerns. They turned to the research partner at the table and asked for support in understanding the actual problem with truancy in their community. Over the next year, the research partner collaborated with the CCI members by analyzing longitudinal data of student attendance in the city. Additionally, data were analyzed regarding utilization of welfare services and a variety of educational outcomes. They quickly learned that chronic absence not truancy was the real concern. Truancy was too low a bar (missing 3 days), whereas chronic absence amounted to missing 10% or more of the school year. Investigating these questions in partnership yielded many further questions and capacity building. For example, the collaborative group first needed to understand the meaning of “regression adjustment” as the data analysis necessary for this study was somewhat more sophisticated than these partners were typically used to seeing. A variety of predictive factors and consequences of chronic absence were explored. When the research partner shared the findings with the leadership committee much of the results were surprising and the implications for action pointed in multiple directions. For example, one of the grades with the most significant number of chronic absences was kindergarten. This finding surprised the entire group. The high school also saw high levels of chronic absences and the research noted gender differences that did not exist in earlier years. Once the consequences of chronic absence were laid out (e.g. lower standardized test scores in elementary school, lower GPA in secondary school, continued chronic absence, etc.) then implications for action became clearer. For example, the elementary school superintendent hired an outside firm to send chronic absence and truancy letters to the parents after only three absences. Additionally, a full time attendance coordinator was hired and focused on exploring the persistent cases. One early “win” was her discovery that parents were keeping children home for lice because they didn't know how to address it. Further, the CCI set up an Attendance working group to engage in a community wide communications effort. They used 1-pagers, targeting different audiences, highlighting research and recommendations. The health care providers discovered that they were inadvertently contributing to the problem by not questioning families about the number of days students had been absent when providing excuse notes for school. Nancy and the leadership team of this CCI had a very successful experience with using research to improve the outcomes of their youth. They enjoy describing this example, and others, with anyone interested in learning the benefits of using data for improvement.
Implications Considering the previous examples and description of the type of leadership needed to effectively use data and evidence in CCIs, we propose two implications for the work moving forward: 1) that we need new conceptions of the roles of leaders in data-driven CCI work, and 2) that the types of data and evidence that are considered to be relevant to CCI work may need to be rethought. Implication 1: In order to facilitate coordinated action toward positive outcomes in CCIs, we must conceive of new roles for leaders engaging in data-driven work. The process of trust building cannot be under-emphasized, but nor can the process be rushed. Trusting relationships, however, cannot carry the arduous work of a CCI. Much more leadership is needed throughout to ensure success from a strong infrastructure and solid agreements among partners (data use contracts that ensure confidentiality, for
61
example). Empathy and respect from the researchers for the community enabled these relationships to flourish. The Gardner Center's conceptualization of partnership involves a commitment to parity. Additionally, the research products of the CCI need to have meaning for all parties. This means that the researchers are not the only partners to have a stake in the research findings. By ensuring that the data will have actionable results, which should include co-construction of research and evaluation initiatives, all parties will make an investment in the work. Frameworks for this kind of collective investment can be found in empowerment evaluation (Fetterman, 1994) and developmental evaluation (Patton, 2011, which positions beneficiaries or primary stakeholders in social interventions to evaluate those interventions themselves (Fetterman, 1994), and developmental evaluation, which is meant to be more responsive and flexible to changing evaluation contexts than traditional approaches (Patton, 2011). Moving forward, the identification of clear roles that integrate trust-building and data interpretation would be of great value. These roles can originate in academia (Nelson, London, & Strobel, in press) or in practice (e.g., the social worker as “culture broker,” as conceived by Palinkas, 2010), but the key is for this person not just to have influence over data, but the ability to interpret and reconcile multiple perspectives on data-driven work. Implication 2: The types of evidence that are used may need to be rethought, such that they can be more useful in providing guidance to strengthen CCIs but also to gain buy-in from the CCI on using the data to improve the effort; a process that biases narrative knowing over paradigmatic knowing. In general, the scientific process is one of adherence to incremental and systematic process, rather than the generation of revolutionary outcomes (Kuhn, 1970). However, considering the intensely contextual and practical motivation behind CCIs, we wonder what the value of scholarly study could and should be to people directly working to implement and support CCIs in their own communities. Smyth and Schorr (2009) have provided an excellent review of methods for studying social interventions, arguing for a “more inclusive approach” (p. 17) to evaluation and research and laying out a number of lessons and guidelines that are very practically oriented, including privileging adaptation over showing impact and setting “good enough” standards for evidence. On the other side of the debate, the current evidence-based policy, program, and practice movement is continually pushing for a wider use of randomized controlled trials as ways the “gold standard” for assessing impact. To ground this productive tension between research and practice, we return to Bruner's (1984) two ways of knowing, which imply that there is a need for more useful evidence on CCIs. If the goal of CCIrelated research is to generate knowledge that is not only paradigmatically true but also lifelike to particular CCI stakeholders, the field must produce more evidence that is directly and explicitly connected to educational outcomes in order to gain traction in the education research, policy, and practice communities. The growth of randomized controlled trials of individual educational interventions at the individual, classroom, and even school level, as promoted by organizations like the Institute of Education Sciences in the U.S. Department of Education and a burgeoning field of economists applying their methods to education (e.g., the Poverty Action Lab at M.I.T. and The Lab for Economic Applications and Policy at Harvard) is certainly to be applauded. However, to address large-scale educational inequities, the effectiveness of comprehensive approaches like Communities That Care could conceivably inspire similar interventions and outcomes in education, and those approaches should be considered and studied in appropriate ways that generate evidence that is useful and lifelike beyond the scientific context. We therefore ask: what types of evidence would in fact be of use to implementers and supporters of CCIs? That is, what types of evidence would actually a) be read and used to influence CCI strategies and actions, and b) yield, through their use, more effective operation and achievement of CCI goals, outcomes, and impacts? Apart from issues
62
E.S. Lin et al. / Journal of Applied Developmental Psychology 40 (2015) 57–62
of representation and voice, Auspos (2012) reports that a gathering of policymakers and researchers on this very topic recommended “helping practitioners make better use of data” (p. 7). Further research on how CCI stakeholders are currently approaching and executing this task would certainly sharpen scholarly understanding of what kinds of evidence are useful for improving CCIs. Perhaps more importantly, though, given the repeatedly documented and discussed limitations of traditional, clinically-rooted methodological approaches to studying complex, community-based interventions, as well as the small but growing body of evidence that community-based interventions can be rigorously studied in alternative ways and shown to have positive effects, it becomes particularly important to learn whether and how these new approaches can ultimately yield greater up-take of evidence by CCI practitioners to improve their work. Conclusion The articles in this special issue have provided us with theories, empirical evidence, and practice-oriented principles for understanding both the opportunities for and challenges facing CCIs as a means of improving population-level outcomes for youth. The complexity of the social and developmental systems in which CCIs operate demand both data-informed strategy as a means of reducing complexity, as well as relationship-oriented moral leadership that sustains collective momentum through times of uncertainty. To promote system-wide adaptation and collective learning, then, CCI leaders must learn how to use evidence in a way that not only considers, but respects and integrates the perspectives of multiple stakeholders. By creating new roles in which these responsibilities are explicitly connected, as well as generating new kinds of evidence that are oriented toward both narrative and paradigmatic knowing, we suggest not only that CCIs will be more likely to succeed, but that researchers interested in CCIs will find greater success in having the products of their analysis taken up by CCI stakeholders. References Auspos, P. (2012). Developing and using data and evidence to improve place-based work. Proceedings from a meeting convened by The Aspen Institute Roundtable on Community Change with support from The Annie E. Casey Foundation. New York: The Aspen Institute Roundtable on Community Change. Blueprints for Healthy Youth Development (2014). Blueprints database standards. Retrieved from http://www.blueprintsprograms.com/resources/Blueprints_Standards_full.pdf Brandtstädter, J. (2000). Emotion, cognition, and control: Limits of intentionality. In W. J. Perrig, & A. Grob (Eds.), Control of human behavior, mental processes, and consciousness (pp. 3–16). Mahwah, NJ: Erlbaum. Brandtstädter, J. (2006). Action perspectives on human development. In R. M. Lerner (Ed.), Handbook of child psychology (6th ed.). Theoretical models of human development, Vol. 1. (pp. 516–568). Hoboken, NJ: Wiley. Bronfenbrenner, U. (1986). Ecology of the family as a context for human development: Research perspectives. Developmental Psychology, 22(6), 723–742. Bronfenbrenner, U., & Morris, P. A. (2006). The bioecological model of human development. In W. Damon, & R. M. Lerner (Eds.), Handbook of child psychology (6th ed.). Theoretical models of human development, Vol. 1. (pp. 793–828). New York: John Wiley. Bruner, E. M. (1984). Text, play, and story: The construction and reconstruction of self and society. American Ethnological Society. Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard University Press. Brunstein, J. C., Schultheiss, O. C., & Maier, G. W. (1999). The pursuit of personal goals: A motivational approach to well-being and life adjustment. In J. Brandtstädter, & R. M. Lerner (Eds.), Action and self-development: Theory and research through the life span (pp. 169–196). Thousand Oaks, CA: Sage. Campaign, Data Quality (2011). Data: The missing piece to improving student achievement. Retrieved from http://www.dataqualitycampaign.org/files/dqc_ipdf.pdf Coburn, C. E., & Turner, E. O. (2012). The practice of data use: An introduction. American Journal of Education, 118(2), 99–111. Corcoran, T., Fuhrman, S. H., & Belcher, C. L. (2001). The district role in instructional improvement. Phi Delta Kappan, 83(1), 78–84. Data Quality Campaign (2011). Data for action. Washington, DC. Diamond, J. B., & Cooper, K. (2007). The uses of testing data in urban elementary schools: Some lessons from Chicago. Yearbook of the National Society for the Study of Education, 106(1), 241–263. Feltovich, P. J., Spiro, R. J., & Coulson, R. L. (1993). Learning, teaching, and testing for complex conceptual understanding. In N. Fredriksen, & I. Bejar (Eds.), Test theory for a new generation of tests (pp. 181–217). Hillsdale, NJ: LEA. Fetterman, D. M. (1994). Steps of empowerment evaluation: From California to Cape Town. Evaluation and Program Planning, 17(3), 305–313. Fullan, M. (2001). Leading in a culture of change. San Francisco: Jossey-Bass.
Ganz, M. (2009). Why David sometimes wins: Leadership, organization, and strategy in the California farm worker movement. Oxford University Press. Gibson, C., Smyth, K., Nayowith, G., & Zaff, J. (2013, September 19). To get to the good, you gotta dance with the wicked. Stanford Social Innovation Review Blog. Retrieved from http://www.ssireview.org/blog/entry/to_get_to_the_good_you_gotta_dance_with_the_ wicked Grotzer, T. A. (2012). Learning causality in a complex world: Understandings of consequence. Lanham, MD: Rowman Littlefield. Honig, M. I., & Coburn, C. E. (2005). When districts use evidence for instructional improvement: What do we know and where do we go from here? Voices in Urban Education, 6, 22–29. Honig, M. I., & Coburn, C. E. (2008). Evidence-based decision making in school district central offices: Toward a policy and research agenda. Educational Policy, 22(4), 578–608. Kim, B. K. E., Oesterle, S., Catalano, R. F., & Hawkins, J. D. (2015). Change in protective factors across adolescent development. Journal of Applied Developmental Psychology (Special Issue: Optimizing child and youth development through comprehensive community initiatives), 40, 26–37. Knapp, M. S., Copland, M. A., & Swinnerton, J. A. (2007). Understanding the promise and dynamics of data-informed leadership. Yearbook of the National Society for the Study of Education, 106(1), 74–104. Kuhn, T. S. (1970). The structure of scientific revolutions (2nd ed.). Chicago, IL: University of Chicago Press. Lanspery, Susan, & Hughes, Della M. (2015). Homegrown Partnerships Making a Difference for Youth. Journal of Applied Development Psychology, 40, 38–46. Lerner, R. M. (2012). Essay review: Developmental science: Past, present, and future. International Journal of Developmental Science, 6, 29–36. London, R. A., & McLaughlin, M. (2014). The youth sector: Live and virtua. Paper prepared for edited volume published by the San Francisco Federal Reserve Bank and the Urban Institute. Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., et al. (2011). Big data: The next frontier for innovation, competition, and productivity. McKinsey Global Institute, May 2011. Retrieved from http://www.mckinsey.com/~/media/McKinsey/dotcom/ Insights%20and%20pubs/MGI/Research/Technology%20and%20Innovation/Big%20Data/ MGI_big_data_full_report.ashx Marsh, J. A. (2006). Democratic dilemmas. Albany: State University of New York Press. Marsh, J. A., Pane, J. F., & Hamilton, S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA: RAND. McLaughlin, M., & London, R. A. (Eds.). (2013). From data to action: A community approach to improving youth outcomes. Cambridge, MA: Harvard Education Press. Nelson, S. R., Leffler, J. C., & Hansen, B. A. (2009). Toward a research agenda for understanding and improving the use of research evidence. Portland, OR: Northwest Regional Educational Laboratory. Nelson, I. A., London, R. A., & Strobel, K. R. (in press). Reinventing the Role of the University Researcher. Educational Researcher. Nickerson, R., Perkins, D., & Smith, E. (1985). The teaching of thinking. Hillsdale, NJ: Lawrence Erlbaum Assoc. Nutley, S. M., Walter, I., & Davies, H. T. O. (2007). Using evidence: How research can inform public services. Bristol: The Policy Press. Osher, D., Amos, L., Jones, W., & Coleman, V. (2015). Comprehensive community initiatives in education reform: The case of Say Yes To Education. Journal of Applied Developmental Psychology, 40, 47–56. Overton, W. F. (2013). Relationism and relational developmental systems: A paradigm for developmental science in the post-Cartesian era. In R. M. Lerner, & J. B. Benson (Eds.), Advances in child development and behavior. Embodiment and epigenesis: Theoretical and methodological issues in understanding the role of biology within the relational developmental system, Part A: Philosophical, theoretical, and biological dimensions, 44. (pp. 24–64). London: Elsevier. Palinkas, L. A. (2010). Commentary: Cultural adaptation, collaboration, and exchange. Research on Social Work Practice, 20(5), 544–546. Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Thousand Oaks, CA: SAGE Publications. Patton, M. Q. (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use. Guilford Press. Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4, 155–169. Robinson, C. M. (1988). Improving education through the application of measurement and research: A practitioner's perspective. Applied Measurement in Education, 1(1), 53–65. Sheldon, K. M., & Emmons, R. A. (1995). Comparing differentiation and integration within personal goal systems. Personality and Individual Differences, 18(1), 39–46. Smyth, K. F., & Schorr, L. B. (2009). A lot to lose: A call to rethink what constitutes “evidence” in finding social interventions that work. Cambridge: MA: Malcolm Wiener Center for Social Policy Working Paper Series Retrieved from http://www.hks.harvard.edu/ocpa/pdf/A% 20Lot%20to%20Lose%20final.pdf. Spillane, J. P. (2000). Cognition and policy implementation: District policymakers and the reform of mathematics education. Cognition and Instruction, 18(2), 141–179. Spillane, J. P., & Callahan, K. A. (2000). Implementing state standards for science education: What district policy makers make of the hoopla. Journal of Research in Science Teaching, 37(5), 401–425. Spillane, J. P., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Research, 72(3), 387–431. Weiss, C. H., Murphy-Graham, E., & Birkeland, S. (2005). An alternate route to policy influence: How evaluations affect D.A.R.E. American Journal of Evaluation, 26, 12–30. Zaff, J. F., Donlan, A. E., Jones, E. P., & Lin, E. S. (2015). Supportive developmental systems for children and youth: A theoretical framework for comprehensive community initiatives. Journal of Applied Developmental Psychology, 40, 1–7.