Realizing an effectiveness revolution in environmental management

Realizing an effectiveness revolution in environmental management

Journal of Environmental Management 92 (2011) 2130e2135 Contents lists available at ScienceDirect Journal of Environmental Management journal homepa...

566KB Sizes 0 Downloads 29 Views

Journal of Environmental Management 92 (2011) 2130e2135

Contents lists available at ScienceDirect

Journal of Environmental Management journal homepage: www.elsevier.com/locate/jenvman

Realizing an effectiveness revolution in environmental managementq Matt Keene a, *, Andrew S. Pullin b a b

United States Environmental Protection Agency, Office of Policy, Evaluation Support Division, MC 1807T, 1200 Pennsylvania Avenue, NW, Washington D.C 20460, United States Centre for Evidence-Based Conservation, School of the Environment, Natural Resources and Geography, Bangor University, Bangor, Gwynedd LL57 2UW, UK

a r t i c l e i n f o

a b s t r a c t

Article history: Received 8 December 2010 Received in revised form 21 March 2011 Accepted 25 March 2011 Available online 22 April 2011

The environmental movement of the 20th century has evolved into a large, diverse and well-financed global community that is increasingly required to prove its worth. Though the environmental sector collects and uses data to determine the status of ecological and social systems, the effectiveness of the programs and policies it uses to affect this status remains largely untested. As governments and donor institutions insist on greater transparency, accountability and evidence of what works and what does not, much is being learned from other fields (e.g. health services, education, international development) and increasingly sophisticated approaches are emerging to manage effectiveness. For example, program evaluation, adaptive management, and systematic review provide frameworks and methods to collect and use information to measure and improve performance. However, the critical data and collaborations necessary for an effectiveness revolution are marginalized by technical, cultural and political obstacles. Learning from other fields, the environmental sector must exploit key leverage points, such as flows of information and self-organization, to overcome impediments and create incentives to initiate and realize an era of effectiveness in environmental management. Published by Elsevier Ltd.

Keywords: Program evaluation Adaptive management Systematic review Performance measurement Policy effectiveness Collaboration Transparency and accountability Information sharing

It is only through reflection that we change our history. (Humberto Maturana) 1. Introduction Visible and apparent environmental degradation catalyzed the public’s demand for environmental protection in the latter half of the 20th century. Although the environmental movement can be traced back to much earlier origins, it perhaps began in earnest in the 1960s with legislators passing laws and creating regulatory agencies and with citizen-centric advocacy organizations campaigning for stronger public health standards and biodiversity conservation (Chertow and Esty, 1997; Fiorino, 2001; Durant et al., 2004). Whereas environmental problems may have once appeared isolated, research reveals unbounded interactions between complex social and natural systems that give rise to dynamic and severe challenges requiring costly intervention (Lubchenco, 1998). q This work is not a product of the United States Government or the United States Environmental Protection Agency, and the author is not doing this work in any governmental capacity. The views expressed are those of the author only and do not necessarily represent those of the United States or the US EPA. * Corresponding author. Tel.: þ1 202 566 2240. E-mail addresses: [email protected] (M. Keene), a.s.pullin@bangor. ac.uk (A.S. Pullin). 0301-4797/$ e see front matter Published by Elsevier Ltd. doi:10.1016/j.jenvman.2011.03.035

What was last century’s environmental movement has rapidly evolved into a large, diverse and well-financed sector of organizations and individuals working (though not necessarily coordinated) to achieve the public’s common goal of environmental health. However, increased activity is not the same as increased effectiveness e an improvement in the degree to which an intended result is achieved. Increased effectiveness is accomplished through testing and evaluating activities, synthesizing available and relevant data and communicating results such that future decisions are made that more frequently achieve intended changes. Though effectiveness is not a new concept to the environmental sector (Stem et al., 2005), the sector has not embraced such ‘evidencebased frameworks’ that have been credited with catalyzing effectiveness revolutions in other fields (e.g. medicine and public health) (Stevens and Milne, 1997). Effectiveness revolutions have followed a process by which decision-making changes from reliance on individual personal judgment to reliance on consensus derived from the best available scientific evidence (Pullin and Knight, 2009). It is generally accepted that there is a need for more evidence-based environmental management; program and policy evaluation, adaptive management and systematic review are approaches that the sector is adopting to meet this need. However, relevant data, methodological sophistication and the collaboration necessary for an effectiveness revolution are marginalized by technical, cultural and political obstacles. Learning from other

M. Keene, A.S. Pullin / Journal of Environmental Management 92 (2011) 2130e2135

fields, the environmental sector must exploit key leverage points, such as flows of information and self-organization, where small adjustments cause changes that overcome impediments and create incentives to hasten a transition into an era of effectiveness in environmental management. 2. The status of the environmental sector In 1970 there were only 10 ministerial level departments of environment in the world (Ausubel et al., 1995). Today most of the approximately 195 national governments have at least two agencies with environmental missions: one for environmental protection and one for natural resources management (Trzyna, 2008). In addition to the 1000’s of national and local government agencies worldwide, the number of non-profit environmental organizations in the US grew almost every year since 1960 with the annual growth rate at 4.6 percent since 1995 compared with 2.8 percent for all other types of non-profits. By 2005 there were more than 26,000 US-based environmental organizations using multiple strategies to address myriad issues including bird conservation (225), water resources and wetlands management (7291), pollution abatement (695), recycling (443) and environmental education (1213) (Straughan and Pollak, 2008). Public investment in the applied environmental sciences has never been higher. The US federal government spent about $10 billion in 1970 for protection of natural resources and the environment (Ausubel et al., 1995). In 2006 the collective resources of the United States Environmental Protection Agency (USEPA) and National Oceanic and Atmospheric Administration (NOAA), just two of the many US federal environmental agencies, totaled approximately $11 billion and 30,000 employees. The combined 2010 budgets of the two agencies was nearly $15 billon. Overall,

2131

government agencies, environmental non-profits and other institutions may spend as much as $120 billion annually on ecosystem protection in the United States (Christensen, 2002). The combined revenue of all US-registered environmental non-profits was just over $8.2 billion in 2005, with 21% of revenues accruing to four of the largest non-profits (down from 41% in 1989) (Straughan and Pollak, 2008). According to the US Internal Revenue Service, in 2006 ten of the largest ENGOs spent $1.5 billion and employed about 7000 staff. Specialization and unique research niches improve the large and diverse environmental sector’s capacity to use interdisciplinary knowledge, expertise and strategies to address challenges in complex systems. Diverse challenges and increasing investment in specialized knowledge and skills spawned a wide-ranging portfolio of problem solving approaches e protected areas, regulations, international treaties, enforcement, public involvement, formal education, eco-fees, and conservation payments, for instance. The environmental sector uses these and a plethora (at least 20 broad approaches and 40 specific strategies) of other context specific approaches (Salafsky et al., 2002). Though their activities and objectives diverge depending on context, environmental organizations have similar fundamental missions, visions and goals related to principles such as clean air, clean water and biodiversity conservation (Redford et al., 2003; Heimlich, 2010) (Fig. 1). The missions and goals of environmental organizations focus work within natural and social systems that are described in detail through many discrete initiatives that facilitate the collection and management of data. For example, USGS (http://geodata.com), New York State’s Ecosystem-based Management Initiative (http:// nyoglatlas.org), NatureServe, USEPA’s Data Finder (http://www. epa.gov/data), DataBasin, Encyclopedia of Life, NEON, Woodland Trust, The Conservation Registry and LandScope provide access to

Fig. 1. Overlapping missions, visions and goals of five large environmental organizations. National Oceanic and Atmospheric Administration (NOAA), Environmental Protection Agency (EPA), World Wildlife Fund (WWF), Wildlife Conservation Society (WCS) and The Nature Conservancy (TNC).

2132

M. Keene, A.S. Pullin / Journal of Environmental Management 92 (2011) 2130e2135

large amounts of integrated environmental information. State of the Environment reports such as those produced by national and sub-national government agencies, foundations and international institutions also collect, manage and report on data and information of all kinds (NCSE, 2010). An entirely new dimension of data generation has arisen with the explosion of current and potential applications of sensor networks, where increasingly vast arrays of sensor nodes collect, integrate and communicate environmental and social information across scales (Chong and Kumar, 2003; Hart and Martinez, 2006). The Federation of Earth Science Information Partners is a broadbased community working to improve the sharing, access, management and use of potentially infinite quantities and types of environmental information (www.esipfed.org). Similarly, the Group on Earth Observations (GEO), with a membership of 85 countries and the European Commission, is improving data sharing and coordinating progress toward a common, transferable format for all Earth observation data (http://www.earthobservations.org). 3. Requiring evidence of effectiveness Governments, foundations and international institutions invest hundreds of billions annually in environmental organizations and increasingly require them to present evidence of the effectiveness of their work. In early 2011, US President Barack Obama signed into law the Government Performance and Results Modernization Act of 2010 (GPRA), a bill to amend and refine GPRA of 1993 which required agencies to measure program outputs and outcomes (GPRA, 1993). The new Act establishes Chief Operating Officers and Performance Improvement Officers for each federal agency and requires each agency to report quarterly on priority outcomeoriented goals and performance plans using a single easily accessible public website (GPRA, 2010). In the years prior to GPRA modernization, the Program Assessment Rating Tool (PART) had been used extensively across US federal agencies to identify program strengths and weaknesses and to track progress toward GPRA goals (Radin, 2000; Gilmour, 2006). In the proposed 2011 and 2012 budgets, President Obama allocated significant funding ($100 million for FY2011) to federal agencies to conduct program evaluations that will assess impacts and provide evidence for planning, policymaking and budgeting (OMB, 2009). The US government’s new Chief Performance Officer is integral to this work and tasked with increasing effectiveness, efficiency and accountability and requiring agencies to set goals and provide evidence of their performance (Zients, 2010). Developed and developing countries are also crafting policies requiring more formal and systematic attention to program evaluation and performance measurement (Nielsen and Ejler, 2008). The European Commission has specifically emphasized the use of impact assessment, utility-focused evaluation, and the need to integrate evaluation into the program cycle to achieve better decision-making, better regulation and coherence with the strategic planning, and an improved focus on results and transparency (EC, 2005, 2007). The government of Canada approved a federal Policy on Evaluation in 2009 that emphasizes the need to create a base of evaluation evidence that is used to support policy and program improvement, decision making, and transparency and accountability (STBC, 2009). Donors also want assurances that their investments are generating intended results (Ferraro and Pattanayak, 2006). The Gordon and Betty Moore Foundation, the Doris Duke Charitable Foundation and other groups that contribute hundreds of millions of dollars to conservation projects are increasingly requiring grantees to provide evidence that explicit environmental and social outcomes are achieved in a specific timeframe. Governments and other institutions that have pledged development assistance to help achieve the United Nations Millennium Development Goals are publicly held

accountable to their commitments by over sixty measures of success (UN, 2009). 3.1. Measuring and managing effectiveness in the environmental sector The frameworks and methods necessary to efficiently collect and use information to measure and improve the effectiveness of programs, policies and specific interventions are well-established inside and outside of the environmental sector (Stem et al., 2005). Government agencies (e.g. USEPA, European Union Environment Agency, US Agency for International Development) and international institutions (e.g. Global Environment Facility, United Nations Environment Program, World Bank) host units dedicated to program and policy evaluation, while national and regional professional evaluation societies are found worldwide (Love and Russon, 2000). The field of evaluation provides a theoretical foundation and practical tools for performance measurement, ongoing improvement, accountability, improved program and policy design, adaptive management and determinations of effectiveness. Recent advances in developmental and adaptive evaluation integrate complexity science and post-normal science to account for the unbounded evolution of nonlinear interactions between social and natural systems (Hildén, 2009; Patton, 2010). The policy sciences, a field that uses collaboration to identify and solve difficult problems, contributes to the theoretical foundation for the practice of adaptive management applied to managing natural resources and conservation initiatives (Lasswell, 1970; Holling, 1978; Ascher, 1986; Clark, 2002). The Conservation Measures Partnership (CMP) is a joint venture of major NGOs and other collaborators that develop a set of open standards for adaptive management and work collectively to develop, test, and promote common principles and tools to assess and improve the effectiveness of conservation practices (CMP, 2007). The field of program evaluation is fundamental to the theory and practice of assessing the management effectiveness of protected areas which improves program planning, adaptive management and accountability (Patton, 1997; Hockings et al., 2000; Pomeroy et al., 2005; Hockings et al., 2006). Organizations and agencies responsible for the management of protected areas, as documented during the 2003 World Parks Congress, have made evaluating their effectiveness a top priority, which has, at least in part, driven the completion of over 6300 assessments of management effectiveness in 100 countries (Leverington et al., 2008). The Collaboration for Environmental Evidence (CEE) is an open partnership between scientists, managers and practitioners who share the goals of a sustainable global environment and the conservation of biodiversity. The Collaboration uses the process of systematic review, as established by the medical community, to synthesize scientific evidence on environmental and conservation issues of greatest concern to policymakers and practitioners and maintains an open-access library of completed reviews (http:// www.environmentalevidence.org). 4. Impediments to an effectiveness revolution Though calls for accountability and transparency are becoming more frequent and specific, data are not consistently or systematically used to evaluate the effectiveness of programs and policies (Ferraro and Pattanayak, 2006). It is not used to test which interventionsdactions taken to affect an outcomedaccomplish goals and which ones do not and thus their effectiveness remains unknown (Sutherland et al., 2004; Pullin and Knight, 2009). Consequently, governments, foundations and non-profits finance environmental research that generates growing amounts of fragmented and inaccessible data that does not underpin practice.

M. Keene, A.S. Pullin / Journal of Environmental Management 92 (2011) 2130e2135

Complex environmental, social and economic systems create methodological challenges to measuring and evaluating the effectiveness of programs and policies, especially in attributing changes in condition across temporal and spatial scales to a particular intervention (Bruyninckx, 2009; Ferraro, 2009; Gullison and Hardner, 2009; Hildén, 2009; Kennedy et al., 2009). Partly because of these challenges, monitoring initiatives have not been explicitly tied to program goals and evaluations of programs have typically focused on measuring outputs (i.e. creation of program products, services) and not outcomes (i.e. change in environmental, social condition) (Knapp and Kim, 1998a; EPA, 2009). Outputoriented evaluation results in data management and use that meets only an individual’s or an organization’s needs, without establishing baselines for impact evaluations that would be relevant to the sector as a whole. With a lack of data suitable for demonstrating progress toward key environmental and social outcomes, the environmental sector struggles to provide cogent recommendations on key issues of policy or practice (Sutherland et al., 2006). It is unclear whether the diversity of organizations, specializations, sources of investment, activities and data helps or hinders progress toward achieving key outcomes. The increasingly decentralized nature of the sector creates the space necessary for independent thought and action as well as the deep issue-specific specialization and accumulation of local knowledge crucial to comprehend complex systems and craft appropriate solutions (Surowiecki, 2004). However, the sector generates relatively little data in response to the question of the effectiveness of steps taken to solve problems. The data describe change over time and space independent of the question of whether the practices of the sector caused the change. Without actively searching out opportunities to collaborate across organizations and disciplines to find and use relevant data to answer these questions, the environmental sector is limited in its capacity to learn from past efforts and share what works and what does not in order to develop an evidence-base and systematically increase its effectiveness in achieving fundamental and shared goals. Cooperation and openness are considered necessary to avoid duplicating mistakes and effort and to learn from successes and failures in order to more efficiently solve environmental problemsdthough even here strong evidence is lacking (Mace et al., 2000; Koontz and Thomas, 2006). The risk that ineffective practices continue increases as reports of failed cooperative initiatives go unpublished because responsible institutions choose not to or because poorly designed programs, or ones that fail to achieve desired results, are considered unworthy of publication (Rohrschneider and Dalton, 2002; Hobbs, 2003, 2009). Another factor contributing to insularity in the sector is the perception of organizations and individuals that their program’s, data and strategies are proprietary. There has been little incentive for organizations to be open and collaborate. Competition between agencies, NGOs and academics to own issues and attract investmentdand public and political attentiondfurther impedes collaborative initiatives (Kleiman et al., 2000; Trzyna, 2008). The divergent evolution of the environmental movement has been beneficial in attracting significant investment and creating space for the rise of diverse disciplines, organizations and datasets. However, widely dispersed financing and effort, without comprehensive coordination and transparency, gives the impression of a lack of resources and creates inefficiencies detrimental to achieving shared goals. Without clear and powerful incentives for collaboration, organizations will continue acting independentlyd ensuring that datasets remain disconnecteddand work together only when it is mandated or convenient. Even where cooperation does exist within the environmental sector, the success and structure of cooperative initiatives are subject to ideological and cultural tensions across sectors and

2133

geographic and organizational scales (Rohrschneider and Dalton, 2002; Trzyna, 2008). For example, environmental NGOs based in developed countries may place a lower priority on basic environmental quality issues (e.g., air and water) that are still a priority in poor and developing countries. Political and ideological frictions further impede cooperation between developed, developing and poor countries (Van Der Heijden, 1999) as is evident in ongoing climate change negotiations. 5. Leveraging for change Commonly owned, open access centralized repositories of systematic reviews that synthesize evidence of effective practice have been widely credited as the basis of an effectiveness revolution in the health services (Stevens and Milne, 1997). The Cochrane Collaboration (http://www.cochrane.org) library now holds in the order of 5000 systematic reviews undertaken to specific guidelines ensuring rigor, objectivity and transparency. Health service providers recognize the losses to effectiveness and efficiency that stem from a dearth of research synthesis. Thus, the medical community invests heavily in developing evidence-based practices and increasing data accessibility (see http://www.cochrane.org, http://www.clinicalevidence. comand, www.thecaptureproject.com). Government health ministries provide the majority of the Cochrane Collaboration’s $24 million annual budget (2007e2008) for infrastructure, support staff and process facilitation (Starr et al., 2009). Evidence-based approaches to management have also been adopted by the education, criminal justice, and social welfare sectors (www.campbellcollaboration.org) as well as international development (www.3ieimpact.org). Significantly, these initiatives emphasize impact evaluation of programs as well as single interventions and go beyond the rigid synthesis of randomized controlled trials to include other quantitative and qualitative methodologies that attempt to account for the intricacy of the system studied. Professionals in these sectors are using the evidence aggregated and synthesized by these initiatives to improve decisions about services provided, practices used and allocation of limited resources across variable scales. Yet, the environmental sector is only just beginning to apply such approaches (www.environmentalevidence.org). What are the forces that led to global change in the health services and how do we animate them in the environmental sector? Reinforcing feedback loops, information flows, rules, self-organization, system goals and paradigms represent leverage points e key places to intervene in a system where small adjustments can lead to big changes (Meadows, 1999). Through reflection on the effectiveness revolution in the health services, the environmental sector can learn to more purposefully and systematically exploit key leverage points to encourage beneficial behaviors in a developing system of environmental management. Initial incentives for concerted action in the medical community grew out of ethical and economic concerns: the variation in quality of service across hospitals coupled with the lack of evidence to decide on the appropriate allocation of limited resources to a wide range of potential interventions (Cochrane, 1972). These concerns motivated the establishment of a centralized repository (library) for systematic reviews that created the potential for unprecedented transparency and accountability of medical research and practice (Chalmers, 1993). Gaps in information were identified and focused the demand for evidence of what was known about the effectiveness of practice thus increasing flows of information from diverse sources. In response, individuals and groups within the medical community organized and collaborated to develop rules for data collection, critical appraisal of data quality, prioritization and completion of systematic reviews, and the use of evidence in healthcare decisions. The ongoing self-ordering within the community, in essence, began

2134

M. Keene, A.S. Pullin / Journal of Environmental Management 92 (2011) 2130e2135

embedding the goal of “evidence-based practice” into a reinforcing feedback loop where the better the accessibility and transparency of evidence and gaps (information flows), the more systematic reviews that are initiated and used (self-organization), the better the rules governing the quality and use of systematic reviews (rules), the better the practices of healthcare (goals), and the better the transparency and accountability (Starr et al., 2009). Despite social, methodological and ethical challenges, this feedback loop is spinning off a new paradigm in the medical community e a pervasive change in the mental model and patterns of behavior e where it more systematically meets needs for issue-specific research and evidence as well as expectations of transparency and accountability (Clark, 2002; Clarke et al., 2007; Starr et al., 2009). But are public demands for quality healthcare equivalent to public demands for quality environmental health? We believe they are merging as it becomes increasingly evident that human well being depends upon environmental quality (McMichael et al., 2008). The equivalence will become clearer still as measurement and evaluation and evidence-based practice matures in the environmental sector. As in the health services, it will take a widespread change in attitude across the environmental sector to shift toward an evaluative culture that documents the effectiveness (or ineffectiveness) of its practices (Sutherland et al., 2004). Infrastructure and rules, including those governing the Cochrane Collaboration, that define scope, boundaries, protocols, and incentives for information sharing and collaboration can support beneficial self-organization within a community (Meadows, 1999). As discussed earlier, efficient self-organization of the environmental sector toward an effectiveness revolution is hindered by diverse impediments: complex ecological and social systems that confound high-quality and systematic measurement and evaluation; disperse and disjointed data, organizations, knowledge, skills, experience, effort and funding; insularity nurtured by a culture that is outputoriented, afraid of failure, proprietary and not committed to bridging its differences; and a lack of incentives for a decisive shift toward a collaborative and open community working toward common goals. By identifying and intervening at similar leverage points, the sector can begin removing impediments to an effectiveness revolution and initiate a reinforcing feedback loop similar to the one that benefits the healthcare community today. For example, the sector’s governing organizations can increase information flows by creating collaborative relationships with grassroots initiatives, such as the Collaboration for Environmental Evidence and the Center for Evidence-based Environmental Policies and Programs (www2.gsu.edu/wwwwcec/ceep), to mandate access to data for measurement, evaluation and synthesis of evidence. Progressive policies such as the US Freedom of Information Act and the Open Government Initiative (www.whitehouse.gov/Open) are making strides in this direction. Other cooperative initiatives such as the Conservation Measures Partnership, the American Evaluation Association, the Environmental Evaluators Network (www. environmentalevaluators.net) and others can continue to create the space for the sector to self-organize around interests in effectiveness and overlapping missions and goals that highlight incentives for collaboration and partnerships. Government agencies and donors can coordinate with the aforementioned initiatives to hone the practical and theoretical infrastructure necessary to generate evidence as well as develop the standards and rules that govern its quality, access and use. Funders can design incentives that encourage organizations to fill gaps in evidence by aligning research priorities with requirements for evidence. All participants in the environmental sector e NGOs, donors, government agencies, financial institutions and universities e can coordinate to establish evidencebased practice as a goal in the sector to drive the purposeful evolution of measurement and evaluation in complex systems.

Because evidence may accumulate more slowly than conservation problems arise, there is a perception that transitioning to a more evidence-based sector will incur high short term costs with unclear benefits and delay action on critical, time-sensitive issues (Griffiths, 2004). However, collaborative health service networks grew and now share a growing evidence-base for long-term benefit, including cost-effectiveness (e.g. http://www.evidence.nhs.uk). This organizational culture change is successful due to basic societal and market principles: customers (patients) demand that practitioner (service provider) credibility is established while funders (e.g. tax payers) require demonstrable return on investments. The information age enriched with the emergence of heterogeneous sensor networks and the use of network science across disciplinary fields (Barabasi, 2009) gives the environmental sector additional capacity to expose leverage points around the access, credibility, analysis and dissemination of current data. These recent adjustments to information flows and analytical infrastructures create the potential for organizations and individuals to more efficiently self-organize and answer questions about the sector’s effectiveness at addressing complex, Earth-scale social, economic and ecological issues. The public, or some segment therein, and its desire for social well-being is the ultimate client of each member of the environmental community. The public can choose whether to support environmental organizations, either directly through donations and taxes or indirectly through votes. The public’s nonnegotiable demands may be the strongest incentive for an effectiveness revolution and a shift in paradigm. Perhaps sufficient leverage to realize an effectiveness revolution is in the environmental sector’s confession that it is not yet thereethat the organizations to which the public pays and donates billions of dollars cannot yet demonstrate their effectiveness at providing future generations with a healthy environment. 6. Conclusion The frameworks and methodologies necessary for a transition to evidence-based environmental management are already in place and much has been learned from other sectors. However, the transition is proceeding slowly and disjointedly, one organization and one sector at a time. A shared vision is necessary, but reaching consensus on the best path to effectiveness is only possible if the community views all its members as legitimate partners and operates on the principle that no one has the complete answer but everyone might. Considering the complexity, scale and scope of environmental problems, a truly concerted revolutiondstarting nowdis in the best interest of the public and the environmental community that it created. Acknowledgements This manuscript was greatly improved by the conversation, comments and work provided by EPA’s Evaluation Support Division, Dr. Andrew Knight and other participants in the Environmental Evaluators Network, three anonymous reviewers, and former interns Brett Wertz and Lena Moffitt. We are grateful to Chris Metzner for his art. References Ascher, W., 1986. The evolution of the policy sciences: understanding the rise and avoiding the fall. Journal of Policy Analysis and Management 5 (2), 9. Ausubel, J.H., Victor, D.G., Wernick, I.K., 1995. The environment since 1970. Consequences 1 (3). Barabasi, A.L., 2009. Scale-Free networks: a decade and beyond. Science 325 (5939), 412.

M. Keene, A.S. Pullin / Journal of Environmental Management 92 (2011) 2130e2135 Bruyninckx, H., 2009. Environmental evaluation practices and the issue of scale. New Directions for Evaluation 122 (Environmental program and policy evaluation: addressing methodological challenges) 31e39. Chalmers, I., 1993. The Cochrane collaboration: preparing, maintaining, and disseminating systematic reviews of the effects of health care. Annals of the New York Academy of Sciences 703, 156. Chertow, M.R., Esty, D.C., 1997. Thinking ecologically: The next generation of environmental policy. Yale University Press. Chong, C.Y., Kumar, S.P., 2003. Sensor networks: evolution, opportunities, and challenges. Proceedings of the IEEE 91 (8), 1247e1256. Christensen, J., 2002. Fiscal accountability concerns come to conservation. New York Times, 5. Clark, T.W., 2002. The policy process: a practical guide for natural resource professionals. Policy Sciences 35 (3), 325e330. Clarke, L., Clarke, M., et al., 2007. How useful are Cochrane reviews in identifying research needs? Journal of Health Services Research & Policy 12 (2), 101. CMP, 2007. Open Standards for the Practice of Conservation, Version 2.0. Cochrane, A.L., 1972. Effectiveness and Efficiency: Random Reflections on Health Services. BMJ and The Nufield Provincial Hospital Trust, London. Durant, R.F., Fiorino, D.J., O’Leary, R., 2004. Environmental Governance Reconsidered: Challenges, Choices, and Opportunities. MIT Press. EC (European Commission), 2005. Impact Assessment Guidelines. SEC. 791. EC (European Commission), 2007. Communication to the Commission from Ms Grybauskaité in Agreement with the President: Responding to Strategic Needs: Reinforcing the Use of Evaluation. SEC. http://ec.europa.eu/dgs/information_ society/evaluation/data/pdf/sec_2007_0213_en.pdf 213. EPA (U.S. Environmental Protection Agency), 2009. Evaluation of EPA’s Temporally Integrated Monitoring of Ecosystems (TIME) and Long-term Monitoring (LTM) Programs. http://www.epa.gov/evaluate/TIME-LTM%20Final%20Report. pdf. Ferraro, P.J., 2009. Counterfactual thinking and impact evaluation in environmental policy. New Directions for Evaluation 122, 75e84. Ferraro, P.J., Pattanayak, S.K., 2006. Money for Nothing? A call for empirical evaluation of biodiversity conservation investments. PLoS Biology 4 (4), 0482e0488. Fiorino, D.J., 2001. Environmental policy as learning: a new view of an old landscape. Public Administration Review 61 (3), 322e334. Gilmour, J.B., 2006. Implementing OMB’s Program Assessment Rating Tool (PART). IBM Center for The Business of Government. GPRA (Government Performance and Results Act of 1993), 1993. P.L. 103e62. U. S. Congress. GPRA (Government Performance and Results Modernization Act of 2010), 2010. H.R. 2142. U. S. Congress. Griffiths, R.A., 2004. Mismatches between conservation science and practice. Trends in Ecology & Evolution 19 (11), 564e565. Gullison, R., Hardner, J., 2009. Using limiting factors analysis to overcome the problem of long time horizons. New Directions for Evaluation 122, 19e29. Hart, J.K., Martinez, K., 2006. Environmental Sensor Networks: a revolution in the earth system science? Earth Science Reviews 78 (3e4), 177e191. Heimlich, J.E., 2010. Environmental education evaluation: reinterpreting education as a strategy for meeting mission. Evaluation and Program Planning 32 (4). Hildén, M., 2009. Time horizons in evaluating environmental policies. New Directions for Evaluation 122, 9e18. Hobbs, R., 2009. Looking for the silver lining: making the most of failure. Restoration Ecology 17 (1), 1e3. Hobbs, R.J., 2003. Ecological management and restoration: assessment, setting goals, and measuring success. Ecological Management and Restoration 4, S2eS3. Hockings, M., Stolton, S., Dudley, N., 2000. Evaluating Effectiveness. A Framework for Assessing the Management of Protected Areas. UICNWCPA/University of Cardiff, Cardiff. Hockings, M., Stolton, S., Leverington, F., Dudley, N., Courrau, J., 2006. Evaluating Effectiveness: A Framework for Assessing Management Effectiveness of Protected Areas. World Conservation Union. Holling, C., 1978. Adaptive Environmental Assessment and Management. Blackburn Press. Kennedy, E.T., Balasubramanian, H., Crosse, W.E.M., 2009. Issues of scale and monitoring status and trends in biodiversity. New Directions for Evaluation 122, 41e51. Kleiman, D.G.R., Richard, P., Miller, B.J., Clark, T.W., Scott, J., Robinson, M., Wallace, J., Richard, L., Cabin, R.J., Felleman, F., 2000. Improving the evaluation of conservation programs. Conservation Biology 14 (2), 356e365. Knapp, G.J., Kim, T.J., 1998a. Environmental program evaluation: framing the subject. In: Knapp, G.J., Kim, T.J. (Eds.), Environmental Program Evaluation: A Primer. University of Illinois Press, Chicago, pp. 1e20. Koontz, T.M., Thomas, C.W., 2006. What do we know and need to know about the environmental outcomes of collaborative management? Public Administration Review 66 (s1), 111e121.

2135

Lasswell, H.D., 1970. The emerging conception of the policy sciences. Policy Sciences 1 (1), 3e14. Leverington, F., Hockings, M., Costa, K.T., 2008. Management Effectiveness Evaluation in Protected AreasdA Global Study. University of Queensland, IUCN-WCPA, TNC, WWF, Gatton, Australia. Love, A.J., Russon, C., 2000. Building a worldwide evaluation community: past, present, and future. Evaluation and Program Planning 23 (4), 449e459. Lubchenco, J., 1998. Entering the century of the environment: a new social contract for science. Science 279 (5350), 491e497. Mace, G.M., Balmford, A., Boitani, L., Cowlishaw, G., Dobson, A.P., 2000. It’s time to work together and stop duplicating conservation efforts. Nature 405 (6785), 393. McMichael, A.J., Friel, S., Nyong, T., Corvalan, C., 2008. Global environmental change and health: impacts, inequalities, and the health sector. British Medical Journal 336 (7637), 191. Meadows, D., 1999. Leverage Points. Places to Intervene in a System. The Sustainability Institute, Hartland, Vermont, USA. NCSE (National Council for Science and the Environment), 2010. State of the Environment Reports. Retrieved 5/04/2010, from. http://ncseonline.org/cms. cfm%3Fid¼518. Nielsen, S.B., Ejler, N., 2008. Improving performance?: exploring the complementarities between evaluation and performance management. Evaluation 14 (2), 171. OMB (U.S. Office of Management and Budget), 2009. Increased Emphasis on Program Evaluations. M-10e01. http://www.whitehouse.gov/omb/assets/ memoranda_2010/m10-01.pdf. Patton, M.Q., 1997. Utilization Focused Evaluation: The New Century Text London. Sage Publications. Patton, M.Q., 2010. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. The Guilford Press. Pomeroy, R.S., Watson, L.M., Parks, J.E., Cid, G.A., 2005. How is your MPA doing? A methodology for evaluating the management effectiveness of marine protected areas. Ocean and Coastal Management 48 (7e8), 485e502. Pullin, A.S., Knight, T.M., 2009. Doing more good than harm - building an evidencebase for conservation and environmental management. Biological Conservation 142 (5), 931e934. Radin, B.A., 2000. The government performance and results Act and the Tradition of federal management reform: square pegs in round holes? Journal of Public Administration Research and Theory 10 (1), 111e135. Redford, K.H., Coppolillo, P., Sanderson, E.W., Fonseca, G.A.B., 2003. Mapping the conservation landscape. Conservation Biology 17 (1), 116. Rohrschneider, R., Dalton, R.J., 2002. A Global network? transnational cooperation among environmental groups. The Journal of Politics 64, 510e533. Salafsky, N., Margoluis, R., Redford, K.H., Robinson, J., 2002. Improving the practice of conservation: a conceptual framework and research agenda for conservation science. Conservation Biology 16 (6), 1469e1479. Starr, M., Chalmers, I., Clarke, M., Oxman, A.D., 2009. The origins, evolution, and future of the Cochrane database of systematic reviews. International Journal of Technology Assessment in Health Care 25 (S1), 182e195. STBC (Secretariat Treasury Board of Canada), 2009. Policy on Evaluation. http:// www.tbs-sct.gc.ca/pol/doc-eng.aspx%3Fid¼12309§ion¼text. Stem, C., Margoluis, R., Salafsky, N., Brown, M., 2005. Monitoring and evaluation in conservation: a review of trends and approaches. Conservation Biology 19 (2), 295e309. Stevens, A., Milne, R., 1997. The Effectiveness Revolution and Public Health. Progress in Public Health. Royal Society of Medicine Press, London. 197e225. Straughan, B., Pollak, T., 2008. (. from. The Broader Movement: Nonprofit Environmental and Conservation Organizations, 1989e2005. The Urban Institute. http://www.urban.org/publications/411797.html. Surowiecki, J., 2004. The Wisdom of Crowds: Why the Many are Smarter than the few and How Collective Wisdom Shapes Business, Economies, Societies, and Nations. Doubleday Books. Sutherland, W.J., Armstrong-Brown, S., Armsworth, P.R., Brereton, T., Brickland, J., 2006. The identification of 100 ecological questions of high policy relevance in the UK. Journal of Applied Ecology 43 (4), 617e627. Sutherland, W.J., Pullin, A.S., et al., 2004. The need for evidence-based conservation. Trends in Ecology & Evolution 19 (6), 305e308. Trzyna, T., 2008. About Environmental Organizations and Programs. from. California Institute of Public Affairs. http://www.interenvironment.org/wd1intro/ aboutorgs.htm. UN (United Nations), 2009. The Millennium Development Goals Report. from. http://www.un.org/millenniumgoals/pdf/MDG_Report_2009_ENG.pdf. Van Der Heijden, H.-A., 1999. Environmental movements, ecological modernisation and political opportunity structures. Environmental Politics 8 (1), 199e221. Zients, J.D., 2010. The Accountable Government Initiative - An Update on Our Performance Management Agenda. U.S. Office of Management and Budget. http://www.whitehouse.gov/sites/default/files/omb/memoranda/2010/ AccountableGovernmentInitiative_09142010.pdf.