Coming to grips with scientific ignorance in the governance of endocrine disrupting chemicals and nanoparticles

Coming to grips with scientific ignorance in the governance of endocrine disrupting chemicals and nanoparticles

environmental science & policy 38 (2014) 154–163 Available online at www.sciencedirect.com ScienceDirect journal homepage: www.elsevier.com/locate/e...

644KB Sizes 0 Downloads 47 Views

environmental science & policy 38 (2014) 154–163

Available online at www.sciencedirect.com

ScienceDirect journal homepage: www.elsevier.com/locate/envsci

Coming to grips with scientific ignorance in the governance of endocrine disrupting chemicals and nanoparticles Nina Honkela a,*, Arho Toikka a, Janne Hukkinen a, Timo Honkela b a

Department of Social Research, University of Helsinki, P.O. Box 54, 00014 Helsinki, Finland Department of Information and Computer Science, Aalto University School of Science, P.O. Box 15400, 00076, AALTO, Espoo, Finland b

article info

abstract

Article history:

New technologies are characterized by various forms of incertitude that challenge both

Received 15 May 2013

scientific expertise and regulatory action. In this paper, we argue that these incertitudes

Received in revised form

place experts in irreducible double bind situations, which may end in paralysis. Double

5 November 2013

binds emerge when primary injunctions are contradicted by secondary injunctions at a

Accepted 8 November 2013

different logical level, which affects the interpretation of the primary injunction. Adequate-

Available online 19 December 2013

ly addressing the challenges posed by new technologies requires phronesis, or pragmatic,

Keywords:

endocrine disruptors and carbon nanotubes as empirical examples, we argue that in relation

New technologies

to new technologies involving various kinds of incertitude, being phronimos—the person

Ignorance

who can do phronesis—involves synthetically and simultaneously enacting parts of the

context-dependent and action-oriented knowledge grounded in value deliberation. Using

Double bind

three interrelated domains of knowledge, ethics and institutions, also across different

Phronesis

logical levels. The special kind of experience-based phronesic skill that required in the regulatory appraisal of new technologies is thus fundamentally related to the human capacity of pattern recognition. Finally, we argue that being aware of and making full use of practical wisdom thus conceptualized enables a new operationalization of the precautionary principle. # 2013 Elsevier Ltd. All rights reserved.

1.

Introduction

New technologies are characterized by multiple kinds of ignorance that challenge scientific expertise (Baker and Simon, 2002; Hansen et al., 2008; Jasanoff, 1999; Renn, 2008; Wynne, 1996, 2001). Confronting this ignorance involves the blending of objective scientific facts with value-laden policy advice (Flyvbjerg, 2001; Jasanoff, 1999; McCarthy and Kelty, 2010; Wynne, 1996, 2001). Scientific experts responsible for the governance of new technologies are ill-equipped for the task,

because the current risk paradigm frames out ignorance (Stirling and Gee, 2002; Wynne, 2001). Experts end up in double bind situations (Bateson G., 1972; Bateson M.C., 2005), where a primary operational injunction (risk assessment manages ignorance) is contradicted by a secondary constitutive injunction at a different logical level (ignorance prevents risk assessment), thus changing the interpretation of the primary injunction. Such unresolvable tensions are highly distressing and may paralyze the policy process. We argue that addressing the challenges posed by new technologies requires practical wisdom or phronesis, that is,

* Corresponding author. Tel.: +358 400936693. E-mail address: [email protected] (N. Honkela). 1462-9011/$ – see front matter # 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.envsci.2013.11.006

155

environmental science & policy 38 (2014) 154–163

pragmatic, context-dependent and action-oriented knowledge grounded in value deliberation (Flyvbjerg, 2001; Hargreaves, 2012). One concrete manifestation of phronesis is the ability to recognize complex patterns of a non-symbolic nature through non-algorithmic decision-making. We also argue that doing so enables a new operationalization of the precautionary principle (Stirling and Gee, 2002) that is especially well suited for the regulatory appraisal of new technologies. Phronesis takes as its starting point the view that knowledge and values are inextricably intertwined in expert action (Flyvbjerg, 2001). Phronesis is the basic intellectual capacity that enables human beings to make wise, contextdependent choices. As such, it allows for explicitly ‘framing in’ fundamental scientific ignorance at the outset of the policy process. Part of doing phronesis is the ‘‘prudent blurring and disaggregation’’ of the three substantive domains within which scientific experts, regulators and professional actors operate: knowledge, ethics and institutions (Michael et al., 2007: 392). Our central aim in this paper is to clarify how scientific experts do this blurring and disaggregation. We ask: What is the phronesic skill that the phronimos or practically wise person should manifest? We argue that a crucial aspect of such skill is integrative pattern recognition across the three domains of knowledge, ethics and institutions at two different logical levels. We analyze expertise on endocrine disrupting chemicals (EDCs) and carbon nanotubes (CNTs) to uncover the form and substance of the practical wisdom used by and required of experts on these substances. We argue that in both cases scientists and regulators are struggling to derive regulations from universal analytical arguments. The way forward is to use tacit expertise to recognize settings where the lack of rulebased understanding is holding back the fulfillment of fundamental goals. We detail the role and actions of individual scientific experts in the emergence of EDCs and CNTs as foci of risk governance since the 1990s. We identify the mechanisms by which the individual experts have recognized patterns as interlinkages across epistemic, ethical and institutional domains, while interacting with other experts. We begin by articulating our argument for understanding precaution as phronesis, which in turn is understood as pattern recognition (Section 2). We then continue with our case analyses of EDCs and CNTs based on this conceptualization (Sections 3–5). We discuss our results in Section 6.

2.

Theoretical and empirical background

2.1. Risk appraisal, ignorance and the precautionary principle The limitations of the existing science-based risk paradigm have been the focus of intense critique (e.g., Jasanoff, 1999; Levidow and Carr, 2006; O’Malley, 2004; Wynne, 2001). A common theme in this discussion is the place and significance of ignorance, which is characterized by ‘unknown unknowns’, or things and issues that we as of yet do not know to be of relevance for the specific phenomenon under study. Many authors in this discussion point to the helplessness and lack of orientation that researchers and policy actors experience

Table 1 – Stirling and Gee (2002) on formal definitions for risk, uncertainty, ambiguity, and ignorance. Knowledge about likelihoods

Some basis for probabilities No basis for probabilities

Knowledge about outcomes Outcomes well defined Risk

Outcomes poorly defined Ambiguity

Incertitude Uncertainty

Ignorance

when they are not even in principle able to approach an issue of concern with the tools of the risk assessment and management regime (e.g., Godduhn and Duffy, 2003; Vogel, 2004; Wynne, 2001). Stirling and Gee (2002) have approached the terminologically confusing risk appraisal debate via the broad notion of ‘incertitude’ (see Table 1). Incertitude incorporates both colloquial usages of terms such as ‘risk’ and ‘uncertainty’ and technical definitions as functions of outcomes and their likelihood in various scientific and regulatory contexts. The well-established formal definition of risk is a condition under which it is possible to define a comprehensive set of possible outcomes and to resolve a discrete set of probabilities for each outcome. Different fields of risk science use different definitions for the parts of the equation, but what is shared and central to all conceptualizations is the quantifiable nature of the potential harm as well as the mechanism through which it is realized. Uncertainty refers to the condition under which there is confidence in the completeness of the defined set of outcomes, but no valid basis to confidently assign probabilities to these outcomes. However, the multidimensionality, complexity and scope of risks as well as different ways of framing and prioritizing them can render the characterization of outcomes ambiguous despite their well-defined probabilities. Finally, when these difficulties with ambiguity are combined with the problems of uncertainty and compounded with the prospects of unknown unknowns, we face ignorance (Stirling and Gee, 2002). As many risk scholars point out, despite the plethora of operational, tactical and strategic alternatives to the probabilistic methods associated with conventional risk appraisals, the latter continue to hold prominence in most regulatory settings (Levidow and Carr, 2006; Renn, 2008; Stirling and Gee, 2002; Wynne, 1996, 2001), thus effectively hiding the often daunting incertitudes involved in even the apparently most straightforward of cases (e.g., Stirling and Gee, 2002: 523). To respond to the incertitudes, it is necessary to have a way to apply the precautionary principle (Hansen et al., 2006; Harremoes et al., 2001; O’Riordan et al., 2001), especially when understood as humility in the face of the many sources of incertitude. Often, the operationalizations of the precautionary principle have focused on broadening participation in regulatory appraisal (Callon et al., 2001; Stirling and Gee, 2002) and dealt less with the expert skill still necessary for wise decisions. We suggest a practical way of addressing these imperatives of precaution (Stirling and Gee, 2002) that recognizes the existence of incommensurable viewpoints and the fundamental human capacity to judge wisely using pattern

156

environmental science & policy 38 (2014) 154–163

recognition. We suggest that these basic human capacities allow us to operationalize the precautionary principle in a way that enables broadening out regulatory appraisal (Flyvbjerg, 2001).

2.2.

An alternative view: phronesis as pattern recognition

The practical incorporation of humility, completeness, participation and deliberation requires phronesis or practical wisdom. Experience leads to a type of knowledge that is neither exclusively concerned with universals invariable in time and space (episteme) nor the concrete activity that leads to the instrumental application of knowledge (techne). Instead, there is a type of practical wisdom, called phronesis, concerned with the ethics, the values, and the action of the situation (Flyvbjerg, 2001). Phronesic things are highly controversial objects that incorporate and blur the domains of knowledge, ethics and institutions (Michael et al., 2007). For knowledge, such objects involve a tension between being plain ‘technical’ objects for ordinary research manipulation and fuzzier ‘epistemic’ objects for intellectual exploration (Rheinberger, 1997). As for ethics, such objects are simultaneously clear-cut regulatory objects and objects of strong personal concern and commitment. Finally, with regard to institutions, such objects are both a site of translation between the languages of scientists and other professional groups (such as clinicians or regulators) and of cross-professional collaborations (Michael et al., 2007). EDCs and CNTs, we suggest, are such phronesic things. EDCs are chemicals that interfere with the hormonal systems of animals, including humans. They challenge the scientific and regulatory frameworks by non-traditional response curves and cumulative effects of multiple low exposures. Using scientific methodology built for strong levels of proof, issues such as the widespread nature of potential harm are ignored (Harremoes et al., 2001). Regulators find it difficult to define these materials as legitimate objects of regulation (see e.g., Tørsløv et al., 2011a,b). The empirical data for the EDC analysis were between 2010 and 2012. Three types of data were collected. (1) Thematic interviews with central EDC experts from Finland (n = 15), Denmark (n = 12), and the US (n = 6). (2) Recordings of three Nordic expert (EDC scientists, policy makers, and other stakeholders) workshops on EDCs held in Copenhagen, Denmark in 2010; of three deliberative Finnish expert workshops on EDCs held in Helsinki, Finland in 2011–2012; and of a Nordic expert workshop on EDCs and nanomaterials held in Helsinki, Finland in 2012. (3) Secondary literature on the history and regulatory challenges of EDCs in the form of books and scientific articles (n = 78) as well as administrative publications, popular media and publications by NGOS such as, e.g., TEDx and Chemsec (n = 54). The thematic interviews were recorded and transcribed. CNTs are among the most promising engineered nanomaterials, with unique electrical properties, extraordinary strength, and efficient heat conduction properties. Recently, CNTs have become a source of controversy, stemming from the similarity of CNTs with asbestos fibers in size and shape and the associated fears of widespread health problems. There is a strong sense—a gut feeling (Maynard, 2011c)—that

unusual or unanticipated risk is present; however, the identification and definition of the problem remains elusive (Maynard et al., 2011). The CNT case is based on document analysis. The case study database includes scientific articles, administrative publications, popular media, and blog posts. The scientific literature was identified by keyword searches for ‘carbon nanotube’ with terms such as risk, toxicology, governance, and the results were narrowed down to 25 key articles that discuss problems and solutions to the issues and not simply report experiments. Administrative publications are EU documents related to REACH. Popular media and nano risk blogs were included, if they explicitly discussed the issues at hand. In total, there were 53 documents that form the basis of the CNT case—many of them are cited in the analysis. Although thoroughly anchored in empirical data (for a summary, see Table 2), the main function of the two case studies is illustrative. From the perspective of phronesic things, what becomes important is not so much the three different domains taken separately, as their integration into the unitary or singular capacity of being phronimos—the person who can do phronesis. A phronesic actor is one who can commit oneself to a particular epistemic perspective, enact oneself as an ethical actor and engage in a particular politics (the management of collaboration) that enables translational research, all at the same time (Michael et al., 2007: 391–392). Thus, there is a special kind of experience-based phronesic skill associated with a specific kind of integrative phronesic actor synthetically and simultaneously enacting parts of the three interrelated domains of knowledge, ethics and institutions. Flyvbjerg (2001) and Michael et al. (2007) are not very explicit in their analysis of the concrete form of phronesic skill, and phronesis has been criticized for being too slippery (Collins and Evans, 2002: 291), even though integrative capacity has been extensively elaborated as expertise (Dreyfus et al., 1986; Dreyfus and Dreyfus, 2005). Our analysis shows that the phenomenon of fundamental constitutive incertitude places actors—who are deeply embedded in the current paradigm of risk assessment—in insoluble yet manageable double binds (Bateson, 1972; Tognetti, 1999). In a double bind, a primary injunction is contradicted by a secondary injunction at a different logical level, which affects the interpretation of the primary injunction. According to Bateson, there is no possibility of resolution or withdrawal from the problem (Bateson, 1972). Phronesis can be understood as a capacity to discern and manage contradictory demands at different levels (Janasik et al., 2010). In risk governance, a double bind can be identified between the constitutive level injunction and the operational level injunction within each domain (Table 3). In the domain of risk

Table 2 – Summary of empirical data for the EDC and CNT case studies. 1 – EDC A – Scientific articles B – Other documents C – Interviews D – Workshops

78 54 33 7

2 – CNT 25 53 N/A N/A

157

environmental science & policy 38 (2014) 154–163

Table 3 – Levels and domains of phronesic skill in environmental risk governance.

Constitutive level

Operational level

Risk knowledge

Risk regulation

Risk ethics

Scientists shall obtain knowledge of probabilities and consequences of different courses of action Determine risk: risk = probability  consequences

Public officials shall safeguard public interest on basis of scientific knowledge

The public shall determine criteria for evaluating public benefit and harm Act on your best knowledge not to harm

Decide on regulatory action: if risk > permissible level, then regulate

knowledge, the operational level logic requires that the quantitative risk of EDCs and CNTs be determined. This requirement contradicts the reality of the constitutive level of logic: in the absence of conclusive knowledge about environmental consequences or probabilities of EDCs and CNTs, their quantitative risk cannot be determined. In the institutional (i.e., regulatory) domain, the operational level logic of regulation stipulates that if the quantitative risk of a substance exceeds the regulatory limit, then regulatory rules will apply. This contradicts the constitutional level logic, which states that if the quantitative risk is unknown, then regulation is impossible. Finally, in the domain of risk ethics, the operational level ethical principle states: ‘Act on your best knowledge not to harm.’ Yet the principle cannot be followed as such, because the constitutional level ethical principle recognizes that if the consequences of an action are unknown, then they can do good or harm. The capacity to manage these double binds is manifested in pattern recognition. An increasing body of evidence shows that expert-level decision making works through pattern recognition (see e.g., Baron and Ensley, 2006; Ellis, 2011). Pattern recognition is a fundamental cognitive skill that involves assessment of a large number of aspects simultaneously. An expert holistically perceives what is relevant in a situation and what is not. This pattern recognition skill is based on experience. The implicit knowledge behind expertlevel pattern recognition skills enables making refined distinctions and recognizing patterns that are difficult or impossible to describe in linguistic terms. Human pattern recognition skills are, at their best, fine-tuned to contextual variation. In many real world problem solving situations, the complexity of context is so high that it remains infeasible to explicate the underlying knowledge. In the regulation of new technologies, phronesic pattern recognition includes, among others, various kinds of translation work (e.g., Star and Griesemer, 1989), boundary spanning through holding multiple roles (e.g., van Egmond and Bal, 2011 on boundary ‘configurations’), reframing the issue under consideration (e.g., Goffman, 1974), and strategies of network expansion and contraction (e.g., McCarthy and Kelty, 2010). However, while such strategic processes form a central part of phronesis, they are not identical to it. Phronesis, as we see it, is a distinct, higher-level concept that encompasses these and other processes, with the explicit aim of reaching a nuanced view of the current double bind, as well as a doable way of proceeding on the basis of this view. Take, for instance, McCarthy and Kelty’s (2010: 426) crystallization of the response by scientists and policymakers to the demand of responsibility: ‘‘the practical, technical and affective orientation that the actors have adopted in order to become more responsible’’. Phronesis, we argue, is precisely such a

responsive orientation, based on the fundamental capacity of pattern recognition across the domains of knowledge, ethics and institutions, that enables the successful use of strategies such as translation, boundary spanning and various kinds of networking. We focus here on the process by which the capacity for integrative pattern recognition emerges in science-policy interaction and how it differs from rule-based reasoning. Previous studies do not address and articulate the actual process where experts recognize and integrate the pattern. In what follows, we suggest that experts do this by recognizing double binds between the two levels in the three domains. EDC and CNT experts deal with the double bind (1) by expanding the field of applicability of the constitutive level injunction and (2) by recognizing novel patterns of action that benefit from the expanded constitutive rules and cut across the domains. The rich language of phronesis makes sense of the actions of individuals and organizations under study in a way that remains fully out of reach for the poor language of the current risk paradigm.

3. Three double binds in EDC science and regulation The story of EDCs began in the late 1980s with the groundbreaking work of Theo Colborn. Before Colborn, there was scattered evidence of disturbances in reproductive function due to the use of chemicals among animals as well as humans. Colborn’s major achievement was the recognition that the separate cases had something in common: the problems they manifested appeared all to be caused by a diverse group of industrial and agricultural chemicals with the capacity to mimic and/or obstruct the hormone function of biological organisms. This key idea was to become known as ‘the environmental endocrine hypothesis’ (Colborn et al., 1996; Krimsky, 2000). For decades, chemicals had been equivalent with cancercausing chemicals. The hypothesis Colborn was developing amounted to a new theory of environmental disease, a new paradigm based on the guiding concept that some chemicals can interfere with the body’s natural hormones (Krimsky, 2000: 2). At this time, however, traditional toxicologists were generally ill-informed about the dose-response effects of hormones. Endocrine systems are based on self-regulating feedback systems, and thus standard monotonic doseresponse curves (the higher the dose, the greater the response) do not apply (Krimsky, 2000: 23). Due to these characteristics, establishing relationships between EDCs and health effects is very difficult. The hardships have not deterred dedicated researchers from

158

environmental science & policy 38 (2014) 154–163

Fig. 1 – The phronesic pattern and pattern recognition process of the EDC case.

trying, however, as evidenced by the most recent documentation of the current scientific work in the State of the Art Assessment on Endocrine Disruptors Part 1 Summary of the State of the Science ordered by the European Commission in 2011. Despite these scientific endeavors, the majority of the conclusions formulated in the report both echo and articulate the current general understanding: so much remains unknown that no decisive regulatory action can be taken. This, then, is the knowledge double bind: The selfregulating feedbacks of endocrine disruption cannot be expressed in terms of probabilities for adverse consequences along the dose-response curve. This constitutes a persistent ignorance in risk knowledge, which also makes it impossible to operationalize risk (number 1 in Fig. 1). And without risk, it is not possible for science to establish the kind of causation needed for regulatory purposes (Vogel, 2004). Colborn’s personal concern led her to convene, in 1991, together with John Peterson Myers what was to become the first Wingspread session. This represents a move from the constitutive-level ignorance (number 1 in Fig. 1) to constitutive-level ethical deliberation (number 2 in Fig. 1). Participating in this historical meeting were, among others, environmental estrogen researchers, reproductive physiologists, and wildlife toxicologists. The event gave rise to a four-page consensus document. In addition to providing the first detailed formulation of the environmental endocrine hypothesis, it also represents a significant step forward in collaboration (Krimsky, 2000). In terms of Fig. 1, it represents a move from constitutivelevel ethical deliberation (number 2) over to constitutive-level attempts to come to terms with ambiguity (number 3). The consensus statement provided an effective tool for raising the issues before a non-scientific audience (Krimsky, 2000: 29). It was cited in journalistic accounts and taken up by environmental activists. Within three months after the meeting, the statement was reported to a Senate committee investigating reproductive hazards (Krimsky, 2000: 29). From the point of view of institutions, 1991 also became the year in which the hitherto epistemic and collaborational endeavor turned openly political. A Senate hearing entitled Government Regulation of Reproductive Hazards featured a testimony by Colborn, who reported forcefully on the conclusions and recommendations of the Wingspread consensus statement. The response was very positive. In Fig. 1, this represents a move from constitutive-level attempts at coming to grips with ambiguity (number 3) to constitutive-level regulatory considerations (number 4).

Already at the hearing in 1991, Colborn had put forth a series of policy recommendations (Colborn et al., 1996). The idea was that EDCs were to pass through the same kind of risk assessment process as any other potentially problematic issue. They were to be tested and screened, with the associated subprocesses. Things did not turn out the way of ordinary risk assessment, however. The Environmental Protection Agency, which was to take responsibility for implementing the screening process, accomplished only preparatory work on the Endocrine Disruptor Screening Program through 1998. After legal processes initiated by stakeholders, new deadlines for test development and testing were established in March 2000, with EPA accepting responsibility for communicating on progress and on reasons for potential delay (Vogel, 2004). In Fig. 1, this represents a move from constitutive-level regulatory considerations (number 4) to operational-level attempt to implement regulation (number 5). As of 2013, the process of turning the environmental endocrine hypothesis into a technical test for endocrine disrupting capacity remains globally mainly at the early stage of validating test methods. These developments can be summarized in terms of an institutional and ethical double bind. In the institutional domain, the constitutive level rule states that public officials shall safeguard the public interest on the basis of scientific knowledge. Yet the persistent lack of scientific risk knowledge prevents regulators from translating this into an operationallevel protective rule ‘‘regulate if risk is over threshold.’’ In Fig. 1, this represents a continuous oscillation between constitutive-level attempts at coming to terms with ambiguity (number 3) and operational-level attempts to implement regulation (number 5). Simultaneously, the debate as to whether EDCs represent a real danger or not remains disputed, resulting in a double bind in the domain of ethics: if the exact consequences of EDCs are unknown, then the chemicals can do good or harm. Thus, in Fig. 1, this represents a return to constitutive-level ethical deliberation (number 2).

4. Three double binds in nanoscience and nanotechnology CNTs are among the most promising nanomaterials—they are extraordinarily hard, stiff objects with very useful electric properties. Some products are already on the market and great growth is forecasted (Thayer, 2007). Still, CNTs are among the

environmental science & policy 38 (2014) 154–163

most debated nanomaterials due to their risk potential. This is because CNTs are similar in form with asbestos fibers, raising concerns that CNTs might result in cancers and other adverse impacts. CNTs were first described in a paper in the Russian Journal of Physical Chemistry (Radushkevich and Lukyanovich, 1952); however, their discovery is now most often credited to Iijima (1991). This reattribution has puzzled scientists (Monthioux and Kuznetsov, 2006). There is more to explaining this puzzle than simple language barriers, namely, a tension between CNTs as epistemic things versus technical objects (see Section 2.2). At first, there was no machinery for CNT manufacturing or even observation, and there was no scientific readership to adopt them. While they were undoubtedly material things, they were also in the domain of the unthinkable: no discipline had the vocabulary for understanding and categorizing them. The Iijima paper served to bring CNTs to a wider scientific audience, whose imagination was now captured. The technical conditions were right: instruments were available, as were theories and research paradigms. The differentiation between epistemic things and technical objects poses a double bind: at the constitutional level, science should define materials only by their properties, but at the operational level, such definition is possible only with the help of technical measurement devices, metrology practices and an understanding community. This double bind between epistemic versus technical objects generated a situation of ignorance in the domain of risk knowledge (number 1 in Fig. 2). The double bind arises from the similarities in shape with asbestos and the issue of toxicity. The operational rules involving dose-response assessment are based on defining probabilities and assigning values to outcomes. Nanomaterials challenge the traditional operational rules, such as using chemical composition as the only determinant in particulate exposure (Maynard et al., 2011: S109). When probabilities cannot be accurately determined, the care and responsibility implied by the higher-level logic of toxicology cannot be applied. The lack of established methods turned CNT toxicity into a debated issue, with teams reaching very different conclusions (Lam et al., 2006; Warheit et al., 2004). An ethical double bind results from interactions between personal concerns over ignorance and rulebooks operating with risk (number 2 in Fig. 2). At the operational level, the development of novel technologies poses the question of whether actions are likely to result in harm toward others. When the pathways of incremental technological developments are at the

159

more complex levels of incertitude, simple moral guidelines are inadequate. McCarthy and Kelty (2010: 411) studied the actions of scientists at the Center for Biological and Environmental Nanotechnology, Rice University in Texas, and the controversies and public discussions they faced. The problem the scientists faced was a moral dilemma: uncertainty about the implications of nanotechnology diluted the methods of moral responsibility that are normally associated with science and technology development. When the scientific and ethical dilemmas are left unresolved and pushed into institutional processes, new double binds come up. Regulatory agencies expect science-based institutions to manage simple risk, but the scientific evidence and regulatory rules do not fit each other. In the EU, this has manifested itself as an institutional double bind over whether or not nanomaterials are covered by REACH. REACH attempts to define the harmful potential as risk (number 3 in Fig. 2). The starting point is that obviously they are: since REACH covers all substances and nanomaterials are substances, REACH applies even without any nanospecific rules. Still, a methodology needs to be developed, validated, and standardized for exposure measurement as well as hazards identification (SCENIHR, 2009: 52). Thus, regulators now wonder what actually constitutes a nanothing. The Competent Authorities for REACH and Classification and Labeling (CARACAL) set up a subgroup for nanomaterials, who further set up a series of projects for REACH Implementation Projects on Nanomaterials, RIP-oNs. The first one, Substance Identity of Nanomaterials, produced an advisory report on the issues (RIP-oN 1, 2011). Unfortunately, the RIP-oN experts concluded that they needed policy decisions to give technical advice for policy suggestions, ending in a vicious circle where the constitutive rules of risk regulation and risk ethics (numbers 4–6 in Fig. 2) collide in the attempt to base the ethical commitment to safety in a purely technical setting. Still, the European Commission managed to adopt a definition (EC, 2011)—even if it might not actually be grounded in science (Maynard, 2011a) (number 7 in Fig. 2).

5. Phronesic pattern recognition in the persistent interim 5.1.

EDCs

From the point of view of phronesic pattern recognition, the EDC story goes as follows. It all started with clear-cut ignorance in the sense of Stirling and Gee (2002) (see Section 2.1): What

Fig. 2 – The phronesic pattern and pattern recognition process of the CNT case.

160

environmental science & policy 38 (2014) 154–163

on Earth was happening to the wildlife of the Great Lakes (number 1 in Fig. 1)? The element of surprise characterizing instances of ignorance is clearly visible here. Inextricably intertwined was the deep ethical concern of Colborn and colleagues (number 2 in Fig. 1). Based on this relation between ethics and knowledge Colborn eventually articulated the environmental endocrine hypothesis, which for decades to come cast the issue under the emblem of ambiguity (number 3 in Fig. 1). Colborn also quickly recognized that to be able to raise her ethical concerns in a way that was both credible and had an impact on regulatory institutions, she needed to mobilize the scientific community with the Wingspread sessions. This is also the point at which the pattern recognition for the first time crosses over to the regulatory domain (number 4 in Fig. 1). This novel action pattern also establishes a triangular relationship between constitutive-level rules for knowledge, regulation and ethics (formed by numbers 2–4 in Fig. 1). Based on these constitutive-level relations and action patterns, the divide between constitutive and operational levels is finally crossed in the early 1990s in the form of congressional hearings and the establishment of the EDC screening program (number 5 in Fig. 1). This also represents a first attempt to operationalize the constitutive-level ethical concern about the possible risks of EDCs. However, during the years to come, irresolvable ambiguity again forced the actors to confront the knowledge double bind (i.e., to revisit number 3 in Fig. 1). This is essentially the situation today in the US. However, there are some signs of activity at the constitutive level of risk ethics (number 6 in Fig. 1). In 2007, the Chemical Heritage Foundation based in Philadelphia organized a conference on human biomonitoring and EDCs, gathering experts from academia, government, industry and NGOs working in fields as diverse as endocrinology, chemistry, sociology, history, and law to gather perspectives on current understandings of these issues (Roberts, 2008: 1). This in our view is a paradigmatic example of how experts broaden the constitutive-level rules of risk ethics by creating forums of focused deliberation with professional and stakeholder communities to ease the establishment of decision-criteria number 6 in Fig. 1; see also Callon et al., 2001). Recent developments in Europe also deserve mentioning. The REACH legislation contains a possibility to address the EDC concern (art. 57(f)) and represents another attempt to operationalize the constitutive-level pattern as we can see it in Fig. 1, number 5. REACH claims to be doing this on the basis of precautionary considerations. However, there are indications that this attempt will face exactly the same double bind problems that are currently bewildering the US situation (see e.g., Tørsløv et al., 2011a,b).

5.2.

CNTs

With CNTs, scientists use different tactics to unravel the pattern of scientific, regulatory and ethical dilemmas. Some examples can be drawn from the work on CNTs by the USbased scientist Andrew Maynard. The first practical move (from number 1 to 2 in Fig. 2) is to explicitly leave the world of science and its operational rules. As a physicist involved in

developing nanotechnology health and safety regulations in both the UK and the US, Maynard has recently entered the ‘‘alternative reality of science policy and communication’’ (Maynard, 2008) in social media with his blog 2020 Science and YouTube channel RiskBites. Such role-play challenges the conventional operational rules for scientists, but does not lead one to discard the role of a scientist: there are clearly separate roles for reporting scientific results and making other, more ethically grounded arguments. The use of multiple roles is based on pattern recognition because the use crosses domains. Maynard has been active in an exercise where governments and scientists seek the elusive definition of nanomaterials. The regulatory debate over nanotechnology has led to a ‘Tower of Babel’ with different framesets of discussion leading to confusion concerning imagined nano-futures, which misleads regulators (Maynard et al., 2010) (number 3 in Fig. 2). For a phronesic actor engaged in pattern recognition, this is not a puzzle that must be solved before proceeding, but an issue to be acknowledged elsewhere in the system in order to proceed. The response is two-way communication between science and society, a participatory regime with enhanced transparency and trust. It is not clear what the inclusive ‘‘evidence-informed and socially responsive decision-making’’ (ibid. 583) entails, but what is clear is that there are ‘‘many hurdles—few of which are ever overcome to everyone’s satisfaction’’ (ibid. 580). This approach both acknowledges the double bind in the situation and manages it by making it an ethical struggle instead of a scientific one (number 4 in Fig. 2). During the definition process, professional regulators are likely to fear that general but strict definitions of nanomaterials might accidentally cover homogenized milk as a nanomaterial (Maynard, 2011b) and miss important materials, as happened with asbestos legislation (Maynard, 2011a). The phronesic pattern recognition strategy, as employed by Andrew Maynard, has been to argue against strict definitions and for replacing them with flexible trigger rules that start a regulatory process (Maynard, 2011a). Number 5 in Fig. 2 builds on this: a regulatory regime that acknowledges incertitude and manages it on the basis of an ethical commitment to respond to a public concern. The evidence-informed but socially responsive framework needs to be adaptive in a novel manner. Some CNT scientists are taking new positions to enable pattern recognition (from number 6 to 7 in Fig. 2). When Andrew Maynard moved from jobs in academia to government to the Project on Emerging Nanotechnologies, he was told to choose between roles: either remain a scientist or become ‘‘a science policy wonk’’ (Maynard, 2008). But Maynard pushed on his own path, or middle-way, managing both roles. Instead of succumbing to sectoral double binds, he managed them by switching between roles (Jamison, 2001) while being engaged in explicit self-reflection (Maynard, 2009a, for example). Maynard moves with ease between being a scientist and science communicator (Maynard, 2009b) or a scientist and activist (Economist, 2007). Venturing outside traditional roles does not mean giving up on science. Maynard et al. (2011) defend the primary nature of science in nanodefinitions for regulatory purposes. The science is not in the simple application of

environmental science & policy 38 (2014) 154–163

old rules, but builds on a problem formulation based on the principles of emergent risk, plausibility of scenarios, and impact analysis as a qualitative reality check (Maynard et al., 2011: S119–S120). Although this new science remains elusive, the McCarthy and Kelty (2010) case study is nonetheless illustrative. To escape the double binds, two separate organizations were needed: the Center for Biological and Environmental Nanotechnology for the science, and the International Council on Nanotechnology for governance. The council aims to react to the double binds by working as a neutral forum for exploring risk and producing a knowledge base for environmental health and safety in CNTs. Such a forum enables actors to discuss issues outside the operational settings of their organizations, hopefully making pattern recognition possible.

6.

Discussion and conclusions

We have shown how experts facing the challenge to govern the risks of new materials have the capacity to discern and manage contradictory double binds across epistemic, institutional and ethical domains. The case studies on EDCs and CNTs indicate that some experts have not been paralyzed by the double binds. What we see instead is an effort to deal with them by recognizing the patterns of ethical judgment raised by the fact that an unknown can do both good and harm. In the epistemic domain this realization leads the expert to generate scientifically plausible alternatives of the future, with explicit consideration of the potential goods and bads of the alternative outcomes. In the institutional domain, the alternatives provide the basis for specifying the principles for dealing with the different outcomes. We have addressed the irresolvable double binds resulting from the confrontation of the science-based risk paradigm with the challenges posed by new technologies. The double binds result from the tendency in current risk thinking to ‘frame out’ ignorance and uncertainty and the value considerations they necessitate (Wynne, 2001). Using EDCs and CNTs as empirical examples we have argued that these challenges can still be managed by resorting to the human capacity for phronesis, or the skill to understand objects of knowledge as objects of personal commitment as well. Phronesis explicitly ‘frames in’ fundamental scientific ignorance, as well as the other forms of incertitude identified by Stirling and Gee (2002), at the outset of the policy process. Such an articulation of what it is we actually do in our current governance also has the merit of giving credit to the extraordinary human capacity of finding creative yet imperfect solutions for ignorance. Our approach specifies the imperatives of the precautionary principle (Stirling and Gee, 2002) by (1) broadening the regulatory appraisal process with procedures for integrating often incompatible viewpoints, (2) identifying the domains and levels of logic that constitute such viewpoints, and (3) discerning tensions and promising new action patterns across the domains and logical levels. Indeed, the two cases of double bind identification and management are in themselves examples of the precautionary principle in action. Colborn and Maynard may have been ignorant but they were not helpless, and in this paper we have

161

developed a vocabulary to help experts deal with their ignorance. The new vocabulary could also be used to formulate a novel industry policy for working with materials with persistent uncertainties and likely risks. In this paper, risk has been defined as a condition under which it is possible to define a set of consequences and to resolve a set of probabilities for each consequence (i.e., risk = probability  consequences; see Section 2). However, risk assessment sciences typically define risk as the product of hazard and exposure. With EDCs and CNTs, many of the existing knowledge gaps and uncertainties are at the level of hazards (in risk = hazard  exposure). Arguably, to be able to apply phronesis to EDCs or CNTs (and more broadly nanomaterials), we have to accept that the hazard concept is not fully applicable, and that we therefore have to shift toward acceptance of concern as a substitute (i.e., for EDCs and CNTs, ‘risk’ = concern  exposure). Even in cases of ignorance it would then be possible to generate numerical estimates of concern and exposure, which would enable the calculation of ‘‘risk’’. The notion of concern-based risk assessment could function as a pragmatic operationalization of phronesic precautionary processes in the business world, where cost considerations tend to overrule issues of precaution when quantitative estimates of data or exposure are not available. Finally, we would like to suggest some avenues for future research emerging from this reinterpretation of precaution as phronesis as pattern recognition. First, as Figs. 1 and 2 suggest, it is highly likely that phronesis as pattern recognition is an essentially distributed process involving the experience-based expertise of multiple actors. Understanding better the dynamics and mechanisms of the process is important especially for the foreseeable cases in which the patterns recognized by the actors do not form a coherent whole and are irresolvable by conventional procedures. Such cases involve negotiations that resemble processes of substantial ethical problem solving more than anything else. Second, participating in such problem solving processes with an awareness of the entanglement of domains and logical levels poses significant selfreflexive challenges for scientists and regulators alike. Learning more about how individual actors cope with and manage them is valuable for all cases involving phronesic rather than epistemic or technical objects.

references

Baker, T., Simon, J., 2002. Embracing Risk: The Changing Culture of Insurance and Responsibility. University of Chicago Press, Chicago. Baron, R.A., Ensley, M.D., 2006. Opportunity recognition as the detection of meaningful patterns: evidence from comparisons of novice and experienced entrepreneurs. Management Science 52 (9) 1331–1344. Bateson, G., 1972. Steps to an Ecology of Mind. Ballantine Book, New York. Bateson, M.C., 2005. The double bind: pathology and creativity. Cybernetics and Human Knowing 12, 11–21. Callon, M., Lascoumes, P., Barthe, Y., 2001. Acting in an Uncertain World: An Essay on Technical Democracy. The MIT Press, Cambridge, MA.

162

environmental science & policy 38 (2014) 154–163

Colborn, T., Dumanoski, D., Myers, J.P., 1996. Our Stolen Future: Are We Threatening Our Fertility, Intelligence, and Survival? A Scientific Detective Story. Penguin, New York. Collins, H.M., Evans, R., 2002. The third wave of science studies: studies of expertise and experience. Social Studies of Science 32 (2) 235–296. Dreyfus, H.L., Dreyfus, S.E., Athanasiou, T., 1986. Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. The Free Press, New York. Dreyfus, H.L., Dreyfus, S.E., 2005. Expertise in real world contexts. Organization Studies 26 (5) 779–792. EC, 2011. European Commission Press Release. What is a ‘‘nanomaterial’’? European Commission breaks new ground with a common definition. European Commission, October 18. Economist, 2007. A little risky business. Economist, November 22. Ellis, R., 2011. Jizz and the joy of pattern recognition: virtuosity, discipline and the agency of insight in UK naturalists’ arts of seeing. Social Studies of Science 46 (6) 769–790. Flyvbjerg, B., 2001. Making Social Science Matter: Why Social Inquiry Fails and How It Can Succeed Again. Cambridge University Press, Cambridge. Godduhn, A., Duffy, L.K., 2003. Multi-generation health risks of persistent organic pollution in the far north: use of the precautionary approach in the Stockholm convention. Environmental Science and Policy 6 (4) 341–353. Goffman, E., 1974. Frame Analysis: An Essay on the Organization of Experience. Harper and Row, London. Hansen, S.F., Carlsen, L., Tickner, J.A., 2006. Chemical regulation and precaution: does REACH really incorporate the precautionary principle. Environmental Science and Policy 10, 395–404. Hansen, S.F., Maynard, A., Baun, A., Tickner, J.A., 2008. Late lessons for early warnings for nanotechnology. Nature Nanotechnology 3, 444–447. Hargreaves, T., 2012. Temporality and prudence: on stem cells as ‘‘phronesic things". Geoforum 43, 315–324. Harremoes, P., Gee, D., MacGarvin, M., Stirling, A., Keys, J., Wynne, B., et al., 2001. Late lessons from early warnings: the precautionary principle 1896–2000. Environmental issue report no. 22/2001. OPCE (Office for Official Publications of the European Communities), Copenhagen. Iijima, S., 1991. Helical microtubules of graphitic carbon. Nature 354, 56–58. Jamison, A., 2001. The Making of Green Knowledge: Environmental Politics and Cultural Transformation. Cambridge University Press, Cambridge. Janasik, N., Salmi, O., Casta´n Broto, V., 2010. Levels of learning in environmental expertise: from generalism to personally indexed specialisation. Journal of Integrative Environmental Sciences 7 (4) 297–313. Jasanoff, S., 1999. The songlines of risk. Environmental Values 8, 135–152. Krimsky, S., 2000. Hormonal Chaos: The Scientific and Social Origins of the Environmental Endocrine Hypothesis. John Hopkings, Baltimore/London. Lam, C.-W., et al., 2006. A review of carbon nanotube toxicity and assessment of potential occupational and environmental health risks. Critical Reviews in Toxicology 36, 189–217. Levidow, L., Carr, S., 2006. GM crops on trial: technological development as a real-world experiment. Futures 39, 408– 431. Maynard, A., 2008. About 2020science. Science, http:// science.org/about/. Maynard, A., 2009a. Confessions of a ‘‘media hog’’. Science, http://2020science.org/.

Maynard, A., 2009b. Asbestos-like nanomaterials – should we be concerned? Science, http://2020science.org/. Maynard, A., 2011a. Don’t define nanomaterials: comment. Nature 475, 31. Maynard, A., 2011b. EC adopts cross-cutting definition of nanomaterials to be used for all regulatory purposes. Science, http://2020science.org/. Maynard, A., 2011c. Why we don’t need a regulatory definition for nanomaterial. University of Michigan Risk Science, http://umrscblogs.org. Maynard, A., Bowman, D.M., Hodge, G.A., 2010. Conclusions: triggers, gaps, risks and trust. In: Hodge, G.A., Bowman, D.M., Maynard, A. (Eds.), International Handbook on Regulating Nanotechnologies. Edward Elgar, Cheltenhan, pp. 573–586. Maynard, A., Warheit, David, B., Philbert, M.A., 2011. The new toxicology of sophisticated materials: nanotoxicology and beyond. Toxicological Sciences 120, S109–S129. Michael, M., Wainwright, S., Williams, C., 2007. Temporality and prudence: on stem cells as ‘‘phronesic things’’. Configurations 13 (3) 373–394. McCarthy, E., Kelty, C., 2010. Responsibility and nanotechnology. Social Studies of Science 40 (3) 405–432. Monthioux, M., Kuznetsov, V.L., 2006. Who should be given credit for the discovery of carbon nanotubes? Carbon 44, 1621–1623. O’Malley, P., 2004. Risk, Uncertainty and Government. Cavendish Press/Glasshouse, London. O’Riordan, T., Cameron, J., Jordan, A.J. (Eds.), 2001. Reinterpreting the Precautionary Principle. Cameron May, London. Radushkevich, L.V., Lukyanovich, V.M., 1952. O strukture ugleroda, obrazujucegosja pri termiceskom razlozenii okisi ugleroda na zeleznom kontakte. Zurn Fisic Chim 26, 88–95. Renn, O., 2008. Risk Governance. Coping with Uncertainty in a Complex World. Earthscan, London. Rheinberger, H.-J., 1997. Toward a History of Epistemic Things. Synthesizing Proteins in the Test Tube. Stanford University Press, Stanford. RIP-oN 1, 2011. REACH Implementation Project. Substance Identification of Nanomaterials. European Commission Joint Research Centre. Roberts, J., 2008. New Chemical Bodies: A Conversation on Human Biomonitoring and Endocrine-Disrupting Chemicals. Chemical Heritage Foundation, Philadelpia. SCENIHR, 2009. Risk Assessment of Products of Nanotechnologies. Scientific Committee on Emerging and Newly Identified Health Risks Opinion. State of the Art Assessment on Endocrine Disruptors, 2011. Part 1: Summary of the State of the Science, 2nd Interim Report. State of the Art Assessment on Endocrine Disruptors. Star, S.L., Griesemer, J.R., 1989. Institutional ecology, ‘translations’ and boundary objects: amateurs and professionals in Berkeley’s Museum of Vertebrate Zoology, 1907–39. Social Studies of Science 19 (3) 387–420. Stirling, A., Gee, D., 2002. Science, precaution and practice. Public Health Reports 117, 521–533. Thayer, A.M., 2007. Carbon nanotubes by the metric ton: anticipating new commercial applications, producers increase capacity. Chemical & Engineering News 85, 29–35. Tognetti, S., 1999. Science in a double-bind: Gregory Bateson and the origins of post-normal science. Futures 31, 7. Tørsløv, J., Slothus, T., Christiansen, S., 2011a. Endocrine disrupters – developing criteria. TemaNord 2011, 536. Tørsløv, J., Slothus, T., Christiansen, S., 2011b. Endocrine disrupters – combination effects. TemaNord 2011, 537. van Egmond, S., Bal, R., 2011. Boundary configurations in science policy: modeling practices in health care. Social Studies of Science 38 (1) 108–130.

environmental science & policy 38 (2014) 154–163

Vogel, J., 2004. Tunnel vision: the regulation of endocrine disrupters. Policy Sciences 37 (3–4) 277–303. Warheit, D.B., Laurence, B.R., Reed, K.L., Roach, D.H., Reynolds, G.A.M., Webb, T.R., 2004. Comparative pulmonary toxicity assessment of single-wall carbon nanotubes in rats. Toxicological Sciences 77, 117–125.

163

Wynne, B., 1996. May the sheep safely graze? A reflexive view of the expert-lay knowledge divide. In: Lash, C., Szerszynski, B., Wynne, B. (Eds.), Risk, Environment and Modernity: Towards a New Ecology. Sage, London, pp. 27–43. Wynne, B., 2001. Creating public alienation: expert cultures of risk and ethics of GMOs. Science as Culture 4, 445–481.