Comprehensiveness of scenarios in the safety assessment of nuclear waste repositories

Comprehensiveness of scenarios in the safety assessment of nuclear waste repositories

Reliability Engineering and System Safety 188 (2019) 561–573 Contents lists available at ScienceDirect Reliability Engineering and System Safety jou...

4MB Sizes 0 Downloads 34 Views

Reliability Engineering and System Safety 188 (2019) 561–573

Contents lists available at ScienceDirect

Reliability Engineering and System Safety journal homepage: www.elsevier.com/locate/ress

Comprehensiveness of scenarios in the safety assessment of nuclear waste repositories E. Tosoni

T

⁎,a,b

, A. Saloa, J. Govaertsc, E. Ziob,d,e

a

Department of Mathematics and Systems Analysis, Aalto University, Finland Energy Department, Politecnico di Milano, Italy c SCK.CEN, Studiecentrum voor Kernenergie - Centre d’Étude de l’Énergie Nucléaire, Belgium d MINES ParisTech/PSL Université Paris, Centre de Recherche sur les Risques et les Crises (CRC), Sophia Antipolis, France e Eminent Scholar, Department of Nuclear Engineering, Kyung Hee University, South Korea b

ARTICLE INFO

ABSTRACT

Keywords: Safety Assessment Scenario Analysis Comprehensiveness Uncertainty Bayesian Networks

The comprehensiveness of scenarios for the safety assessment of nuclear waste repositories has been studied from many perspectives. In this paper, we offer a generalized interpretation thereof through an explicit quantification of residual uncertainty, which makes it possible to derive conclusive statements about the safety of the repository. To support the operational evaluation of comprehensiveness, we develop a probabilistic methodology which uses set inclusion to capture epistemic uncertainties and propagates these in Bayesian networks to obtain bounds on the probability that the reference safety threshold is violated. We also present analyses which can help in achieving comprehensiveness and describe a novel algorithm for the efficient selection of computer-simulation inputs. The methodology is illustrated with a case study on a near-surface repository in Belgium.

1. Introduction Nuclear waste repositories can be either near-surface or deep-geological, depending on waste radioactivity [1][2]. The risk associated with the construction and operation of the repository must be quantified by a safety assessment. This requires that the large uncertainty about the evolution of the repository be considered. To this aim, the safety assessment is typically carried out by scenario analysis. This begins by identifying the so-called Features, Events and Processes (FEPs) that affect the repository, such as groundwater flows, chemical concentrations and contaminant-transport properties. These are structured by building a model that represents the FEPs and their dependences. Finally, scenarios are generated and analyzed as joint evolutions of the FEPs [3]. A major challenge in the scenario analysis of nuclear waste repositories is comprehensiveness, partly because the interpretation of this concept has varied across alternative approaches. Referring to the recent review on scenario analysis [3], these interpretations have differed in terms of the extent to which the residual uncertainty about the risk of the repository is quantified. Specifically, the pluralistic approach relies on expert judgments in the formulation of scenarios, which are analyzed without quantifying the residual uncertainty. Thus, the analysis is considered comprehensive if the scenarios can be assumed to represent



all possible futures. In contrast, probabilistic approaches [4][5] quantify the residual uncertainty about the risk estimate primarily by sampling scenarios from a probability space, whereby comprehensiveness can arguably be attained by ensuring that the sample size is sufficiently large. This paper advocates a generalized interpretation of comprehensiveness, arguing that comprehensiveness is achieved if the residual uncertainty about risk is sufficiently small to assess conclusively whether the repository is safe or not. To help implement this interpretation, we present a probabilistic scenario-analysis methodology which is based on Bayesian networks. The novelty of the methodology consists of i) the characterization and propagation of epistemic uncertainty through feasible regions of probability values as a means of quantifying the residual uncertainty about risk, and ii) the use of the Adaptive Bayesian Sampling algorithm in deciding which computer simulations should be performed to reduce residual uncertainty and pursue comprehensiveness efficiently. We also carry out a sensitivity analysis to identify those expert judgments for which the attainment of a greater degree of consensus would lead to more conclusive results. The paper is structured as follows. Section 2 introduces the generalized interpretation of comprehensiveness and develops the Bayesian-network-based methodology with reference to a simplified repository. Section 3 applies the proposed methodology to the

Corresponding author. E-mail address: [email protected] (E. Tosoni).

https://doi.org/10.1016/j.ress.2019.04.012 Received 14 January 2019; Received in revised form 29 March 2019; Accepted 3 April 2019 Available online 04 April 2019 0951-8320/ © 2019 Elsevier Ltd. All rights reserved.

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 1. Interpretation of comprehensiveness in probabilistic approaches when risk (here expressed as the probability pvio of violating a reference safety threshold) is bounded by μl and μu, i.e., the endpoints of the dashed interval or the quantiles of the solid-line distribution. Comprehensiveness is achieved in (a) and (c), because the system can be deemed safe and unsafe, respectively, regardless of residual uncertainty. In (b), instead, the risk limit ɛvio iswithin the interval [μl, μu].

inconclusive1. This interpretation of comprehensiveness is aligned with the assessments of the WIPP and Yucca Mountain repositories [4][5], because large sample sizes tend to lead to tighter distributions. In what follows, we present a probabilistic methodology for quantifying residual uncertainty and evaluating comprehensiveness based on Bayesian networks.

prospective near-surface repository in Dessel, Belgium. Section 4 discusses the benefits and limitations of the methodology. Section 5 concludes. 2. Methodology for scenario analysis 2.1. Comprehensiveness Comprehensiveness is required at different stages of scenario analysis. At the FEP level, comprehensiveness means that all FEPs which significantly influence the repository are identified and included [3]. However, ensuring this through systematic methods is very hard. Rather, comprehensiveness in scenario generation refers to the requirement of exploring exhaustively the scope of the FEPs that are retained. Because it is possible to develop methods for the quantitative evaluation of this exploration, we hereafter focus on the comprehensiveness of the scenarios. In pluralistic scenario analysis, comprehensiveness is understood in terms of representativeness. Specifically, it is assumed that the repository evolution will be similar to one of the formulated scenarios. In contrast, in probabilistic approaches (adopted, e.g., for the WIPP and Yucca Mountain repositories [4][5]), scenarios are events in a probability space, and comprehensiveness refers to the broad coverage of the future through a large sample [3]. The underlying challenge is that completeness, i.e., the precise estimation of the risk of the repository, cannot be attained [3]. Indeed, scenario analysis involves some residual uncertainty, because there is an infinite number of possible futures and inevitable epistemic uncertainty about these futures. We therefore posit that comprehensiveness is achieved if this residual uncertainty is sufficiently small to assess conclusively whether the repository is safe or not. Quantifying residual uncertainty is, thus, a prerequisite for evaluating comprehensiveness. While pluralistic approaches may not be effective in this regard (as we discuss in Section 3.2), residual uncertainty can be quantified in probabilistic approaches. Specifically, let μl and μu denote lower and upper bounds on risk, corresponding to the endpoints of an interval (see Section 2.3.3) or to the quantiles of a probability distribution (as in the assessments of the WIPP and Yucca Mountain repositories [4][5]). Regardless of how μl and μu are obtained, the difference between them is an indicator of residual uncertainty. In probabilistic approaches, comprehensiveness is achieved if the bounds on risk do not contain the predefined risk limit ɛvio, i.e., if either

µu <

vio

or

µl

vio.

2.2. Bayesian networks In a Bayesian network (BN) [6][7], the nodes represent random variables which correspond to the FEPs and the safety target, while the arcs indicate the causal dependences between these variables. Fig. 2 shows the FEPs (white nodes) Chloride concentration, Groundwater flow and Canister breach, and the safety target Radionuclide discharge (black) for a simplified repository concept of illustration. Let the set V = {i|i = 1, ...,nFEP , t } include the nodes for the nFEP FEPs and the safety target t = nFEP + 1. The set of directed arcs A = {(j, i )|i , j V , i j} denotes causal dependences, so that (j, i) ∈ A if the outcome at node i depends on j. The nodes V i = {j|(j, i) A} from which there is an arc to node i ∈ V are the parents of i, which is their child. The sets V I = {i|i V , V i = } }, with V D=V V I , contain the independent and V D = {i|i V , V i and dependent nodes without and with parents, respectively. In Fig. 2, Chloride concentration and Groundwater flow are the independent nodes, whereas Canister breach and Radionuclide discharge depend on them. Each node i ∈ V is associated with a random variable whose realization si ∈ Si is one of the possible states. These discrete states represent values, ranges or intensities of the variable such as low, medium, high [8] t [9]. For the safety target t ∈ V, the state s vio indicates the violation of the selected reference threshold (i.e., the regulatory limit or a more conservative value). Tab. 1 shows the states of the nodes of Fig. 2, obtained by discretization of ranges of values [10][11][12]. Uncertainties about the FEP states and the safety target are modeled as probabilities. Formally, p si is the probability that an independent node i ∈ VI is in state si ∈ Si. For a dependent node i ∈ VD, p si | si is the conditional probability of state si when its parents j V i are in the S i = Xj V i S j (where X denotes the Cartesian product). states si Tab. 2 shows illustrative state probabilities assigned by five experts to the FEP Groundwater flow in Fig. 2. For nuclear waste repositories, a set of states of all FEPs constitutes a scenario [3]. This can be modeled as a vector s = (s1, ...,s nFEP ), so that the set of all scenarios is the Cartesian product S = Xi V FEP S i, where

(1)

These inequalities indicate whether the risk limit is respected (Fig. 1a) or breached (Fig. 1c), whereas the result μl < ɛvio ≤ μu (Fig. 1b) is

1

562

Conservatively, ɛvio is already unacceptable.

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 2. Bayesian network of a simplified deep geological repository [1]. The canister can be breached due the flow of groundwater transporting high concentrations of chloride. The size of the breach and the magnitude of the flow determine the extent of the discharge of radionuclides to the environment. The probabilities of both more or less severe breaches are obtained from simulations, whereas those of Chloride concentration, Groundwater flow and Radionuclide discharge are obtained from expert judgments as described in Section 2.3. Table 1 FEPs and safety target of Fig. 2 with states derived from the discretization of continuous ranges. The reference threshold for the normalized safety target is 1. Node

States Chloride concentration Groundwater flow Canister breach Radionuclide discharge

FEPs Safety target

Units

Dilute Low Damage Respect

Brine Medium Severe Violation

Saline High Complete

Groundwater flow State

Expert 1

Expert 2

Expert 3

Expert 4

Expert 5

Low Medium High

0.660 0.220 0.120

0.880 0.080 0.040

0.570 0.250 0.180

0.720 0.270 0.010

0.505 0.492 0.003

t p (s, s vio )=

s S i VI

psi ·

j VD j t

[0, 75] [0, 100] [0, 950] [0, + )

[0, [0, [0, [0,

23) 33) 170) 1)

[23, 46) [33, 66) [170, 450) [1, + )

[46, 75] [66, 100] [450, 950]

The probabilities in Eq. 2 are not precisely known [14][15][16] [17], and the characterization of their epistemic uncertainty [18][19] [20][21] can be done differently for different sources of information [10][22]. Here, we partition the nodes into the sets Vsim and Vexp for which the sources of information to obtain the state probabilities are simulations and expert judgments, respectively (more generally, for any given node, multiple sources of information can be aggregated [23]). For the time being, we also assume that simulations are restricted to dependent nodes so that VI⊆Vexp. In Fig. 2, Vsim includes Canister breach, whereas Chloride concentration, Groundwater flow and Radionuclide discharge belong to Vexp. In what follows, we describe epistemic uncertainty by feasible probability regions instead of crisp probability values. BNs with imprecise probabilities are usually referred to as credal networks or enhanced BNs [22][24]. 2.3.1. Simulations Simulations are extensively employed in risk assessments [25][26] [27]. In BN-based analyses, simulations reproduce the relationship between a node i ∈ VD ∩ Vsim and its parents V i , i.e.3,

t psj | s j · p svio |s t . V

Discretization

2.3. Epistemic uncertainty

V FEP=V t is the set of FEPs. For example, in a scenario of Fig. 2, Chloride concentration, Groundwater flow and Canister breach may have states saline, high and complete, respectively. The state of FEP i ∈ VFEP in the scenario s is denoted by si. Similarly, we call a combination si of states for all parents V i of a dependent node i ∈ VD a subscenario. For instance, the subscenarios of Radionuclide discharge are made of states of Groundwater flow and Canister breach, whereas each combination of states of Chloride concentration and Groundwater flow is a subscenario of Canister breach. The subscenario of i in the scenario s is denoted by sV i . In safety assessments, the event of interest is the violation of the reference threshold [13]. Following the global semantics of BNs [7], the total probability of the threshold violation state is s S

gl l y-1 mm -

Range

environment over 10,000 years which shall not be violated with a probability higher than 0.001 (or 0.1 for less hazardous radionuclides)2.

Table 2 Illustrative probabilities for the states of the FEP Groundwater in Fig. 2.

t pvio = p (s vio )=

-1

V

xi =

(2)

where x

The threshold violation probability is here taken as an estimate of the risk of the repository. This risk can be considered acceptable if this probability is below the predefined risk limit ɛvio. For example, for the WIPP repository [4], the United States Environmental Protection Agency has defined thresholds of cumulative radioactive discharge to the

(3)

(x i , ), i

is a vector of values

xj

[a j ,

b j]

,

j

V , θ ∈ Θ is a i

2 Code of Federal Regulations, Title 40, Chapter I, Subchapter F, Part 191, Paragraph 191.13. 3 If states do not derive from discretization, the relationship is s i = (si , ) .

563

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

vector of simulation-model parameters, and the deterministic function : [(Xj V i [a j , b j] ) × ] [ai , bi ] is evaluated via the simulation. Specifically, the value xj of each parent j V i is sampled from the interval S i (e.g., from of values defining the state sj in a given subscenario si 1 [0, 23 g l ) if Chloride concentration is dilute in Tab. 1). The resulting xi corresponds to a state si of the child. Simulations can, thus, be performed for any subscenario to estimate the conditional state probabilities p si | si , s i S i . However, simulations can be affected by epistemic uncertainties [28]. For instance, the parameter values θ may not be known, in which case a probability distribution FΘ has to be defined. If these parameters are calibrated using experimental observations, measurement errors also have to be taken into account [29]. Further uncertainties arise from the sampling of the values x j , j V i from the corresponding intervals (see the end of this section). Due to these uncertainties, repeated simulations of the same subscenario can result in different states si. Assume that n si | si out of nsi simulations for the subscenario si lead to the state si, so that s i Si

n si | si = n si .

unconditional and conditional p ei , e E i, s i S i, i V I V exp s

·

si | si

si

+ n si

,

si

Si

{

= p si| si |p si | si

[ psi|si , p si| si ],

S i, s i Si

}

p si | si = 1 ,

i

{

= p s i |p s i =

e Ei

w e ·p ei , w e s

0,

e

E i,

w e = 1,

si

e Ei

}

Si ,

i

VI

V exp,

For dependent where p si is the vector of state probabilities p si , nodes, the feasible regions are defined analogously, except that the weights are subscenario-specific. This is because experts may have more knowledge on some subscenarios and because imposing the same set of weights to the different feasible regions may lead to a less conservative quantification of epistemic uncertainty. Fig. 3b shows the feasible region for the state probabilities of Groundwater flow. When the experts’ beliefs are more divergent, the feasible region becomes larger so that its size can be viewed as a measure of how much epistemic uncertainty there is [39]. In particular, the feasible region tends to become large if one expert strongly disagrees with the others (which may appear overly conservative [38] and not very “democratic” [40]). Moreover, if there are intra-expert uncertainties [39], experts may be reluctant to provide precise probabilities. In this case, one could elicit a feasible region from each expert and build a larger feasible region by enveloping these individual regions. In general, if experts are viewed as relevant information sources in the absence of more reliable ones, it is plausible to approximate uncertainty about probabilities by forming the convex hull of their beliefs.

(5)

si

denoted by , e E i,

(7)

are obtained as the expected values of the distributions Beta( · si | si + n si | si , · (1 n si | si ) . The si | si 0 are the si | si ) + n si prior probabilities and the parameter σ > 0 represents the strength of the belief in these priors (for larger σ, the estimate is closer to the prior). The uncertainty of these estimates can be described by lower and upper probability bounds p si| si and p si | si , which can be obtained in a variety of ways. For instance, instead of anchoring to the priors, we use the Imprecise Dirichlet Model (IDM) [30] to account for all (0, 1), si S i . si | si Then, the feasible region for the vector p si | si of conditional probabilities p si | si , s i S i can be approximated by the set in which these are within their intervals and sum up to 1, i.e., si | si

p ei

s | si

si

(4)

+ n si | si

be

s i S i, si S i , i V D V exp, respectively. As an example, experts’ beliefs for Groundwater flow in Fig. 2 are shown in Tab. 2. As in aggregation methods [36][37], we define experts’ weights we, ∀e ∈ Ei, i ∈ VI∩Vexp for the independent nodes. However, instead of using exact weights, we form the convex hull of the experts’ beliefs [38] as the feasible region

In a Bayesian setting, crisp estimates of the corresponding conditional state probabilities

p^si | si =

beliefs and

Si.

2.3.3. Residual uncertainty Eq. 2 can be concisely written as

pvio = pvio (p sim, pexp (w)), sim

(8)

where p represents all probability vectors p si | si , si S i , i V D V sim obtained from simulations and pexp(w) represents all probability vectors and p si , i V I V exp ps j|s j , s j S j , j V D V exp obtained from expert judgments. Note that the latter depend on the vector of experts’ weights w. Eq. 8 highlights that, for each feasible combination of state probability values, there is a corresponding probability pvio of violating the reference safety threshold. Specifically, in the optimization model of Appendix A, the constraints ensure that the state probabilities belong to their feasible regions, and the threshold violation probability is the objective function to be minimized and maximized, with respect to the state probabilities, for estimating the lower and upper extremes pvio and pvio , respectively. Because the threshold violation probability is a multilinear function of the state probabilities (i.e., it is linear in each state probability if others are fixed), the optimization model is a multilinear programming problem (MLP). Its solution is therefore obtained at some combination of extreme points of the feasible regions. Yet, there can be so many such combinations that solving the MLP by explicit enumeration is computationally intractable [41][42]. To examine the extreme points efficiently, we developed an algorithm that combines the simplex [43] and the reduced-gradient [44] methods for solving linear and nonlinear programming problems. The result is an interval [ p vio , pvio ] for the threshold violation probability (e.g., [0.007,0.084] in the example of Fig. 2), whose extremes can be taken as the risk bounds μl and μu, respectively. Thus, this interval quantifies the residual uncertainty on risk and supports decision making [22][41] through inequality (1). Yet, some (difficult to quantify) uncertainty about the extremes themselves should be acknowledged. In our methodology, the interval

V sim.

(6) If the probabilities at the safety target are obtained by simulations (i.e., t ∈ Vsim), the feasible region is the interval for the conditional probability of the violation state. As more simulations are performed, the probability intervals get narrower and the feasible region shrinks. Thus, the size of the feasible region reflects the amount of epistemic uncertainty. Out of 1,000 simulations for the subscenarios of Canister breach in Fig. 2, 111 were performed for the subscenario in which Chloride concentration and Groundwater flow are in states (brine, low). Using these, 33 and 78 resulted in the states damage and severe of Canister breach, respectively. The FEP values were sampled from uniform distributions over the intervals defining their states in a given subscenario, as uniform distributions generally help cover the entire intervals widely [31] [32]. When simulating the subscenario (brine, low), we sampled the values of Chloride concentration from U (23 g l 1, 46 g l 1) and of Groundwater flow from U (0, 33 l y 1) as per the discretization of Tab. 1. Fig. 3a shows the feasible region for the state probabilities conditioned on the subscenario (brine, low) of Canister breach. 2.3.2. Expert judgments Let Ei, |Ei| > 1 be the set of experts for node i ∈ V. For independent nodes, the experts’ subjective state probabilities, or beliefs, can be elicited by conventional methods [33]; for dependent nodes, parametric elicitation methods [34][35] may be more appropriate. Let 564

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 3. Feasible regions (gray areas) for the states probabilities of Canister breach, conditioned on the subscenario (brine, low) of states of Chloride concentration and Groundwater flow (a) and of Groundwater flow (b) in Fig. 2. The region on the left is the intersection of the hyperrectangle where the conditional state probabilities belong to their intervals and the hyperplane where they sum up to 1 (dash-dotted triangle), whereas that on the right is the convex hull of the experts’ beliefs.

for the threshold violation probability depends on the feasible probability regions, which approximate reality [45]: the uncertainty from expert judgments can be larger than what is modeled by their convex hull; the probability intervals obtained from simulations vary with the parameters of the chosen model (e.g., the interval credibility γ and the strength of the belief in the priors σIDM in the IDM [30]). Moreover, in discretized BNs there are approximation errors in that the state probabilities are essentially integrals of probability density functions over the variables’ values [46][47]. Therefore, in real applications, especially if on the safe side as in Fig. 1a, there should be some conservative safety margin between the upper risk bound and the risk limit. In order to pursue comprehensiveness efficiently, we suggest that the BN is initialized with the experts’ beliefs, and that simulation results are incorporated progressively. Therefore, we develop an algorithm for identifying which subscenarios are most relevant for reducing residual uncertainty (Section 2.3.4). We also identify those state probabilities for which a greater consensus among experts would reduce residual uncertainty most (Section 2.3.5). The rationale for reducing uncertainty is that, unless the unknown pvio is very close to ɛvio, a narrower interval is less likely to contain the risk limit.

Table 3 The 1,000 simulations for the subscenarios of Canister breach in Fig. 2 by uniform sampling (US, third column), Adaptive Bayesian Sampling (ABS) in the basic (fourth column) and in the improved version with nCR = 10 (fifth column). Canister breach Subscenario

Simulations

Chloride concentration

Groundwater flow

US

ABS

ABS nCR = 10

Dilute Brine Saline Dilute Brine Saline Dilute Brine Saline

Low Low Low Medium Medium Medium High High High

105 111 114 108 114 108 116 103 121

178 120 32 238 119 44 167 101 1

178 121 33 239 118 36 171 99 5

set to

nk(l :k ) = nk + 1,

2.3.4. Adaptive Bayesian Sampling Tab. 3 (third column) shows that the 1,000 simulations for the subscenarios of Canister breach in Fig. 2 were first performed for one subscenario at a time, with equal probability. This uniform sampling may lead to few simulations for all subscenarios, hence to large feasible probability regions and, ultimately, leave considerable residual uncertainty. For more efficient decisions on which subscenarios to simulate, similarly to algorithms such as AK-MCS [48] and Meta-AK-IS [49], we present Adaptive Bayesian Sampling (ABS). If there is a single node whose probabilities are obtained from simulations, as in the cases of Fig. 2 and of Section 3, the set of eligible subscenarios is S sim = S i , i V sim, |V sim| = 1 (see comments on |Vsim| > 1 at the end of this section). For brevity, we indicate each S sim by k = 1, ...,|S sim | and each state si ∈ Si of the child subscenario si node i by j = 1, ...,|S i|. The number of simulations for the k-th subscenario is nk and, among these, the number of those which have resulted in the j-th child’s state is nj|k. In each simulation round, ABS solves a decision tree through the following procedure (Fig. 4 and Tab. 4). The simulation round l = 1, ...,nmax supports the decision about the simulation l out of the maximum number nmax. If the k-th subscenario is selected, which is denoted by (l: k), a temporary simulation counter is

(9)

where nk l 1 (at the first round, nk = 0 ). This assumption corresponds to the path (l: k) branching from the decision node (square) in Fig. 4. If this simulation gives the j-th state of the child, which is denoted by (l: k → j), we set temporary result counters to

nj(|lk:k

j)

= nj | k + 1

j)

= nh | k ,

(10)

and

nh(l|:kk

S i, h

h

j,

(11)

i

where ns|k ≤ nk, ∀s ∈ S . The assumptions corresponding to Eqs. 10 and 11 mean that the path (l: k → j) branches from the k-th chance node (circle) in Fig. 4. Through the IDM, for instance, these counters give the bounds p s(|lk:k j) and ps(|lk:k j) , s S i on the state probabilities in the kth subscenario. Through the MLP of Appendix A, these lead to the in(l : k j ) (l : k j ) , pvio ], whose width terval [ p vio (l : k vio

j)

(l : k = pvio

j)

(l : k p vio

j)

(12)

is a measure of the residual uncertainty. Note that this width is calculated before the l-th simulation is performed, based on assumptions about the next simulation and its result (as denoted by the superscripts). Indeed, by resetting the temporary 565

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 4. Decision tree for selecting the subscenario for the l-th simulation, l = 1, ...,nmax . If it is performed for the k-th subscenario, k = 1, ...,|S sim |, as denoted by (l: k), and it results in the j-th state of the child, j = 1, ...,|S i|, as denoted by (l: k → j), the corresponding width of the interval for the threshold violation probability is (l : k j ) . Taking into account the probabilities p^j | k of the results, the l-th simulation is performed for the subscenario for which the expected residual uncertainty vio

[

(l : k ) vio ]

is the smallest.

ABS focuses on those subscenarios si for which the associated probabilities are particularly relevant for the threshold violation probability. Fig. 5 shows the correlation between the number of simulations performed for the various subscenarios of Canister breach and the interval-valued sensitivities of the threshold violation probability to the associated probabilities. As the expression of these sensitivities is similar to Eq. 2 except that not all scenarios are included, their extremes can be calculated through the MLP of Appendix A. Fig. 6 shows the evolution of the residual uncertainty (i.e., the width of the interval for the threshold violation probability) when simulations are performed by uniform sampling (solid line) and ABS (dashed). ABS finally provides a slightly narrower interval (i.e., [0.007,0.082]) than uniform sampling, but it is outperformed between simulations 32 and 184. Indeed, ABS has selected the same subscenarios for many rounds, namely (dilute,medium) between 1 and 55, (dilute,high) between 56 and 95, (brine,medium) between 96 and 140, (brine,high) between 150 and 176. If a subscenario has been simulated many times, the information contributed by further simulations is marginal in that the feasible region for the associated conditional state probabilities does not shrink much. Thus, the reduction of residual uncertainty is slow, as shown by the repeated flattening of the dashed curve. ABS keeps selecting the same subscenarios if the prospects of just one simulation (Eq. 9) for a “new” subscenario are not expected to reduce residual uncertainty substantially. The switch to new subscenarios can be fostered by increasing the counters in Eqs. 9 and 10 by nCR > 1 instead of 1. This parameter represents the assumption that nCR consecutive simulations for the k-th subscenario would all result in the j-th state of the child. This information would not be significant for extensively simulated subscenarios, whereas it would shrink the feasible regions for unexplored subscenarios considerably. Note that nCR is only used for selecting the next subscenario, but just one simulation per round is performed. The revised ABS (nCR = 10 ) uses simulations somewhat more widely over the subscenarios (Tab. 3, fifth column) and proves more efficient than uniform sampling in reducing residual

counter nj(|lk:k j) to nj|k, the assumption denoted by (l: k → h) and the procedure from Eq. 10 to Eq. 12 can be iterated over all states h ∈ Si, h ≠ j. The expected residual uncertainty from the l-th simulation for the k-th subscenario is, then,

[

(l : k ) vio ]

= j Si

p^j | k ·

(l : k vio

j)

.

(13)

The point estimates p^j | k of Eq. 5 (hence “Bayesian” in ABS) are taken as the probabilities of different simulation results, i.e., of the paths branching from the k-th chance node (at the first round, they coincide with the priors). Now, by resetting the temporary counter nk(l :k ) to nk, the assumption that the m-th subscenario is selected, denoted by (l: m), and the procedure from Eq. 9 to Eq. 13 can be iterated over all subscenarios m S sim, m k . Because the goal is to reduce residual uncertainty, the next simulation is performed for the subscenario k* such that

k* = argmin k S sim

[

(l : k ) vio ].

(14)

If there are multiple such subscenarios, the next simulation can be performed for any of them. Once the simulation result j′ ∈ Si has been observed, the counters n k* and n j | k* are incremented by 1 before proceeding to the simulation round l + 1, so that the Bayesian point estimates, the intervals for the conditional state probabilities and for the threshold violation probability can be updated accordingly (hence “adaptive” in ABS). Tab. 3 (fourth column) reports the number of simulations performed for the subscenarios of Canister breach by ABS with the following illustrative parameters: = 1; IDM = 2 ; = 0.99; the prior conditional probabilities for the states damage, severe, complete of Canister breach were set to (1,0,0), respectively, for the subscenario (dilute,low), to (0.5,0.5,0) for (brine,low) and (dilute,medium), to (0,0.5,0.5) for all subscenarios with saline Chloride concentration, and to (0,1,0) for (brine,medium), (dilute,high) and (brine,high). 566

end

vio

567

E[ω(l:k) ]

vio

Observe the child’s state j0 resulting from simulation l Increment counters nk∗ = nk∗ + 1 and n j0 |k∗ = n j0 |k∗ + 1 Update Bayesian point estimates pˆ j|k , ∀ j ∈ S i (Eq. 5) Update p i and p si |k ∀si ∈ S i s |k Update the threshold violation probability interval [p , pvio ] by solving the MLP of Appendix A

k∈S −sim

end Perform simulation l for subscenario k∗ s.t. k∗ = argmin

end P j) – Compute the expected imprecision E[ω(l:k) ] = j∈S i pˆ j|k · ω(l:k→ vio – n(l:k) = nk k

j) ∗ n(l:k→ = n j|k j|k

j) j) ∗ Compute imprecision ω(l:k→ = p(l:k→ − p(l:k→ j) vio vio

vio

j) ∗ Compute [p(l:k→ j) , p(l:k→ ] by solving the MLP of Appendix A vio

s|k

j) ∗ Compute p(l:k→ j) and p(l:k→ ∀s ∈ S i (e.g., with IDM[30]) s|k

j) = nh|k , ∀h ∈ S i , h , j ∗ n(l:k→ h|k

j) ∗ n(l:k→ = n j|k + 1 j|k

– n(l:k) = nk + 1 k – for j = 1 : |S i | do

for l = 1 : nmax do for k = 1 : |S −sim | do

Table 4 The Adaptive Bayesian Sampling algorithm.

E. Tosoni, et al.

Reliability Engineering and System Safety 188 (2019) 561–573

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 7. Computational effort (in hours) per unit reduction of residual uncertainty (width of the interval for the threshold violation probability) by Adaptive Bayesian Sampling (nCR = 10 ) with respect to uniform sampling as a function of the simulations-to-subscenarios ratio in the example of Fig. 2.

Fig. 5. Correlation in Adaptive Bayesian Sampling between (i) the number of simulations (circles, right vertical axis) for each subscenario of Canister breach in Fig. 2 (horizontal axis) and (ii) the sensitivity (segments, left vertical axis) of the threshold violation probability to the state probabilities (for graph clarity, only of the state complete) conditioned on the subscenarios.

satisfactorily and ABS is unnecessary. When there are intermediate resources for simulation, the computational effort required by ABS can be relatively low. If |Vsim| > 1, the implementation of ABS depends on the specific problem structure. For instance, it can be hard to treat “overlapping” subscenarios defined by combinations of states of V i and V j such that Vi V j for nodes i, j ∈ |Vsim| whose values are calculated by the same computer code. Furthermore, if there are multiple codes, the maximum number of simulations nmax may apply to all of them, or each may have its own. More radically, BNs may not be fully adequate in problems as the WIPP and Yucca Mountain repositories [4][5], in which all dependent nodes are calculated by simulations, i.e., V sim = V D (see also Section 4.2).

uncertainty (dash-dotted line in Fig. 6). The computation time of ABS at each simulation round is tMLP ·|S i|·|S sim |, where tMLP is the computation time of the MLP of Appendix A. This must be solved for each branch of the decision tree in Fig. 4. After l simulations, the computational effort of ABS per unit reduction of residual uncertainty compared to uniform sampling is

c ABS =

tMLP ·|S i|·|S sim |· l US vio

ABS vio

,

(15)

where and are the widths of the interval for the threshold violation probability obtained by the two algorithms. Fig. 7 shows that the required computational effort is large when the ratio between the number of simulations and subscenarios is very small, because ABS cannot bring noticeable improvements; it is even larger at the opposite extreme, because uniform sampling covers all subscenarios US vio

ABS vio

2.3.5. Contraction Residual uncertainty is also driven by the degree of divergence in experts’ beliefs. In order to quantify this contribution, it is possible to employ the contraction [50][51] of the feasible regions for the state probabilities obtained by expert judgments. This means that, in Eq. 7, each expert is given a minimum weight 0 < λ ≤ 1/|Ei|. This contraction

Fig. 6. Evolution of the residual uncertainty (width of the interval for the threshold violation probability) as a function of the number of simulations in uniform sampling (solid line), Adaptive Bayesian Sampling (dashed) and its revised version with nCR = 10 (dash-dotted).

Fig. 8. Dose curves for the pluralistic scenarios of Tab. 2 of the Data article [53]. The time horizon is cut at 50,000 y as dose rates become negligible. 568

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 9. Bayesian network for the near-surface repository taken as the case study.

helps in exploring the implications of state probabilities which do not differ much from the average belief. The corresponding shrinkage of the feasible region would reduce the residual uncertainty. Specifically,

=

vio

vio vio

downstream of the repository. The resulting time-dependent dose rate to the public in [Sv y 1] can be computed with COMSOL Multiphysics [54], as described in the Data article [53]. 3.2. Pluralistic scenario analysis

(16)

As in the previous Dessel study [55], we first adopted a pluralistic approach. Scenarios were formulated in Tab. 2 of the Data article [53] through COMSOL parameter values reflecting different assumptions about boundary conditions, physical-chemical degradation of the barriers, etc. The remaining parameters in Tab. 1 of [53], which do not characterize any particular scenario, were kept fixed. For each scenario, Fig. 8 shows the dose rate over the time horizon 0-100,000 y due to Iodine-129 (129I) and Plutonium-239 (239Pu). These long-lived radionuclides are significant contributors to the radiological impact, and behave differently in terms of very weak and very strong sorption onto the solid phase, respectively. The curves are checked against the unit value, as they were normalized with respect to the confidential dose-rate reference threshold (which was nevertheless taken considerably below the regulatory limit of 1 mSv y 1[56]). The reference threshold (bold dashed line) is violated by the scenarios of Low sorption (solid curve), Fast chemical degradation (dashed) and Low infiltration (dash-dotted). In all other scenarios (represented by dotted curves) the reference threshold is not violated. However, this scenario analysis does not permit conclusive statements about safety. Are the violating scenarios based on very conservative assumptions? How likely are they? Should even more conservative scenarios be added? Questions on the need for additional information could be answered through the evaluation of comprehensiveness, which requires residual uncertainty to be quantified.

measures the sensitivity of the threshold violation probability to the λcontraction of feasible regions, where the width vio is calculated by adjusting the constraints in the MLP of Appendix A to ensure that the corresponding weights are not less than λ. Based on these sensitivities, the state probability vectors can be ranked. In the example of Fig. 2, the largest reduction (more than 40%) of residual uncertainty would result from taking the average ( = 1/|E i|) of the experts’ beliefs about Groundwater flow. Nevertheless, there are no guarantees that the experts, after further discussion, would agree on the average belief. Still, this kind of analysis identifies those probabilities for which the reduction of epistemic uncertainty would contribute to the attainment of comprehensiveness. 3. Case study 3.1. Near-surface disposal Near-surface repositories can host low and intermediate short-lived nuclear waste [1]. Longer-lived radionuclides are also admitted, but in limited amounts. These repositories can be found in, e.g., Czech Republic, France, Norway, Slovakia, United Kingdom and United States. Bulgaria, Germany, Italy, Romania and Slovenia, among others, are at different stages of the licensing and construction process. Here, we consider a case study on the repository planned for the site of Dessel, Belgium [52], although the following is not an actual safety assessment. A possible design for near-surface repositories is illustrated in Fig. 1 of the Data article [53] associated to this work. Nuclear waste is placed into concrete monoliths, in the form of bulks of individual pieces such as tools and clothing, or drums of compacted material. Monoliths are piled within modules, also made of concrete, which rest on a sand-cement embankment laid on the ground. These barriers are finally covered with soil and concrete components. Even though the repository is designed to confine the radionuclides, some release may occur over time. The main driver of this process is water, which seeps through the top cover, penetrates the barriers and triggers the leaching of radionuclides. These are subsequently discharged into the unsaturated zone and, hence, they start migrating in groundwater. Eventually, radionuclides may reach humans through the ingestion of contaminated water, for instance. Here, we made the conservative assumption that household water is consumed from a well

3.3. Scenario analysis with Bayesian networks We revisited the pluralistic scenario analysis using the BN-based methodology (Fig. 9). The safety target is the Dose rate to the public, normalized by the confidential safety threshold. Because BNs do not enable explicit time modeling [57], we take the maximum dose rate between 0 and 100,000 y. In turn, the states of the FEPs refer to relevant time instants (e.g., Water flux) or are invariant with time (e.g., Barrier degradation) [55]. A given FEP can have strongly correlated model parameters, which vary together and consistently in, e.g., “low” or “high” pluralistic scenarios (see the distribution coefficients in Tab. 2 of the Data article [53], for instance). The remaining parameters in Tab. 1 of the Data article [53] were kept fixed as in the pluralistic analysis, and modeled as the parameters θ in Eq. 3. Had there been some uncertainty about them, we would have defined a distribution FΘ over their values (the uncertainty about a parameter should be 569

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Fig. 10. The 1,000 simulations (horizontal axis) for the subscenarios (vertical axis) of the case study of Section 3 by uniform sampling (left) and Adaptive Bayesian Sampling (right).

coefficient in concrete and Initial diffusion coefficient in mortar from LU (8.46 E LU (3.83 E 11 m s 2, 8.82 E 11 m s 2) and 12 m s 2, 3.85 E 11 m s 2), respectively (where LU stands for loguniform). Consistent parameters were sampled using the same seed. The rather low ratio of 1,000 simulations for 576 subscenarios suggested that ABS could be beneficial. Specifically, we kept the settings of Section 2.3.4 with nCR = 10 and used the frequencies observed in past numerical simulations as priors (Tab.13 in [53]). Regardless of the specific subscenarios, Fig. 10 highlights that, unlike uniform sampling (a, left), ABS directed the simulations towards the most relevant ones (b, right). In Fig. 11, ABS (dashed line) outperformed uniform sampling (solid) for the entire simulation experiment. After 1,000 simulations, the intervals for the threshold violation probability obtained by the two algorithms became [0.033,0.856] and [0.003,0.990], respectively, representing an absolute reduction by 16.4% of residual uncertainty in ABS with respect to uniform sampling (difference between the interval widths). The simulation rounds took approximately 113 hours with uniform sampling and 199 hours with ABS on a standard computer. Thus, the computational effort for reducing residual uncertainty with respect to uniform sampling was roughly 4 days. The residual uncertainty, that is, the width of the interval for the threshold violation probability, remained large. We did not set the risk limit ɛvio, which is to be provided by the safety authority. However, unless lower than 0.033 (or, unrealistically, higher than 0.856), the risk limit may be contained in the interval for the threshold violation probability, so that comprehensiveness would not have been achieved. In this sense, in our methodology the quantification of residual uncertainty permits the evaluation of comprehensiveness. Part of the residual uncertainty comes from the diversity between the experts’ beliefs. For instance, the contraction-based sensitivity analysis showed that taking the average of the beliefs about Water flux would reduce residual uncertainty by nearly 4% (whereas seeking consensus about the other FEPs would lead to minor reductions). Yet, this limited sensitivity suggests that residual uncertainty can be attributed mostly to the limited number of simulations. The same simulations-to-subscenario ratio as in the example of Fig. 2 (in which the residual uncertainty was small) would be attained for approximately 60,000 simulations in the Dessel case study. The fact that 1,000 simulations, even if used efficiently, were not sufficient confirms that few illustrative scenarios can lead to comprehensiveness only in the sense of representativeness.

Fig. 11. Evolution of the residual uncertainty (width of the interval for the threshold violation probability) as a function of the number of simulations for obtaining the conditional state probabilities for Dose rate by uniform sampling (solid line) and Adaptive Bayesian Sampling (dashed).

considerable for it to be modeled as a FEP [3]). We also included the FEP Earthquake, which affects the speed of barrier degradation [55]. The arc from Crack aperture to Hydraulic conductivity explains the decreased effective conductivity of fissured concrete. Tab. 3 of the Data article [53] reports the FEPs and the safety target of Fig. 9 (second column), their consistent model parameter (third column) and their discrete states (fourth to sixth columns). The illustrative expert judgments of Tabs. 4 to 12 in [53] were taken as the state probabilities for the FEPs. Subsequently, the conditional probabilities of violation at the safety target were estimated through COMSOL simulations. We performed 1,000 simulations for the 576 subscenarios of Dose rate made by the states of Water flux, Crack aperture, Diffusion coefficient, Distribution coefficient, Chemical degradation, Barrier degradation, Monolith degradation and Hydraulic conductivity. In keeping with Section 2.3.1, we sampled the parameter values from uniform distributions over the intervals for the FEP states in the given subscenario (or loguniform in case of very wide intervals). For example, when simulating a subscenario in which Diffusion coefficient is low (see Tab. 3 of [53]), we sampled the consistent parameters Initial diffusion 570

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

Table 5 Multilinear programming (MLP) problem to estimate the lower and upper bounds of the interval for the threshold violation probability. Bound Objective function Constraints

Lower pvio min

pvio

p si =

w ep ei ,

p sim, pexp (w) e Ei

e

w ej s e Ej

0,

t vio | s

w et s

e Et

t vio | s

exp

w e j p ej

e

s

E j,

p s j| s j

s |s j j

VI

i

V exp

e Et

0,

e

w et = 1, s

ps t

t vio | s

sj

S ,

p s j| s j sj

w et p et s

s vio | st

Et,

t

s

st

sj

S j,

j

S j,

p s j | s j = 1, =

V exp

,

s

sj

s

ps j |s j ps t

S i,

I

VI

i

e Ej

w ej = 1,

sj Sj

si

pvio

∀e ∈ E , ∀i ∈ V ∩V

w e = 1,

ps j |s j =

ps t

s

i

w ≥ 0, e Ei

Upper pvio max

p sim, pexp (w)

VD

j VD

j

sj

S j,

sj

S j,

j

VD

,

st

S

S j,

V exp,

VD

j

j

t

t

V exp, j

t

S j,

j

V sim, j

V exp, j

VD

V sim, j

t

t

St if t ∈ Vexp

t

St

ps t

t vio | s

4. Discussion

st

St

if t ∈ Vsim

loop Monte Carlo sampling [67] could be carried out to simulate the Canister breach and the Radionuclide discharge repeatedly, and to estimate a distribution for the threshold violation probability (the bellshaped curves of Fig. 1). Thus, whereas definite comparisons between former approaches and our methodology are not straightforward, we argue that probabilistic scenario analyses may be suitable for characterizing epistemic uncertainty, quantifying residual uncertainty and evaluating comprehensiveness.

4.1. Comparison with the pluralistic approach Because our methodology quantifies the residual uncertainty on risk, it is possible to evaluate comprehensiveness as interpreted in Section 2.1. Without quantification, it is difficult to conclusively assess whether the repository is safe or not. This is why the interpretation of comprehensiveness in the pluralistic approach relies primarily on the representativeness of the scenarios [3]. From the pluralistic perspective, the most desirable result is when the scenario with the largest radiological impact does not violate the reference threshold at the safety target. However, the interactions between the FEPs are so complex (i.e., nonmonotonic) that the worst-case scenario may not have been formulated [58]. Moreover, if the reference threshold is violated in some scenarios (as in Fig. 8), probabilistic considerations are needed to demonstrate that these are so unlikely that they can be neglected. In turn, probabilistic approaches, like ours, may be black boxes, which provide risk values without clarifying the role of specific scenarios [59]. Insights can be gained through, for instance, risk importance measures [60][61][62][63]: because they connect assumptions on the occurrence of specific system states to increased or decreased risk [64], risk importance measures can inform strategies of risk monitoring and/or reduction.

5. Conclusions In this paper, we have considered the comprehensiveness of scenarios for the safety assessment of nuclear waste repositories, noting that the prevailing interpretations of comprehensiveness reflect specific approaches to the quantification of residual uncertainty about the risk of the repository. Against this backdrop, we have sought to clarify the principles and procedures in the evaluation of comprehensiveness. Specifically, we provided a general interpretation of comprehensiveness, which is less dependent on the specific methodological approach. Recognizing that scenario analysis does not typically yield complete knowledge about the risk of the repository, comprehensiveness is achieved if the residual uncertainty on risk is sufficiently small to conclusively assess whether the repository is safe or not. This interpretation makes it necessary to quantify residual uncertainty, which implies that pluralistic scenario analysis is not entirely adequate. We have therefore operationalized the interpretation of comprehensiveness for probabilistic approaches, which provide a framework for quantifying residual uncertainty. If this is done by estimating a lower and an upper bound on risk, comprehensiveness can be said to be achieved if the risk limit is not within these bounds (otherwise statements on safety cannot be conclusive). On these grounds, we have presented a methodology for scenario analysis based on Bayesian networks. The Features, Events and Processes (FEPs) and the safety target are modeled as nodes, which assume discrete states with different probabilities. Scenarios and subscenarios are defined as combinations of FEP states, which are used for calculating risk as the probability that the safety target violates the reference safety threshold. The state probabilities can be obtained by simulations and expert judgments, whose epistemic uncertainty is characterized through regions of feasible probability values and propagated into an interval for the threshold violation probability. Residual uncertainty is quantified by the width of this interval, which, together with the interval location, offers the basis for evaluating

4.2. Comparison with other probabilistic approaches Different probabilistic approaches can be employed in scenario analysis. A classical one is based on fault trees [65] in which a scenario is a sequence of events. In fault trees, uncertain probability values can be addressed, for instance, with evidence theory to estimate an interval for the probability of the top event (i.e., for nuclear waste disposal, the violation of the reference safety threshold) [66]. This notwithstanding, the Boolean logic of fault trees may not be flexible enough for modeling multi-state variables with complex dependences. Another option is the probabilistic framework for the WIPP and Yucca Mountain repositories [4][5]. The WIPP and Yucca Mountain framework could be applied to, e.g., the example of Fig. 2 if all dependent nodes were examined by simulations (i.e., V sim = V D ). Probability distributions could be built over the values of the independent nodes Chloride concentration and Groundwater flow. The epistemic uncertainty about the means and variances of these distributions could also be described by distributions. Then, a double571

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

comprehensiveness. The interpretation of comprehensiveness also helps assess what additional information may reduce residual uncertainty. To do this efficiently, we have developed the Adaptive Bayesian Sampling algorithm to focus the simulation resources on those combinations of FEP states (more precisely, subscenarios) which are most relevant for narrowing the interval for the threshold violation probability. We have also carried out a sensitivity analysis based on the concept of contraction, showing that the selection of probability estimates near the average of the experts’ beliefs would reduce residual uncertainty. These analyses may help achieve comprehensiveness. We have illustrated our methodology with a case study on the planned near-surface repository at Dessel, Belgium. Adaptive Bayesian Sampling made it possible to allocate simulation resources efficiently. The contraction-based sensitivity analysis identified those experts

whose beliefs are particularly important for achieving comprehensiveness. Overall, our results suggest that residual uncertainty can be quantified in our methodology more readily than in pluralistic approaches, thus helping in the evaluation of comprehensiveness. Acknowledgements This research was funded by the Finnish Research Program on Nuclear Waste Management KYT 2018 (project TURMET). The authors are indebted with Dr. Suvi Karvonen from the Technical Research Centre of Finland for her support to the research work, and thank the anonymous reviewers for their valuable suggestions. Edoardo Tosoni is personally grateful to Dr. Suresh Seetharam from the Belgian Centre of Nuclear Studies for both his practical and scientific help.

Appendix A. Optimization-based estimation of the interval for the threshold violation probability

Supplementary material Supplementary material associated with this article can be found, in the online version, at 10.1016/j.ress.2019.04.012

Mechanical Systems and Signal Processing 2013;37:4–29. [21] Aven T. Risk assessment and risk management: review of recent advances on their foundation. European Journal of Operational Research 2016;253:1–13. [22] Tolo S, Patelli E, Beer M. Robust vulnerability analysis of nuclear facilities subject to external hazards. Stochastic Environmental Research and Risk Assessment 2017;31:2733–56. [23] Kangaspunta J, Liesiö J, Salo A. Cost-efficiency analysis of weapon system portfolios. European Journal of Operational Research 2012;223:264–75. [24] Tolo S, Patelli E, Beer M. An open toolbox for the reduction, inference computation and sensitivity analysis of credal networks. Advances in Engineering Software 2018;115:126–48. [25] Cadini F, De Sanctis J, Girotti T, Zio E, Luce A, Taglioni A, Monte A. Carlo-based assessment of the safety performance of a radioactive waste repository. Reliability Engineering and System Safety 2010;95:859–65. [26] Cadini F, Tosoni E, Zio E. Modeling the release and transport of 90sr radionuclides from a superficial nuclear storage facility. Stochastic Environmental Research and Risk Assessment 2016;30:693–712. [27] Beltrame L, Dunne T, Vineer HR, Walker JG, Morgan ER, Vickerman P, McCann CM, Williams DJL, Wagener T. A mechanistic hydro-epidemiological model of liver fluke risk. Journal of the Royal Society Interface 2018;15(45). https://doi.org/10.1098/ rsif.2018.0072. [28] Uusitalo L, Lehikoinen A, Helle I, Myrbeg K. An overview of methods to evaluate uncertainty of deterministic models in decision support. Environmental Modelling & Software 2015;63:24–31. [29] Turolla A, Cattaneo M, Marazzi F, Mezzanotte V, Antonelli M. Antibiotic resistant bacteria in urban sewage: role of full-scale wastewater treatment plants on environmental spreading. Chemosphere 2018;191:761–9. [30] Walley P. Inferences from multinomial data: learning about a bag of marbles. Journal of the Royal Statistical Society, Series B (methodological) 1996;58(1):3–57. [31] Turati P, Pedroni N, Zio E. An adaptive simulation framework for the exploration of extreme and unexpected events in dynamic engineered systems. Risk Analysis 2017;37(1):147–59. [32] HUGIN. Reference manual version 8.6. 2018. (accessed July 15, 2018) http:// download.hugin.com/webdocs/manuals/api-manual.pdf. [33] De Wispelare AR, Herren LT, Clemen RT. The use of probability elicitation in the high-level nuclear waste regulation program. International Journal of Forecasting 1995;11:5–24. [34] Laitila P, Virtanen K. Improving construction of conditional probability tables for ranked nodes in bayesian networks. IEEE Transactions on Knowledge and Data Engineering 2016;28(7):1691–705. [35] Laitila P, Virtanen K. On theoretical principle and practical applicability of ranked nodes method for constructing conditional probability tables of bayesian networks. IEEE Transactions on Systems, Man and Cybernetics: Systems 2018. https://doi. org/10.1109/TSMC.2018.2792058. [36] Dias LC, Morton A, Quigley J. Elicitation - The science and art of structuring judgment. Springer; 2018. [37] Soares MO, Sharples L, Morton A, Claxton K, Bojke L. Experiences of structured elicitation for model-based cost-effectiveness analyses. Value in Health 2018;21:715–23. [38] Crès H, Gilboa I, Vieille N. Aggregation of multiple prior opinions. Journal of Economic Theory 2011;146:2563–82. [39] Zio E, Apostolakis G. Accounting for expert-to-expert variability: a potential source of bias in performance assessments of high-level radioactive waste repositories. Annals of Nuclear Energy 1997;24(10):751–62. [40] Stone M. The opinion pool. Annals of Mathematical Statistics 1961;32(4):1339–42. [41] Tolo S, Patelli E, Beer M. Sensitivity analysis for bayesian networks with interval

References [1] Chapman NA, McKinley IG. The Geological Disposal of Nuclear Waste. Chichester, UK: Wiley; 1987. [2] Apted M. J., Ahn J., editors. Geological repository systems for safe disposal of spent nuclear fuels and radioactive waste 2nd edition. UK: Duxford; 2017. [3] Tosoni E, Salo A, Zio E. Scenario analysis of nuclear waste repositories: a critical review. Risk Analysis 2018;38(4):755–76. [4] Helton JC, Anderson DR, Basabilvazo G, Jow HN, Marietta MG. Conceptual structure of the 1996 performance assessment for the waste isolation pilot plant. Reliability Engineering and System Safety 2000;69:151–65. [5] Helton JC, Hansen CW, Sallaberry CJ. Conceptual structure and computational organization of the 2008 performance assessment for the proposed high-level radioactive waste repository at yucca mountain, nevada. Reliability Engineering and System Safety 2014;122:223–48. [6] Jensen FV. Bayesian networks and decision graphs. New York, NY: Springer-Verlag; 2001. [7] Pearl J, Russell S. Bayesian networks. In: Arbib MA, editor. Handbook of Brain Theory and Neural Networks 2nd edition. Cambridge, MA: MIT Press; 2003. p. 157–60. [8] Di Maio F, Baronchelli S, Zio E. A computational framework for prime implicants identification in noncoherent dynamic systems. Risk Analysis 2015;35(1):142–56. [9] Mancuso A, Compare M, Salo A, Zio E. Portfolio optimization of safety measures for reducing risks in nuclear systems. Reliability Engineering and System Safety 2017;167:20–9. [10] Uusitalo L. Advantages and challenges of bayesian networks in environmental modelling. Ecological Modelling 2007;203:312–8. [11] Helle I, Lecklin T, Jolma A, Kuikka S. Modeling the effectiveness of oil combating from an ecological perspective - a bayesian network for the gulf of finland; the baltic sea. Journal of Hazardous Materials 2011;185:182–92. [12] Di Maio F, Vagnoli M, Zio E. Risk-based clustering for near misses identification in integrated deterministic and probabilistic safety analysis. Science and Technology of Nuclear Installations 2015;2015. https://doi.org/10.1155/2017/2709109. [13] Cadini F, Gioletta A, Zio E. Improved metamodel-based importance sampling for the performance assessment of radioactive waste repositories. Reliability Engineering and System Safety 2015;134:188–97. [14] Toppila A, Salo A. A computational framework for prioritization of events in fault tree analysis under interval-valued probabilities. IEEE Transactions on Reliability 2013;62(3):583–95. [15] Beer M, Kougioumtzoglou IA, Patelli E. Emerging concepts and approaches for efficient and realistic uncertainty quantification. In: Frangopol D, Tsompanakis Y, editors. Maintenance and Safety of Aging Infrastructure - Structures and Infrastructures Book Series, Vol. 10. London, UK: Taylor & Francis; 2014. p. 121–62. [16] Mancuso A, Compare M, Salo A, Zio E. Risk-informed decision making under imprecise information: portfolio decision analysis and credal networks. Proceedings of the 27thEuropean Safety and Reliability Conference. 2017. Portoroz, Slovenia [17] Behrensdorf J, Broggi M. Imprecise reliability analysis of complex interconnected networks. Proceedings of the 28thEuropean Safety and Reliability Conference. 2018. Trondheim, Norway [18] Aven T, Zio E. Some considerations on the treatment of uncertainties in risk assessment for practical decision making. Reliability Engineering and System Safety 2011;96:64–74. [19] Woo G. Calculating catastrophe. London, UK: Imperial College Press; 2011. [20] Beer M, Ferson S, Kreinovich V. Imprecise probabilities in engineering analyses.

572

Reliability Engineering and System Safety 188 (2019) 561–573

E. Tosoni, et al.

[42] [43] [44] [45] [46] [47]

[48] [49] [50] [51] [52]

[53] [54] [55]

probabilities. In: Walls R.B, editor. Risk, Reliability and Safety: Innovating Theory and Practice. London, UK: Taylor & Francis Group; 2017. p. 306–12. Tolo S, Patelli E, Beer M. An inference method for bayesian networks with probability intervals. Proceedings of joint ICVRAM ISUMA UNCERTAINTIES conference. 2018. Florianopolis, Brazil Bertsimas D, Tsitsiklis JN. Introduction to linear optimization. Belmont, MA: Athena Scientific; 1997. Drud AS. CONOPT - a large-scale GRG code. ORSA Journal on Computing 1994;6(2):207–16. Ferson S, Kreinovich V, Ginzburg L, Myers DS, Sentz K. Constructing probability boxes and Dempster-Shafer structures - SAND2002-4015. Albuquerque, NM: Sandia National Laboratories; 2003. Zwirglmaier K, Straub D. A discretization procedure for rare events in bayesian networks. Reliability Engineering and System Safety 2016;153. 906–109 Song G, Khan F, Yang M, Wang H. Predictive abnormal events analysis using continuous bayesian network. ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part B: Mechanical Engineering 2017;3(4). https://doi.org/ 10.1115/1.4035438. Echard B, Gayton N, Lemaire M. AK-MCS: an active learning reliability method combining kriging and monte carlo simulation. Structural Safety 2011;33:145–54. Cadini F, Santos F, Zio E. An improved adaptive kriging-based importance technique for sampling multiple failure regions of low probability. Reliability Engineering and System Safety 2014;131:109–17. Larsson A, Johansson J, Ekenberg L, Danielson M. Decision analysis with multiple objectives in a framework for evaluating imprecision, international journal of uncertainty. Fuzziness and Knowledge-based Systems 2005;13(5):495–510. Danielson M, Ekenberg L, Idefeldt J, Larsson A. Using a software tool for public decision analysis: the case of nacka municipality. Decision Analysis 2007;4(2):76–90. Cool W, Vermariën E, Wacquier W, Perko J. The long-term safety and performance analyses of the surface disposal facility for the belgian category a waste at dessel. Proceedings of the 15thInternational Conference on Environmental Remediation and Radioactive Waste Management. 2013. Brussels, Belgium Tosoni E., Salo A., Govaerts J., Zio E. Comprehensiveness in scenario analysis of nuclear waste repositories: Data on near-surface disposal. Data in Brief. Submitted. COMSOL. Multiphysics model library version 3.5a. 2018. (accessed 20 July 2018) https://extras.csc.fi/math/comsol/3.5a/doc/multiphysics/modlib.pdf,. Seetharam S., Perko J., Mallants D., Jacques D. Model assumptions for the cementitious near field of the dessel near surface repository. 2012. Brussels, Belgium,

ONDRAF/NIRAS. [56] EURATOM. Council directive 2013/59/EURATOM of 5 december 2013 laying down basic safety standards for protection against the dangers arising from exposure to ionising radiation, and repealing directives 89/618/euratom, 90/641/ euratom, 96/29/euratom, 97/43/euratom and 2003/122/euratom. Official Journal of the European Union 2014. [57] Mancuso A., Compare M., Salo A., Zio E. Portfolio optimization of safety measures for the prevention of time-dependent accident scenarios. Reliability Engineering and System SafetyIn press. [58] Goodwin BW, McConnell DB, Andres TH, Hajas WC, Neveu DML, Melnyk TW, Sherman GR, Stephens ME, Szekely JG, Bera PC, Cosgrove CM, Dougan KD, Keeling SB, Kitson CI, Kummen BC, Oliver SE, Witzke K, Wojciechowski L, Wikjord AG. The Disposal of Canada’s Nuclear Fuel Waste: Post Closure Assessment of a Reference System - AECL-10717, COG-93-7. Pinawa, Canada: Atomic Energy Canada Limited; 1994. [59] Chapman NA, Andersson J, Robinson P, Skagius K, Wene CO, Wiborgh M, Wingefors S. Systems analysis, scenario construction and consequence analysis definition for SITE-94 - SKI Report 95-26. Stockholm, Sweden: SKI; 1995. [60] Borgonovo E, Apostolakis GE. A new importance measure for risk-informed decision making. Reliability Engineering and System Safety 2001;72(2):193–212. [61] Borst MVd, Schoonakker H. An overview of PSA importance measures. Reliability Engineering and System Safety 2001;72(3):241–5. [62] Aven T, Nøkland TE. On the use of uncertainty importance measures in reliability and risk analysis. Reliability Engineering and System Safety 2010;95(2):127–33. [63] Zio E. Risk importance measures. In: Pham H, editor. Safety and risk modeling and its applications. London, UK: Springer-Verlaag; 2011. p. 151–96. [64] Noroozian R, Kazemzadeh RB, Niaki STA, Zio E. System risk importance analysis using bayesian networks. International Journal of Reliability, Quality and Safety Engineering 2018;25(1). https://doi.org/10.1142/S0218539318500043. [65] Zio E. The Monte Carlo simulation method for system reliability and risk analysis. New York, NY: Springer; 2013. [66] Curcurù G, Galante GM, La Fata CM. Epistemic uncertainty in fault tree analysis approached by the evidence theory. Journal of Loss Prevention in the Process Industries 2012;25:667–76. [67] Helton JC, Sallaberry CJ. Computational implementation of sampling-based approaches to the calculation of expected dose in performance assessments for the proposed high-level radioactive waste repository at yucca mountain. Nevada, Reliability Engineering and System Safety 2009;94:699–721.

573