Uncertainties in smart grids behavior and modeling: What are the risks and vulnerabilities? How to analyze them?

Uncertainties in smart grids behavior and modeling: What are the risks and vulnerabilities? How to analyze them?

Energy Policy 39 (2011) 6308–6320 Contents lists available at ScienceDirect Energy Policy journal homepage: www.elsevier.com/locate/enpol Uncertain...

210KB Sizes 0 Downloads 18 Views

Energy Policy 39 (2011) 6308–6320

Contents lists available at ScienceDirect

Energy Policy journal homepage: www.elsevier.com/locate/enpol

Uncertainties in smart grids behavior and modeling: What are the risks and vulnerabilities? How to analyze them? Enrico Zio a,b,n, Terje Aven c a

Ecole Centrale Paris – Supelec, Paris, France Politecnico di Milano, Milano, Italy c University of Stavanger, Stavanger, Norway b

a r t i c l e i n f o

a b s t r a c t

Article history: Received 11 December 2010 Accepted 14 July 2011 Available online 6 August 2011

This paper looks into the future world of smart grids from a rather different perspective than usual: that of uncertainties and the related risks and vulnerabilities. An analysis of the foreseen constituents of smart grids and of the technological, operational, economical and policy-driven challenges is carried out to identify and characterize the related uncertainties and associated risks and vulnerabilities. The focus is on the challenges posed to the representation and treatment of uncertainties in the performance assessment of such systems, given their complexity and high-level of integration of novel technologies. A general framework of analysis is proposed. & 2011 Elsevier Ltd. All rights reserved.

Keywords: Smart grids Uncertainty Risk

1. Introduction In today’s World, a number of critical infrastructures have become fundamental to national and international social development, economy, quality of life and security. The energy transmission and distribution network, the telecommunication network, the transportation system are common representative examples. Their high degree of inter- and intra-connectedness makes them quite vulnerable to global disruption, when exposed to hazards of various nature, from random mechanical/physical/ material failures to natural events, intentional malevolent attacks, human errors. This broader spectrum of hazards and threats, calls for an all-hazards approach for the understanding of the failure behavior of such systems, in view of their consequent protection. One associated underlying motif is the complexity of these systems and the related emergent behaviors, which may arise in collective ways difficult to predict from the superposition of the behavior of the individual elements of the system. This complex, emergent behavior has been demonstrated by system breakdowns often triggered by small perturbations followed by accelerating cascades and large-scale, border-crossing consequences, stressing the importance of (inter)dependencies. This leads to the fact that the analysis of these systems cannot be carried out with classical analytical methods of system decomposition and logic analysis; a framework is needed to integrate a number of methods capable of n Corresponding author at: Ecole Centrale Paris – Supelec, Paris, France, and Politecnico di Milano, Italy. E-mail addresses: [email protected], [email protected], [email protected] (E. Zio), [email protected] (T. Aven).

0301-4215/$ - see front matter & 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.enpol.2011.07.030

viewing the problem from different perspectives (topological and functional, static and dynamic, etc.). The focus of this paper is on the networks for energy supply (the so-called electric power grids), and its foreseen future ‘‘smart’’ development in response to the challenges ahead. These systems are pervasive in our everyday’s life as they reach virtually every home, school, hospital, office, factory and institution in developed countries and are quickly penetrating in developing ones. They are made of a large number of interconnected elements (wires and machines), which link the electricity generators to the customers, for satisfaction of their diverse needs. Originally developed as loosely interconnected networks of local systems, electric power grids have extended on large scales, across regional and national boundaries. Recently, distributed resources, mostly small power generators, are being increasingly connected to the existing backbone. The extent of the interconnectedness, the number and variety of power sources and generators, of controls and loads make electric power grids among the most complex engineered systems. Another relevant complexity attribute relates to the increased integration of the electrical power grid with other critical infrastructures, e.g. driven by the pervasive use of computer-based communication and control systems, which is beneficial in many respects but on the other hand it introduces additional vulnerabilities due to the increased interdependence, which can lead to surprising behaviors in response to perturbations. The re-conceptualization of the electric power grid to allow for the integration of large shares of electricity produced by harvesting solar and wind energies at the most suitable sites (e.g. desert solar and offshore wind farms) calls for a ‘‘smarter’’ system with decentralized

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

generation, smart metering, new devices for increased controllability, self-healing, etc., which will convert the existing power grid from a static infrastructure to be operated as designed into a flexible, adaptive infrastructure operated proactively. Besides the above mentioned technological challenges (e.g. creation of distribution management through using distributed intelligence and sensing; integration of renewable resources), a number of emerging issues are daunting the electric power grid systems and increasing the stress of the environments in which these are to be operated. These are: – the deregulated energy market, which has resulted in the systems being operated closer to their capacity and limits, i.e., with reduced safety margins, and consequently in the need for new and balanced business strategies; – the prospected demand for electricity in the next 25–50 years, which results in the need to technically respond by increased capacity and efficiency; – the sensed increase in the exposure to malevolent attacks that are no longer only hypothetical, which calls for effective protection to a different type of hazard/threat, much more difficult to predict than random failures. In this scenario of increased stress onto the electric power grid, concerns are naturally arising on the vulnerability of the systems and the risks associated to their future development and operation, i.e., on the possibility that capacities and safety margins will be designed so as to support the anticipated growing demand in a scenario of greater integration with other critical infrastructures and of market deregulation, without diminishing security and reliability of operation. In this paper, we look at the ‘stressful’ scenario, which is envisioned for the future electric power grids and reflect on its associated uncertainties, from the point of view of the vulnerability hazards/threats, which they imply. We propose the adoption of a categorization of uncertainties typical of risk analysis, and coherently offer some views on the proper mathematical frameworks for their representation and treatment (Aven and Zio, 2011; Dubois, 2010). In doing so, we question to what extent the different approaches serve the purpose, with an eye on the decision makers who must be given a clearly informed picture of the uncertainties, risks and vulnerabilities, faithful to the supporting information and knowledge available, upon which they can confidently reason and deliberate in terms of protection measures and tolerability. We keep the presentation of the different frameworks for uncertainty representation and treatment at the minimum level of technical details, sufficient to challenge and judge how the different frameworks contribute to a fair and informative representation of the risks, vulnerabilities and associated uncertainties. The remaining of the paper is organized as follows. In Section 2, we go deep into the emerging issues of future electric grids, highlighting the associated uncertainties. Then, we frame the general concept of uncertainty analysis in Section 3, and specify its nature in Section 4. From this basis, Section 5 proposes some possible extended approaches for treating the identified uncertainties. In Section 6, we bring together the complexity and uncertainty characteristics related to the characterization and modeling of future smart grids and offer a framework of guidelines for addressing them systematically. Section 7 resumes the key issues and questions arisen, and provides some conclusions.

2. Electric power grids: issues and challenges The electric power grid is the ensemble of elements that connect the sources of electricity (i.e., the power plants) with

6309

the users or customers, allowing for the satisfaction of their needs. The objective of providing the demanded supply with high reliability calls for a large number of tightly synchronized generators connected in a complex pattern for transmission and provision to the customers. In this way, any failure in generation is recovered from other parts of the system. To give a hint as to the dimension of the system, it is sufficient to look at a grid like that of the United States, containing around 10,000 power plants sending power through more than 300,000 km of high-voltage transmission lines. Currently, the electrical power grid has a multi-level hierarchical structure, which encompasses the generation level, the highvoltage transmission level, the medium-voltage sub-transmission level and the medium- and low-voltage distribution level. This constitutes a multi-state, multi-mode system capable of responding to disturbances by specific operational and control actions and reactions. For example, normal states and modes involve economic dispatching, load frequency control and forecasting; disturbances include faults, instability and load shedding; restorative states and modes involve rescheduling, resynchronization and load restoration. From the point of view of the time domain, the operation of electrical power grids is multi-scale, with a range which extends from the nanoseconds of the fast wave dynamic effects to the decades for expansion by installation of new power plant units. One technological issue, which strongly influences the problem of electricity generation and supply is the difficulty in storing electricity, which calls for a continuous matching of generation and use. Simplistically, there are baseload generators, which are run continuously to supply the minimum demand level, complemented by peaking generators, which intervene only to meet power needs at maximum load and intermediate generators, which take care of all the other situations. Sophisticated controls are used to dispatch generators as needed, accounting also for customer demand variations from day to night and from season to season. The current electric power grids have grown on a huge, continental scale somewhat in a patchwork manner. This raises significant concerns of reliability and vulnerability, exacerbated by the deregulated market. In theory, deregulation is aimed at offering the lowest service price through increased competition. On the contrary, the current market reality shows a businessoriented drive to maximum efficiency for highest profit. Opening the power industry to private, independent operators has reduced the safety margins. In the past, extra generation capacity was used as a ‘shock absorber’, to reduce the risk of generation shortages in case of equipment failures or of unusually high demands for power (e.g. in very hot or very cold days). As a result of this reduction in ‘shock absorption’ capability, the grid is becoming increasingly stressed and exposed to risks and vulnerabilities. Indeed, a number of cascading failures leading to large blackouts have been experienced in the last years throughout the World, calling for a deep understanding of the complexities of power network systems under stressed environments and for the development of new emergency controls and restoration strategies. In addition to mechanical failures and damaging attacks, line overloads can create power-supply instabilities, e.g. phase or voltage fluctuations. Transmission and distribution losses have at least doubled in the past 40 years, as a result of increased utilization and congestion, with losses of billions of Euros per annum. For example, the economic losses of the blackout that happened in the Northeastern area of the United States and in Canada on August 14, 2003 were about $4–$10 billion, with approximately 50 million people affected (US–Canada Power System Outage Task Force, 2004).

Short circuit on key power line due to bad weather, Itaipu hydro plant (18 GW) shut down 60 Mio. 4

0.8 Mio. households 15 Mio. households 5 2

 14 November 10, 2009

August 14, 2006 November 4, 2006

?  14

London Denmark/Sweden Italy Athens Moscow Switzerland (railway supply) Tokyo Western Europe (‘‘controlled’’ line cut off) Brazil, Paraguay August 28, 2003 September 23, 2003 September 28, 2003 July 12, 2004 May 25, 2005 June 22, 2005

0.72 6.4  30 9 2.5 0.2

 60 Great Lakes, NYC August 14, 2003

1 Definition of the N  1 security criterion specifies that ‘‘any probable single event leading to a loss of a power system element should not endanger the security of the interconnected operation, that is, trigger a cascade of tripping or the loss of a significant amount of consumption. The remaining network elements, which are still in operation, should be able to accommodate the additional load or change of generation, voltage deviation or transient stability regime caused by the initial failure.’’ (UCTE, 2008). 2 Now ENTSO-E.

Table 1 Recent major blackouts of electric power supply systems.

One of the main north–south transit lines through Switzerland – the Lukmanier transmission line – shut down following a flashover between a conductor cable and a tree. This resulted in a physical redistribution of the electrical flow with subsequent overload (110%) of another north–south transit line, namely the San Bernardino transmission line, which due to another flashover also shut down at 3.25 am. As a consequence, a series of cascading failures of other transmission lines spread through the border region, leaving the Italian grid completely separated from the UCTE2 network. Despite primary frequency control, automatic disconnection of the pump storage plants in Northern Italy and load shedding (for a total

Duration (h)

Also, lack of investment due to increasing economic pressure, public resistance, etc. can be observed in many countries and areas, leading to insufficient system monitoring, control and automation, grid extension and maintenance (including tree cutting programs (Great Lakes, Switzerland/Italy)), and thus contributing significantly to past blackouts. The complexity of the problem is made even more harsh by the fact that critical infrastructures are highly interconnected and mutually dependent in complex ways, both physically and through a host of information and communications technologies (so-called cyber-based systems), so that failures in one system can cross borders and cascade onto another with significant consequences, as shown by experienced events (Kroger and Zio, 2010). The Italian electric power blackout on September 28, 03 at 3.01 am (Sunday/Monday night) serves well as an exemplary:

500,000 4.2 Mio. 56 Mio. 5 Mio. 4 Mio. 200,000 passengers

Main causes People affected



Load loss (GW)



Blackout



ures), external impacts (Tokyo, construction work; Brazil extreme weather conditions) and adverse behavior of protective devices (London) are important triggering events, when not protected by the N  1 security criterion1 and/or in combination with high load conditions (Moscow). Outstanding causes are found also in organizational factors like market liberalization and short-term contracting, which lead to system operation beyond the original design parameters (e.g., Great Lakes, Italy) and stressful operation conditions due to weakening maintenance work and/or inadequate integration of intermittent power generation (e.g., Western Europe). As the transmission system operators (TSOs) play a decisive role with regard to contingency management, lack of situational awareness and short-term preparedness, as well as limited real-time monitoring beyond control areas and poor timely cross-border coordination (e.g., Great Lakes, Italy, Switzerland) (rail) build up as aggravating factors. The inadequacy of the N  1 security criterion and, even more importantly, its inadequate evaluation/implementation in various cases have enforced attempts to make it more stringent and legally binding.

1 7 up to 18 3 4 3

 Technical failures (Denmark/Sweden, two independent fail-

50 Mio.

The analysis of recent major blackouts from 2003 to 2006 (Table 1), leads to drawing some conclusions on the main underlying causes and common patterns of behavior, which are bound to threaten and shape also the future of smart grids (Kroger and Zio, 2011):

Inadequate right-of-way maintenance, EMS failure, poor coordination among neighboring TSOs Incorrect line protection device setting Two independent component failures (not covered by N  1 rule) High load flow CH-I, line flashovers, poor coordination among neighboring TSOs Voltage collapse Transformer fire, high demand leading to overload conditions Non-fulfillment of the N  1 rule, wrong documentation of line protection settings, inadequate alarm processing Damage of a main line due to construction work High load flow D-NL, violation of the N  1 rule, poor inter TSO-coordination

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

 16

6310

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

of 10 GW), the voltage and frequency drops within the Italian grid could not be mastered and generation plants started to trip. This in turn gave rise to a total blackout throughout Italy at 3.27 am (except Sardinia). After 3 h, energy was restored in some regions connected to France (such as Liguria). Nine hours later, in the afternoon of 28 September, electricity was restored gradually in most places, including Turin, Milan, Venice and Rome. The energy not supplied due to the rotating outages totaled to 12.9 GWh. Rolling blackouts continued to affect about 5% of the population for the next two days (29–30 September) as the electricity company ENEL continued its effort to restore supply. Restoring power to the whole country took 18 h. As a consequence, other infrastructure sectors were affected showing their strong dependence on electricity supply. The effects on the population, the economy and other infrastructures are given in Table 2. The impact of failure cascades across inter-dependent infrastructures is highlighted in the real events listed in Table 3. The answer to the grand challenges posed by the design, operation and management of the current and future electrical grids must rely on rational, risk-informed and cost-effective strategies to investments for expansion and modernization by revolutionary engineering, materials science and technology. From the point of view of reliability, vulnerability, risk and security, all hazards and threats on the electric power grid relate to the hardware network itself or to its companion communication and information systems, and include: – mechanical failures; – natural events (storms, earthquakes, natural fires, etc.); – structural attacks (by conventional means, including truck bombs, small airplanes hits, gunshots); – control failures; – control hijacking; – cyber/internet attacks. Some of these hazards and threats bring into the picture the issue of dependencies of the energy infrastructure on adjacent power grids, telecommunications markets and computer networks.

Table 2 Effects of the Italian blackout, 28 September, 2003 (Kroger and Zio, 2011). Impact on the population—strong

 About 56 Million people affected, five elderly persons dead.  Hundreds of people trapped in elevators. Economic losses—moderate  About 120 million h due to spoiled foodstuff.  Several hundred thousand h due to the interruption of continuously working industries (e.g., steel, cement or plastic factories); no effects on the financial markets.

Impact on dependent critical infrastructures—varying  About 110 trains with more than 30,000 passengers stopped, as well as subways in Rome and Milan. Flights were canceled or delayed. Outage of traffic lights partly led to chaotic situations in major cities, but with no severe accidents.  In some southern regions, interruptions of water supply for up to 12 h.  Telephone and mobile networks in a critical state but operable; internet providers shut down their servers (data transfer rate went down to 5% of nominal).  Hospitals with no serious problems due to the use of emergency power generators.

6311

This panorama is rendered technologically even more challenging by the fact that in the end the reliability of the electrical service relies on the ability of the system to instantaneously respond to changed conditions. Undoubtedly, operating a heterogeneous, widely dispersed but globally highly interconnected system such as the electric power grid is a serious challenge, and even more so accounting for all the diverse objectives, which must be achieved, of optimal service efficiency and maximum consumers benefit but also of business gains in a fair and free, competitive market. In this respect, the current structure of electric power grids suffers from the fact that the ‘intelligence’ for effectively responding to the challenges of the dynamic environment is applied only locally by protection systems centrally controlled through the Supervisory Control and Data Acquisition (SCADA) system. For certain cases, the central control system is too slow and the protection systems are designed to protect only specific components. In the present structure of completely centralized control, situations of rapid changes demand multiple, high-data-rate, two-way communication links, a powerful central computing facility for data elaboration and a sophisticated operations control center: all these elements are subject to disruption at the very time when they would be needed to face unusual and sudden system stresses caused by failures, natural events, malevolent attacks, etc. This unsatisfactory situation justifies the innovative efforts pushing in the direction of self-healing ‘smart grid’ concepts. These concepts foresee the possibility of distributing intelligence in the electrical power grid by means of independent processors located in each component (e.g. breaker, switch, transformer, busbar, etc.), and at each substation and power plant. These processors, equipped with a robust operating system, are expected to be able to act as independent agents that can communicate and cooperate with others in a large, distributed computing platform. Besides containing permanent information on the component characteristics and parameters, each agent is connected to sensors associated to the component whose analog measurements can be used to assess the component operating conditions; these are transmitted to the neighboring agents via the communication pattern set up. The network of local controllers can act as a distributed, parallel computer, communicating via microwaves, optical cables or the power lines themselves and intelligently limiting their messages to only the information relevant for achieving the global optimization and failure recovery. Such local, autonomous control would make the system resilient to multiple contingencies. For this, control components would need to be reconfigurable power electronic devices with a distributed controller and function as high-speed switches to redirect power flow; they must also be fault-tolerant and be equipped with agent-based collaborative intelligence to expedite the response to contingency events. Smart grids so made would be able to absorb the islands isolation following multiple failures in various locations of the network; indeed, the intelligence distributed throughout the components acting as independent agents would allow them to reorganize themselves and rapidly rejoin the network by making efficient use of the available local resources. One key aspect for the success of such ‘smart’ operation relates to the possibility of enabling grid operators with increased lookahead capabilities and foresight for scenario anticipation. Current schemes achieve a delay of more than 30 s in assessing system behavior. On the contrary, today’s advanced simulation and modeling techniques can provide faster-than-real-time lookahead simulations for what-if analyses of regional power systems, which integrate market, policy and risk analyses within a holistic framework capable of quantifying the integrated effects on system reliability and security, and thus avoiding previously

6312

Table 3 Examples of the importance of interdependencies between critical infrastructures (based on ETH-LSA, 2009). Event

Duration

Cause

May 19, 1998 Failure of Galaxy IV satellite (CRE)

Two days Galaxy IV was not restored

Failure of satellite’s primary control processor (technical)

July 18, 2001 The tunnel entrance was blocked Baltimore Howard for about 5 days Street Tunnel Fire (CIE)

Affected infrastructures

 

ICT Finance service

Consequences

  

A freight train detailed while passing through Howard Street Tunnel in Baltimore and caused the fire explosion due to subsequent ignition of the flammable liquid, (technical)

   

      

August 23–31, 2005 Katrina a category 4 hurricane Hurricane Katrina dissipated 7 days after it formed. (CCIE) Repair and recovery last more than four years

Hurricane Katrina caused severe damage along the Gulf coast from Central Florida to Texas, especially in New Orleans, Louisiana (natural)

      

Energy(power supply)—water and food (water supply) (physical) ICT (telecommunication)—public safety, rescues, and emergency service(physical) ICT (telecommunication)—energy(power supply) (logical) Energy(power supply)—ICT (radio) (physical) Transport (road transport)—public health (medical care and hospitals) (physical) Energy (power supply)—public health (medical care and hospitals) (physical) ICT (radio)—public health (medical care and hospitals) (physical)

         



August 17, 1999 Restoration process lasted from a A Mw 7.4 earthquake along the Kocaeli few days (power supply) to months North Anatolian Fault in Earthquake (CCIE) (transport infrastructure) and up to northwestern Turkey (natural) many years for total recovery





IE–ICT (internet/telecommunication), energy (power supply, oil supply), chemical industry, transport (road/ rail), water and food, public safety, rescues, and emergency services, financial services Energy(power supply)—ICT

 



45 million customers lost pager service Loss of TV/radio service Disruption pf bank card service



12 million USD associated with incident 23 bus lines and several train services suspended or delayed. Delays of coal and limestone services Extremely heavy road congestion in Baltimore 14 million gallons of water lost (water supply system) 1200 Baltimore buildings lost electricity Service disruptions for phone/ cell phone and slowed internet service



Damages cost more than 100 billion USD 1836 fatalities 80% of New Orleans city flooded Drinking water contamination 890,300 customers in Louisiana lost power 30 oil drilling platforms destroyed or damaged 24% of annual oil production and 18% of annual gas production reduced 3 million customers lost phone lines 2000 cell sites out of service Most of major roads in New Orleans area damaged 7 million gallons of oil and 1–2 million gallons of gasoline spilled into southeast Louisiana Population in New Orleans reduced half after Katrina



Around 15 billion USD losses Official death toll at 17,127 (unofficial figure closer to 40,000), 43,959 injured, around 500,000 homeless Loss of electricity due to damaged transmission lines







  



Almost all the satellites are not well protected due to their inherent designs and operating environment (design) The replacement of failed satellite is extremely slow and expensive (operational) Significance of this rail route was not fully recognized (organizational) Many structures, services, and private utility lines all coexisted within a relatively compact area (design)

The original levees in New Orleans was not prepared to category 4 or 5 hurricane (design) Massive inoperability problems between communication and power supply systems (design / operational) Lack of back up communication devices, such as satellite phone(organizational) Responsibilities (city, state, federal) turned out to be impending (organizational)

Some heavy populated districts built on ground mainly composed of sea soil which made them vulnerable to any earthquake (organizational/technical) Inadequate seismic design and construction practices lead to collapses (technical)

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320



Transport (rail transport)—water and food (geospatial) Transport (rail transport)—ICT (radio) (physical) Transport (rail transport)—energy (power supply) (geospatial) Water and food (water supply)—ICT (internet/telecommunication) (geospatial) Water and food (water supply)—energy (power supply) (geospatial)

Vulnerabilities

   

(telecommunication) (physical) Energy(power supply)–transport (rail/road) (physical) Energy(power supply)—water and food (water and food supply) (physical) Energy(oil supply)—transport (road) (physical) Transport(road)—public safety, rescues, and emergency services (physical)

 

  

January 2, 2004 Mini Telecommunication Blackout in Rome (CIE)

The communication blackout lasted In the major Telco node, the around 2 h breakage of a metallic pipe carrying cooling water caused the node failure and cascading failures in the other infrastructures. (technical)

   

ICT (geospatial) Energy (cyber) Transport (cyber) Finance services (cyber)



  

September 11, 2001 World Trade Center Attack (CCIE)

Part of energy supply and Terrorist attack with two hijacked telecommunication were restored planes caused the collapse of WTC within one week. More services (maleovolent acts) were restored in one year. The complete rebuilding of WTC is still underway

      

ICT (geospatial and physical) Energy (geospatial and cyber) Transport (physical and logical ) Finance Services (geospatial, physical and cyber) Water and Food (geospatial) Public safety, rescue and emergency services (geospatial, physical and cyber) Public health (logical)

    

  



ACEA power grid lost the monitoring and control of all the remote substations managed by the unmanned control center Major node failure and all connections connected to the node failed Delays and services perturbations at banks Delays and check-in troubles in Fiumicino airport Delays and services perturbations in ANSI print agency, post offices



4 airplanes crashed * Twin Towers collapsed, destruction of nearby buildings In total 2993 people were killed, more than 6000 injured Loss of power supply in a big area Loss of gas and steam supply A lot of phone lines and data lines damaged, most communication traffic rerouted New York Stock Exchange closed, international economy affected Loss of water supply All air and subway services suspended















Inadequate earthquake risk management (organizational) Food, clothing, and sheltering efforts generally lagged behind the needs of the people (organizational)

The facilities and apparatus in the major Telco node were not checked and maintained carefully and regularly (operational) Because of a maintenance operation, two redundant communication links between two control centers of ACEA power grids were actually routed through the same Telco node. Both links failed too due to geospatial common cause failures (design/ operational) Emergency power supply like diesel generators was not kept safely enough and failed to start due to the presence of water. (operational) Geospatial concentrations of critical infrastructure may be distinctly vulnerable to catastrophic disruption (design and operational) Lack of interoperability results in police and fire not being able to communicate with one another during an emergency (organizational and operational) Revealed shortfalls in the emergency services sector being able to respond to large-scale terrorist incidents and other catastrophes disasters that required extensive cooperation among local, state, and federal emergency response organizations (organizational and operational) Communications and business continuity will further deteriorate after the initial impact of a disaster if backup power generation facilities are not provided with guaranteed access to fuel and maintenance (organizational)

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320



and substations Oil sector affected due to damages on industrial plants Various industrial facilities affected including petrochemical plants, pharmaceutical firms, car manufactures, tire companies, paper mills, steel fabrication plants, etc. Loss of water supply services due to failures of pipelines Motorways and railway tracks damaged Telephone service disruptions

CRE: cascade resulting event; CIE: cascade initiating event; CCIE: common cause initiating.

6313

6314

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

unforeseen disturbances by greater grid self-awareness and resilience. The technological challenges behind the development of such a framework of distributed, intelligent control and operation extend to advanced sensors with the focus on: (i) the development of digital control systems in replacement of the far-lessaccurate analog and pneumatic controls, (ii) the detection and measurement of a wide range of chemical species, e.g. CO2, NOx and SO2, (iii) devices resistant to the harsh temperatures and chemical environments typical of power plants, to smart materials to allow self-recovering with fast response in milliseconds under outage events or terrorist attacks, to advanced hardware components (meters, sensors and monitors, motors, transformers, solid-state breakers, switchgear, computers and networks, mobile devices, appliances) to incorporate smart materials and structures into the grid, to nanotechnology for energy storage and saving (for example, there is an expectation that nanotechnology will enable the development of power storage systems with energy densities much larger than those of current batteries), to the use of fullerenes carbon molecule in nanotubes for highly conductive wires and cables (Amin and Stringer, 2008).

3. Basic concepts of uncertainty analysis Technology investors and policy decision makers are faced with the challenge of carrying out the program of development of future smart electric power grids. They are facing an uncertain future scenario of evolution of technology and society. Future smart grid operators will then be faced with the challenge of operating the system with the required stability and reliability of service, in spite of the variability of power generation and utilization involved, and all the hazards and threats associated. Evaluation of these uncertain situations is required, by proper modeling with due account of all exogenous and endogenous uncertainties. By definition, modeling provides a representation of the system behavior, based on a number of hypotheses and parameters whose values need to be estimated. As good as a model can be, it will never provide a perfect replica of reality, due to the intrinsic uncertainty in and incomplete knowledge of the system behavior. This leads to uncertainty on the hypotheses of the model (model or structural uncertainty) and on the values of its parameters (parameter uncertainty). Such uncertainty propagates within the model to its outputs. In the literature, a number of aspects/factors/causes of uncertainty have been identified (Armacost and Pet-Edwards, 1999; Zimmermann, 2000):

 Lack of information (or knowledge):





Lack of information, knowledge and/or data on the phenomena, systems and events to be analyzed is the main source of uncertainty. Obviously, this cause of uncertainty can be reduced by gaining more information and data about the problem at hand. Approximation Approximation means for example to use a specific value for an uncertain parameter, or to replace a physical phenomena by a simple model. Actually, any modeling implies some degree of approximation. It takes place when the analyst does not have enough information to describe exhaustively the phenomenon of interest or when he/she deliberately would like to simplify the analysis (e.g. due to computational constraints). In some cases, the approximation is declared explicitly, in other cases it is hidden. Abundance of information (or knowledge): This aspect is due to the human incapacity of assimilating and elaborating many pieces of data and information at the same









time. In this situation, the analyst usually focuses its attention only on those parameters and those pieces of data and information that he/she considers more important, neglecting the others. The identification of a rigorous (and possibly automated) procedure to select (among hundreds or thousands) the relevant data, information and parameters for the application at hand is the most critical issue. The analyst has to face this aspect of human incapacity when, for example, he/she has to choose among different models for simulating a given phenomenon. This issue is closely linked to biases and heuristics in subjective probability assignments (see Kahneman and Tversky (1974) and Aven (2003)). Conflicting nature of pieces of information/data: It may happen that some pieces of available information and data suggest a given behavior of the system, while others suggest a different one. In this case, increasing the amount of available information and data would not decrease the uncertainty, but rather it would increase the conflict among different pieces of information and data. There could be many reasons for such a conflict, for example that some pieces of information are affected by errors, but the analyst cannot identify them, or that a model used by the analyst is poor. Measurement errors: The measurement of a physical quantity (e.g. temperature, weight, length, etc.) is always affected by the precision of the instrument adopted. Linguistic ambiguity: An expert may express that a quantity is large, but the meaning of ‘‘large’’ is ambiguous. It is possible to interpret the word large in different ways. Subjectivity of analyst judgments: Different analysts may provide different interpretations of the same piece of information and data, depending on their cultural background and competence in the field of analysis.

The aim of uncertainty analysis is to describe the uncertainty in the modeling due to the different aspects/factors/causes and propagate the effects onto the output. In simple words, uncertainty analysis determines the uncertainty in the model results y¼f(x) that comes from uncertainty in the model inputs x (Helton et al., 2006). This requires an assessment of the uncertainties in x and the propagation through the model f. Typically, the uncertainty about x and the uncertainty related to the model structure f, i.e., the uncertainty due to the existence of alternative plausible hypotheses and to the completeness of the representation offered by the model as to all variabilities of the real system, are treated separately. Aven (2010a) distinguishes between model inaccuracies (the differences between the actual output y and the predicted output obtained through f(x)), and model uncertainties due to alternative plausible hypotheses on the phenomena involved. Actually, while the first source of uncertainty has been widely investigated and more or less sophisticated methods have been developed to deal with it (Aven, 2010a), research is still ongoing to obtain effective and agreed methods to handle the uncertainty of model structure (Parry and Drouin, 2009). In the context of quantitative risk analysis, it is common to distinguish between ‘‘aleatory’’ and ‘‘epistemic’’ uncertainties (Apostolakis, 1990; Helton and Oberkampf, 2004; USNRC, 2009). The former refers to phenomena occurring in a random way: probabilistic modeling offers a sound and efficient way to describe such occurrences. The latter captures the analyst’s confidence in the model by quantifying the degree of belief of the analysts on how well it represents the actual system. It is typically expressed as subjective probabilities of the parameters of the probability models. It is also referred to as

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

state-of-knowledge or subjective uncertainty and can be reduced by gathering information and data to improve the knowledge on the system behavior. The different aspects/factors/causes of uncertainty may be understood and consequently treated in different ways. For example, in the view of subjective Bayesian probability, there is only one type of uncertainty, stemming from lack of knowledge about unknown quantities (including events), and subjective probabilities are used to express this uncertainty conditional on the available knowledge of the assessor. These assigned probabilities need to be seen in relation to the approximations made, and the properties (including the limitations and weaknesses) of the assignment tool (abundance of information, conflicting nature of pieces of information/data, subjective interpretations of information). Measurement errors are usually taken into account using an appropriate probability model. Ambiguous information is a part of the background knowledge that the subjective probabilities are based on. In general, a Bayesian analyst would seek to express knowledge and judgments through probabilities and try to reduce ambiguity. However, in certain situations the probability-based approaches are being challenged. This is due to the belief that there are situations for which the scarce available knowledge that the probability assignments are based on would make the corresponding numerical values meaningless, if not misleading. It is true that adopting the subjective probability approach, probabilities can always be assigned, but the information basis supporting the assignments is not reflected by the numbers produced. One may assign a low probability of health problems occurring as a result of some new chemicals, but these probabilities could produce poor predictions of the actual number of people that experience such problems. Or one may assign a probability of fatalities occurring on an offshore installation based on the assumption that the installation structure will withstand a certain accidental load; in real-life the structure could, however, fail at a lower load level: the assigned probability did not reflect this uncertainty. In the context of the present work, the argument put forward is that the quantity and type of information available on the future scenarios of development and operation of smart grids does not provide a sufficiently strong basis for specific probability assignments, as the uncertainties related to the occurrence of the relevant events, the associated consequences and their modeling are too large. In this scenario, it is doubtful that the many stakeholders involved in the problem can be satisfied with a precise probability-based assessment of the situation for decision-making, which may require a broader description inclusive of all the social, political and financial aspects, as well as the technical and environmental ones. As an exemplary case, we mention wind generation, which on one side is experiencing enormous growth in the recent years, with an increasing impact on power system dynamic performance (Van Huylle, 2005; Wiser and Bolinger, 2007), but whose inclusion in the analysis of dynamic behavior and generating capacity adequacy is difficult because many characteristic parameters are not well known. The reasons for this are varied (Rose and Hiskens, 2008): non-disclosure of information by manufacturers, in protection of their intellectual property (lack of information); disappearance of some manufacturers with loss of information, whereas their manufactured turbines continue to operate (lack of information); complexity of manufacturers models, which are difficult to scale-down to the level of simplification needed for the studies of generating capacity adequacy and grid stability (approximation), and others. Obviously, not all parameter values must be known with the same accuracy, as their influence on the system dynamics may change depending also on

6315

the type of disturbances. In this respect, sensitivity analysis comes to help in identifying the relevant parameters which then need to be estimated from measurements, while uncertainty analysis becomes essential in propagating the uncertainty associated to the estimation (measurement errors). The adequate sensitivity and uncertainty analyses should support the selection of the wind turbines, which are best suited for a particular application site in order to obtain the maximum capacity benefit with the required reliability level (Billinton and Chen, 1997).

4. Uncertainties in the future development and operation of electric power grids Clear statements have been given on the potential benefits of decentralized energy systems including significant shares of renewable sources, while recognizing the technological challenge (Buchholz et al., 2006; SmartGrids, 2006; Chebbo, 2007; Czisch and Giebel, 2007; ABB, 2008; Battaglini et al., 2008a, 2008b). Actual development is also a policy and financial problem (Barras and Gordon, 2008). In this respect, a number of issues are open and need to be addressed from the perspective of investors, decision- and policy-makers. One concern refers to the sheer magnitude of the investments associated to the expansion of electrical power grids into distributed smart grids of conceivable sizes. This would require highly intensive capital investments in the range of billions of Euros, with significant economies of scale. Also, as usual, risk of project cost overruns during construction and of project delays is real, with additional technological uncertainties due to obsolescence before the investment can turn into net profit. Prediction of the financial scenario is quite challenging, given the uncertainties in the technology and policy developments over time. Undoubtedly, financial uncertainties influence the profitability of even the seemingly most attractive smart grid projects. Competitiveness and political uncertainties play a role, which must be given due account when studying the future development and operation of electric power grids. Interestingly enough, the inter- and intra-dependencies of the system are such that its successful deployment rests on the development of all generation, utilization, control and telecommunication technologies involved, and the related breakthroughs needed. The uncertainty in the fossil fuel price is also relevant in defining the future financial scenarios for electricity production. Electricity production by fossil fuels incurs high variability and is vulnerable to the hazard of oil-price shocks. This results in volatility of the operating costs, which is charged on the electricity prices. In this sense, renewable energy at high fixed but low variable costs can provide price stability and a good safety margin against fuel price volatility. National politics uncertainties need to be addressed, to the extent possible. The risk that a government nationalizes power production is a deterring factor due to the possible loss of the invested capital. Also, local political obstacles to transmission are a concern: obtaining rights of way to build long-distance transmission lines crossing several national and local jurisdictions is not trivial, and the legal boundaries of the issues are no clearly set yet. At a more global level, political issues related to terrorist actions against elements of the electric power grid (generation plants, power lines, etc.) are an uncertain concern of risk, which cannot be neglected, and whose cost for protection must be accounted for. In all, questions on the reliability and security of electricity production or import, and transmission must be addressed on a multi-scale political setting, to guarantee electricity supply. Energy and environmental policy uncertainties are at present quite large. Long-term climate targets are somewhat defined but

6316

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

the effects on the future energy policy and the associated incorporation of environmental costs into electricity prices are not known. For this reason, on the short term, the developments of projects involving renewable technologies will need to rely on public support schemes in an amount, which can be variable. In the future development and operation of electric power grids, demand and supply uncertainty will play a crucial role, the latter depending on the former, which is the true driver of the system. On a long-term perspective of several years, electricity demand is expected to grow significantly around the World; uncertain forecasts of the future energy needs will dictate the margins of additional resources required to satisfy them. This perspective is fundamental for the timing of new capacity installation, and accounting for the associated uncertainty is crucial for proper capacity management. On the other hand, on a short-term perspective of few hours/days the demand of electricity is rather predictable and proper arrangements can be made to meet the anticipated demand levels. Yet, unexpected peak loads may occur and the anticipation of such shock events may be highly uncertain; electrical power grids must be designed in a way to be protected also from these events, to avoid the demand exceeding the available capacity, with consequent failure of delivering the required amount of electricity (Stalon, 1997). The uncertainties of operation of a liberalized electricity market also affect the overall picture. In a competitive environment, it is difficult for electricity producers to charge consumers with any increase in cost, which is not directly related to the industry or specific production method used. Overall, the decision to build and operate a smart electrical power grid will be influenced by a number of uncertain parameters pertaining to technological, environmental, economic, social and political factors. The uncertainties must be identified, represented in adequate mathematical forms, quantified in the system assessment and communicated to the involved stakeholders in an intelligible way, so as to allow rational decisionmaking. In this variety of sources and types of uncertainties, challenges are foreseen for uncertainty analysts in devising the proper framework of uncertainty representation and the tools for uncertainty propagation in the system models.

5. Extended approaches to uncertainty analysis As the events of interest in the analysis of the future development and operation of smart grids are extremely unlikely and the knowledge on their modalities of occurrence and consequences very poor, the uncertainties may be too large to justify a purely probabilistic treatment for all aspects/factors/causes. The innovative character of the technologies and policies involved, and the variety of hazards and threats to which the system is exposed (e.g. malevolent structural and cyber attacks, natural events, etc.) lead to unique characteristics of the events and phenomena shaping the scenarios, so that attempting to refer to probability models representative of fictional populations of non-existing similar situations seems difficult to justify. For example, it seems senseless to define a frequentist probability of a terrorist attack to a node of an electric power grid (Aven and Renn, 2009). Thus for the ‘unique and rare’ context of smart grids, suitable probability models may be difficult to define. Subjective (knowledge-based, judgmental) probabilities on the other hand can always be assigned, but in situations like this they would lack a rigorous basis. The foundational problems of the probability-based assessment of complex systems have been addressed in the literature from the conceptual point of view and the multiple stake-holders sides (Aven and Zio, 2011). Reid (1992) argues that there is a common tendency of underestimation of the uncertainties. The

disguised subjectivity is potentially dangerous and open to abuse if it is not recognized. According to Stirling (2007), abusing of the probabilistic treatment when strong knowledge about the probabilities and outcomes does not exist, is irrational, unscientific and potentially misleading. Tickner and Kriebel (2006) stress the tendency of decision makers and agencies not to talk about uncertainties underlying the outcome numbers. Acknowledging uncertainty can weaken the authority of the decision-maker and agency, by creating an image of being unknowledgeable. Precise numbers are used as a facade to cover up what are often political decisions. Renn (1998) summarizes the critique drawn from the social sciences over many years and concludes that technical, quantitative analyses represent a narrow framework that should not be the single criterion for identification, evaluation and protection management of risks and vulnerabilities of complex systems. Given the difficulties of probability frameworks to handle the peculiarity of complex systems like the electrical power grid, alternative approaches have been suggested for representing and describing uncertainties. For a review, see Helton and Oberkampf (2004). In probability bound analysis (Ferson and Ginzburg, 1996), interval analysis is used for those components whose aleatory uncertainties cannot be accurately estimated; for the other components, traditional probabilistic analysis is carried out. However, this often results in very wide intervals and the approach has been criticized for not providing the decision-maker with specific analyst and expert judgments about epistemic uncertainties (Aven, 2010b). Other frameworks, like imprecise probability (after Walley (1991) and the robust Bayes statistics area (Berger, 1994)), random sets (in the two forms proposed by Dempster (1967) and Shafer (1976)), and possibility theory (which is formally a special case of the imprecise probability and random set theories (Dubois and Prade, 1992; Dubois, 2006)) allow for incorporation and representation of incomplete information (some theoretical details are provided in Appendix A at the end of the paper concerning imprecise probabilities: for further knowledge, the interested reader should proceed to the specialized literature, possibly starting from the overview and bibliography in Aven and Zio (2011)). Their motivation is to be able to treat situations where there is more information than an interval, but less than a single specific probability distribution would imply. The theories produce epistemic-based uncertainty descriptions and in particular probability intervals, but they have not been broadly accepted. Much effort has been made in this area, mostly with a mathematical orientation, but no convincing framework for system assessment is presently available in practice. Further research is required to make these alternatives operational. The same conclusions can be drawn also for other proposed approaches that have recently been suggested, for example the probabilistic inference with uncertain and partial evidence developed by Groen and Mosleh (2005) as a generalization of Bayes’ Theorem. Work has also been carried out to combine different approaches, for example probabilistic analysis and possibility theory. Here the uncertainties of some parameters are represented by probability distributions and those of some other parameters by means of possibilistic distributions. An integrated computational framework has been proposed for jointly propagating the probabilistic and possibilistic uncertainties (Baudrit et al., 2006). This framework has previously been tailored to event tree analysis (Baraldi and Zio, 2008) and fault tree analysis (Flage et al., 2009), allowing for the uncertainties about event probabilities (chances) to be represented and propagated using both probability and possibility distributions. The work has been extended in Flage et al. (2010) by comparing the

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

results of the hybrid approach with those obtained by purely probabilistic and possibilistic approaches, using different probability/possibility transformations. See also Helton and Oberkampf (2004) for some additional examples of such integrative work. Within the context of the present work, it has been argued that the distribution network operators are faced with different kinds of uncertainties, some of which can be righteously modeled or described within a probabilistic framework whereas others cannot. The effect of these uncertainties can be quite relevant as the generation pattern of distributed generating units changes the power flow in the lines and this causes changes in the active losses, which the distribution network operator is responsible for compensating. The generation pattern is dependent on the generating technology and also on the decisions of the operators of the distributed generating units, who are active entities different from the network operators. For wind turbines generating units, for example, this pattern is often described using a Weibull distribution for the wind speed at the site of installation and a power curve for the wind turbine, but for other controllable distributed generation technologies like gas turbines it is not so simple to provide a probabilistic representation of the power generation schedule. On the other hand, loads are also uncertain and cannot adequately be described probabilistically in many cases (Soroudi and Ehsan, 2011; Gouveia and Matos, 2009; Matos and Gouveia, 2008). All the approaches discussed above are quantitative. Another direction is based on a mixture of quantitative and qualitative methods. An example is the semi-quantitative approach outlined by Aven (2008a, 2008b, 2010b). Following this approach, uncertainty factors ‘‘hidden’’ in the background knowledge that the subjective probabilities are based on are identified and assessed in a qualitative way. The motivation for the qualitative analysis is the acknowledgment and belief that the full scope of the analysis and its uncertainties cannot be transformed to a mathematical formula, using probabilities or other measures of uncertainty. Numbers can be generated but would not alone serve the purpose of the assessment, to capture and describe the system variability in behavior with the associated risks, vulnerabilities and uncertainties, as a basis for rationale decision-making.

6. Discussion and practical guidelines The future ‘‘smart’’ electric power grid is a complex uncertain system, where complexity and uncertainty come from a number of its characteristic aspects. The context of system definition, design, development, deployment, operation, adaptation and evolution is one such aspect of relevance, which includes technical issues of feasibility, sustainability, etc. as well as societal issues such as globalization, competitiveness etc., financial issues, political issues, etc. The functioning of the components and systems implemented will depend on the operational context in the large sense (technical, environmental, social, economical, political), whose boundaries are difficult to frame with certainty and are dynamic on the spatial and time scales considered. Given the nature of the system, setting context and system boundaries, which do not exceed one’s span of authority is unimaginable; on the other hand, defining broad boundaries that include virtually everything is intractable, also including the diversity of multiple stake-holders that act, control and use the system driven by many diverse interests and objectives; diffuse, uncertain boundaries that are continually evolving to capture emerging influences that are originally hidden pose conceptual and mathematical challenges. The uncertainties in the development and operation of future ‘‘smart’’ electric power grids are of technological, social, financial,

6317

political and environmental nature. They are generally large, dynamic (including emergent phenomena) and of difficult characterization. These uncertainties reflect on the modeling of such systems, which is quite a challenging task per se as it involves multiple hierarchical levels of different natures, multiple spatial and temporal scales, intricate, dynamic and often not completely known relationships in the network of constitutive elements and multiple perspectives by the involved stake-holders. In these conditions of pervasive uncertainties on the behavior of the system, and associated vulnerabilities and risks (from both known and unknown sources), systematic assessments corroborated by proper uncertainty analyses for rationale decision support are in order. The pure probabilistic and extended non-probabilistic methods provide ways of uncertainty analysis, whose separate and independent application is possibly difficult to justify for such a complex system like the future ‘‘smart’’ electric power grid, given its poor characterization and the scarce information available. In practice, a somewhat more pragmatic approach, such as the one proposed below, may be adopted to be able to say something about the relevance of uncertainties on the system, for informing development and/or operation decisions. This approach is based on system engineering good practice for identifying and characterizing ‘‘drivers’’, ‘‘limiters’’ and ‘‘effectors’’ on the multiple levels and scales of the system and its models. It is based on probability intervals, in terms of both standard prediction intervals but also imprecise probabilities. At the outstart, the context of the analysis must be clearly and unarguably defined, in terms of the aims with which one is looking at the system. This drives the identification of the multiple stake-holders and their objectives associated with the system’s development, deployment and operation. From this phase, corresponding sets should emerge (as known and/or dynamically predicted at the time of analysis) of: – observable targets (the ‘‘drivers’’), e.g. costs (D1), profitability (D2) but also other targets that are difficult to quantify, e.g. related to sustainability (D3), resilience (D4), adaptability (D5), etc. – Constraints (the ‘‘limiters’’), e.g. technical limitations in technological deployments and advancements (L1), resources availability (L2), etc. – Influencing events and phenomena (the ‘‘effectors’’), e.g. technical ones like failures (E1), environmental ones like meteo conditions for some types of generation from renewables (E2), economical ones like the price of fuel (E3), societal ones like user behavioral evolution (E4) but also terrorist attacks (E5). The elements of analysis identified need to enter into a modeling architecture capable of underlying the representation of the physical, environmental, economical and social phenomena of interest. The aim here is to provide a crude structure for the scientific explanation of the phenomena of interest in terms of conceptual modeling frameworks at the different levels needed and at the different depths of details allowed by the information, knowledge and understanding available. The structure is supported by the links of relationship {R} among the drivers {D}, limiters {L} and effectors {E} elements defining the system. The overall architecture ({D}, {L}, {E}, {R}) is intended to represent the system at a macro-level and the modeling architecture, which it supports must be capable of describing and representing distinguishing characteristics of behavior emergence and adaptability, robustness and resilience, flexibility and agility, and evolvability. The need here is to understand what can be fixed and what must be left changeable (and how), in terms of structure

6318

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

of the relationships, dynamics of the constraints and control actions, etc. Given the diverse nature of the (dynamic) aspects into play, including technical and environmental phenomena, financial and political issues, and human and social behaviors, transdisciplinary modeling is required, supported by a proper common taxonomy for the necessary ontological basis of discussion. The model must be evaluated prior to and subsequent to development and deployment relative to the drivers and intended uses by the multiple stake-holders. This must be complemented by an assessment of the uncertainties in all elements of the modeling architecture ({D}, {L}, {E}, {R}) and the scenarios that its evaluation provides. For the uncertainty analysis, we suspect that the information on the model constitutive elements {L}, {E}, {R} mainly allows for the use of imprecise knowledge-based/judgmental probabilities to assess the uncertainties involved, e.g. for the event of a terrorist attack (E5) intervals such as: unlikely o1%, moderate likely 1–10%, likely 410%, very likely 450%; the assessor is not willing/able to provide more accurate judgments based on the available knowledge. These assessments are conditional on different constraints (L) and other influencing events and phenomena (Es), which should be specified to the extent possible based on the modeling architecture and the relationships thereby represented ({R}). Uncertainties on the constraints (Ls) and influencing events and phenomena (Es) should also be assessed, for example in terms of probability intervals, e.g. how probable is a specific development in society, a type of technology is developed, etc. As most uncertainty assignments are expected to be subjective, it is important to identify and explicitly state, which are the aspects/causes/factors in the background knowledge that could change the assignments strongly and assess their importance by some crude categorization. This will allow control on the assignment and adaptation to new information and knowledge. Once assessed, the uncertainties in {L}, {E}, {R} must be propagated through the model architecture to predict and assess the uncertainties on the targets {D} in terms of conditional probabilities, e.g. P(a rD1 rb9L, E) ¼0.90 expresses that the analyst is 90% confident, with respect to his/her knowledgebased/judgmental probability assessment, that the value of the cost driver D1 (for example) is in the interval of values [a,b], given a set of constraints L and influencing events E. The mathematical and computational challenges of propagating the uncertainty representations within an integrated framework of uncertainty analysis still stand. The results of the model evaluation with the uncertainty assessment are then given in terms of intervals of values of all elements of the modeling architecture ({D}, {L}, {E}, {R}) and for quantities on the real line with associated levels of ‘‘confidence’’. These results should be provided in an intelligible form to the multiple stake-holders involved in the system development, deployment and operation for rational decision-making.

7. Conclusions We have looked into the development and operation of future smart grids for electric power production and distribution. By way of resuming the challenges ahead for the development and the conditions for the operation, we have identified a number of uncertainties, which are expected to play an influential role. Challenges and conditions are of technological, environmental, financial, social and political nature; the associated uncertainties are quite diverse in nature, as is the information and knowledge available to describe them. This calls for appropriate methods of

uncertainty representation in the analysis of the system, which go beyond a pure precise probabilistic approach. In this view, we recalled some alternative methods of uncertainty representation. Concrete pilot case studies are needed to identify the proper method of representation for each source and type of uncertainty. Then, proper methods of integrated propagation of these uncertainties must be devised, together with clear ways of interpretation of the outcomes of the uncertainty analysis for use by the decision makers involved. We provided some guidelines to address these issues, based on probability intervals (covering both standard prediction intervals and imprecise probabilities). Alternative approaches, for example based on possibility theory should also be investigated. Smart grids with all their uncertainties remain standing as an exciting technological challenge, as well as a modeling and analysis challenge.

Acknowledgments The work of E. Zio has been partially funded by the ‘‘Foundation pour une Culture de Securite’ Industrielle’’ of Toulouse, France, under the Research Contract AO2009-04. The work of T. Aven has been funded by The Research Council of Norway through the SAMRISK and PETROMAKS research programmes. The financial support is gratefully acknowledged. The authors are indebted to the unanimous referees for the precious comments, which have enabled significant improvements to the paper.

Appendix A. Imprecise (interval) probability (based on Aven (2010a)) The theory of imprecise (interval) probability generalizes probability using an interval [P(A),PðAÞ] to represent uncertainty about an event A, with lower probability P(A) and upper probability PðAÞ, where 0rP(A) rPðAÞ r1. The imprecision in the representation of the event A is defined by

DPðAÞ ¼ PðAÞPðAÞ: Peter M. Williams developed a mathematical foundation for imprecise probabilities, based on de Finetti’s (1974) betting interpretation of probability. This foundation was further developed independently by Vladimir P. Kuznetsov and Peter Walley (the former only published in Russian), see Kuznetsov (1991) and Walley (1991). Following de Finetti’s betting interpretation, the lower probability is interpreted as the maximum price for which one would be willing to buy a bet which pays 1 if A occurs and 0 if not, and the upper probability as the minimum price for which one would be willing to sell the same bet. If the upper and lower probabilities are equal, we are led to precise probabilities. These references, and Walley (1991) in particular, provide an in-depth analysis of imprecise probabilities and their interpretations, with a link to applications to probabilistic reasoning, statistical inference and decisions. It is, however, also possible to interpret the lower and upper probabilities using the reference to an uncertainty standard. Such an interpretation is indicated by Lindley (2006), pp. 36. Consider the subjective probability P(A) and say that the analyst states that his/her assigned degree of belief is greater than the urn chance of 0.10 (the degree of belief of drawing one particular ball out of an urn comprising 10 balls) and less than the urn chance of 0.5. The analyst is not willing to make any further judgments. Then, the interval [0.1, 0.5] can be considered an imprecision interval for the probability P(A).

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

Of course, even if the assessor assigns a probability P(A)¼0.3, one may interpret this probability as having an imprecision interval [0.26, 0.34] (as a number in this interval is equal to 0.3 when displaying one digit only), interpreted analogously to the [0.2, 0.5] interval. Hence imprecision is always an issue in a practical uncertainty analysis context. This imprecision is commonly viewed as a result of measurement problems (refer to item iii) in the criteria list presented in Section 1. Lindley (2006) argues that the use of interval probabilities confuses the concept of interpretation with the measurement procedures (in his terminology, the concept of measurement with the practice of measurement). The reference to the urn lottery provides a norm, and measurement problems may make the assessor unable to behave according to it. See also discussion in Bernardo and Smith (1994), pp. 32. However, other researchers and analysts have a more positive view on the need for such intervals, see discussions in Ferson and Ginzburg (1996) and Aven and Zio (2011): impression intervals are required to reflect phenomena as discussed above, for example when experts are not willing to express their knowledge more precisely than by using probability intervals. Imprecise probabilities are also linked to the relative frequency interpretation of probability (Coolen and Utkin 2007). The simplest case reflects that the ‘‘true’’ frequentist probability p is in the interval [P(A),PðAÞ] with certainty. More generally and in line with the above interpretations of imprecision intervals based on subjective probabilities P, a two level uncertainty characterization can be formulated (see e.g. Kozine and Utkin, 2002): [P(A),PðAÞ] is an imprecision interval for the subjective probability P(arp rb), where a and b are constants. In the special case that P(A)¼PðAÞ ( ¼q say) we are led to the special case of a q  100% credibility interval for p (i.e. with a subjective probability q, the true value of p is in the interval [a,b]).

References ABB, 2008. The Corporate Technical Journal of the ABB Group. ABB, Zurich, /www. abb.com/abbreviewS. Amin, M., Stringer, J., 2008. The electric power grid: today and tomorrow. MRS Bulletin 33, 399–406. Apostolakis, G., 1990. The concept of probability in safety assessments of technological systems. Science 250, 1359–1364. Armacost, R., Pet-Edwards, J., 1999. Integrative risk and uncertainty analysis for complex public sector operational systems. Socio-Economic Planning Sciences 33, 105–130. Aven, T., 2003. Foundations of Risk Analysis. Wiley, NJ. Aven, T., 2008a. Risk Analysis. Wiley, NJ. Aven, T., 2008b. A semi-quantitative approach to risk analysis, as an alternative to QRAs. Reliability Engineering and System Safety 93, 768–775. Aven, T., 2010a. Some reflections on uncertainty analysis and management. Reliability Engineering and System Safety 95, 195–201. Aven, T., 2010b. On how to define, understand and describe risk. Reliability Engineering and System Safety 95 (6), 623–631. Aven, T., Renn, O., 2009. The role of quantitative risk assessments for characterizing risk and uncertainty and delineating appropriate risk management options, with special emphasis on terrorism risk. Risk Analysis 29, 587–600. Aven, T., Zio, E., 2011. Treatment of uncertainties in risk assessment for practical decision making. Reliability Engineering and System Safety 96, 64–74. Baraldi, P., Zio, E., 2008. A combined Monte Carlo and possibilistic approach to uncertainty propagation in event tree analysis. Risk Analysis 28 (5), 1309–1325. Barras, L., Gordon, R., 2008. Stakeholder Perceptions of a SuperSmart Grid. Preliminary Results of the Survey. Postdam Institute for Climate Impact Research, European Climate Forum, Postdam. Baudrit, C., Dubois, D., Guyonnet, D., 2006. Joint propagation of probabilistic and possibilistic information in risk assessment. IEEE Transactions on Fuzzy Systems 14, 593–608. Battaglini, A., Lilliestam, J., Bals, C., Haas, A., 2008a. The SuperSmart Grid, European Climate Forum. Battaglini, A., Lilliestam, J., Haas, A., Patt, A., 2008b. Development of supersmart grids for a more efficient utilization of electricity from renewable sources. Journal of Cleaner Production 17, 911–918. Berger, J., 1994. An overview of robust Bayesian analysis. Test 3, 5–124. Bernardo, J.M., Smith, A.F.M., 1994. Bayesian Theory. Wiley, Chichester.

6319

Billinton, R., Chen, H., 1997. Effect of windturbine parameters on the capacity adequacy of generating systems using wind energy. In: Proceedings of the Conference on Communications, Power and Computing, WESCANEX’97, Winnipoeg, MB, pp. 47–52. Buchholz, B., Engler, A., Hatziargyriou, N., Scholtes, Schluecking, U., Furones Fartos, I., 2006. Lessons learned: European pilot installations for distributed generation. An Overview by the IRED Cluster, CIGRE, paper C6-302. Chebbo, M., 2007. Electricity networks of the future 2020 and beyond. In: Proceedings of the Power Engineering Society General Meeting, IEEE. Coolen, F.P.A., Utkin, L.V., 2007. Imprecise probability: a concise overview. In: Aven, T., Vinnem, J.E. (Eds.), Risk, Reliability and Societal Safety: Proceedings of the European Safety and Reliability Conference (ESREL), Stavanger, Norway. Taylor & Francis, London, pp. 1959–1965. Czisch, G., Giebel, G., 2007. Realisable Scenarios for a Future Electricity Supply Based 100% on Renewable Energies. Universitat Kassel, Riso National Laboratory, Kassel, Roskilde. de Finetti, B., 1974. Theory of Probability. Wiley, New York. Dempster, A.P., 1967. Upper and lower probabilities induced by a multivalued mapping. Annals of Mathematical Statistics 38, 325–339. Dubois, D., Prade, H., 1992. When upper probabilities are possibility measures. Fuzzy Sets and Systems 49 (1), 65–74. Dubois, D., 2006. Possibility theory and statistical reasoning. Computational Statistics and Data Analysis 51, 47–69. Dubois, D., 2010. Representation, propagation and decision issues in risk analysis under incomplete probabilistic information. Risk Analysis 30, 361–368. ETH-LSA, 2009. ETH Laboratory for Safety Analysis, Interdependencies. Report for FOCP. Ferson, S., Ginzburg, 1996. Different methods are needed to propagate ignorance and variability. Reliability Engineering and System Safety 54, 133–144. Flage, R., Baraldi, P., Ameruso, F., Zio, E., Aven, T., 2009. Handling epistemic uncertainties in fault tree analysis by probabilistic and possibilistic approaches. In: Bris, R., Guedes Soares, C., Martorell, S. (Eds.), Reliability, Risk and Safety: Theory and Applications, Proceedings of the European Safety and Reliability Conference 2009 (ESREL 2009), Prague, Czech Republic. CRC Press, London, pp. 1761–1768. Flage, R., Baraldi, P., Zio, E., Aven, T., 5–9 September 2010. Possibility–probability transformation in comparing different approaches to the treatment of epistemic uncertainties in a fault tree analysis. In: Proceedings of the European Safety and Reliability Conference 2010 (ESREL 2010), Rhodes, Greece. Gouveia, E.M., Matos, M.A., 2009. Symmetric ac fuzzy power flow model. European Journal of Operational Research 197 (3), 1012–1018. Groen, F.J., Mosleh, A., 2005. Foundations of probabilistic inference with uncertain evidence. International Journal of Approximate Reasoning 39, 49–83. Helton, J.C., Oberkampf, W.L., (eds) 2004. Alternative representations of epistemic uncertainty, Special Issue Reliability Engineering and System Safety 85 (1–3), 1–369. Helton, J.C., Johnson, J.D., Sallaberry, C.J., Storlie, C.B., 2006. Survey of samplingbased methods for uncertainty and sensitivity analysis. Reliability Engineering and System Safety 91, 1175–1209. Kahneman, D., Tversky, A., 1974. Judgment under uncertainty, Heuristics and biases. Science 185, 1124–1131. Kozine, I., Utkin, L., 2002. Processing unreliable judgements with an imprecise hierarchical model. Risk Decision and Policy 7, 1–15. Kuznetsov, V.P., 1991. Interval Statistical Models. Radio i Svyaz, Moscow (in Russian). Kroger, W., Zio, E., 2010. In: Proceedings of the European Safety and Reliability Conference ESREL 2010, Rhodes, Greece. Kroger, W., Zio, E., 2011. Vulnerable Systems. Springer. Lindley, D.V., 2006. Understanding Uncertainty. Wiley, Hoboken, NJ. Matos, M., Gouveia, E., 2008. The fuzzy power flow revisited. IEEE Transactions on Power Systems 23 (1), 213–218. Parry, G., Drouin, M.T., 2009. Risk-informed regulatory decision-making at the US NRC: dealing with model uncertainty. Nuclear Regulatory Commission, 2009. Reid, S.G., 1992. Acceptable risk. In: Blockley, D.I. (Ed.), Engineering Safety. McGraw-Hill, New York, pp. 138–166. Renn, O., 1998. Three decades of risk research: accomplishments and new challenges. Journal of Risk Research 1 (1), 49–71. Rose, J., Hiskens, I.A., 2008. Estimating wind turbine parameters and quantifying their effects on dynamic behaviour. In: Proceedings of the 2008 American Control Conference, Seattle, WA. Shafer, G., 1976. A Mathematical Theory of Evidence. Princeton University Press, Princeton. SmartGrids, 2006. European SmartGrids Technology Platform. Vision and Strategy for Europe’s Electricity Networks of the Future. DG Research, European Commission, Brussels. Soroudi, A., Ehsan, M., 2011. A possibilistic–probabilistic tool for evaluating the impact of stochastic renewable and controllable power generation on energy losses in distribution networks—a case study. Renewable and Sustainable Energy Reviews 15, 794–800. Stalon, C.G., 1997. Electric industry governance: reconciling competitive power markets and the physics of complex transmission interconnections. Resource and Energy Economics 19, 47–83. Stirling, A., 2007. Science, precaution and risk assessment: towards more measured and constructive policy debate. European Molecular Biology Organisation Reports 8, 309–315.

6320

E. Zio, T. Aven / Energy Policy 39 (2011) 6308–6320

Tickner, J., Kriebel, D., 2006. The role of science and precaution in environmental and public health policy. In: Fisher, E., Jones, J., von Schomberg, R. (Eds.), Implementing the Precautionary Principle. Edward Elgar Publishing, Northampton, MA, USA. UCTE, 2008. Union for the Co-ordination of Transmission of Electricity. Operational Handbook. /www.ucte.orgS (page visited in 2008). US–Canada Power System Outage Task Force, 2004. Final Report on the August, 2003 Blackout in the United States and Canada: Causes and Recommendations. USNRC, 2009. Guidance on the Treatment of Uncertainties Associated with PRAs in Risk-Informed Decision Making. NUREG-1855. US Nuclear Regulatory Commission, Washington, DC.

Van Huylle, F.. 2005. Large Scale Integration of wind Energy in the European Power Supply: Analysis, Issues and Recommendations, European Wind Energy Association (EWEA). Walley, P., 1991. Statistical Reasoning with Imprecise Probabilities. Chapman and Hall, New York. Wiser, R., Bolinger, M., 2007. Annual Report on US Wind Power Installation, Cost and Performance Trends: 2006. US Department of Energy DOE/GO-1020072433. Zimmermann, H.J., 2000. An application-oriented view of modeling uncertainty. European Journal of operational research 122, 190–198.