Dissonance oriented stability analysis of Cyber-Physical & Human Systems

Dissonance oriented stability analysis of Cyber-Physical & Human Systems

2nd IFAC Conference on Cyber-Physical & Human-Systems 2nd IFAC on Cyber-Physical & Human-Systems Available online at www.sciencedirect.com Miami, FL,C...

586KB Sizes 0 Downloads 19 Views

2nd IFAC Conference on Cyber-Physical & Human-Systems 2nd IFAC on Cyber-Physical & Human-Systems Available online at www.sciencedirect.com Miami, FL,Conference USA, Dec. 14-15, 2018 2nd IFAC on Cyber-Physical & Human-Systems Miami, FL,Conference USA, Dec. 14-15, 2018 Miami, FL, USA, Dec. 14-15, 2018

ScienceDirect

IFAC PapersOnLine 51-34 (2019) 230–235

Dissonance oriented stability analysis of Cyber-Physical&Human Systems Dissonance oriented stability analysis of Cyber-Physical&Human Systems Dissonance oriented stability analysis of Cyber-Physical&Human Systems V. Jimenez*, F. Vanderhaegen** V. Jimenez*, F. Vanderhaegen** V. Jimenez*, F.  Vanderhaegen** 

* Universidad Camino José Cela Castillo de Alarcón, 49 Urb. Villafranca del Castillo * Universidad Camino José Cela Castillo de Alarcón, 49 Urb. Villafranca del Castillo 28692 Madrid - Spain (e-mail:[email protected]) * Universidad Camino José Cela Castillo de Alarcón, 49 Urb. Villafranca del Castillo 28692 Madrid - Spain (e-mail:[email protected]) 28692 Madrid - Spain (e-mail:[email protected]) ** Univ. Valenciennes, LAMIH CNRS UMR 8201 ** Univ. Valenciennes, LAMIH CNRS UMR 8201 59313 Valenciennes, France (e-mail: [email protected]). ** Univ. Valenciennes, LAMIH CNRS UMR 8201 59313 Valenciennes, France (e-mail: [email protected]). 59313 Valenciennes, France (e-mail: [email protected]). Abstract: Resilience is defined as the capacity of a system to control successfully its stable or unstable Abstract: definedbyasgaps the capacity a system to control successfully stable or or between unstable states. TheResilience states areis between of Cyber-Physical Systems (CPS) andits Abstract: Resilience isdefined definedbyasgaps the capacity of a system to control successfully itshumans stable or or between unstable states. The states are defined between Cyber-Physical Systems (CPS) and humans Cyber-Physical&Human Systems (CPHS) in terms of factorsSystems such as(CPS) perception, intention, belief, states. The states are defined by gaps between Cyber-Physical and humans or between Cyber-Physical&Human Systems (CPHS) in terms of factors such as perception, intention, belief, behavior or emotion. These gaps (CPHS) are interpreted as of discrepancies called dissonances.intention, The successful Cyber-Physical&Human Systems in terms factors such as perception, belief, behavior or emotion. These gaps are interpreted as discrepancies called dissonances. The successful control ofordissonances makesgaps the CPHS resilient. asThe paper proposes a dissonance oriented stability behavior emotion. These are interpreted discrepancies called dissonances. The successful control dissonances makesonthe CPHS resilient. paper proposesi.e. a dissonance stability analysis of approach. It is based dissonances betweenThe CPS and humans, on conflictsoriented of perception, of control of dissonances makesonthe CPHS resilient. The paper proposesi.e. a dissonance oriented stability analysis approach. It is based dissonances between CPS and humans, on conflicts of perception, of intention, of behavior, or of use for instance. The impact analysis of dissonance focuses on the analysis approach. It is based onuse dissonances between CPS and humans, i.e. on conflicts of perception, of intention, of behavior, or of for instance. The impact analysis of dissonance focuses on the identification of possible or positive or for negative, stable or unstable consequences of dissonances. Examples intention, of behavior, of use instance. The impact analysis of dissonance focuses on the identification of possible positive or and negative, stableillustrate or unstable consequences of dissonances. Examples about dissonances on transportation healthcare the interest of the proposal that is capable to identification of possible positive or and negative, stableillustrate or unstable consequences of dissonances. Examples about dissonances on transportation healthcare the interest of the proposal that is capable to detect possible dangerous dissonances due to an excess of stability and a breakdown of stability of about dissonances on transportation and due healthcare illustrate the interestand of the proposal thatofisstability capable of to detect possible dangerous dissonances to an excess of stability a breakdown assessed gaps. dangerous dissonances due to an excess of stability and a breakdown of stability of detect possible assessed gaps. assessed gaps. (International Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved. © 2019, IFAC Keywords: human reliability, human-centered design, safety analysis, user interfaces, behavior, CPHS Keywords: human reliability, human-centered design, safety analysis, user interfaces, behavior, CPHS Keywords: human reliability, human-centered design, safety analysis, user interfaces, behavior, CPHS   

1. INTRODUCTION time whereas they are persuaded not to err, this conflict is 1. INTRODUCTION time they Dissonances are persuadedarenotthreats to err,orthis conflict is calledwhereas dissonance. opportunities 1. INTRODUCTION whereas they Dissonances are persuadedarenotthreats to err,orthis conflict is Recent fatal accidents involving autonomous driving systems time called dissonance. opportunities about individual, or organisational or Recent fatal accidents involving autonomous driving systems called dissonance.collective Dissonances are threats or knowledge opportunities demonstrate the high level of interdependency between the about individual, collective or organisational knowledge or Recent fatal accidents involving autonomous driving systems elements of knowledge such as facts, attitudes, beliefs, demonstrate the high level of interdependency between the about individual, collective or organisational knowledge or human drivers, the Cyber-Physical Systems (CPS) of the car, elements of knowledge such as facts, attitudes, beliefs, demonstrate thethe high level of interdependency between the behaviours, intentions, suchperceptions or emotions human drivers, Cyber-Physical Systems (CPS) of the car, elements of knowledge as facts, attitudes, beliefs, and the drivers, road infrastructure configurations. intentions, perceptions or emotions human the Cyber-Physical SystemsThese (CPS)undesirable of the car, behaviours, (Vanderhaegen, 2014). and theoccur road infrastructure configurations. These undesirable behaviours, intentions, perceptions or emotions events after of long period of CPS use without any (Vanderhaegen, 2014). and theoccur road infrastructure configurations. These undesirable events after of long period of CPS use without any (Vanderhaegen, 2014). problem occurrence. theCPS integrity of CPS and can be defined as the system capacity to recover events occur after ofThis longincreases period of use without any Resilience problem This integrity of CPSafter and Resilience can be defined as the system recover the trust occurrence. human drivers canincreases allocate the to them. However, from unprecedented situation due tocapacity events to as problem occurrence. This increases the integrity of CPS and Resilience can be defined as the system capacity tosuch recover the trust human drivers can allocate to them. However, after from unprecedented situation due to events such as about two hours of manual driving, it is recommended to dissonances (Ouedraogo et al., 2013; Enjalbert, the trust human drivers can allocate toitthem. However, after from unprecedented situation due to 2013; events Enjalbert, such as about two hours of manual driving, is recommended to dissonances (Ouedraogo et al., make break in order to remain the human vigilant. As 2017). A lack stability, excess of about aatwo hours of manual driving, it is driver recommended to Vanderhaegen, dissonances (Ouedraogo et of 2013;an Enjalbert, break in order to none remain therecommendation human driver vigilant. As Vanderhaegen, 2017). A of lack of al., stability, an excess of amake matter of fact, there is such about the stability or a breakdown stability can generate hidden, make a break in order to none remain therecommendation human driver vigilant. As Vanderhaegen, 2017). A of lack of stability, an excess of amaximum matter oftime fact, there is such about the stability or a breakdown stability can generate hidden, useisofnone a carsuch CPS to guaranty theabout capacity latent, or or patent dissonancesofthat make the system vulnerable. amaximum matter oftime fact, of there recommendation the stability a breakdown stability can generate hidden, of use of car CPSany to guaranty capacity latent, or patent dissonances that successful make the system of the human driver to aarecover technicalthe relates then to the controlvulnerable. of these maximum time of use of car CPSany to guaranty theproblems. capacity Resilience latent, or patent dissonances that successful make the system of the human driver to recover technical problems. Resilience relates then to the controlvulnerable. of these Human drivers are supposed to be always vigilant and dissonances (Vanderhaegen, 2017). The detection the of the human driver to recover any technical problems. relates then to the2017). successful control ofor Human drivers areto supposed to system be always vigilant and Resilience dissonances (Vanderhaegen, The detection orthese the capable to react automated alarms anytime. control of dissonances can produce discomfort or overload Human drivers areto supposed to system be always vigilant and dissonances (Vanderhaegen, 2017). discomfort The detection or the capable to react automated alarms anytime. control of dissonances can produce or overload Interpretation errorsto done by CPS such alarms as autonomous generate new dissonances. Therefore, it is easier not to capable to react automated system anytime. and control of dissonances can produce discomfort or overload Interpretation errors done by CPS suchof as autonomous and generate new dissonances. Therefore, it is easier not to vehicles are sometime explained because the presence of try to control a dissonance and to maintain the initial Interpretation errors done by CPS suchof as autonomous and generate new dissonances. Therefore, it is easier not to vehicles are sometime explained because the presence of try to control a dissonance and to maintain the initial non-compliant safety barriers on the road (Della Cava, 2018). knowledge in order to reduce these negativetheimpacts. vehicles are sometime explained because of the presence of try to control a dissonance and to maintain initial non-compliant safety barriers the road (Della Cava, system 2018). knowledge in order to these negative Does it mean that to make an on autonomous car driving However, a dissonance can reduce also have positive impactimpacts. such as non-compliant safety barriers on the road (Della Cava, system 2018). knowledge in order to reduce these negative Does it mean that to make an autonomous car driving However, a dissonance can also have positive impactimpacts. such as dangerous, it is enough to modify the characteristics of its pleasure, or well-being for instance. The impact analysis Does it mean that to maketoanmodify autonomous car driving system However, a dissonance can also have positive impact analysis such as dangerous, it is enough the characteristics of its pleasure, or well-being for instance. The impact environment? A enough barrier is defined as a characteristics technical or human process of or dissonances have toinstance. take into The account both analysis positive dangerous, it is to modify the of its pleasure, well-being for impact environment? A barrier defined as a technical or human process of dissonances have to take into account both positive support to protect againstis occurrence or the consequence negative consequences (Vanderhaegen, Carsten, 2017). environment? A barrier isthe defined as a technical or human and process of dissonances have to take into account both positive support to protect against the occurrence or the consequence and negative consequences (Vanderhaegen, Carsten, 2017). of undesirable events such as accidents (Vanderhaegen, The results of a dissonance control may modify the CPHS support to protectevents againstsuch the occurrence or the consequence and negative consequences (Vanderhaegen, Carsten, 2017). of undesirable as accidents (Vanderhaegen, The results of a dissonance control may modify the CPHS 2010). However,events humansuch operators are capable to remove knowledge by applying cumulative, merged, redundant, or of undesirable as accidents (Vanderhaegen, The results of a dissonance control may modify the CPHS 2010). However, human operators are capable to remove knowledge by applying cumulative, merged, redundant, or them in order to optimize the compromise between several hybrid reinforcement processes (Vanderhaegen, Zieba, 2014; 2010).inHowever, human operators are capable to remove knowledge by applying cumulative, merged, redundant, or them order to optimize the compromise between several hybrid reinforcement processes (Vanderhaegen, Zieba, 2014; criteria such as safety, workload, quantitybetween or quality of Enjalbert, Vanderhaegen, 2017).(Vanderhaegen, A lack of autonomy in 2014; terms them in order to optimize the compromise several hybrid reinforcement processes Zieba, criteria such as safety, workload, quantity or quality of Enjalbert, Vanderhaegen, 2017). lack of autonomy in terms production (Sedki et al., 2013; Yahia et al., 2015). competences, availability andA of actions or criteria such as safety, workload, quantity or quality of of Enjalbert, Vanderhaegen, 2017). A possibilities lack of autonomy in terms production (Sedki et al., 2013; Yahia et al., 2015). of competences, availability and possibilities of actions or contradictions between scientific results aim at considering production et be al.,analyzed 2013; Yahia et al.,of2015). competences, availability and results possibilities actions or Such threats(Sedki have to in terms impacts. When a of contradictions between scientific aim atofconsidering Such threats haveare to be terms of impacts. CPS or humans in analyzed a wrong in environment or in aWhen wrongaa contradictions between scientific results aim at considering Such threats haveare to be terms of impacts. CPS or humans in analyzed a wrong in environment or in aWhen wrong CPS or humans are in(International a wrong environment or in a wrong 2405-8963 © 2019, IFAC Federation of Automatic Control) Hosting by Elsevier Ltd. All rights reserved.

Copyright 238Control. Peer review©2018 under IFAC responsibility of International Federation of Automatic Copyright ©2018 IFAC 238 10.1016/j.ifacol.2019.01.050 Copyright ©2018 IFAC 238

IFAC CPHS 2018 Miami, FL, USA, Dec. 14-15, 2018

V. Jiménez et al. / IFAC PapersOnLine 51-34 (2019) 230–235

the dissonances as a key point for the design of future CPHS (Vanderhaegen, Jimenez, 2018). Contributions such as (Yue et al., 2010; Wang et al., 2013; Khaitan, McCalley, 2015; Quadri et al., 2015; Sadiku et al., 2017) aim at modelling and analyzing CPHS by implementing dynamic event based models. However, the role of humans and the impact of their behaviours on CPHS are rarely taken into account because they focus mainly on autonomous CPS used for delivering services to humans.

CPHS_out(i): |Hout(i) – CPSout(i)|

(4)

CPS/H(i)= |CPSout(i) – Hin(i|)

(5)

H/CPS(i) = |Hout(i) – CPSin(i)|

(6)

CPHS1/CPHS2(i): |CPHS1(i) – CPHS2(i)|

(7)

The assessment or the evolution of gaps can be done by different sources of assessment, Figure 2. Regarding the same kind of assessment (e.g., CPS or H), when gaps are equivalent, they are stable and when they evolve upon the time, they are unstable. Comparing several kinds of assessment, when they are equivalent or evolve similarly, they converge, and when their assessment or their evolution differs, they diverge. These convergences or divergences can then concern the assessment at a given iteration i or the evolution of this assessment between two iterations i and j.

The identification of dissonances depends on baselines or frames of reference related to required, desired, perceived, interpreted or activated data about a situation that occurs at a given iteration. The lack of baseline or the presence of erroneous baseline can also be a source of dissonance (Vanderhaegen, 2017). Regarding a given baseline, gaps can occur. These gaps between required behaviours given by a baseline with real behaviours in terms of desire, perception, interpretation or action are potential dissonances in terms of conflict for the human operator (H), the CPS or the CPHS, Figure 1.

CPH (t) H (t)

Examples of convergent gaps

Examples of divergent gaps

Stable gaps

(t)

CPH (t) H (t)

(t)

High

Low

Iteration i

Time i

j

i

j

i

j

i

j

i

j

i

j

Unstable gaps

CPS(i)=CPSout (i) – CPSin (i)

CPHS_in (i)= H in (i) – CPSin (i)

Real behaviors of Human: H in (i)

(3)

An assessment of a gap can also depend on differences between expected behaviours from H or CPS with real behaviors from CPS or H respectively, or between expected and real behaviors from different CPHS (e.g. CPHS1 regarding CPHS2):

2. THEORITICAL FRAMEWORK FOR DISSONANCE ORIENTED STABILITY ANALYSIS

Real behaviors of CPS: CPSin (i)

CPHS_in(i): |Hin(i) – CPSin(i)|

Formulas (1) and (2) are behavioral dissonances between the required and real behaviors. Formula (3) refers to situational dissonances when behaviors differ to treat the same situation, and formula (4) relates to referential dissonances when baselines differ. Each value of the gaps can be compared with the value of the previous iterations or with the values of other humans, CPS or CPHS.

Considering the definition of resilience as the capacity of the CPHS to control successfully its stability and instability whatever the operational context, the paper aims at defining the bases of a support to analyse dissonance occurrence and consequences. The occurrence and the treatment of dissonance are seen as possible vulnerable scenarios for CPHS. The possible impact of these scenarios is analyzed in terms of stable or unstable gaps and of benefits, costs and dangers. Their success control makes the CPHS resilient. Sections 2 focuses then on dissonance oriented stability and impact. Section 3 and 4 propose examples to illustrate the interest of such an approach.

Required behaviors for CPS: CPSout (i)

231

Dissonance oriented stability of CPHS(i)

H (i)=H out (i) – H in (i) i

j

i

j

Fig. 2. Examples of convergent, divergent, stable and unstable gaps.

Required behaviors for Human: H out (i)

CPHS_out (i)= H out (i) – CPSout (i)

The gap assessment can give positive or negative value but its assessment on Figures 1 and 2 relates to its absolute value. An extended assessment to distinguish positive and negative values consists in applying the so-called Benefit-Cost-Deficit (BCD) model regarding different criteria (Vanderhaegen, 2004). The BCD model was adapted for predicting barrier removals based on subjective data (Vanderhaegen et al., 2009), on quantitative data (Vanderhaegen et al., 2011) and on simulated data (Sedki et al., 2013) about several criteria

Fig. 1. Examples of gaps assessment for dissonance oriented stability analysis. These gaps are noted CPS(i), H(i), CPHS_in(i) or CPHS_out(i):

CPS(i)= |CPSout(i) – CPSin(i|)

(1)

H(i) = |Hout(i) – Hin(i)|

(2) 239

IFAC CPHS 2018 232 Miami, FL, USA, Dec. 14-15, 2018

V. Jiménez et al. / IFAC PapersOnLine 51-34 (2019) 230–235

such as workload, safety or comfort for transportation systems. It can be adapted to the analysis of the dissonance oriented stability depending then on the assessment of the benefits, costs and deficits or dangers to identify beneficial or hazardous consequences of CPS, H or CPHS behaviours and detect dissonances. It relates to the instantaneous values or to the evolution of the BCD model parameters. Regarding the comparison between two situations, parameters B, C and D can be assessed. These gap assessments consist in comparing the consequences of a behavior that occurs at the iteration ji with the behavior that occurs at the iteration i. Considering a positive function sa(k, i) that returns the impact level on the criterion k of a behavior a that occurs at the iteration i, the benefit gap noted B can be defined as follows:

3. EXAMPLE ON TRANSPORTATION The first example concerns a CPHS configuration in transportation, Figure 2. It is a vehicle that contains several CPS such as Adaptive Cruise Control (ACC), Navigation Support System (NSS), and driver-vehicle interaction systems to make interactions with the human driver possible. Adaptive Cruise Control

Navigation Support System

Driver-vehicle interaction systems CPS

|sb (k, j) – sa (k, i)|, (sb (k,j)
B(k,i,j) =

0

otherwise

Drivers

(8) Fig. 2. Example of CPHS configuration for transportation.

Related to the definition of the function sa(k, i), the lower the value of sa(k, i) is, the better is. The cost gap noted C assesses an acceptable loss regarding a given threshold C:

The dissonance oriented stability analysis process focuses here on the uses and the functioning of the NSS. The NSS gives oral and graphical information to the car drivers who can make a parallel between the environment they perceive and the advices given by the system in terms of navigation, Figure 3.

|sb (k, j) – sa (k, i)|, 0
C (k,i,j)=

0

otherwise

(9)

Iteration 1

A deficit gap noted D occurs when there is a degradation of the severity with a gap over the threshold C:

D(k,i,j)=

|sb (k,j)-sa (k,i)|,

sb (k,j) - sa (k,i) >C)

0

otherwise

Benefits: - Human trust increasing to NSS use - Navigation error reduction - Workload and stress reduction - Safety improvement

(10)

It is also possible to assess a global BCD gap by integrating a probability P(b) of the failure of the behavior b. Then, the beneficial gap B and the acceptable loss C relate to the probability of success whereas the unacceptable loss D to the probability of failure (Vanderhaegen et al., 2010):

BCD(k,i,j)=(1-P(b)).[B(k,i,j)-C(k,i,j)]-P(b).D(k,i,j)

NSS: « go to the left side and turn to the left » Iteration 2

(11)

Another example of assessment of the BCD gap allocates a transformation function , , and  to the B, C and D parameters respectively in order to adapt the calculation regarding to human preference or experience for instance (Polet et al., 2012):

BCD(k,i,j) = .B(k,i,j) - .C(k,i,j) - .D(k,i,j)

Costs: -Ttime increasing of system functioning understanding No danger, no deficit Benefits: - Human trust increasing to NSS use - Navigation error reduction - Workload and stress reduction - Safety improvement Costs: - Time increasing of system functioning understanding

NSS: « turn to the left » Iteration 3

(12)

No danger, no deficit No benefit

No cost

Regarding the gaps on the Figure 1, formulas from (8) to (12) may be adapted to assess additional gaps and compare the impacts of different sets of required or real behaviors of CPS, H or CPHS at the same iteration (i.e., i=j) or between two distinguished iterations (i.e., ij). A matrix of the evolution of the different BCD gaps can then be built to support the analysis of dissonance oriented stability.

Dangers or deficits: - Human trust to NSS use - Workload and stress increasing - Safety degradation

NSS: « go straight »

This generic framework dedicated to gap assessments for dissonance oriented stability analysis process is applied to transportation and healthcare coaching. The gaps assessment is determined empirically and qualitatively on different criteria.

Possible resilient behavior: - Send alerts to nearby vehicles - Stop using the NSS - Go back slowly - Escape carefully from the road - change road infrastructure

Fig.3. Example of dissonance of CPHS perception (adapted from Vanderhaegen, 2012).

240

IFAC CPHS 2018 Miami, FL, USA, Dec. 14-15, 2018

V. Jiménez et al. / IFAC PapersOnLine 51-34 (2019) 230–235

Gap assessment concerns then a comparison between the CPS and the human perception impacts in term of benefits, costs and deficits on criteria such as human trust, workload, stress, error, or time understanding, Figure 3. As information between advice from NSS and the human environment perception converge, the car drivers may trust the NSS during the first two iterations. At the third iteration, a dissonance occurs between the NSS advice and the human perception. This may degrade seriously several criteria such as human trust or safety.

as trust, health, time of CPS use, physical effort. Gaps relate to comparisons between the baselines and the real behaviours impacts. On iterations 1 and 2, advices of the daily sport programme application are determined and adapted regarding online physiological data about the user physical state based on heart rate or body temperature for instance related to the real human physical activity. Benefits and costs may be stable on iterations 1 and 2 but a breakdown of this stability may occur on iteration 3 that can present a serious dissonance of CPS use due to the anxiety crisis occurrence.

A breakdown of stability occurs at this iteration when the drivers realize they drive towards an incorrect direction with a risk of collision with other cars. At that time, the NSS may not perceive this incoherency and maintain its advice because the movement of the car will respect what it is expected to. Dangerous situations may then occur and possible resilient behaviours have to be suggested. On Figure 3, several examples of solutions are proposed to make the CPHS resilient facing this breakdown of stability.

Table 1. Examples of dissonance of CPHS use Iteration 1

2

4. EXAMPLE ON HEALTHCARE COACHING 3

Below is an example associated with healthcare coatching with fitness and physiological measures, Figure 4. Several CPS such as elliptical bike, treadmill, indoor bike or rowing machine can be used for defining a daily sport programme by using a dedicated smart phone application. Other CPS connected to users deliver information about human health on another smart phone application by using data from supports such as heart rate sensor, body temperature sensor or body motion sensor. Smart physical supports Elliptical bike Treadmill Indoor bike Rowing machine

Scenario 30 min of daily sport programme Normal user state Convergence between physiological measures with human feelings 30 min of daily sport programme Normal user state Convergence between physiological measures with human feelings 30 min of daily sport programme Anxiety crisis of the user Divergence between physiological measures with human feelings

Gaps Benefits: pleasure increasing, health improvement, trust improvement on CPS use Costs: time for application use, physical effort increasing No danger, no deficit Benefits: pleasure increasing, health improvement, trust improvement on CPS use Costs: time for application use, physical effort increasing No danger, no deficit No benefit Costs: physical effort increasing Dangers or deficits: health degradation, trust degradation on CPS use Possible resilient behaviors: stop using CPS, include subjective indicators on the smart phone application, use voice and face recognition systems to recognize human discomfort or weakness

CPS

The healthcare application may make a parallel between the heart rate acceleration for instance with the physical effort due to the required sport programme to be achieved by the user. However, the scenario of the iteration 3 indicates that the user makes an anxiety crisis that can give similar symptoms on heart rate. Therefore, this dissonance has to be treated because it can affect the user health or trust, and produce unadapted sport programme for the next iterations. Some solutions are then proposed in order to make the CPHS resilient facing this breakdown of stability. For instance, they relate to the improvement of the CPS ability to detect the physical, physiological and mental human state.

Physical measure supports Heart rate sensor Body temperature sensor Body motion sensor

Sport coatching application: user profile adapted daily programme

233

Healthcare application: online physiological data Smartphone applications

5. CONCLUSION The papers proposed a new original approach to analyse dissonance oriented stability in terms of gaps between assessment criteria. Positive or negative gaps, stable or unstable gaps can be identified and analysed. The challenge of such an analysis approach is its capacity to determine possible dissonances when convergent gaps or divergent gaps occur between CPS and humans. Examples illustrated the real interest of this dissonance oriented stability analysis approach. The first example showed that due to the convergence of the perception of the environment between CPS and humans, the human confidence towards the CPS

Users

Fig.4. Example of CPHS configuration for healthcare coaching. Table 1 gives scenarios of such a CPHS use during three successive iterations. For each iteration and considering data from the associated scenario, possible gaps can occur in terms of benefits, costs or deficits or dangers on criteria such 241

IFAC CPHS 2018 234 Miami, FL, USA, Dec. 14-15, 2018

V. Jiménez et al. / IFAC PapersOnLine 51-34 (2019) 230–235

may increase. However, this situation can be or became dissonant when both CPS and humans do not detect a dangerous situation or when their perception differs suddenly. The second example is a dissonance of the use of mobile applications when conflicts occur between the real individual state and the quantitative physiological indicators. Here again, an excess of stability or a breakdown of stability of the assessed gaps can lead to the occurrence of dangerous dissonances. The successful control of these identified dissonances regarding the characteristics of stable or unstable gaps will make the CPHS resilient and will increase knowledge about their use and functioning characteristics A new challenge consists then in detecting the undetectable or in expecting the unexpectable about CPHS use and functioning.

Engineering Applications of Artificial Intelligence, 64, pp. 295-301. Jouglet, D., Piechowiak, S., Vanderhaegen, F. (2003). A shared workspace to support man–machine reasoning: application to cooperative distant diagnosis. Cognition, Technology & Work 5 (2), 127-139. Khaitan, S. K., McCalley, J. D. (2015). Design techniques and applications of cyberphysical systems: A Survey. IEEE Systems Journal, 9( 2), 350-365. Ouedraogo, A., Enjalbert, S., Vanderhaegen, F. (2013). How to learn from the resilience of Human–Machine Systems? Engineering Applications of Artificial Intelligence, 26, 1, 24-34. Polet, P., Vanderhaegen, F., Zieba, S. (2012). Iterative learning control based tools to learn from human error. Engineering Applications of Artificial Intelligence, 25(7), 1515-1522.

Future works will develop cooperative features of CPHS in order to solve dissonance occurrence and impact jointly involving CPS and humans. Knowledge of a CPHS cannot be static but dynamic. This requires the adaptation of real-time and off-line learning and cooperation principles (Vanderhaegen, 2012; Vanderhaegen, Zieba, 2014; Enjalbert, Vanderhaegen, 2017). The development of supports such as shared work space (Jouglet et al., 2003) or the multilevel cooperative assessment of gaps (Vanderhaegen, 1997, 1999; Vanderhaegen et al., 2004) will facilitate the dissonance detection and control. The analysis of dissonance oriented stability will also be based on several points of view of a given situation by applying principles such as multi-point of view diagnosis (Vanderhaegen, Caulier, 2011), situation awareness, mindfulness or sense-making (Carsten, Vanderhaegen, 2015). The proposed examples were limited to a qualitative and empirical assessment of gaps. Another approach will consist in integrating probabilities of dissonance occurrence or of gaps by taking into account technical, human or organisational factors (Rangra et al., 2017).

Quadri, I., Bagnato, B., Brosse, E., Sadovykh, A. (2015). Modeling methodologies for Cyber-Physical Systems: research field study on inherent and future challenges. Ada User Journal, 36(4), 246-252. Rangra, S., Sallak, M., Schön, W., Vanderhaegen, F. (2017). A Graphical Model Based on Performance Shaping Factors for Assessing Human Reliability. IEEE Transactions on Reliability, 66 (4), 1120-1143. Sadiku, M. N. O., Wang, Y., Cui, S., Musa, S. M. (2017). Cyber-Physical: a literature review. European Scizentific Journal, 13, 52-58. Sedki, K., Polet, P., & Vanderhaegen, F. (2013). Using the BCD model for risk analysis: An influence diagram based approach. Engineering Applications of Artificial Intelligence, 26(9), 2172-2183. Vanderhaegen, F. (1997). Multilevel organization design: the case of the air traffic control. Control Engineering Practice, 5, 3, 391-399.

ACKNOWLEDGEMENT The present research work has been supported by the International Research Network on Human-Machine Systems in Transportation and Industry (GDR I HAMASYTI), by the Scientific Research Network on Integrated Automation and Human-Machine Systems (GRAISyHM), and by the Regional Council of “Hauts-de-France” (Regional Council of Nord – Pas de Calais – Picardie from France). The authors gratefully acknowledge the support of these institutions.

Vanderhaegen, F (1999). Toward a model of unreliability to study error prevention supports. Interacting With Computers, 11, 575-595. Vanderhaegen F (2004). The Benefit-Cost-Deficit (BCD) model for human analysis and control. Proceedings of the 9th IFAC/IFORS/IEA symposium on Analysis, Design, and Evaluation of Human-Machine Systems, Atlanta, GA, USA, 7-9 September 2004. Vanderhaegen, F (2010). Human-error-based design of barriers and analysis of their uses. Cognition Technology & Work, 12, 133-142.

REFERENCES Carsten, O., Vanderhaegen, F. (2015). Situation awareness: valid or fallacious?. Cognition Technology & Work, 17, pp. 157-158.

Vanderhaegen, F. (2012). Cooperation and learning to increase the autonomy of ADAS. Cognition, Technology & Work, 14 (1), pp. 61-69.

Della Cava, M. (2018). Tesla stock dives as feds investigates deadly Calif. Crash. News Center Main, March 27, 2018.

Vanderhaegen, F. (2014). Dissonance engineering: a new challenge to analyse risky knowledge when using a system. International Journal of Computers Communications & Control, 9(6), 750-759.

Enjalbert, S., Vanderhaegen, F. (2017). A hybrid reinforced learning system to estimate resilience indicators. 242

IFAC CPHS 2018 Miami, FL, USA, Dec. 14-15, 2018

V. Jiménez et al. / IFAC PapersOnLine 51-34 (2019) 230–235

Vanderhaegen, F. (2016). A rule-based support system for dissonance discovery and control applied to car driving. Expert Systems With Applications 65, 361-371.

235

Vanderhaegen, F., Jouglet, D., Piechowiak, S. (2004). Human-reliability analysis of cooperative redundancy to support diagnosis. IEEE Transactions on Reliability, 53, pp. 458-464.

Vanderhaegen, F. (2017). Towards increased systems resilience: new challenges based on dissonance control for human reliability in Cyber-Physical&Human Systems. Annual Reviews in Control, 44, 316-322.

Vanderhaegen, F., Zieba, S. (2014). Reinforced learning systems based on merged and cumulative knowledge to predict human actions. Information Sciences, 276(20), 146-159.

Vanderhaegen, F., Carsten, O. (2017). Can dissonance engineering improve risk analysis of human–machine systems? Cognition, Technology & Work, 19(1), 1-12.

Wang, J, Cheng, L., Liu, J. (2013). A new spatio-temporal event model based on multi-tuple for Cyber-Physical Systems. International Journal of Control and Automation, 6(6), 51-62.

Vanderhaegen, F., Cassani, M., Cacciabue, P. (2010). Efficiency of safety barriers facing human errors. Proceedings of the 11th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Human-Machine Systems, Valenciennes, France, pp. 16.

Yahia, W. B., Vanderhaegen, F., Polet, P., Tricot, N. (2015). A2PG: alternative action plan generator. Cognition, Technology & Work, 17(1), 95-109. Yue, K., Wang, L., Ren, S., Mao, X., Li, X. (2010). An adaptive discrete event model for cyber-physical system. Proceedings of the 1st Analytic Virtual Integration of Cyber-Physical Systems Workshop, San Diego, USA, pp. 9-15.

Vanderhaegen, F., & Caulier, P. (2011). A multi-viewpoint system to support abductive reasoning. Information Sciences, 181(24), 5349–5363. Vanderhaegen, F. & Jimenez, V. (2018). The amazing human factors and their dissonances for autonomous CyberPhysical&Human Systems. First IEEE Conference on Industrial Cyber-Physical Systems, Saint-Petersbourg, Russia, May 14-18, 2018.

243