Bisimulation based design of user-interface for discrete event systems

Bisimulation based design of user-interface for discrete event systems

ELSEVIER Copyri ght © IFAC Discrete Event Systems. Reims. France. 2004 IFAC PUBLICATIONS www.elsevier.comllocatelifac BISIMULATION BASED DESIGN OF ...

8MB Sizes 0 Downloads 102 Views

ELSEVIER

Copyri ght © IFAC Discrete Event Systems. Reims. France. 2004

IFAC PUBLICATIONS www.elsevier.comllocatelifac

BISIMULATION BASED DESIGN OF USER-INTERFACE FOR DISCRETE EVENT SYSTEMS Masakazu Adachi • Tos himitsu U shio • Yoshitaka Ukawa '

• Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama , Toyo naka, Osaka, 560-8531 , Japan

Abstract: In human-machine systems, a user pred icts machine's behavior using partial or abstracted information on t he system 's behavior t hrough a userinterface. If the user-interface can not display sufficient information for user's prediction , an automation surprise may occur . In this paper , we consider interaction between a user and a machine through a user-interface , where both the user and the machine are modeled by automat a and the user-interface is represented by a binary relation. First, automation surprises are classified into t hree cases: a blocking st ate , a mode confusion , and a refusal state. Next, we derive necessary and sufficient conditions for the nonexistence of the a utomation surprises using simulation and bis imu lation relations between the machine mo del and the user model. Fin ally, we show that a user-i nt erface and a user model \vit hout automation surprises can be designed by utilizing the bisimulation algorithm. Copyright © 2004 IFAC Keywords : Discrete event systems, user-interface , human-machine system, aut omation surprise , bisimulation.

1

I ~T RO D CC TIO ~

mode indications is required in the user-interface . and machine's modes may change indirectly without user's intention according to their internal variables or environment. Consequently, if the user-interface and associated user 's awareness are not adequate , the possibility of occurrence of automation surprises becomes higher. For example. Palmer reported two automation surprises in aircrafts (P almer. 1995 ). In real-li fe situation. automat ion surprise may cause human errors so that a serious accident occurs. Hence. we need the metho dology to realize a concise and reliable human-machine inte raction .

\\'hen a user controls and supervises mode-rich systems, a user-interface displays partial or abstracted information on a current state of a system or a machine, and the user operates the machine accord ing to the information. In such human-machine syst ems , the user usually tracks or anticipates machine 's behavior based on both information proYided by a user- interface and a mental model of the underlying machine. But due to the incomplete information, machine 's actual behavior may be different from user's awareness of the system configuration so that t he use r \vill be surprised by the difference. Such a faulty interaction is called an automat ion surprise (Sarter et al., 1997). As the automat ed systems become more complex and large-scaled, a large number of

Human-machine systems consist of the machine , the user. and the user-interface. where the machine is designed a priori. So , in order to obtain an adequate human-machine system without au-

159

tomation surprises, we need a model-based codesign of both a user-interface and a user model. Recently, systematic approaches to analysis of human-machine systems have been studied. Degani proposes a model- based analysis of humanmachine systems (Degani, 1996). Parasuraman et al. propose a four-stage model (Parasuraman et al., 2000) . l\Ioreover , techniques developed in computer science have been applied to verification of automation surprises. Rushby demonstrates applicability of model checking to exploration of automation surprises (Rushby. 2002 ) . Degani and Heymann model both machine's behavior and user's action as discrete event systems. and propose an algorithm for generating an adequate user-interface which enables the users to perform specified tasks correctly (Heymann and Degani , 2002; Degani and Heymann , 2002). Their approach is based on a reachability problem in a composite model of the machine model and the user model. Oishi et al. introduce a concept of immediate observability in discrete event systems and apply it to reduction of a number of states in a user-interface (Oishi et al., 2003). In this paper. we address an alternative procedure for constructing human-machine systems \vithout automation surprises by introducing a bisimulation relation of the machine model and the user model. First, we classify automation surprises into three cases: a blocking state, a mode confusion , and a refusal state. Then we characterize them by using simulation or bisimulation relations over the machine model and the user model. Second. we derive some properties which show relationship between the automation surprises and the bisimulat ion relation. Finally. we shov-' that both a userinterface and a user model without automation surprises can be generated by utilizing the bisimulation algorithm.

MACHINE'

Gel

W ')\ ~"X'" 124

.

...

.. 5'

'

abstraction

user-interface I ~ Q\1x Qc

control

Fig. l. Human-machine system. deterministic finite automaton GAl:

GM

= (QM, BM, 5.'V/ , qlvl ,O) ,

(1)

where QM is the set of states , BM is the finite set of events, a partial function 5.1>,1 : QM x B,w -> QM is the transition function , and qM.O E Q M is the initial state. BA! is partitioned into two subsets BM and Bt[ as follows:

(2)

BA1 is the set of observable events which correspond to commands given by the user. In other words. occurrence of every event in BA1 is triggered by user's decision \vhich is based on the user model. In this paper. events in BM will be called observable events or commands alternatively. BM is the set of unobservable events which are triggered by qualitative change of internal states of the machine. An example of mc.'.' hine model is shown in Fig. 2. In this figure , events Q and (J E LN! are commands given by the user and transitions by their occurrences are denoted by solid arrows while events f.1 and 7] E LX1 represent qualitative change of the machine's states and transitions by them are denoted by dashed arrows. The user can observe occurrences of all observable events since they are triggered by the user. But. occurrences of unobservable events can be neither observed directly nor triggered.

2. l\IODEL OF HU?lIAJ\-I\IACHI?'-JE SYSTDlS

A mathematical model of a human-machine system is represented formally by composition of a machine model , a user-interface (information display ). and a user model (a user manual ) for operation as sho\\'n in Fig. l. C sually. the machine model is given a priori, and both the userinterface and the user model are suitably designed in order to avoid automation surprises so that a friendly human-machine system is realized. First. we present mathematical representations of these components.

HIGH

a

7]\

a

f3 LOW

2.1 Machzne model

Fig. 2. An example of machine model.

\\'e assume that a machine considered in this

Throughout this paper. we shall \\Tite 5M (q. 0' )1 (resp. ~5M ( q,0' ) ' ) for any state q E Q.w and event

paper is a discrete event system modeled by a

160

(7 E 'E M if OM (q , (7) is defined (resp. undefined ).

We also denote q 5!... q' if OM(q,(7) = q'. These notations will be also used for other automata.

2.2 User-interface

The user-interface displays abstracted information on current states of the underlying machine and the user gives adequate commands based on both the information and the user model. Obviously, commands given by the user are useful informat ion for user's perception of machine 's current stat es. So, for simplicity, we assume that t he event set 'EM of observable events (commands ) is also included in the informat ion provided by the user-interface .

on the user model. Moreover , since occurrences of unobservable events may cause changes of states of the user-interface, the user model must include such transitions , which can be described by nondeterministic c-moves. Thus , the user model is described by the following nondeterministic finite automaton with c-moves and the state set Qr.; :

Gr.;

= (Q u, 'EM U {c} , Ou , qr.;,o),

where E. is the empty string, a partial set-valued Qu x ('EM U {E. }) --+ 2Q ",' is the function Ou transition function, and qr.;.o E Qu is t he initial stat e. Since t he unobservable events are not observed by the user , we introduce a projection

IT : 'E jl,f

--+

LA1 U { t:}: IT (d

For example, in the machine model of Fig. 2, states Ll and L2 are abstracted as LOW , and HI and H2 as HIGH. In this user-int erface , the user knows whether the user-interface displays HIGH or LO\V , but can not identify machine's internal state exactly. Thus, a st ate of the user-int erface corresponds to a subset of machine 's state set. \Ve represent t he correspondence as a binary relation as follows:

(4)

= {;

if (7 E 'EM' if (7 E 'EM'

(5)

1\ ote that occurrences of unobservable events are classified into the following two cases:

(1) The occurrence of an unobservable event does not cause the change of the state in the user-interface , that is, for (qM , qu ) E I, q~[ E Q M , and (7u E 'EM with qM ~ q~, we have ( q~ , qu) E I. In this case, the user can not know the occurrence of the unobservable event at all. (2) The occurrence of an unobservable event causes the change of the state in the use rinterface, that is , for (qM, qu) E I, q'tvJ E Q M, and (7u E 'EM with qM ~ q;w , we have ( q~\1' qu) rt I. In this case, the user can know that some event surely occurs.

(3)

where, Q c; is the set of states of the user-interface. :\ote that Qu will be used as a set of states in the user model since the user gives commands based on st ates of t he user-int erface. In Fig. 2, the correspondence is given by I ={ (LL LO\V ), (L2 , LOW ), (HI, HIGH) , (H2 , HIGH )}. !'ot e that , when a st ate transition by occurrence of an unobservable event changes a state of the user-interface, the user can recognize that some unobservable event occurs though the user can not ident ify which event occurs in general. In Fig. 2, the user can not know the occurrence of event !Jat all while the user can know that some event occurs when event TJ occurs since the occurrence causes change of states of the user-interface.

Usually, the user manual does not mention about case (1 ) \"ihile it may mention case (2). We assume that , for any qu E Q r.; , transition qr.-' ~ qu is defined in G u . But, transition qu ~ q~. \vith qe; # q ~. is defined if the user manual notices it, otherwise undefined. In the following , for simplicity, we omit all self-loop transitions qe; ~ qe; from figures of the user model. In the user model, for simplicity, we assume that the user can predict the next state after giving commands uniquely (Degani and Heymann , 2002 ; Heymann and Degani , 2002 ). In other words , 10u(qe;, (70 )1 :::: 1 for any qr; E Qu and (70 E 'E),[, \vhere I . I denotes the cardinality. However, transitions triggered by the empty string c may be nondeterministic from Eq. (5).

2. 3 User model

From the practical point of view, it is a difficult problem how to const ruct a user model (Norman. 1988 ). But , in this paper. we assume t hat the user can get an exact user model from a user manual and gives commands ,,:hich are correct in the user model. All information the user can get is both the abstracted information of machine 's states and commands t hrough the userinterface. So, user's decision on given commands is based on states of the user-interface, which belong te the set Qu introduced in the previous section , <' .:l user's prediction on the next state of the usermterface after giving a command in L ~J is based

Figure 3 shows an example of the user model. As mentioned before, machine's states are abstracted as LO\V and HIGH in the user model of Fig. 3, where all self-loop transit ions related to the empty string are omitted in G c; . Recall that Q and 3 E LM are commands (observable events ) and !Jand T7 E 'EX! are unobservable. In this example, t he user manual is assume to notices that the c-

161

move from HIGH to LOW , ou(HIGH, E) = LOW. can occur , otherwise it would be deleted. In the following , athree-tuple (C.\1 , G u , I ) of the machine model, the user modeL and the binary relation will be called a human-machine system.

a

Definition 2. (Heymann and Degani, 2002: Degani and Heymann, 2002) The human-machine system (GM, Cu , I ) has a mode confusion if t he following condition holds: If (qU, qM ) E I-I and qu ~ q~, for some 0"0 E LA1' then there exits qM such that q.\1 ~ qM and ( q~., q;\1 ) >t I - I Definition 3. (Ckawa et al., 2003 ) The humanmachin,c system (GM, Gr.;, I ) has a refusal state if (qU, qM ) E I - I and OU(qL', 0"0) ' I\ ~o.'vf( q!lf, 0"0) ' for some 0"0 E LM

(3. c

Intuitive interpretations of the above definitions are as follows: • A blocking state represents a situation that. when an unobservable event occurs in the machine, the corresponding E-move is not defined in the user model adequately. • \\."e will say that a mode confusion occurs when the updated state of the user-interface after giving a command is different from a state predicted by the user model. • A refusal state represents a situation that , when the user gives some command to the machine. it is prohibited from occurring in the machine. Practically, the machine beeps when the prohibited command is given.

Fig. 3. An example of user model.

3 SI. rCLATIO)J RELATION IN HC:"IA N-:"IACHI:\"E SYSTDIS In this section. we classify automation surprises into three cases, which will be defined formally by the binary relation I. By applying a concept of simulation relation, we characterize each automation surprise. vVe make the follov.·ing assumptions. Assum ption 1. In the human-machine system (GM. C u , I ). for all (qM , qu) E I. q;\1 ~ q~'vI ( 0"0 E L A1)' there exits qv E Qu such that qu ~ qf.: and (q;\1' qv) E I.

For example, in Fig. 3, we consider HI ) E I - I of states in GL' and that the user gives the command change the state from HIGH to user-interface. In the user model ,

Assumption 2. In the human-mach ine system (C.\1 , Cr.;, I ). for every qM E QM, there exists qr.: E Qu such that (q.~I , qr.;) E I.

HIGH 3... LO\\! is defined. but the corresponding transition OM (HI, (3) is not defined in t he machine model. Thus, the command :J is somet imes refused in the state HIGH . Obviously, the user-interface and the user model in Fig. 3 are inadequate.

Intuitiyely. Assumption 1 means that. if a command 0"0 E L ~I is defined at a state in the machine modeL the user can give the command at the corresponding state in the user-interface. ~\'I oreover, Assumption 2 means that the user-interface can work no matter which state the machine is in.

a pair (HIGH , G iH . Suppose :J in order to LOW in the the transition

Here , we define a simulation relation (:"Iilner, 1989; Barrett and Lafortune. 1998) of the machine model by the user model. Definition 4. For the human-machine system (GM, G L·. I ), a simulat ion relation of G.H by G L· is a binary relation I such t hat :

Automation surprises arise when some state transition in the machine is not transmitted to the user through the user-interface correctly or the machine can not follow the user:s command. In the following. the inconsistencies of the state transitions caused by the commands and by the unobservable events in the machine are characterized by the pair (q.\I , qc) E I and (qL".qM ) E I-I. respectiyely.

1) For every q,\.[ E Q.\1, there exists q.\! E qr.; such that (q.'vf . qc) E I. 2) If (qM, qr.;) E I and q.'vf !!... q~\1 (0" E LAd· Il i a )

then there exists q ~. E Q L' such that qu ~ q~. and ( q~I' q~.) E I. 3) If (q.H.O, qd E I. then qr.: = qco, and if (q.\!, qL".O ) E I . then q."v1 = q.HO·

Definition 1. The human-machine system (G.\I, GL· . I ) has a blocking state if the following condiu tion holds: If (q.\{, qc) E I and q:vI a q;H for some 0" u E LXI ' t hen there does not exist q~. E <\.(qL·. =: ) such that (q'\1,q~.) E I.

If I is a simulation relation of C.\[ by GL", we will say that Gr.; simulates G.\,[ by I , denoted by G.\{ ~I Gr.;.

162

Definition 5. For the human-machine system (CM, Cv , I ), a bisimulation relation of CM and Cv is a binary relation I such that CM :SI Cv and C u :SI-l C.w · If I is a bisimulation relation of CM and C u, we will say that C.,..,! and Cv are bisimilar by I , denoted by CM == I Cr.;.

Remark 1. \~iithout loss of generality, we can assume that C.,..,! and Cu have the same initial state (eg. "the machine does not work") in order to guarantee that the third condition of Definition 4 always holds. Lemma 1. The human-machine system (CM , C u , I ) has no blocking states, if and only if CM :SI

Cu For a given machine model CM and a user mode l Cr.;, let CM and Ct be subautomata of CM and C u , respect ively, by eliminating all transitions labeled by unobservable events or the empty string:

= (QM, ct = (Qc;,

C'J..1

L:'J..1, oM, qMO),

(6)

L:~f '

(7)

where transition functions (i = U, M ) are defined by

OO( . a ) 1 q"

={

ou, qU,o),

Of : Qi

x

L:'J..1 ...... Qi

Oi(qi , a ) if a E L:M , undefined otherwise.

Since all transitions labeled by observable events are deterministic. both CM and Ct are deterministic. Both a mode confusion and a refusal state depend on commands, and we will show that the existence for such automation surprises can be checked by using both CM and C N!·

Set Machine model CM, equivalence relation ,,-,0 set QM /""O = {QM \ {qM.O} ,qM.O} set i = 0, C~ = C M / ,,-,0 while :JQ, Q' E Q!vI! ,,-,i and 170 E 2.':'fvl such that 0=1= (Qn PreO'o(Q')) =1= Q set Ql = QnPreO'o( Q'), Q2 = Q\ PreO'o(Q') refine QM / ""i+J = (QM / ,,-,i \ {Q}) U { QJ , Q2} set C~7J = C M/ ,,-,i+J set i = i + 1 end while return Ct F ig. 4. Bisimulation based design algorithm. an adequate user-interface and a user model for a given machine model. it is sufficient to compute both a binary relation I and an automaton C u satisfying Theorem l. So, we propose a design algorithm for them by utilizing a quotient system (Henzinger , 1995).

Definition 6. For a machine model C A-f = (Q M , L:M , o.w, qM.O ) and an equivalence relation "" ~ Q M x Q M, a quotient system of C M is defined as C At!"-' = (QM / "-' , 2.': 'J..f u {~} , OM~ , iiM .O / "-' ), where • Q,V[ / '" is a set of equivalence classes of QM by"" . • for every Q and Q' E QM / "" Q' E OM~(Q. IT (a)) if there exist q.w E Q and q;\1 E Q' such that O.W (qM , a) = q~, and • iiM.O / '" = {q E QM I q'" qM,O}. :'\ote that iiM ,O / '" = {qM.O} by Remark l. Let Cv = C M / "" , then we introduce a binary relation I~ over Q M x Qr.; induced by '" as follows: I~

= {(qM , qu ) E QM

x Qv

i q.,..,!

E

qu}.

Then, we kno'v that a user model C u ahvays simulates the machine model CM by I~.

Lemma 2. The human-machine system (C lv!. Cu . I) has neither mode confusions nor refusal states. if and only if Ct :SI- l C'J..-f.

From Lemma 1, the binary relation I~ and t he automaton Cv provide a user-interface and a user model with no blocking states. Moreover , C M :SI_ Ct is a lso true . Thus. from Theorem l. an adequate user-interface and a user model without automation surprises can be obtained by a bisimilar relat ion I: . which is finer than I~. such that CM == I,:.. CE.· For 170 E 2.': M, let

Since C'fv1 and Cr.; are subautomata of C M and Cl', it is clear that if C;V[ :SI Cl', then C~! :SI CE·. From Lemmas 1 and 2. we have the following theorem.

Theorem 1. The human-machine system (C.\[,

Pre"o( Q' )

Cl', I ) has no autOmation surprises , if and only

if CM :SI Cu and Ct == CM

= { qM E QM I q.\f

~ q;~!.q:\1 E Q'}.

4. DESIGN' OF USER-I:,\TERFACE

Then. by applying the bisimulation algorithm (Henzinger , 1995 ). we can construct an adequate user-interface and a user model without automation surprises (F ig. 4).

In the previous section, we sho",; necessary and sufficient conditions for nonexistence of automatlOn surprises. Based on the conditions. we propose a design method of both an adequate userinterface and a user model. In order to construct

In general. a user model may be nondeterministic when we construct it as a quotient system of the machine model. It is undesirable that it is nondeterministic with res pect to an observable event since the user can not predict the next state

163

GrJa,(J,y -{.w.o (a)

for a user-interface and a user model without automation surprises. It is future work to extend the proposed algorithm to timed systems and hybrid systems.

et

GrJa. y a,(J! ly

(J

REFERE?\'CES G Barrett and S. Lafortune (1998). Bisimulation, the supervisory control problem and strong model matching for finite state machines. Discrete Event Dynamic Systems 8 , 377-429. A. Degani (1996). t\lodeling Human-Machine Systems: On Modes, Error, and Patterns of Interaction. Ph.d. dissertation. Georgia Institute of Technologyy. A. Degani. and ~1. Heymann (2002 ). Formal verification of human-automation interaction. Human Factors 44(1) , 28-43. T. A. Henzinger (1995 ). Hybrid automata v,'ith finite bisimulations. In: Proceedings of the 22nd International Colloquium on Automata, Languages, and Programming (ICALP). pp. 324335. LNCS 944. Springer-Verlag. 1\1. Heymann and A. Degani (2002). On abstraction and simplification in the design of human-automation interaction . .NASA Technical memorandum 211397. NASA Ames Research Center. Moffett Field, CA. R. ~1ilner (1989). Communication and Concurrency. Prentice Hall. ::Jew Jersey. D. A. \'orman (1988 ). The Psychology of Everyday Things. Basic Books. !\ew York. ~vI. Oishi, 1. Hwang and C. Tomlin (2003). Immediate observability of discrete event systems with application to user-interface design. In: Proceedmgs of the 42nd IEEE Conference on Decision and Control. pp. 2665-2672. E. Palmer (1995 ). Oops , it didn't arm. - a case study of two automation surprises. In: Proceedings of the 8th International Symposium on Aviation Psychology. pp. pp. 227-232. R. Parasuraman, T. B. Sheridan and C. D. Wickens (2000 ). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Afan. and Cybernetics-Part A: Systems and Humans 30 (3).286-297. J. Rushby (2002 ). Csing model checking to help discover mode confusions and other automation surprises. Reliability Engineering and System Safety 75. 167-177. \'. Saner, D. Woods and C. Billings (1997 ). Automation surprises. In: Handbook of Human Factors and Ergonomics (G. Sah'endy, Ed ). pp. 1926-1943 Wiley. ::\ew York. Y Ckawa. T. 'C'shio and ~I. Adachi (2003 ). Formal detection of mode confusion in humanmachine interaction. In: Proceedings of the 1st International Symposium on Systems & Human Science. Osaka, Japan. pp 309-314

8

-{.Il.O e.II

(b)

ef.;

(cl)

et

(J Al

-{M.O (c)

cb

Fig ..S. Computing the user-interface and the user model. uniquely after executing a command. The user model computed by the algorithm is deterministic with respect to any observable event. \\'e consider a machine model shown in Fig. 5. where I:AI = {a:,B,"Y} and I:M = {rJ , f.I}. First , we set an initial binary relation as I~ = {(L A), (2, A ), (3. A), (4, A), CS , A), (6, A)}, where A = {L2.3.4.5,6}. We obtain the initial user model CZ· = C!vd~o as shmvn in Fig ..5(a). By Lemma 1, (CM, C~) has no blocking state on ~. However, for transition A ...: A , \ve have A =I- (A n Pre ,. (A )) = {3, 4. 5. 6}, which means CM and Cr· are not bisimilar. ::\ext , the state A is partitioned into two states Al = {L 2} and A2 = {3, 4. 5, 6}, and is shown in Fig. 5(b). After iterations , a we finally have C.\I :::SI~ Cl- and C~w =T~ (Fig 5(d)). Thus , an adequate user-interface and a user model are obtained.

ct

ct

5. CO!\CLCSIO::\S In this paper, we studied interaction between a user and discrete event systems through a userinterface and classified automation surprises into three cases. First. we derive necessary and sufficient conditions for the nonexistence of the automation surprises using simulation and bisimulation relations between the machine model and the user model. :-.iext. we propose a design algorithm

164