A view of human—machine communication and co-operation

A view of human—machine communication and co-operation

Int. J. Man-Machine Studies (1983) 19, 309-333 A view of human-machine communication and co-operation HORST OBERQUELLE, INGBERT KUPKA AND SUSANNE MAA...

1MB Sizes 0 Downloads 14 Views

Int. J. Man-Machine Studies (1983) 19, 309-333

A view of human-machine communication and co-operation HORST OBERQUELLE, INGBERT KUPKA AND SUSANNE MAASS

Fachbereich Informatik, Universitiit Hamburg, Schliiterstrafle 70, D-2000 Hamburg 13, Federal Republic of Germany (Received 16 August 1982) Nowadays computers are increasingly used for communication purposes and less for mere calculation. Userfriendly dialog design for non-computer professicaal users is becoming an important research issue. The discussion has already shown that humanmachine systems have to be studied as a whole: apart from the machine and its T,~ers they include the designers and those persons responsible for the system's application, as well. Computers just play a special role as one element in a highly complex communication network with several human agents linked in space and time. In order to characterize communication between humans and machines the concept of formal communication is introduced and related to natural communication. Communicating behaviour and its determining factors are represented by a model which is based on psycholinguistic concepts of communication and which uses high-level Petri net interpretations. Formal communication can be observed among humans as well as with machines; often it is caused by delegation. Programming of computer systems can be conceived as a special form of delegation. The view of computer systems as communication media with formal communicating behaviour permits an explanation of problems arising from computer applications, especially at the humanmachine interface, and shows directions for future research.

1. Introduction H u m a n - c o m p u t e r c o m m u n i c a t i o n has b e c o m e ran i n c r e a s i n g l y i m p o r t a n t r e s e a r c h issue d u r i n g t h e last d e c a d e . T h e results of p r a c t i c a l s y s t e m s d e v e l o p m e n t , h o w e v e r , are s a t i s f a c t o r y m e r e l y f r o m a t e c h n i c a l p o i n t of view. U s e r s a r e c o n f r o n t e d with highly n o n - t r a n s p a r e n t s y s t e m s t h a t a r e far f r o m e a s y to use. T h e p r o b l e m s of t o d a y ' s c o m p u t e r use m a i n l y result f r o m difficulties in the communi-

cation between the human and the machine. P r o v i d e d t h a t a d e q u a t e c o n c e p t s m u s t b e b a s e d on t h e o r e t i c a l m o d e l s , the w e a k n e s s of p r e s e n t s y s t e m s d e v e l o p m e n t c o u l d b e d u e to the a p p a r e n t lack of an underlying communication model. T h e existing m o d e l s of t h e real s t r u c t u r e of t h e m a c h i n e o r of its virtual b e h a v i o u r as well as m o d e l s of u s e r n e e d s a n d skills a r e not a p p r o p r i a t e . O b v i o u s l y , t h e o l d c o m p u t e r p a r a d i g m of a c a l c u l a t i n g m a c h i n e is not sufficient a n y m o r e . C o m p u t e r s y s t e m s a r e r a t h e r to b e c o n s i d e r e d as media for strictly organizable communication or information flow, an idea, first s t a t e d b y C. A. Petri. This n e w p a r a d i g m , which a c c e n t u a t e s the n o t i o n s of communication a n d information, will b e f u r t h e r e l a b o r a t e d b e l o w . T h e m a i n i d e a s a r e as follows. C o m p u t e r s y s t e m s are c o m m u n i c a t i o n m e d i a with f o r m a l c o m m u n i c a t i n g b e h a v i o u r , to which f o r m a l c o m m u n i c a t i o n a c t i o n s can b e d e l e g a t e d b y p e r s o n s . 309 0020-7373/83/100309-~ 25503.00/0

9 1983 Academic Press Inc. (London) Limited

310

H. OBEROUEI.I.E E T A L .

A comprehensive analysis of the communicating behaviour of all persons involved in the processes of delegation and use can lead to a better understanding of the effects of computer application and to an improved design. A careful comparison between interpersonal and formal human-machine communication is needed to understand the impact of increasing computer application. This paper develops and uses a model for human-computer communication in four basic steps. First, after a historical review, communication is described by six pragmatic characteristics which apply to both, interpersonal and human-computer communication. Secondly, a systematizing step leads to communication schemes represented by means of high-level Petri nets. Referring to these schemes, the notions of formal and algorithmic communication are introduced. Finally, different forms of humancomputer co-operation are discussed.

2. The evolution of human-computer communication Computers have been developed as tools for performing calculations and other data manipulations which can be done automatically. Once in use, however, they turned out to be devices with a complex communicating behaviour, able to follow instructions, to accept questions and to produce answers and requests as well as results and error messages. 2.1. THE HISTORICAL D E V E L O P M E N T

From the communication point of view four stages can be distinguished: the early stage of computer development, batch processing, time-sharing and the present-day development. Different kinds of communication can already be found in the early stage of computer

development. 1940 A non-programmable computer with three terminals is shown at Bell Telephone Laboratories. 1941 G. R. Stibitz and S. B. Williams demonstrate remote data processing via telephone and typewriter at the American Mathematical Society. For comparison: 1941 K. Zuse completes his Z3. 1944 H. H. Aiken constructs the Mark I. Computers of the so-called first generation were used directly. The user could interrupt the machine and control program runs by giving commands and receiving messages concerning the status of the machine. For the user's understanding these were elementary messages. There was no message exchange concerning the communication process itself, i.e. no metacommunication, like asking for input formats or changing them. During the era of batch processing the direct contact to the machine disappeared. The messages were buffered in queues. Software as a new and growing layer was put

HUMAN-MACHINE COMMUNICATION/CO-OPERATION

311

between the machine and its users, thus opening a new field of communication. Declarations concerning input/output conventions for the languages to be used in a job were exchanged between user and machine. The user took these specifications as schematically constructed technical means without realizing that they formed some kind of metacommunication. The use of prepared job-control cards in card decks for batch processing was a typical example.

Time-sharing brought back the original immediate contact between the machine and its users. Working quasi-simultaneously at the same computer users could additionally communicate with each other. The main characteristics of this kind of usage, however, are determined by the increase of processing speed and storage capacities, which allow a complex communicating behaviour to be programmed. Milestones of system development for time-shared computers are the systems designed at M.I.T. (Massachusetts Institute of Technology). 1961 CTSS (Compatible Time-Sharing System), first version. 1967 MULTICS (Multiplexed Information and Computing Service). Numerical calculation was the main purpose of the early interactive application systems. 1962 K. I. Iverson's APL, later on extended to the widespread version APL\360. 1965 BASIC (Beginner's All Purpose Symbolic Instruction Code). 1966 JOSS (Johnniac Open Shop System). These, and other, conversational languages (cf. Klerer & Reinfelds, 1968; Kupka & Wilsing, 1980) enriched the inventory of linguistic elements for programming purposes. The user's mode of working changed from passive to active and furthermore to an interactive mode with increasing control facilities (Hoffmann, 1977). The increased division of labour allowed the exchange of information at the latest possible moment. The level of metacommunication in time-sharing systems is higher than that of batch-processing. Examples of metacommunication are log-in and log-off commands, conversational declarations concerning the mode of processing (immediate execution or collection or both), help facilities and interactive error handling.

The present-day development of human-machine dialogs reflects the general trends in information technology. The rapid progress of integrated circuit technology has led to new technical facilities for storing large amounts of data, accessing them in various ways, and for applying information processing under quite different circumstances in all kinds of technical, scientific, industrial or administrative domains. For instance, the number of elementary functions on one chip has increased exponentially, from about 103 in 1970 to about 106 in 1980, whereas the amount of energy needed for one operation, which can be taken as a measure for the costs, is exponentially decreasing (Stein, 1978). Times for main storage access may fall under the limit of 10-9s (Turn, 1974). Obviously the machine part in man-computer dialogs now enters a new dimension. As a consequence, the recent expansion of computer application s has reached the field of public and commercial administration as well as many industrial workplaces. Thus human-machine dialogs have no longer to be designed for computer specialists only. Now many naive users work with mass-produced computers and terminals. For

312

H. OBEROI.JEI_LE E T A L .

systems design this means that emphasis shifts away from purely technical considerations to the problem of workplace design. The invention of new transmission devices, e.g. based on glass fibre optics, with high transmission rates and a large bandwidth, affects the propagation of information. The same holds for new technological concepts for portable storages of high capacity, e.g. floppy disks. Increasing communication via networks leads to human-machine dialogs for performing distributed information processing and interpersonal communication via computers rather than isolated communication between an individual user and a machine. New telecommunication techniques revolutionize technical and social communication. For details see Kaiser, Lange, Langenbucher, I,erche & Witte (1978). They are based on an increasing integration of information processing and transmission (Tietz, 1980; Ganzhorn, 1979). The technical interface for future human-machine communication will combine the input/output media of computers--including new systems for image and speech recognition and generation (see Fellbaum, 1980)--with transmission devices like telephone, TV-set and telefax services (cf. Naffah, 1979). 2.2. TYPES OF H U M A N - C O M P U T E R D I A L O G S

The prototype of human-computer dialogs is the alternating sequence of user actions and machine reactions. This quasi-symmetry is reflected by the syntactical structure. As stated in Kupka (1974) the syntax of a dialog consists of the user's "local" syntax for (isolated) inputs, a corresponding "local" syntax for (isolated) outputs of the machine, and a "global" syntax combining both. Unlike batch processing, dialog allows a subdivision of actions and reactions leading to a mode of working where human and computer co-act in parallel. If the user is allowed to control the work process immediately by interrupt facilities the mode of operation is called interactive (Hoffmann, 1977). The semantics of dialogs is multi-layered: there is a primary (inner) layer of problem-oriented data structures and data-specific operations and statements. A second layer concerns the manipulation of objects like storing program parts, editing or file handling. Dialog-specific features and user-to-user communication facilities form further (outer) layers. Conversational systems like APL\360 combine several layers in one language, whereas non-conversational languages have to be supplemented by an editing language and a command language in order to cover the different layers. For a detailed discussion see Kupka & Wilsing (1980). Frequently, user and machine can refer to the dialog itself, its preconditions and its components. This kind of reflexivity of human-computer dialogs together with the layered structure of semantics reveals the affinity to natural communication. A reasonable co-operation in a human-computer dialog depends on two facts: users have to understand the machine they are working with, at least to a certain extent, and the designers of the hard- and software must have based their decisions upon realistic ideas about the future users. A user's system model can normally only be a rough image of the real or virtual machine because of its high structural complexity. In the user's understanding the system's function is often reduced to pure data processing. The communication aspect,

IIUMAN-MACtIINE

313

COMMUNICATION/CO-OPERATION

however, may be at least as important for a good understanding and successful co-operation as the operational aspect. Users may internally build up system models by experience and without referring to any formal description. Existing formal models frequently cover isolated layers of semantics only. Comprehensive and comprehensible models are m o r e likely to be derived from descriptions of conversational systems like A P L \ 3 6 0 which integrate all layers of language semantics. Many user models have been used or proposed for the design of systems. On an elementary level such models define different types of users according to certain classes of activities. Dehning, Essig & Maass (1981) present the following comparative overview.

Codd/date Altshuler/ Plagman

Casual user

Parametric user

Structurally independent user

Structurally parametric user

Analysts & researchers

Application System programmers programmers Structurally dependent user

(

End-user

IBM

)

Applications programmer Palme

,~

Dolotta

End-user

Martin

IDirect end-user) Casual - user Untrained - -

Systems programmer

Naive user

et al. Martin

)

)

(

, System support user

Dedicated user

Trained user

user

Martin

Nonprogrammer FIG. 1. D i f t c r e n t user types a n d t h e i r range.

Programmer

,, R a n g e ;

7, u n c e r t a i n range.

General preconditions of h u m a n - c o m p u t e r dialogs are the determined, algorithmic behaviour of the machine and the user's f r e e d o m of behaviour, which is only restricted by communication conventions. Hence different dialog forms are possible. For a detailed systematic representation see Dehning et al. (1981). An important distinctive characteristic is the guidance of the dialog. According to the role of the computer as a tool the user may have the initiative. If the instructions for the machine are unambiguously determined, we have a command dialog. This dialog form is provided by desk-calculator-like systems and conversational languages generalizing this concept. Quasi-natural language dialogs allow ambiguous instructions. It is also possible to p r o g r a m the machine to take the initiative in dialog. This may lead to different kinds of restrictions for user actions. In the case of form filling the user's choice consists of filling in one of several possible input values. In menuconversation the user may select from a list of actions displayed by the machine. Less

314

H

OBERQUEI.I.E ET AL.

restriction characterizes dialogs based on system requests. The user's inputs are either restricted by a syntactical scheme, often indicated to the user by a generalized example, or they can be chosen freely in quasi-natural language. Further dialog forms arise from combinations of dialog steps of the types mentioned above. This may lead to dialogs where the actual initiative or the kind of restriction for the inputs change systematically. Corresponding strategies which support successful communication depend on the underlying user models. The development of models for dialog forms which explicitly define metacommunication facilities is a problem to be solved in future. The design problems of h u m a n - c o m p u t e r dialogs become evident when considering so-called critical dialog situations. An error situation occurs if the actual input given to the machine violates the actually valid syntactic or semantic conventions. Contradictory or incomplete inputs may lead to errors as well. Unexpected system reactions are defined with respect to the user's view. They need not only occur in error situations. The user reacts on them as in an error situation. Demand situations generated by computed input requests and situations of suspense belong to the normal behaviour of interactive systems but turn out to be difficult to handle by the user. A situation which may confuse the user is the semantic deadlock, which can generally be described as follows. For a meaningful continuation of the dialog the user first needs some information which could only be given to him by the system. But he can only get this information after a correct completion of the current dialog step. All the situations mentioned here can consistently be understood as special kinds of communicating behaviour. Semantic deadlocks, for instance, result from a lack of metacommunication facilities. 2.3. SOME ANALOGIES AND THEIR LIMITATIONS H u m a n - c o m p u t e r communication is comparable with communication between persons at least as far as the human is concerned. Consequently, many design proposals for human-machine communication aim at a superficial adaptation to natural dialogs. Examples are graphic and acoustic communication as well as certain linguistic metacommunication techniques, like abbreviations, or more generally the development of high-level programming languages which partially follow natural languages ( i f . . . t h e n . . , else, w h i l e . . , d o . . . etc.). The apparent analogies with natural dialog can mislead the unskilled user to assume a more fundamental similarity. The system E L I Z A of J. Weizenbaum was frequently misunderstood as a demonstration of real human behaviour by a machine (cf. Weizenbaum, 1976). The tool character may be considered an essential limit of the analogy between a dialog with a machine and a natural dialog. Any behaviour of the machine, be it ever so natural in its appearance, results from a planning process outside the computer, nondeterminacy included. Unlike humans a computer does not communicate by its own free will but it performs a communicating behaviour entrusted to it. The complexity of this programmed behaviour may be high and intransparent, its limits, however, are defined by computability. [Turing believed that finally human bchaviour underlies the same limitation (Turing, 1950).] This limitation of machine behaviour by computability is a fundamental but abstract one. In addition, much effort is needed for getting a clearer idea of the role of the computer. The traditional paradigm of a calculating machine is too narrow and should

HUMAN-MACHINE COMMUNICATION/CO-OPERATION

315

be extended to a paradigm of a communication medium. An analysis of its role should not only refer to the hard- and software but it has to include all persons involved in building, maintaining and using the computer. Dialogs between humans and machines refer to local processes and isolated applications. A characteristic of the communication point of view is the integration of the analysis of application problems, the design of real and virtual machines and their conversational or non-conversational use.

3. Communication as a general phenomenon Communication is a general p h e n o m e n o n in human life. So there is a great variety of definitions and ideas about it [see, for example, the 160 definitions reported in Merten (1977)]. According to its respective concepts and structures, every special discipline puts an emphasis on some specific aspects which are at best considered marginally by other disciplines. We will shortly characterize some of them. 3.1. MESSAGE TRANSMISSION

In 1945 C. E. Shannon and W. Weaver developed their Mathematical Theory of Communication (Shannon & Weaver, 1949). The theory reduces communication to generation and coding, transmission (including noise), decoding and reception of messages. This mathematical approach characterizes every single message by the probability of its occurrence. It allows exact statements about codes, error discovery, properties of transmission channels, etc. Problems of message interpretation are explicitly excluded. 3.2. COMMUNICATION AND ORGANIZATION

Organization theory views communication as some means for the organization of functional units with regard to some management obiectives [see Grochla (1973) and his references]. Communication means the network of informational relations among active units (people, machines) which is the necessary precondition for co-operation. There is a strong relationship between information and decision making inasmuch as information is obtained as a result of decisions and is needed for further decisions. A distinction is made between formal and informal communication. Compared with informal communication, in formal communication some components are missing; it is restricted to the explicit and intentional communication of information, it thereby becomes authentical and can be attributed to the sender. Informal communication can be indirect and ambiguous, it may, for example, serve for emotional expression or for social encouragement. Formal communication is again divided into bounded and free communication. In contrast to free communication in bounded communication the paths, partners, frequencies, forms or means of communication are restricted. C. A. Petri was one of the first informaticians to study computer use under the aspect of communication. Already in the ambiguous title of his doctoral dissertation "Kommunikation mit A u t o m a t e n " (Communication with/by automata) (Petri, 1962) he addresses a problem without a generally accepted answer in informatics: Is the automaton to be seen as a tool or a partner in communication?

316

H. O B E R Q U E L I . E

ET AL.

Petri considers all appearances of the actual information flow as communication. To him information flow means the structural properties of effects, as studied in physics. In his attempt to develop a conceptual basis for an exact theory of communication Petri's main tools are nets, at first especially place/transition nets. Later he, and other researchers, started to develop a general systems theory for information and communication systems (including a conceptional, a descriptive and a deductive component): the so-called general net theory (see Petri, 1980; Holt, 1976). According to Petri there will be a transformation of communication in society following from the introduction of interlinked computer-based information and communication systems. Automation is considered to be a communication problem insofar as tasks are delegated to other agencies: the computer serves as a medium for strictly organisable information flow. In this connection Petri believes mathematical information (transmission) theory and traditional automata theory to be insufficient and the ideas of communication science to be too human-oriented. So he proposes an informaticsspecific approach. H e tries to understand the p h e n o m e n o n of communication by studying the 12 so-called "communication disciplines" given in Fig. 2. They are listed by increasing difficulty; disciplines on the same line are considered to be closely related to each other (Petri, 1977). 1. 3. 5. 7. 9. 11.

Synchronization Addressing Copying Composition Authorization Delegation

2. 4. 6. 8. 10. 12.

Identification Naming Cancelling Modelling Valuation Reorganization

FIG. 2. Communication disciplines.

For the majority of researchers in informatics the communication issue has been a minor point of interest up to now. A survey just from a technical point of view can be found in Steinbuch (1977). 3.3. H U M A N C O M M U N I C A T I N G

BEHAVIOUR

In the context of psychological stimulus-response schemata communication is considered as a stimulus which---depending on given predispositions--produces effects in the recipient, which are reflected in his behaviour. As relevant predispositions we have, for example, the recipient's self-image, his or her psycho-physical condition, expectations, internalized norms of conduct, etc. Sociology takes into special consideration that senders and recipients belong to certain social groups, which shape their norms and expectations. Psycho-linguistics and socio-linguistics combine concepts and ideas of different disciplines. Therefore they are more successful in approaching the communication issue than the respective single disciplines. Today there is no standardized concept to describe communication. But certainly the number of those people is decreasing who believe that communication means the transmission of meaning or information--as something objective and well-defined-which can be determined without any reference to the individual sender or recipient.

HUMAN-MACHINE COMMUNICATION/CO-OPERATION

317

Human communication does not only include statements about facts (semantic level). Concurrently processes are going on on a pragmatic level which may at times be even more important than the factual content of a message (cf. Watzlawick, Beavin & Jackson, 1967; Fittkau, Miiller-Wolf & Schulz von Thun, 1977). The more recent literature shows a clear trend to include pragmetic aspects in the description of communication (e.g. Gloy & Presch, 1975; Laing, Phillipson & Lee, 1966; Lewis, 1969).

Communication is seen as a complex of (social) actions for the purpose of mutual understanding and for allowing co-ordinated actions (co-operation). In the communication process everyone has to participate actively. The sender wants to achieve some effect in the recipient, he plans his statement accordingly. In order to be comprehensible for the partner he orients along certain conventions and a partner model. In natural communication we follow social norms of conduct and take social roles which are then reflected in the applied syntatic, semantic and pragmatic linguistic conventions. In addition to these general conventions there are also special conversational parts which develop during dialog, a certain choice of words, abbreviations, etc. Partner models help to predict behaviour. They also contain assumptions about the partner's expectations and lead to the so-called "expectability of expectations" (cf. Baacke, 1973). The recipient trieg to reconstruct the planning process for the message: in order fully to comprehend its meaning he has to find out the factors of influence the partner has considered in formulating his statement. He also relies on the existing conventions and his partner model: What could be the partner's intentions, how would he normally express himself for this purpose? Obviously, mutual understanding in communication is based upon the observance of norms and the consideration of partner models. On the other hand, norms and models are modified in communication and new conventions and expectations evolve. This is partly due to the fact that human communication always includes metacommunication, i.e. the communication process itself can be made a topic of conversation. 3.4. A GENERAL CHARACTERIZATION OF COMMUNICATION Communication in general happens in systems with several interacting agents performing specific acts: the exchange of messages by means of a medium. In contrast to real actions aiming at a modification of the real world we call these actions constituting communication symbolic actions--after all, the messages refer to other, real, actions or facts. There is an obvious first difference between natural communication and human-machine communication with regard to the medium: the interaction language and the communication paths between agents have to be fixed precisely. The following six characteristics can easily be shown to be valid for natural communication; in the literature they have been discussed with varying degrees of specialization and emphasis (see Baacke, 1973; Merten, 1977; Watzlawick et al., 1967; Fittkau et al., 1977). For h u m a n - c o m p u t e r communication they will have to be validated in detail or to be interpreted differently. The six statements are not intended to cover every aspect of communication but they help to get a clearer impression of the similarities and differences between natural and what we will later call "formal" communication.

318

H. O B E R O U E L L E

ET AL.

(1) Communication serves to co-ordinate (real and symbolic) actions of several agents. (2) Communication is determined by the objectives of all participants (intentions). (3) Communication depends on comparable premises for understanding (knowledge and conventions). (4) Communication can refer to the communication process itself and to its preconditions (metacommunication). (5) Communication is always coupled with expectations concerning the partner (partner model). (6) In communication there is a trend towards economical behaviour.

A first application to human-machine communication reads as follows.

(1) Communication serves to co-ordinate (real and symbolic) actions of several agents Participants in h u m a n - c o m p u t e r communication are, to begin with, systems designers and users who co-operate over a distance in space and time. The computer takes over well-defined tasks by delegation. So it appears as an agent on its own who follows a fixed procedure (virtual machine) and who can in principle perform real and symbolic actions. According to the kind of delegated action the computer can take the part of a dialog partner or of a complex medium which mediates between different users. In both cases communication takes place via a formalized language for input and output. The dialog types mentioned in section 2 are examples for different kinds of coordination between a user and a computer in its role as a dialog partner.

(2) Communication is determined by the objectives of all participants (intentions) In their problem solving processes designers and users try to get help from the system for the solution of sub-problems. These human intentions can be traced back to certain actual human needs. The computer as a machine does not have any needs and thus no intentions either. Strictly following programmed procedures it only mirrors its designers' intentions. For a user the intentions of all people involved in the design of a complex computer system may melt into one another and form one (fictive) single intention; by its nature this intention may appear inconsistent and incomprehensible.

(3) Communication depends on comparable premises for understanding (knowledge and conventions) Successful communication is based on a similar use of language and world knowledge by all participants. This does not mean that everyone has to have identical premises for understanding. Knowledge applied in natural communication is unformalized or not completely formalized. In h u m a n - c o m p u t e r communication the relevant knowledge is totally formalized. In contrast to the joint stepwise development and modification of dialog conventions in natural communication language conventions in h u m a n - c o m p u t e r communication are exclusively defined by the systems designers.

HUMAN-MACHINE COMMUNICATION/CO-OPERATION

319

They can only be modified if this has been forseen and designed for by the designers-another one-sided decision.

(4) Communication can refer to the communication process itself and to its preconditions (metacommunication ) This characteristic is of special importance if we talk about computers as dialog partners. It refers, for example, to HELP-functions, declarations, implicit default values and interrupts. Metacommunication is necessary to improve communication conditions; from the user's point of view it makes communication more efficient, As already mentioned, meta-communieation with a computer is restricted to what the designers allow. (5) Communication is always coupled with expectations concerning the partner (partner modeO The respective partner models of all participants influence the design and interpretation of statements. A user as the human partner in human-machine communication plans his own statements and interprets the system's output in the current dialog. He often has an anthropomorphic partner model (sometimes called "system model") and thus does not see the limitations of the system's behaviour that are a consequence of delegation. The information about the designer's intentions the user assembles in his system model may be very inconsistent and contradictory. The computer's "user model" consists of the accumulated user models of the designers; it may be modified within predefined limits. In natural communication the participants can mutually influence their partner models. To avoid misunderstanding in h u m a n - c o m p u t e r communication the user is forced to adapt to the computer's user model. The computer's statement design follows implemented rules: the designers have done the planning long before it becomes necessary in dialog. There is no real attempt to reconstruct the user's planning process: only a limited number of input alternatives have been implemented. The user can only modify the system's partner model if he can either communicate directly with the designers or if he can slip into the designer's role and reprogram the system. (6) In communication there is a trend towards economical behaviour The delegation of repeatable communication acts itself can be explained by economical reasons: the computer can perform these actions in a cheaper, quicker and more reliable way. The use of abbreviations, preformulated commands and implicit defaults are further examples. The designers' efforts for economical behaviour, however, may turn out to be counterproductive since to them it may be more important to complete the communication planning process (systems design) in time rather than to design good communication patterns for future system use.

4. Models of communicating behaviour The preceding sections have shown that communication has many aspects and can be studied on several levels of abstraction. For a uniform treatment of communication p h e n o m e n a in h u m a n - c o m p u t e r systems it is necessary to have a general model and

320

H. O B E R Q U E L I . E

ET AL.

a method which allows to abstract from details (for example from the type of agents) and to interrelate different views. As indicated in section 3, C. A. Petri's general net theory provides a useful framework. It allows to use nets with different interpretations and to interrelate them by net morphisms. Starting with a short introduction into interpreted nets we will develop a set of communication schemes which reflect the general characterization given in the last section. Based o n t h e s e schemes the notions of formal communication and algorithmic communication are introduced and related to delegation and programming. 4.1. NETS, I N T E R P R E T A T I O N S A N D M O R P H I S M S

A net is a mathematical structure consisting of two n o n - e m p t y disjoint sets of nodes, called S-elements and T-elements, respectively, and a binary relation F, called the flow relation. F connects only nodes of different types and leaves no node isolated.? Usually these c o m p o n e n t s are graphically represented as in Fig. 3. An example is given in Fig. 4. S- elements

T- elements

F- elements

L Circle

Elhpse

Square

Rectangle

, Arrow

FIG. 3. Representation of net elements

FIG. 4. Graphic representation of a net. Nets can be interpreted by using a suitable pair of concepts for the sets S and T and a suitable interpretation for the flow relation F. The meaning of individual S- or T - e l e m e n t s is indicated by inscriptions added to the net. The interpretations as condition/event systems or place/transition nets (often called "Petri nets") are well known in the literature on nets (see Brauer, 1980). In this p a p e r we use two higher level interpretations, channel/agency nets and means/activity nets, as introduced in Oberquelle (1980). The channel/agency interpretation allows o n e to describe the static structure of a system with several active and passive functional components. The following concepts are used. + A net can be formally defined as a structure N = (S, (1) SuT#QS, (3) F q S • (2) S n T , ~ QS, (4) domain(F)~range(F)=S~T.

1"; F)

with

I IUMAN-MAC1

lINE

321

COMMUNICATION/CO-OPERATION

S(@): channel = passive functional unit able to store some means of the system; T(I--): agency = active functional unit consuming, transforming or producing means in the system F(-*): access relation; A

F]

C

,0:

c

(~

A may put means into C;

A

,[__~: A may take means out of C.

This interpretation is closely related to means~activity nets where the interpretation is as follows: S(@): means = real or informational entity; T(~_ I): activity = (repeatable) action of a system; F('-}): [a]

"~i~: activity a produces means m ; ~:

activity a uses means m.

This interpretation is especially useful for the description of complex tasks independent of their distribution a m o n g channels and agencies. Both interpretations may be combined since each agency performs one or m o r e activities, each channel may contain one or m o r e means. However, not every collection of means or activities corresponds to a meaningfully abstracted channel or agency, respectively. The main operations (relations) between two nets are abstraction, embedding and folding. They can be formally defined as special net morphisms (see Genrich, Lautenbach & Thiagarajan, 1980). For this p a p e r it will suffice to explain these concepts verbally and by examples. Wc say net N I is an abstraction of net N2, if the S-elements (T-elements) of N1 are the image of a region of N2 with only S-elements (T-elements) as border elements and if the the F-relation of N1 is induced by the F-relation of N2. N2 may then be called a relinement of N1. Both nets can bc graphically represented in one figure; the F-elements of the image can be omitted (see Fig. 5). /

\

/

[] () F I G . 5. E x a m p l e

\

/

,

W

L

for abstraction/refinement.

,'__

--

r"-

-I

~ .

: Z - -N=.--

-, R e f i n e d n e t ; - - - ,

\

~

abstract net (image).

322

H. O B E R Q U E L L E

ET AL.

T h e embedding of a net N1 in a net N2 is an operation which identifies N1 with a subnet of N2. Read in the other direction it may be called an extraction (see Fig. 6).

FiG. 6. Example for e m b e d d i n g / e x t r a c t i o n ,

E m b e d d e d net; - - -, rest of image.

---,

Folding a net means to m a p S-elements onto S-elements and T-elements onto T-elements while keeping the F-structure. This operation is useful to point out the relation between a process (e.g. a dialog) and a system which shows this behaviour (see Fig. 7) / -- ",

~

I

II

I r-"~

I ~ ~ _ _

'3---kj' I

, IC~I

r-----7

/" - ' x

f/'~

, / /I

I

I

I

I]~--~

,

I

I

'l

I

-

~ ~

,

:

;

I

J

~

:~

iI

7C-;',

,[--7

I

I

I

I

i

, L~L.',

,~---~;U", ~ I ',

t

, ~ L

~

I

--

l\ ~ FIG. 7. E x a m p l e for folding/unfolding. - - ,

,

I

:L_I,

~

, ,/1

.

i~-'~

,

i'0 -.. "~

j

/

II

Unfolded net; . . . . , folded net (F omitted).

4.2. C O M M U N I C A T I O N S C H E M E S

Using interpreted nets and net morphisms we can construct a sequence of schemes which reflect the six characteristics of communication (see section 3.4). They also include important special eases which have been treated independently up to now. By means of channel/agency nets and refinements we will first illustrate the structure of the system in which communication takes place. The means relevant for a communication act are then represented by means/activity nets in further refinements. P1

R

P2

SCHEMF 1. Two partners (agencies) P1, P2 are connected by the s urroundi ng reality R (channel) through which they can interact.

11UMAN-MACHINE

323

COMMUNICATION/CO-OPERATION R

PI

k. f -- "-

P2

SCHEME 2. S e p a r a t i o n of a c o m m u n i c a t i o n c h a n n e l C from the e n v i r o n m e n t E to which actions p r i m a r i l y refer ( r e f i n e m e n t of R).

The models for information transmission of Shannon & Weaver (1949) and for feedback via the.environment (see below) can be embedded in this net. Sender

Channel

Receiver

"7- , , t t /I Environment

SCHEME 2A. P1 ! '1 ST'I

A1

"-I t

P2 rI A2

C

I

ST2

"-1 t t

t._

-J

SCHEME 3. S e p a r a t i o n of a s t o r a g e c o m p o n e n t f r o m an active f u n c t i o n a l unit inside the two partners.

This refinement includes the dialog processing system (see Oberquelle, 1980) which b e c o m e s apparent as a subsystem if we refine channel C.

\

/

o

SCHEME 3A. P1

C

P2

S C H E M E 4. R e p r e s e n t a t i o n of the fact that P1 and P2 access o n l y p a r t i a l l y o v e r l a p p i n g e n v i r o n m e n t s .

324

H. O B E R Q U E L I . E E T AI..

The scheme includes the practically relevant cases of disjoint environments (Ec missing) and of environments contained in each other (e.g. Ec = E2). P1

P2

F I

I ST1

A1

I

F

C

L A2

Q

ST2

I 1

I i J

SCHEME 5. Switching from system structure to the corresponding means and activities, ai, Activity of Pi (i = 1, 2); zi, state of Pi; m, message; s, situation (refinable into sl, s2, sa, sb, sc according to Scheme 4).

/,

ii I

/

\

~

\

"it"

rp,~e

'

SCtlEMI~ 6. Refinement of the states zi (i = 1,2) into components which reflect the pragmatic characteristics. ki, conventions for messages m; sii, self image of Pi; ii, intentions; pmi, partner model; wi, other knowledge, containing knowledge about s.

The conventions of the two partners must be sufficiently similar to m a k e communication possible. 4.3. F O R M A l . A N D A I , G O R I T H M I C C O M M U N I C A T I O N

The schemes of the last section form a basis for the discussion of communication between two agents of a system. Communication between m o r e than two agents can also be treated but makes further refinements necessary. In informaties we are especially interested in the case where one agent communicates strictly according to predefined rules, We define: An agent shows formal communicating behaviour if all components relevant for his behaviour (cf. Schemes 5 and 6) can be described by mathematical models (well-defined sets and functions). In the most general case formal communicating behaviour means that all state components of Scheme 6 may be accessed and modified (as far as allowed by the models). In special cases some components may be constant or missing. Formal communicating behaviour of one agent influences the communicating behaviour of his partners. If they do not want to run the risk of misunderstanding

HUMAN-MACHINE COMMUNICATION/CO-OPERATION

325

they are forced to communicate formally as well: they cannot leave the formal model and are restricted to a formal role. If at least one of the communicating agents shows formal communicating behaviour we call the whole process formal communication. To understand the reasons for formal communicating behaviour we have to look at the soci~ll process in which it is established. The central activity is delegation, one of the twelve "communication disciplines" of Petri (1977). To delegate something normally means to hand over some function to a person (or a group of persons). This usually leaves a margin for performing the delegated task. A basic requirement for delegation is that the delegated behaviour and its preconditions can be sufficiently (in the extreme case: formally) described. Otherwise it would be impossible to control it. Controllability is necessary since the responsibility for the delegated behaviour in principle remains with the delegator. Delegation of actions is normally restricted to so-called standard cases. For exceptional cases it must be clear who is responsible for their detection and what has to happen. The delegation of tasks means the definition of some role. The agent who takes this role is restricted by it in his behaviour which in the extreme case becomes formal as defined above. A typical example for formal, delegated communication between persons is the communication between a citizen and an inflexible civil servant. If the latter follows his instructions exactly he shows a formal behaviour. If the citizen has a very special problem (exceptional case) the civil servant who sticks to his role would not be able to help him but will direct him to his superior. A special case of delegation is the programming of behaviour. H e r e the behaviour and its preconditions do not only have to be formally describable but must be explicitly described algorithmically. We define: An agent shows algorithmic communicating behaviour if all his relevant components are explicitly described by enumerable sets and computable functions. Formal communciation based on algorithmic communicating behaviour is called

algorithmic communication. H u m a n - c o m p u t e r dialogs are a typical example. The machine strictly shows the p r o g r a m m e d functional communicating behaviour delegated to it by its designers and programmers. Unlike the case of delegation to humans, all possible reactions must be preciscly defined (margin =zero). The machine is unable to cope appropriately with events unforeseen at delegation time. In contrast to formally communicating humans the machine is unable to leave its role, for example, to call for first-aid if the user becomes unconscious. Especially if the mechanism for reacting on exceptional cases is incomplete and if communication with the delegator is impossible, the user is forccd to adapt to the p r o g r a m m e d algorithmic communicating behaviour of the machine. The user cannot overcome the actual limitations by influencing the relevant components of the partner (e.g. conventions or partner model): usually they are fixed and hidden in the programs and can hardly be modified. This situation is the result of

326

H. O B E R O U E L I . E

ET AL.

structuring development and usage of dialog systems in a way which neglects some communication necessities: The future users are hardly ever engaged in the development and cannot effectively influence conventions and partner models of the designers. The results are incomprehensible programs and documentations (see Fig. 8). Designer 3 I

Proqrom

I

//

do,o

Processor

. "

*-,\

~, I

/ j

1

I

)

Documentation

L

,,o,u..u.. I

J

J

FIo. 8. Design and implementation as a step without communication between designer and future user.

Conventions, partner model and intentions of the designers are not all stated explicitly and are partially only contained in the documentation. The documentation cannot be fully accessed in the dialog in a situation-dependent manner. During the usage of the programmed dialog system often there is no communication link to the designers or delegators. This prevents the user from improving conventions or partner model by having modified programs, data or documentation (see Fig. 9).

U~er

'f

0

Diolog

FIG. 9. Using dialog system.

The situation is still more complex than in interpersonal formal communication: dialog systems are usually designed, programmed and reprogrammed by several people who quite independently develop for instance an operating system, compilers, editors, database management routines, application packages, etc. They all have

ttUMAN-MACHINE COMMUNICA'FION/CO-OPERATION

327

different user models, employ different conventions, follow different intentions. They design different roles which often are inconsistent in their communicating behaviour. Since users hardly ever work with one of these programs alone they are confronted with a very complex, intransparent " p a r t n e r " , a model of which is hard to develop by themselves. Only in recent years were systems discussed and designed which have some simple kind of self-image (system model) which is used to explain the system's behaviour to the user. To improve the situation in human-machine dialogs it seems necessary: to structure dialog systems as agents with algorithmic communicating behaviour starting from Schemes 5 and 6; to study a larger system including designers, delegators and users and to review the whole communication process including delegation and reorganization at the beginning and modification during usage on the background of a general concept of communication.

5. Human-machine co-operation Initially information processing systems have been developed to perform algorithmic calculations. This includes the ability to communicate algorithmically as has been realized in practice very soon. Theoretically it is a consequence of the universality of the concept of an algorithmic machine: by programming any algorithmic communicating behaviour of arbitrary complexity can be delegated to a universal machine. In practice this leads to the attempt to approximate communicating behaviour, for example of humans, algorithmically. The better the approximation the more the machine looks like an autonomous agent. Usually many programs of different authors work together and are modified during their lifetime. This produces a communicating behaviour without a human prototype. We call this phenomenon the virtual autonomy of the machine. It is reinforced by the fact that complex software systems can hardly be modified by their users. The inability of humans to store complex information safely themselves leads to a yet increased virtual autonomy of the machine in the user's perception. According to our characterization of communication in general, human intentions cannot be described algorithmically. Consequently we cannot talk about the communication intention of the machine. But it makes sense to associate a virtualcommunication intention with a virtually independent machine. It is the accumulation of all intentions of its programmers. A quite new role of the machine follows from this. It is not a simple medium or a tool but an agent with an in many regards complicated function in a communication process interconnecting several persons and machines in time and space. Under a temporal aspect the actual communicating behaviour of the machine can be seen as the result of a preceding communication process. We call such long-term communication processes with embedded virtually autonomous computer systems humanmachine co-operation (Kupka, 1981).

328

H. O B E R Q U E I . I . E E T A t . .

5.1. D E I . E G A T 1 O N OF C O M M U N I C A T I N G B E H A V 1 O U R : SOME SIMPLE CASES

Delegation is an important part of h u m a n - m a c h i n e co-operation. We discuss some simple cases starting with a simplified scheme for a human agent which does not explicitly contain partner model and intentions.

Agent

]

I

)

I

'

I

I

[

SCHEMI:~ 7. Simplified model of a human agent.

One principally important case of delegation is the delegation of storing. Externally stored knowledge is often called " d a t a " . A first step is storing on directly accessible external media, for example on paper, Using a storage m e d i u m only accessible by the machine leads to a situation where accesses are e m b e d d e d in a communication process with that machine. The conventions for this communication are set and can be modified by the p r o g r a m m e r s of the machine. For other users they are invariable constraints.

HA l----q I

~ i

'FT~'

Human agent (HA) I

Machine F . . . . . . . t Processor I [- - - - - ~ I

, ' FT~ ,

I

I

I

L . . . . . .

J

L __ J ~. ~ Reed

l Write

I

L -- _ _!

~ L___

J

L

External storage

SCIIEMI-: 8.

Delegation of knowledge storing, a, a', a", Activities of the human agent; a,,,, activity of the machine.

The second step of the above scheme illustrates the situation after delegation has taken place. The act of delegation as yet another communication process has not been represented in this scheme. The decomposition of the processor into conventions and machine activity is a virtual one and shows the relation to Scheme 7.

329

1I U M A N - M A C t lINE C O M M U N I C A T I O N / C O - O P E R A T I O N

The main benefit of storing inside a machine stems from its larger capacity, quicker access and sharability. A precondition for information sharing is a communication process between the original user and new users in which at least the conventions for using the machine must be transmitted. The same knowledge (data) may be accessed by different users according to different conventions which can be algorithmically transformed into each other. Algorithmic data transformation is an important task of the " c o m p u t i n g " machine. It is based on the delegation of functional activities.

HA .

.

Processor .

.

1

I.

I

I

,,

HA

.

.

.

L t

s

!

I I

T' , l l I I

L_.

I

r-LL' H t ;~i

o I

'~

"el J I i

~_--tTU_ :

() SCHEME 9. Delegation of a functional activity, f, Functional activity; g, supplement of f to gct the full activity a of H A ; a', activity remaining with H A ; d, data.

The delegation of functional activities again requires the definition of conventions for the exchange of messages. Objectification, improved efficiency and easy multiplication are the benefits. The main problem is the need for algorithmic descriptions of f. Usually standard cases can be formally described. The recognition of exceptional cases mostly cannot be formulated in advance. The delegation of functional activities often produces more general systems which are of economic interest but less comprehensible. Structurally similar to the situation of Scheme 9 is the delegation of accessing the environment (e.g. in process control systems). It can be described as a transformation of a system with one control circuit into a system with two circuits, one of which is open: control of the environment by the h u m a n agent is substituted by automatic control of the environment by the c o m p u t e r which in turn is controlled by the human agent. By viewing this process as a communication process we stress the importance of the necessary conventions for message exchange. The main benefit is the relief from very fast reactions. The delegation of algorithmic communicating behaviour is shown in our third example, where we have an algorithmic dependency between knowledge, conventions and messages.

330

H HA

I--

-1

F--

tF~t tkLJ

F-

Mochme F

HA

1

OBERQUELI.E

t

t

L

F/~

~

t

L

kL~

kLJ

i

-It

IL--/a

/27" A L .

I

I

I

I I

L

TI-' 'm.

~

I J

I I

L___! _J SCtlEMI2 10. Delegation of algorithmic c o m m u n i c a t i n g behaviour.

Here we have two coupled communication processes. One is concerned with external communication (k, m), the other one controls the delegated behaviour (k', m'). 5.2. C O M M U N I C A T I O N

NETWORKS

An important characteristic of the schemes in the previous section are the constancy and the equality of the conventions of the two partners. This means the quite special situation that the machine is exactly programmed according to the intentions of the programmer (needs not to be reprogrammed or modified) and that the user masters the conventions perfectly. In practice these assumptions are only seldom valid due to errors and forgetfulness. The situation is far from ideal in the following important cases. The user is not the programmer and has to learn the conventions for human-machine communciation in a separate communication process. Several programmers successively modify the communicating behaviour of the machine. In both cases several communication processes take place in a network with several agents. Metacommunication plays an important role in these processes. Scheme 1 1 HN '

7

I

i

1

I

I /f"~

,

'1 1

Machine F

-

--

l

),,

SCttEME i ], N e t w o r k of three agents.

HUMAN- MACHINE

331

COMMUNICATION/CO-OPERATION

shows two h u m a n agents and a machine. Both humans may work as delegators (programmers) and users. They may use the machine for cooperation and may modify its communicating behaviour. Additionally they can communicate directly. The scheme shows some typical network p h e n o m e n a . At least one communication channel with corresponding conventions for its use belongs to each communication link. One communication link may be the subject of a communication on another link. For example, the conventions for the communication of H A 2 (as user) and the M A C H I N E may be talked about in a dialog between H A 1 (as the p r o g r a m m e r of the M A C H I N E ) and H A 2 . This is a kind of metacommunication. It is also possible to connect several agents by one single channel only. For example, H A 1 and H A 2 may use the same terminal. If the messages and conventions are insufficient to identify the individual users then there is only one (virtual) partner from the machine's point of view. HA

Machine

3 I

Ist Phase: buildinq up the complete convention ( meta -

communication) '

~ , /

I I

I I

: '

2nd Phase: normal usage

I I I I J

L

I

SCt IEMI=. 12. D i a l o g with m e t a c o m m u n i c a t i o n .

332

It. OBERQUELI,t", ET AL.

An increased complexity of such networks makes it harder to transmit the conventions for machine usage via a separate channel. In this situation the self-explaining machine becomes important. This concept means that all conventions (and further information) can be learned by the user in a metadialog with the system. He only has to know some minimal convention in advance. This situation is illustrated in Scheme 12, which can be folded into a scheme with two agents who resemble the agent of Scheme 7. For m a n y fields of application the embedding of information processing by machines into complex communication networks is much more important than the development of algorithms. Software engineering is a good example: portability and adaptability, standardization and d o c u m e n t a t i o n - - f o u r communication t a s k s - - a r e of great interest. Practioners criticize that in software engineering research these areas are seldom studied (see Schnupp, 1980). Also in the field of interactive software the long-term co-operation of designers, systems p r o g r a m m e r s , application p r o g r a m m e r s and application specialists must be carefully studied in addition to the dialogs with the users. Taking the characteristics of communication seriously may help to make the right design decisions.

References BAACKE, D. (1973). Kommunikation und Kompetenz. Mfinchen: Juventa. BRAUER, W., Ed. (1980). Net theory and applications. Lecture Notes in Computer Science, vol. 84. Berlin, Heidelberg, New York: Springer. DEHNING, W., ESSIG, It. & MAASS, S. (1981). The adaptation of virtual man-computer interfaces to user requirements in dialogs. Lecture Notes in Computer Science, vol. 110. Berlin, Heidelberg, New York: Springer. FELLBAUM, K. (1980). Entwicklungstendenzen zukiinftiger Text- und Sprachkommunikation. In 0sterreichische Gesellschaft ffir Informatik, Ed., lnformationssysteme fi~r die 80er Jahre, pp. 424-451. Linz: Johannes Kepler Universit/it Linz. FITTKAU, B., MI~I,LER-WOLF, ll.-M. & S('IIULZ VON TttUN, F. (1977). Kommunizieren Lernen (und Umlernen ). Braunschweig: Westermann. GANZHORN, K. (1979). Beziehungen zwischen Informationsverarbeitung und Kommunikation. In ENDRES, m. & SCHI)NEMANN, C., Eds, In/ormationsverarbeitung und Kommunikation, pp. 9-23. Miinchen, Wien: Oldcnbourg. GENRICIt, H. J., LAUTENBACH, K. &, TItlAGARAJAN, P. S. (1980). Elements of general net theory. In BRAUER, W., Ed., Net Theory and Applications, pp. 21-163. Lecture Notes in Computer Science, vol. 84. Berlin, Heidelberg, New York: Springer. GLOY, K. & PRESCH, G. (1975). Sprachnormen, vols 1-3. Stuttgart: Frommann-Holzboog. GROCtlLA, E. ed. (1973). Handw6rterbuch der Organisation. Stuttgart: Poeschel. HOFFMANN, H.-J. (1977). Betrachtungen zum Entwurf interaktiver Systeme. In BLASER, A. & HACKL, C., Eds, Interactive Systems. Lecture Notes in Computer Science, vol. 49, pp. 38-91. Berlin, Heidelberg, New York: Springer. HOLT, A. W. (1976). Formal methods in systems analysis. In SHAW, B., Ed., Computers and the Educated Individual. Proceedings Joint I B M University of Newcastle upon Tyne Seminar, pp. 135-179. University of Newcastle upon Tyne, Computing Laboratory. KAISER, W., LANGE, B. P., IJANGENBUCI1ER, W. R., LERCt4E, P. & WITTE, E. (1978). Kabelkommunikation und lnformationsvielfalt. Mfinchen, Wien: Oldenbourg. KLERER, M. & REINFEI_,DS, J., Eds (1968). Interactive Systems for Experimental Applied Mathematics. New York, London: Academic Press. KUPKA, I. (1974). /0ber die Syntax von Dialogsprachen. In SCIILENDER, B. & FRIELINGHAUS, W., Eds, GI, 3. Fachtagung iiber Programmiersprachen. Lecture Notes in Computer Science, vol. 7, pp. 45-54. Berlin, Heidelberg, New York: Springer.

I IUMAN-MACHINE COMMUNICATION/CO-OPHRATION

333

KUPKA, I. (1981). Pragmatics of Information and man-machine cooperation. In LASKER, G. E., Ed., Applied Systems and Cybernetics, vol. V, pp. 2303-2309. New York: Pergamon Press. KUPKA, I. & WILSING, N. (1980). ConversationalLanguages. Chichester, New York, Brisbane, Toronto: John Wiley & Sons. LAING, R. D., PHILLIPSON, H. & LEE, A. R. (1966). Interpersonal Perception. London, New York: Tavistock. LEwis, D. (1969). Convention: A Philosophical Study. Cambridge, Massachusetts: Harvard University Press. MERTEN, K. (1977). Kommunikation. Opladen: Westdeutscher Verlag. NAFFAH, N. (1979). Exemple d'un poste de travail burotique ~ interface universelle. M E V.2.503, Projet Pilote K A YAK. Paris: IRIA. ORERQUELLE, H. (1980). Nets as a tool in teaching and terminology work. In BRAUER, W., Ed., Net Theory and Applications, pp. 481-506, Lecture Notes in Computer Science, vol. 84. Berlin, Heidelberg, New York: Springer. PETRI, C. A. (1962). Kommunikation mit Automaten. Dissertation, Universitfit Bonn, Institut fiir instrumentelle Mathematik. PETRI, C. A. (1977). Communication disciplines. In SHAW, B., Ed., Computing System Design. Proceedings Joint IBM University of Newcastle upon Tyne Seminar, pp. 171-183. University of Newcastle upon Tyne, Computing Laboratory. PETRI, C. A. (1980). Introduction to general net theory. In BRAUER, W., Ed., Net Theory and Applications, pp. 1-19. Lecture Notes in Computer Science, vol. 84. Berlin, Heidelberg, New York: Springer. SCHNUPP, P. (1980). Softwaretechnologie f(ir den kommerziellen Anwender--Bringen die 8()cr Jahre einen Paradigmenwechsel? In NIEDEREICHHOLZ, J., Ed., ED V-Anwendungen in den 80erJahren, pp. 37-50. Universitiit Frankfurt-am-Main, Institut fiir Wirtschaftsinformatik. SIIANNON, C. E. & WEAVER, W. (1949). The Mathematical Theory of Communication. Urbana: University of Illinois Press. STEIN, K. U. (1978). Grenzen der Grossintegration durch deterministische und stochastische Prozesse. Proceedings GI 8. Jahrestagung, Informatik-Fachberichte, vol. 16, pp. 149-174. Berlin, Heidelberg, New York: Springer. STEINBUCH, K. (1977). Kommunikationstechnik. Berlin, Heidelberg, New York: Springer. TIETZ, W. (1980). Datenkommunikation im Umfeld digitaler Technologien. In (~sterreichische Gesellschaft ffir Informatik, Ed., Informationssysteme fiir die 80er Jahre, pp. 481-491. Linz: Johannes Kepler Universit~it Linz. TURING, A. M. (1950). Computing Machinery and Intelligence; Mind, vol. 59, pp. 433-460. Reprinted 1963 in FEIGFNBAUM, E. A. & FELDMAN, J., Eds, Computers and Thought. pp. 11-35. New York, San Francisco, Toronto, London, Sydney: McGraw-ttill. TURN, R. (1974). Computers in the 1980s. In ROSENFELD, J. E., Ed., Information Processing 74, pp. 137-140. Amsterdam, New York: North-llolland. WATZLAWICK, P., BEAVIN, J. H. & JACKSON, D. D. (1967). Pragmatics of Human Communication. A Study of Interactional Patterns, Pathologies, and Paradoxes. New York: Norton & Company. WEIZENBAUM,J, (1976). Computer Power and Human Reason. From Judgement to Calculation. San Francisco: Freeman.