Int. Z Man-Machine Studies (1982) 16, 287-299
Towards self-adaptive interface systems P. R. INNOCENT
School of Mathematics, Computing and Statistics, Leicester Polytechnic, Leicester, U.K. (Received 10 June 1981, and in revised form 21 September 1981) This paper follows a trend towards more user oriented design approaches to interactive computer systems. The implicit goal in this trend is the development of more "natural" systems. Design approaches should aim at a system capable of continuous change by means of suitable agents. The term "soft facade" is used to describe a modifiable interface to a system. Facades vary in softness and agents for change can be the systems designer, the user or an expert system in the facade. In the latter case, the system is called a self-adaptive interface system. The conditions where a self-adaptive interface system is desirable are briefly considered and discussed in relation to a simple example. Recent research in artificial intelligence is mentioned as a possible source of proposed components for a self-adaptive interface system.
1. Introduction The ideas presented in this paper emphasize the needs of the user rather than the availability of technology in the production of reliable and useable systems. The argument is based on a brief selective history of design approaches and the main factors influencing them. It was suggested by Lewis (1973) that a useful way of thinking about a system, in order to design a reliable and trustworthy system, was to consider an analogy with a building. The building has a sound structure which must support its functions reliably. In addition, it has a facade. These are the necessary additions to the structure in order to make the building usable. Figure 1 shows a representation of this situation.
~ I FocadeSystem FIG. 1. A system viewed as a building. The facade is the system as it appears to a user, and must be designed "for ease of use and understanding, simple order amid complexity, clarity of paths of control and a high standard of response---or apparent intelligence----on the part of the system" (Lewis, 1973). The facade of a system contains all those aspects such as documentation and user-help facilities, as well as those aspects which protect privacy and system integrity, 287 0020-7373/82/030287 + 13503.00/0
9 1982 Academic Press Inc. (London) Limited
288
P.R.
INNOCENT
etc. The consideration of the m a n - c o m p u t e r interface as an integral part of the facade is a response by designers to cope with the larger variety in users, tasks and systems that the direct user interactive system facilitates. It is part of the trend towards systems design methods which are predominantly user-oriented. Technological developments such as computer networks lead the designer to conceptualize the facade of a system as a functionally distinct entity of the total system which can be designed as such. Figure 2 shows this view.
FIG. 2. A facade as a functionally distinct sub-system.
Early methods for designing "dialogues" (e.g. Martin, 1973) have the purpose of producing stable facades on the basis of rigorous analysis of the users purpose, tasks and inherent abilities as a general information processor. Little emphasis was given to the need for implementing a design in such a way that it could easily be modified, particularly by users. Systems have been designed which allow users to re-program the system in a number of ways. The following list is not exhaustive but shows the response of designers over the last decade to the needs of users for extensible and personalized interfaces. Systems have been designed in response to users' needs with the following facilities. (i) The conversational mode can be altered to allow varying degrees of user control. For example, the naive user can change from a computer menu-driven dialogue to a user-dominated command driven dialogue as he gains experience. (ii) The channel of communication can be altered to suit the speed/accuracy characteristics required by the users tasks. For example, the user can employ V D U and light-pen for menu-selection rather than other teletype or similar input/output devices. (iii) Convenient clusters of user requests can be grouped together in such a way that they can be quickly and easily invoked by the user. For example, a " m a c r o " or command file facility can be used to run particular programs in a particular sequence by specifying the name of a file containing the program names. (iv) Users can change the system interface so that the same terminal can be attached to different host systems. For example, the terminal can be made into a network node by setting up appropriate communications parameters and loading suitable emulators. (v) Users can change particular parameters of information display and presentation. For example, a particular print font can be changed for a printer output. The tasks necessary for users to personalize a system described above vary in complexity and hence require different user characteristics and effort. The extra effort to personalize a system may be considered too great for the occasional user, even though a personalized system is desired. Put another way, the extra effort may be considered as an initial price to pay for long-term benefits and has to be evaluated by the user. This evaluation may be considered as a continuous process carried out
TOWARDS SELF-ADAPTIVE SYSTEMS
289
by the m o r e - f r e q u e n t user. A useful concept was p r o p o s e d by Carbonell et al. (1968) that a designer should consider the total work accomplished with respect to a variety of tasks (one or more of them on-line) that are competing for man's attention. Assume that the human operator weights the different tasks and thus derives what we may call a set of "costs" of not performing them. 9
It appears that designers should produce systems which minimize the effort required to personalize a system so that the user weights the associated tasks (i)-(iv) lightly compared with how they would be weighted with a less-easily personalized system. This will result in a set of costs which favours the personalized system over the less-personalized system. This may be achieved by providing the user with suitable tools which are matched to his expertise and the tasks in hand. For the c o m p u t e r oriented m o r e expert user, terminals can be used to produce facades with degrees of "smartness". They can be programmed independently Io p e r f o r m various functions such as emulating other terminals or simply acling as a stand-alone micro-computer. Sophisticated specialist {nt~ut/output eempo~er:ts, such as p r o g r a m m a b I e keyhaards and sgeech modutes are available as sub-units to these smart terminals. P r o g r a m m a b l e terminals with "soft" i n p u t / o u t p u t modules present a solution for the designer to the problem of coping with the variety in users, tasks and systems by allowing a facade to be tailored to suit a particular context. However, easy-to-use tools for tailoring are necessary for use in suitable design approaches. E d m o n d s & Guest (1978) describe such a tool and how it has been and could be used, Newell (1980) describes some approaches to the design of interactive systems wherein such tools will be useful, In this paper, the term "soft facade" is used to m e a n a facade that can be easily tailored. However, expert c o m p u t e r users are a minority of the potential user population. A majority of users may be expected to require facades which can be changed with little or no expert computer skill. In these cases a personalized system can be achieved by making a computer system adapt itself. Soft facades are a necessary pre-requisite for developing adaptive systems. Stability of the facade may interact with the stability of the user in terms of knowledge and experience in such a way that the total m a n / c o m p u t e r system is unstable, That is, each change to the facade may upset the confidence of the user who expects a facade to be consistent and uniform rather than adaptive and changeable, The balance between the need for a consistent and uniform facade and the need for change to suit user, task and system environments is a delicate one, Little is known about the factors attecting this balance and how it affects the ease of use and learning of an interactive system. Some investigations have been carried ~ut (e.g, Innocent, ~977) which give some indication of the likely problems with soft facades. L. A. Miller & T h o m a s (1977) discuss some of the relevant behavioural issues and there are m a n y similar discussion papers and reports in this area. A p r o b l e m for the designer of " n a t u r a l " systems such as these (see Fitter, 1979) is how and when and by w h o m should the soft facade be tailored? H u m a n factors engineers such as Shackel (1969) and Chapanis (1976) are experimenting to discover how a facade affects m a n - c o m p u t e r interaction and this will be useful for assessing
290
P.R. INNOCENT
how and when a facade should be changed. For example, R. B. Miller (1968) is often used as a source for guidelines on system response times. There are three possible agents through which a soft facade can be tailored; this section has concentrated on the systems designer, but both the user and the system (including the facade) can also effect changes.
2. Facade modification by external agents 2.1. USER MODIFIED FACADES Tasks (i)-(v) outlined in section 1 are examples of how a user can re-program a facade. These are additional tasks to those that the user actually wants to carry out in an interactive session, such as solving a problem or obtaining sales information. The tasks vary in difficulty and require different levels of expertise on behalf of the user. To this extent they may be thought of as a cost in overheads to personalize the system so that it suits the user and his tasks better. The cost/benefit of tailoring in this respect therefore seems to rely on, for example, the ease of use of tools for tailoring and the benefits of using them. The former may depend on both task and user factors, such as repetition of tasks and competence of the user. 2.2. DESIGNER MODIFIED FACADES Most changes to a facade are initiated or triggered by requests or complaints to the designer. Since this is not an unusual occurrence, Shneiderman (1979, 1980) has emphasized and argued a case for an experimental approach to the development of usable systems. Since the product of an experiment is a modification, then emphasis should also been given to the production of easy-to-use tools for the quick modification and implementation of dialogues (e.g. Edmonds & Guest, 1978). This approach is a response to the recognition that user behaviour on an interactive system is necessarily partially unpredictable and an appropriate design strategy is therefore required. The implication is that designers need sophisticated models of users which allow them to evaluate user information and behaviour. The growth in related study areas such as psychology has played a major part in influencing the development of thinking about users as information processors and provided suitable concepts for considering users as people with particular ways of thinking and behaving. For example, L. A. Miller, Gallanter & Pribram (1960) suggested that people develop images which are used to form plans which, in turn, structure behaviour. The images are necessary for building a "cognitive m a p " which is used in guiding behaviour. For the user of an interactive system, a suitable image is used to generate part of this map which is continuously monitored and changed to suit experience. Other parts of the map are concerned with the user's goals and the context in which he is working. Ideally a designer should attempt to build a facade which allows users to have images which can be easily integrated within a particular context of other images and so lead to harmony. This implies that the designer needs to build an image of the cognitive map of a user which can then determine the designer's behaviour in the production of an acceptable facade. This approach is mainly successful for those stable parts of the cognitive map which can be isolated and characterized.
TOWARDS
SELF-ADAPTIVE
SYSTEMS
291
The map contains procedural and declarative knowledge which allows users to make decisions and use the system. If the designer can make this map explicit, then the facade can be constructed in such a way that it incorporates the user model of the system. If this occurs, then a good match can be obtained between user expectations of the system and the appearance of the system through the facade. This is more than a cosmetic exercise which simply alters the appearance of the system. The "user model" in the facade is to some extent a model of the user in that its contents and structure at any particular time during interaction will reflect the knowledge, skill and purposes of the user. The problem for a designer is that the user is likely to change both within and between interactive sessions with a computer. Hence, there is a need for a modification of the model at different times. Some tools have been developed in psychology for eliciting user models and tested as a design aid for interactive systems (Pearce & Easterby, 1973). Other tools have been developed by computer scientists, which enable easy modification of dialogue (e.g. Vlaeminke, 1981) in an implemented system between sessions. However, the changes required during an interactive session (related to the form of a conversation) need different tools and approaches. The approach taken in this paper is to suggest that the system itself should contain adaptive mechanisms which are triggered by user responses in a conversational context. In this sense, the system may be thought of as self-adaptive. Other approaches to this problem would include incorporating adaptive mechanisms outside the h u m a n - c o m p u t e r system which are triggered by the user implicitly or explicitly. These are not explored in this paper.
3. The self-adaptive user interface (SAUl) The purpose of an SAUI is to reduce the overheads to the user of tailoring a facade to a system to suit the needs of the user and to optimize the communication, where necessary, in this process. An S A U l is particularly necessary where the user's needs and expectations are likely to change within an interactive session as well as frequently between them. Firstly, instability can be seen during problem-solving. The user's cognitive map is continuously changing at some level appropriate to the problem in hand. The changes are observed only through elicited behaviour following the formation of suitable plans. However, the user can be altering plans as they are being executed and so behaviour appears to be discontinuous and unpredictable. This idea has been proposed to explain some aspects of interactive problem solving studied by Innocent (1977). Secondly, another level of instability, which is related to the first one, is on a conversational level. This arises from the variations in formality and "naturalness" of language used for problem solving. A view of communication proposed by Thomas (1978) suggests that people treat the generation of suitable communications as a separate task during interactive problem solving. Each communication is generated in such a form as to induce in the receiver the best possible response. Since this task is an overhead on top of the tasks in the problem-solving plan, it is likely that effort on it will be minimized and tasks interrupted as necessary. This produces an observed discontinuity in task structure at the interface. W h e r e the receiver does not have
292
P . R . INNOCENT
appropriate decoding mechanisms which are triggered in this process, the source of the information has the additional overhead of co-operating in their development when required during the problem solving process. This further exacerbates the observable discontinuity in behaviour at the interface. Chapanis (1976) presents information which supports this view. The magnitude of the conversational overhead is related to the measure of "dialogue determination" proposed by Thimbleby (1980). A dialogue is over-determined if the computer restricts and unnecessarily controls the user; it is under-determined if the user does not know what to do. These are some sources of instability in interactive communication and problem solving which allow the roles of an S A U l to be identified. 3.1. ROLES A N D G O A L S OF AN " S A U l "
Firstly, the S A U l should enable the user to maximize effort on problem solving. Secondly, the users effort should be minimized in conversation. The goals ol an SAUI are therefore briefly a s follows (the list is not exhaustive). (i) Present advice on how to use system facilities and give estimates of availability and s p e e d / e r r o r characteristics when appropriate. (ii) Allow users easily to minimize effort expended on conversation in a session by, for example, providing appropriate feedback to the user during a session. (iii) Maintain conversational continuity by, for example, balancing the rates of conversation between the user and the system. (iv) Ensure that the richest communication channels are used by, for example, presenting advice based on the speed/accuracy characteristics of the input/output modules in use. The following section presents an example to show how a dialogue may result from a SAUI which achieves these goals. 3.2. A N E X A M P L E OF A D I A L O G U E
The following hypothetical dialogue should be thought of as occurring through a speech channel of communication between a single user (U) and a local computer system (S) which connects itself, when necessary, to a larger system ("system"). While the use of natural language speech input may appear unlikely, recent and planned research shows that it is not unlikely in the near future at least in a restricted form (Lea, 1980). The dialogue is intended to be illustrative of a range of properties and has been designed for that purpose. This range, while illustrated by considering natural language in a speech channel (for ease of comprehension), does not depend on either the channel or language flexibility. In the following dialogue messages enclosed in ( ) are sent between the user facade and system, and vice versa. Assume that the user is known to the system from previous interactions which have set a suitable context for the tasks being carried out. Dialogue begins: U. Hello, I've got a problem for you. C. Hello, John. You haven't picked up your data from our last discussion yet, you know?
T O W A R D S S E L F - A D A I r I ' I V E SYSTEMS
293
U. Forget that. This is urgent, I need some answers quickly. What's the situation? C. (sends status request to system) I'm not too busy right now. Do you think you need more than I've got? (Receives status from system), U. Of course. C, System is not too busy right now but anticipates a heavy load in about 15 minutes. U. O.K. That's long enough. Take a look at this. (Places a sheet of text on a reader) C. Interesting . . . . Hold on while I get adjusted. (slight pause). O.K. U. I've got two sets of text like this and I want to know some statistics about them. C. H m m . . . My reader says some 10% of that text can't be read in. Is that O.K.? (interrogates system for alternatives) U. I could do with it all. C. (system returns one main alternative) You could send it up to the main department and that will take them about 2 days. D o you want to do that? U. No! I haven't time for that. Just take this text (places second sheet on reader) and analyse it, please, C. Please speak more clearly. Did you say take this and analyse it? U. Yes. C. How? (requests service from system) U. Give me a comparative digram table for the text. C. (sends request for sample output from previous session with John) You mean like this? (displays output on soft graphics screen) U. Yest, that will do, Hurry up. C. (sends off data and request to system for analysis) O.K. Is there anything else I can do? Would you like a listing of the text that I could read? U. Yes. C. (text is printed) (results returned from system, when printing finished computer continues) Your analysis is complete and the results are on the screen. Do you want a copy? U. H m m . . . O.K. C. (system goes down) Anything else I can do? U. No. Goodbye. C. Goodbye. End of Dialogue. 3,3 O B S E R V A T I O N S ON T H E D I A L O G U E
(a) There are multiple channels of communication in use sometimes simultaneously, Thus the richness (as measured by the variety in signals allowed) of the communication varies as required by the needs of the user. The natural language speech channel is, of course, the richest channel which dominates the others and is the main channel for prescriptive information. (b) The communication channels vary in smartness, capacity, direction, mode, speed and error rates. Channel selection during the dialogue is made dynamically on the basis of the trade-offs in user needs for speed and accuracy as well as the availability of resources. (c) Information within the dialogue is at two levels: "proximal" which is fast, local but non-specific, and "distal" which is slow and possibly physically disjoint from the initiating action, but specific to it. The proximal feedback has not been characterized
294
P.R. INNOCENT
within the dialogue but consists of such things as noise f r o m the operating machinery of an activated i n p u t / o u t p u t channel. (d) The flow of the dialogue is maintained by the system through anticipation of user requests which are time consuming and then introducing " d i s p l a c e m e n t " activities for the user as necessary. (e) The user is informed of time and resource constraints as they are known both at the start of the dialogue and as it progresses. This allows the user to m a k e decisions about how to proceed and hence distribute his remaining effort effectively. (f) T h e distribution of effort by the user is d e p e n d e n t on his model of the dynamics and other characteristics of the system. Two simple and well-known generalizations are proposed to determine the distribution of effort. First, work expands to fill the time available (Parkinson's Law) and second, the user expends minimal effort in carrying out the work [Zipf's L a w - - Z i p f (1949)]. Since the system is providing information about available time in which it can effectively operate for the user, this is a kind of "time f r a m e " within which the user minimizes his effort and expands his work. The time f r a m e is a dynamic part of the user model. (g) The system does not dominate the conversation but allows users to m a k e choices with little o v e r h e a d which are designed to minimize user effort. A balance of control is maintained and the facade seems to be uniform and consistent. (h) T h e computer-initiated redistribution of effort does not force user tasks to be left open (i.e. user task closure is preserved). Neither is the user forced to change his main objective in using the system (i.e. the user's " s e t " or p r o b l e m definition is preserved). (i) The conversation has similar characteristics of normal h u m a n conversation in terms of the following properties (from Nickerson, 1976): it is bi-directional, has mixed initiative, there is a peer status, there is a structure, a characteristic time-scale, a s p e e c h / h e a r i n g match, a sense of presence, a shared situational content, a nonlinguistic element and informal language is allowed. F r o m these observations, the main implication is that an S A U I may be built to support the simpler functions, such as conversational control, while the m o r e complex functions like natural language comprehension can be a long-term objective. Therefore, apart from the user's need for an S A U I , two other groups can be identified. First, the researcher who is interested in understanding h u m a n linguistic behaviour and second, the designer of complex general-purpose problem-solving aids. This paper continues with concepts which may be useful to the latter rather than the former.
4. Overview of a possible SAUl architecture Figure 3 shows the basic c o m p o n e n t s of an S A U l . These are a soft facade and an " e x p e r t " modifier. As interaction between the user and the facade proceeds, the expert system monitors and evaluates both user and system behaviour with the purpose of reshaping the soft facade as and when required. 4.1. THE SOFT FACADE Figure 4 shows the entities that can be described at a high level in a soft facade. The principle c o m p o n e n t s are briefly described below.
295
TOWARDS S E L F - A D A P T I V E SYSTEMS
SAUI 9c
J Sotttacade J ,c
/
System
\
[ xpe, mod,,i.r I Fie,. 3. Basic architecture of an SAUl.
input/output modules J
Multiplexer <
>
<
> Multiple
input/output [ modules [ <
FIG. 4. Components of a soft facade.
4. I.I. User input/output modules When active, these modules accept information from the user and may process it before handing it on the linking net via a controlling multiplexer. More than a single module will normally be active at any time and this defines the richness of information between the user and the facade. Input modules may be simple ("dumb") and simply transfer user information without any pre-processing (e.g. Q W E R T Y keyboard). Alternatively, they may be complex ("smart") and carry out a great deal of pre-processing (e.g. voice input device). Hence, the smartness of the modules also contributes to the richness of communication. Output modules may also vary in smartness and make a contribution to the overall richness of communication. The (de)activation of a module may be initiated by a user or by the expert system and is associated with the activity of the input/output multiplexer. 4.1.2. The user input/output multiplexer The function of the multiplexer is to allow communication between selected user input/output devices and the linking net. It allows the expert system to both monitor and censure communication between the user and the linking net and so adapt the soft facade by variation between input/output modules. Variation of communication within an input/output module is achieved through the operation of a linking net. 4.1.3. The linking net The linking net is the central part of the intelligence in the soft facade and is the part wherein the model of the user resides. The model may be thought of as consisting of computational instances of a context or "frames" (Minsky, 1962). Frames are user, system and task dependent and have some correspondence with user images in terms
296
v.R. INNOCENT
of their structure. T h a t is, they are likely to be nested in some kind of hierarchy and vary in complexity and content. A convenient means for representing frames is a transition network. A transition network consists of nodes which are interconnected through links. Each node represents a state of the system and m a y be reached f r o m another state via a link. W h e n a new state is reached, certain processes m a y be carried out, such as sending a message to the user or to the system, or initiating a system task. On completion of these processes, a new state is defined which causes a transfer to another node. T h e process m a y be a continuous one involving a net supervisor which can recognize state patterns and tidy-up unused or used links, nodes and processes. Alternatively, the process m a y be discontinuous and operate on parts of the net as if they were sequential programs. W h e r e a facade has limited intelligence, then the net is a u g m e n t e d by the inclusion of process descriptions in the node representation. In some situations, where links are formed dynamically during interaction, then a node can re-link into its own part of the net for further processing. In this recursive process, the net is m o r e than a simple connector between the user i n p u t / o u t p u t modules and the system i n p u t / o u t p u t modules---it is evaluating and controlling user interaction directly. It m a y be expected that, since h u m a n behavi0ur is "unpredictable to-a certain degree, the net will be a heterarchy rather than a hierarchy at any particular time. However, for any given user, task and system, there may be identifiable stable parts of the net where links and processes are fixed. Various parts of the net may be associated with a stability m e a s u r e according to their lifetimes in a given context. W h e r e suitably stable sub-nets can be identified they may be called " a g e n t s " which can be brought into existence by an expert system when the context is suitable. Such an expert system m a y be " r u l e - b a s e d " in the sense that W a t e r m a n (1978) describes, A useful scheme for classifying and then clustering rules in a system could be based on R. B. Miller's (1969) archetypes of m a n - c o m p u t e r problem-solving. The t e r m "user agent" is used by W a t e r m a n (1978) to m e a n a small p r o g r a m that sits between the user and the system he is interacting with and is capable of performing a variety of tasks for the user. Programs of this s o r t . . , are sometimes referred to as interactive transaction (IT) agents. User agents interface the user to external systems, i.e. they provide him with help in learning and using complex systems. System agents m a y also be p r o p o s e d which act on behalf of a system to protect it and censure bad data. These would act mainly through conversations with the system. Most system agents would be very stable. W a t e r m a n (1978) and E d m o n d s & Scrivener (1980) have gone some way towards achieving a soft facade along these principles. Although in the early stages of development, their work is a m a j o r part of developing an S A U I .
4.1.4. System input~output modules and multiplexer These modules function in the same way as the user i n p u t / o u t p u t modules only their construction is suited to system interfaces and protocols rather than user behaviour. 4.2. THE EXPERT MODIFIER The role of the expert modifier is to mimic a systems designer engaged in optimizing a facade. Consequently, the expert system needs a triggering mechanism and a means
297
TOWARDS S E L F - A D A P T I V E SYSTEMS
7.
I
7
I
7
I
)-
FIG. 5. Components of an expert modifier.
for capturing and using the experts model of the user/system/task facade. Figure 5 shows some of the components necessary to perform these tasks.
4.2.1. The soft expert-modifier The principle component here is an easily modifiable link net which embraces the expert knowledge for controlling the facade. The argument for ease of modification is similar to that for developing a soft facade--facade modification is likely to be a complex, and initially not a stable, process. That is, agents in the expert net are likely to vary in stability but it is necessary to the success of the S A U I that a greater proportion of stability is eventually achieved than that of the facade net. Initially, the least stable agents will be those concerned with evaluating and controlling user/facade/system interactions since there is little known about these. The most stable agents will be those concerned with setting up the stable agents in the facade. The net interactions through control and evaluation need careful investigation and background research. Recent work by Waterman (1980) is encouraging in that the aim is to build a system for capturing expertise from users in a system.
4.2.2. Monitor modules The function of these modules is to collect and transport information about the facade components to evaluation agents i n the expert modifier. Since the facade net is a dynamic and complex entity, large volumes of information will require collection and processing quickly if facade changes are to be made quickly and effectively. Hence, monitor modules may include a degree of intelligence and be selective. 4.2.3. Control modules The function of these modules is to take the information from control agents in the expert net and change the components of the facade accordingly. Again, control modules may have intelligence and specialise their control functions. In particular, a degree of specialization would be expected to suit the principal components of the facade.
5. Conclusion This paper has presented some concepts for a self-adaptive user interface system. Already work has been done on many aspects of the self-adaptive system but there are many problems for researchers to contemplate. For example, there is a major problem in evaluating interactive systems, in that suitable unobtrusive methods based on sound statistical principles have to be developed. Since the experimental approach to the design of interactive systems is still in its infancy and is constrained by economic pressures, it is likely that one of the first research goals should be the development of these methods.
298
P.R. INNOCENT
Baldwin & Siklossy (1977) have developed a monitor which may be adapted for use and Innocent (1979) has considered some alternative experimental procedures which could result in suitable statistical evaluation methods. W o r k on conversational analysis such as that of C o o m b s & Alty (1980) should be useful for developing agents in the expert and facade systems. In the long term, the hope is that suitable methods can be learned by an expertmodifier system and a complete self-adaptive interface produced.
References BALDWIN, J. T. & SIKLOSSY, L. (1977). An unobtrusive computer monitor for multi-step problem solving. International Journal of Man-Machine Studies, 9, 322-349. CARBONELL, J. R., ELKIND, J. 1. & NICKERSON, R. S. (1968). On the psychological importance of time in a time-sharing system. Human Factors, 10(2), 135-142. CHAPANIS, A. (1976). Interactive human communication. N A T O A S I Conference on ManComputer Interaction Proceedings, Mati, Greece. Loughborough: Department of Human Sciences, Loughborough University of Technology. COMBS, M. J. & ALTY, J. L. (1980). Face-to-face guidance of university computer users--2: Characteristic user interactions. International Journal of Man-Machine Studies, 12, 407429. EDMONDS, E. & GUEST, S. (1978). SYNICS--A FORTRAN subroutine package for translation. Man-Computer Interaction Research Report No. 6, Leicester Polytechnic. EDMONDS, E. & SCRIVENER, S. (1980). The use of graphical soft keyboard devices. ManComputer Interaction Research Group Report No. 11, Leicester Polytechnic. FITTER, M. (1979). Towards more "natural" interactive systems. International Journal of Man-Machine Studies, 11, 339-350. INNOCENT, P. R. (1977). Investigations of the effects of different computer input methods on man-computer interaction. Ph.D. Thesis, Loughborough University of Technology. INNOCENT, P. R. (1979). An experimental approach t o man-computer interaction. ManComputer Interaction Research Group Report, No. 20, Leicester Polytechnic. LEA, W. A. (1980). Trends in Speech Recognition. Englewood Cliffs, New Jersey: Prentice-Hall. LEWIS, M. (1973). The design of reliable and trust-worthy systems. BCS Datafair Conference Proceedings. MARTIN, J. (1973). The Design of Man-Computer Dialogues. Englewood Cliffs, New Jersey: Prentice-Hall. MILLER, L. m. & THOMAS, J. C. JR (1977). Behavioural issues in the use of interactive systems. International Journal of Man-Machine Studies, 9, 509-536. MILLER, L. A., GALLANTER, A. & PRIBRAM, G. (_1960). Plans and the Structure of Behaviour. New York: Holt, Reinhardt & Winston. MILLER, R. B. (1968). Response times in man-computer conversational transactions. AFIPS Conference Proceedings 33, 267-277. MILLER, R. n. (1969). Archetypes in man-computer problem solving. Ergonomics, 12, 559581. MINSKY, M. (1962). Steps towards artificial intelligence. In FEIGENBAUM, E. & FELDMAN, E. Eds), Computers and Thought, p. 447. NEWELL, H. E. (1980). Towards a design methodology for interactive systems. In GUEDJ, R. A., TEN HAGEN, P. J. W., HOPGOOD, F. R. A., TUCKER, H. A. & DUCE, D. A., Eds, Methodology for Interaction. Amsterdam: North-Holland. NICKERSON R. (1976). Modelling for better design. N A TO A S I Conference on Man-Computer Interaction. Proceedings, Mati, Greece. [See also Chapanis (1976).] PEARCE, M. S. & EASTERBY, R. S. (1973). The evaluation of user interaction with a computer-based management information system. Human Factors, 15 (2), 16. SHACKEL, n. (1969). Man-computer interaction--the contribution of the Human Sciences. Ergonomics, 12, 485-489.
TOWARDS SELF-ADAPTIVE SYSTEMS
299
SHNEIDERMAN, B. (1979). Human factors experiments in designing interactive systems. Institute of Electrical and Electronic Engineers Computer Journal, 9-19 (December). SHNEIDERMAN, B. (1980). Software Psychology. New Jersey: Winthrop Publishers. THIMBLEBY, n. (1980). Dialogue determination. International Journal of Man-Machine Studies, 13, 395-404. THOMAS, J. C. (1978). A design interpretation analysis of natural English with application to man-computer interaction. International Journal of Man-Machine Studies, 10, 651-668. VLAEMINKE, I. (1981). ASK--a software tool for implementing a user interface to APL systems. Man-Computer Interaction Research Report No. 35, Leicester Polytechnic. WATERMAN, D. A. (1978). A rule-based approach to knowledge acquisition for man-machine interaction. International Journal of Man-Machine Studies, 10, 693-711. WATERMAN, D. m. (1980). User-oriented systems for capturing expertise: a rule-based approach. In MICHIE, D., Ed., Expert Systems in the Micro-electronic Age. Edinburgh University Press. ZIPF, G. (1949). Human Behaviour and the Principle of Least Effort. New York: Hapfner.