125
Propaedeutics of Decision-making: Supporting Managerial Learning and Innovation * Raymond G. HUNT and G. Lawrence SANDERS State University of New York at Buffalo, Buffalo, N Y 14260, USA This paper considers the DSS development process as an exercise in mutual and concurrent learning by an analyst and a user. The paper draws upon the theoretical an~ ~x~.~.~, foundation of cognitive science to view the systems development process and the roles of its participants in a new context. Several strategies are proposed for developing systems so as to ensure their continued effectiveness in dynamic environments.
Keywords: DSS development, Cognitive science, Problem solving, Decision support, Learning, Schema.
Raymond G. Hunt is a Professor and Chairman in Organizational Behavior, School of Management at SUNYAB. He has a Ph.D. from the University of Buffalo. His research has lately concentrated on descriptive analyses of decision-making and joint management strategies for multi-party projects. He has pubfished numerous books and articles in social psychology and organizational behavior and has consulted widely in these areas with business and government. G. Lawrence Sanders is Assistant Professor of Management Science and Systems in the School of Management at the State University of New York at Buffalo. His research interests are the impact of organizational factors on computer based systems implementation, DSS evaluation, and applicalions of learning theory to systems development. He has published articles in several journals including Decision Sciences, Information and Management, and MIS Quarterly. * This paper is a revision of one originally presented under a slightly different title at The Institute of Management Sciences XXVI International Meeting, June 20, 1984, Copenhagen, Denmark.
North-Holland Decision Support Systems 2 (1986) 125-134
1. Introduction This paper is about the general problem of designing and developing support systems for managerial decision-making, especially in so-called tt~tsu uvmred or semi-structured situations, but also in the development of transaction processing systems. It focusses on the relationship between two principal parties to this enterprise - the analystdesigner and the user-decision-maker - as these two actors go about devising workable decision support systems (DSS). We plan to show how certain concepts from behavioral learning theory and cognitive science are helpful in thinking about these matters. The paper promotes a view of the relationship between DSS user and analyst as a problem in mutual concurrent learning and it encourages arialyric attention to problem-identification or predecision phases of managerial choice. Our argument is that learning is both a fundamental descriptive feature and a normative objective of DSS development. An analyst-designer's special tasks in DSS development consist primarily of gaining an understanding of a prospective user's information-decision requirements and how to satisfy them. The user-manager, meanwhile, is occupied with understanding and coping with her or his own operational tasks, some or all of which may become objects of support. In addition to the separate tasks, however, each of the actors is confronted with common needs to comprehend their working 'interface', to explore alternative constructs of this interface (and of their individual tasks), and, ultimately, to devise a satisfactory joint product - i.e., a workable DSS.
2. Learning and the cognitive basis of performance Now, what can cognitive science tell us about these tasks that may be helpful in planning and
0167-9236/86/$3.50 © 1986, Elsevier Science Publishers B.V. (North-Holland)
12e
R.G. Hunt, G.L. Sanders / Proi~aedeutics of Decision-making
doing them? Well, let us note to begin with that whatever its scope and content, a 'task' is (1) a more or less precise specification (2) of some more or less complex way of dealing with (3) more or less well-defined environmental conditions so as to produce (4) more or less reliably determinate outcomes that have (5) more or less predictable effects. Tasks, then, vary in the precision of their specification, as we of course know. They are also, in every case, learned. And the essential 'product' of this learning is a mental or cognitive construction: a 'model' of the task that, whether it is explicit or remains implicit, serves as an individuars ultimate guide to overt performance [24]. Learning is a somewhat a,~biguous notion. A rubric, really, the term loosely denominates those cases of behavior change that are relatively durable but that somehow result from experience. Learning, then, is any process by which an individual modifies his or her (or its) knowledge about event contingencies, and adjusts and integrates behavior in terms of this contextual 'understanding'. Traditionally among psychologists learning research had concentrated on the acquisition of new responses, although the concept has also encompassed cases of refinement or improvement of response patterns (i.e., performance) up to some criterion at which learning is judged to be complete [9]. In recent years, however, psychologists have concentrated more and more on the complicated ways by which people receive, process, store, retrieve, and use information rather than on behavioral 'acquisition' processes [22,6,35]. Instead of learning, per se, attention, perception, memory, and meaning now largely define the subject matter of the new multi-disciplinary field of 'cognitive science'. And the scientists who populate this field incline more to a 'rationalist' view of human behavior as a process of coping with problematical circumstances, than to the once popular 'behavioristic' one of mechanically linking specific stimuli and responses. Spurred by the seminal work of Miller, Galanter and Pribam [30], contemporary cognitive behavior theory tends to conceive of human behavior as unfolding according to a 'strategy'. Individuals are thought to behave on the basis of global patterns of expected 'products' that will result from actions they take which then are variously evaluated by themselves and others in their environment [32]. A
convenient way of conceptualizing these complicated matters is in terms of ideas such as ' schema' and 'scripts'. Schema (or schemata) are 'intermediate' neural processes that link sensory (or other) inputs to behavioral outputs. They define the assumptions made by actors in concrete cases of learning and remembering; and, as Bransford [7, p. 182] has put it, they represent the 'kinds of information [one would] need to build into a computer in order that it might stimulate the comprehension and inference processes of human beings'. Depending on one's metaphysics, schemas may be thought of literally as 'structural clusters of knowledge' in the mind [22, p. 386], or nominally as 'convenient fictions' postulated to explain certain hows and whys of human action, specifically the unknown processes that organize prior learning (knowledge), guide its use, and orient actors to their environments. In any case, 'schema' somehow control what is noticed about a situation, how it is interpreted, and how responses to it are organized. When confronted with new information the human tendency is to assimilate it to existing schemas. Memory, for instance, now is widely thought of as a 'reconstructive' process of 'inferring the past' from contemporaneous schemas rather than as a literal act of reproducing it [221. 'Scripts' [1], meanwhile, are a species of flexible schema that refer explicitly to the linkage of cognitive or mental structures with overt behavioral expressions or action. In one of the very few discussions of script ideas in the managerial literature, Gioia and Poole [18, p. 450], for example, describe a script as 'a schema held in memory that describes events (or sequences of events or behaviors) appropriate for a particular context'. Gioia and Poole mention various applications of the script idea to managerially interesting problems such a performance appraisals, selection interviews, meetings, and decision-making. A decisionmaker, for instance, is presumed to interpret the performance implications (choices) of a current situation by reference to past experience in the form of a 'recalled decision script'. This 'program' then provides actionable guidance on current problems and choices, i.e., an experience-based definition of the situation and a general plan for dealing with it. Thus, scripts are 'heuristic knowledge strut-
R.G. Hunt, G.L. Sanders / Propaedeutics of Decision-makir,g
tures' that reduce the cognitive complexity of deci,~ion-making and narrow the interpretive and behavioral possibilities of managerial situations. They are pertinent to understanding 'automatic' cognitive processing [33], as in transaction processing, but their truest metier is in explaining interpretive discretionary actions - responses to illstructured, ambiguous circumstances where "fuzzy logic' reigns [44]. In these situations, 'understanding is ... accomplished by means of metaphor and that metaphor is a primary vehicle in the social construction of reality' [18, p. 453]. Scripts (or something like them) are, therefore, indispensable to integrated human action, especially in ambiguous circumstances. In managerial contexts they contribute to efficient decision-making. But as Gioia and Poole [18] point out, scripts cannot guarantee the quality of decisions. For one thing, they oversimplify situations and thereby induce operational error. For another thing, since their relation to an immediate situation is generally loose (i.e., metaphorical), they may prompt seemingly sensible actions that are nonetheless ill-advised. Moreover, cognitive structures incline to rigidity. Naylor et al. [32] point out, for example, that schema, scripts, and cognitive structure generally 'are initially determined by learning of contingencies, but [later] become fixed 'template' strategies that are rarely updated or evaluated'. In an older terminology, they become habitual. Left to themselves, over long spans of time behavioral systems tend to freeze into place conventional definitions of task characteristics, requirements, and modes of management that reduced human capacities for handling novelty and stifle innovation. Organizations fossilize because they lack effective ways of countering these entropic tendencies. What they need, Karl Weick [43, p. 69] has suggested, is 'mechanisms for generating new [ideas and] structures that complicate their existence' problem generators, as it were, that stimulate skepticism of the status quo and impel adaptive learning.
3. The problems of learning, action, and retaining organization complexity Agreement is general that learning involves an interaction of an individual with an environment. It is an adaptive transactional process by which
127
individuals acquire performance competencies, and other important things stlch as goals and decision rules [28] and become functionally integrated with their environments, includh~g, of course, with other people. V.H. Brix [8] has suggested treating this adaptive 'action learning' process from the standpoint of cybernetic 'control theory'. Brix perceives learning to be in the service of control - control of people as well as inanimate conditions. Like the cognitive theorists to whom we have alluded, he conceives of learning as a process of building up 'mental precepts [schema] in the mind of what is going on outside; continually improving on the precepts by ... assimilating or rejecting da~:a coming in' [8, p. 495]. Much like the classic TOTE model of Miller, Galanter and Pribram [30], Brix [8, p. 495] argues that 'the performance of controlling is ... a process of making trials [tests in Miller et al.'s terminology], measuring the effects of errors projected as differences between the real position and the desired (goal) positions, ~and] so progressively reducing error on subsequent trials'. A point to note is that the action learning process sketched in fig. 1 is not blind trial and error. Rather it entails a 'delicate synergy between theory and practice' [8, p. 497]; (see also Hedberg [20] and Bandura [3]). Action, in short, is theory-(or model- or schema-)guided. But social action produces experience - tests - which may plompt revisions of its cognitive directors, i.e., learning. Thus, a door is opened for an analyst (or a decision aid) to serve not only as a technical handmaiden to manager-decision-makers, but as a kind of 'learning prod', helping to counter the entropic tendencies endemic to both cognition and organizations. Opportunities exist for assisting 'creative' human efforts by divising 'cognitive support systems' to assist the acquisition, activation, retrieval, and application of knowledge [7]. Computer-based DSS, for example, may be regarded as cases of more general cognitive support ideas; and an intriguing prospect exists for designing systems that provide efficient, creative managerial memory enhancement, so to speak, by linking decision-makers with current information and relevant knowledge bases. The basic idea is a simple but important one: connecting new information with prior knowledge will interactively influence what is remembered and, it appears, the greater the prior knowledge,
128
R. (~ Hunt, G.L. Sanders / Propaedeuticsof Decision-making
input stimuli incongruity
II
congruity
Miller, Gallanter, and Pribamts [30] Basic TOTE Unit
Survey of Problems I 1 ,,
Recycle] Accept
D
l~"l Decision, I
doI
_ Or I
Positive action
;I
II
Inspection, evaluation, in I terms of original survey
• Output Brixts [8) System Beta Fig. 1. Illustrativeaction learning/controlmodels.
the more so [22]. What we can see here is a theoretical basis for the development of so-called 'expert systems'. As Gevarter [17] has described them, these interactive, user-oriented knowledgebased systems, on the one hand, separate program and, on the other hand, include procedures for linking the two. Such a system can change fundamentally by modification of its knowledge base consequent upon experience, which is to say by learning [19]. Moreover, it can even get better at doing this: it can 'learn to learn' (cf. Bransford
[7]).
4. Organizational learning, problem-solving, and task-modeling Dery [14] has advanced a provocative elaboration of these ideas in his view of organizations as learning entities. Noting a traditional bias in favor of improving institutional means (as distinct, that is, from ends), Dery stresses instead the importance of the goals and organizational premises (schemes) that usually remain untouched by customary choice selection-based treatments of organization decision-making. He points out that
R.G. Hunt, G.L. Sanders / Propaedeuticsof Decision-making
most analyses of choice behavior take goals and values as given. Arguing to the contrary that goals are not pre-existent, Defy advances an interpretive model of managerial action-learning centered on continuous goal setting and resetting. He thus concentrates his attention on 'predicision' situations, i.e., on problem derivation, on the interpretive premises of action, or on processes of task modeling, as one of us has called it in another connection [24]. Dery's fundamental quarrel with contemporary management information and decision support systems has to do with their practice of neglecting the learning of new goals and values. An instrumentalist bias toward performance-oriented decision rules, Dery contends, has led DSS designers to take goals and values for granted while assuming the objective existence of problems and decisions. This customary attitude of the formation systems community is counterproductive in the long run, Dery thinks, because it c',oes little to stimulate organization learning necessary for creative management. Called for, however, is not so much lessened attention to desig~_ng S.O.P.S. for decision-making, as sometimes is argued, as the development of complementary interest in the analysis and critique of judgmental premises of operational systems. What we are talking about is the question of how tasks are 'modeled' prior to decision-making, per se.
5. Problem-solving and task-modeling Researchers, even psychologists working within the so-called stimulus-response behavioral science paradigm, have typically neglected the role of the 'stimulus' in behavioral research. Defy [14] and Dery and Mock [15] have complained that this analytic neglect of 'front-end' matters has been especially conspicuous in decision-making research. Focusing on pre-decision definitions and conceptions (i.e., on task modeling), however, necessarily entails a shift of perspective from 'responses' to 'stimuli' and, hence, to perceptions and their schematic bases. If one divides 'decision-making' into broad phases such as pre-decision, decision, and post-decision (or implementation); and, if one considers a decision phase to begin with a 'presentation of choice', i.e., with a normally tacit 'model' of a
129
decision task, then one may define pre-decision analysis as a search for specification of this presentation (plus, perhaps, an explanation of it). Pre-decision analysis thus involves descriptions of two things: the 'state of the system' in which the decision is embedded and the immediate stimulus and conceptual conditions that give rise to the 'evocation of choice' [2]. The first of these is contextual. It refers to an organization's (or other system's) immediate 'capacity for decision' (e.g., skills, available resources) as well as its local context (organizational size, structure, etc.) Meanwhile, the "evocation of choice', is essentially a signal detection and diagnostic or sense-making process which ;_s rooted in the cognitive process of its human agents and is conditioned by the state of the system in which it occurs. It, and hence the predecision process, results in a definition of a situation - a ' presentation of choice' - which may constitute a specification of decision forms, alternatives, scenarios, or what have you. In any event, it is the conceptual output of the pre-decision process - i.e., a model of the decision (prior to choice) and its context. It is not, however, a 'copy' of some reality. Rather, it is an interpretation - an 'epistemological achievement', as Weick [43] would say. These cognitive constructions - models, schema, ,,¢,:ipts - we have emphasized, are the bases of :~ubsequent action, including actual decision-makaag and choice. They are, furthermore, the essential given of decision support systems, as these usually are understood, which is to say that DSS typically concentrated on the efficient implementation of these cognitive decision models.
6. Support for managerial task modeling It must be apparent, however, that the development of task models by manager-decision makers is itself a supportable process just as is their implementation. Moreover, pre-decision support is arguably more important to effective management than is literal decision-making. For in a real task setting, decision-makers are not simpBy given lists of problems and solutions which they then must match-up [40]. Instead, they must identify and conceptualize problems and solutions and associate them in interpretive models of their larger (managerial) tasks. They must learn effective ways of coping with complexity - ways that ideally
130
R.G. Hunt, G.L. Sanders / Propaedeutics of Decision-making
avoid the excessive simplifications fatal to innovative action. It follows, or so it seems to us, that analysts can play an important part in this problem - construction/solution process and in the learning that may result from and enhance it. They can, for instance, help design systems which facilitate search and scanning for decision opportunities, and can sensitize decision-making managers to the definitional (modeling) requirements of task performance. They can help to devise the 'complexity preservers' that Weick has called for, and the 'problem generators' both he and Dery seek in order to provoke thought and promote organization innovation. The most important thing to notice about t,his idea of assisting managerial learning and imagination is how it concentrates attention not on decisionmaking, per se, or 'how or why managers choose a certain course of action' [14, p. 321], but more broadly on pre-decision problem-defining, problem-solving, and planning. Learning and the redefinition of pre-existing action premises, March [28, p. 223] points out, may in general be adaptively rational, but in particular cases it can be 'superstitious', leading to 'local ot~tima quite distant from global optima'. The st, ategic challenge to the system analyst and DSS designer lies in devising effective means of supporting imaginative managerial task modeling - problem-solving - as well as ordinary routine decision-making. Meeting this challenge requires consideration of the relationship between system analyst-designers and manager-decision-makers.
tasks of learning mazes to 'engineering' problems of learning how those mazes are built. In other words, from a preoccupation with performance of largely fixed routines to an emphasis on flexible problem-solving. In this role, an analyst would be expected to help a manager consider (discover) alternative models or 'maps' of his mazes as well as to provide technical assistance in using models to explore them. If analyst-designers are to function as effective 'complexity preserve~rs' and 'learning prods', they will need to approach their maze building tasks with sensitivity to its social psychological as well as its technical dimensions. Theirs is a demanding collaborative venture, and the managers and systems they seek to help have their own cognitive styles and learning preferences [31,38]. These need to be recognized and accommodated, even if one eventually hopes to change them in order to facilitate learning and innovation. How managers look at the world and the assumptions they make about it, we have pointed out, effect the problems they perceive, the decisions they recognize, and the answers they find acceptable. It is the same with support systems. As Mason and Mitreff [29] have said, managers need information geared to their problems and psychology, not to those of analyst-designers (see also Davis [12]). At the same time, however, analysts need to bear in mind the danger implied by March's observation that 'learning what is likely may inhibit discovery of what is possible' [28], p. 223].
8. DSS development guidelines 7. The psycho-social environment of decision support Cognitive and experimental approaches to organizational development and managerial performance imply a reorientation of systems analysis and management information systems away from their usual concentration on 'technical' problems or decisions and mechanical rules for making them. Focus is rather being placed on working with, developing, and supplementing human managers' models of organizational reality. Such a reorientation to pre-decision matters of problem recognition, definition, and strategic choice would carry with it a significant redefinition of the system analyst role: as Dery puts it, from 'operational'
So what then does the previous discussion suggest about the process of developing DSS. That is, what should guide users, analysts and researchers in selecting tools and procedures for DSS applications. Simply put, the question that should be asked about DSS is whether a system encourages learning and innovation. For example, prototyping has been highly proposed as a means for developing DSS applications. We feel that the underlying appeal (perhaps subconscious) of prototyping is that it facilitates the mutual and concurrent learning process of the analyst and user, an idea that contrasts with the typical system life cycle approach. Consider the evolution of structured analysis
R.G. Hunt, G.L. Sanders / Propaedeutics of Decision-making
and design methods: early techniques in addition to being thorough and systematic, were also often quite rigid. This sometimes resulted in premature freezing of system specifications, which was disadvantageous for users and developers alike, particularly in a DSS context. The problem was that any learning that might be necessary within and between users and developers is usually overlooked. Certain systems analysis methodologies have recently begun to evolve that are more consonant with human action learning requirements: for example, stepwise refinement and iterative design methodologies (whether intentional or accidental); Davis' [12] contingency approach; and the use of prototyping. Prototyping, in particular, appears to be well suited for DSS applications. Prototypes facilitate collaborative schema development by providing user and analysts with an opportunity to use (and learn) more about a target application [10]. To be sure, the construction of prototypes can be an expensive proposition (acquiring prototype language, coding time, demand on hardware, etc.), but from the perspectives we have outlined it makes sense, and the field could probably benefit from more of it just as it could from using more sophisticated information acquisition strategies (of. Davis [12]). Thus, since a primary motivation for developing new techniques of system analysis and design is to enhance the systems development process, proponents of new methodologies should ask themselves how their new technique will increase the learning capability of users and developers. Many of the existing methodologies are useful in managing the data acquired in the development process, but their capacity to increase the ability of users and analysts to understand their target application is commonly small (of. Davis [121). Because of the inherent time requirements of the learning process a certain minimal amount of time is required to analyze and design a system for a given task which is to meet a user's need. Learning time may be affected by various factors, such as familiarity with the task, scope of the project, and complexity, but time is an inescapable requirement of learning [9]. Complex learning (which typifies many systems projects) is, of course, a difficult process which, in addition to requiring time, has many other characteristics which we cannot go into here. Our point is simply that it is reasonable in judging
131
systems analysis and design techniques to ask whether they facilitate the kinds of learning we have outlined and to consider how well they accommodate to its essential requirements. This criterion is distinct from other criteria that may be used in assessing methodologies, which may involve judgments on data consistency, systems completeness, etc., but it is not necessarily apart from them. In order to develop useful system analysis and design techniques, academicians and practitioners need to develop greater sophistication and competence in the application of cognitive theory (and, in particular, learning theory) to their tasks. This will be difficult, of course, and will require a change in analysts' customary mindsets. It may also lead to some new khads of problems; but the increased understanding of the systems development process that would result from solving these problems will outweigh the difficulties they induce. What, then, can an analyst, user or user analyst incorporate into the development of decision support applications to facilitate learning and innovation? We suggest the following general approaches: (1) Help the decision-maker sort out, simplify, and interpret the 'booming, buzzing confusion' out there, perhaps via such technical devices as query languages, DBMS, and fourth-generation languages that empower stimuli-stricken managers to engage in an active search for meanings and 'handles' on their task environments. And s(he) can
(2) try to develop managemeat reporting and control systems that stress positwe feedback mechanisms. Bad news may need to be known, but systems that emphasize noxious, teedback can reduce the abilily of problem solvers to elaborate on their cognitive premises and instead induce stereotypy [21]. Also, in order to counter entropic tendencies, an analyst might (3) consider the use of creativity techniques to improve not only decision-making (i.e., choice selection) but also predecision processes (i.e., specification of choices and selection criteria). Table 1 outlines several methods which may be useful in promoting creativity. An analyst can also (4)
me automated systems which assis~ problem
R.G. Hunt, G.L. Sanders / Propaedeutics of Decision-making
132
Table 1 Creativity techniques, a Free -
association Brainstorming Synetics G o r d o n or Little t e c h n i q u e Phillips 6 6 - b u z z session O r g a n i z e d r a n d o m search Black b o x t e c h n i q u e
Formed-relationship - Catalogue technique - Listing t e c h n i q u e - G r i d - a n a l y s i s , m a t r i x analysis Analytical - A t t r i b u t e listing - Input-output - Grid-analys~s, m a t r i x analysis Eclectic a p p r o a c h e s - C o m b i n a t i o n o r extension o f the o t h e r techniques O t h e r a p p r o a c h e s (see Beer [5]) - N o m i n a l g r o u p process - Delphi a Adapted from Summers and White [41].
recognition. Industries, organizations, groups, and individuals often experience what can be called generic problems. For example, a prototype system has beer~ described that can be used to identify critical success factors for particular applications from a superset of what were referred to as generic success factors [37]. In addition, however, an analyst might try (5) designing 'semi.confusing' information systems [21]. Designing systems to deal with changing environments presumes an organizational climate that tolerates pluralism, diversity of perspectives and evaluation measures, and a certain level of ambiguity and uncertainty in its information systems. Given such conditions an analyst may profitably seek ways of discouraging institutional over-reliance on standard routines and generally try to loosen things up organizationally. The goal in this is an organizational environment which (a) fosters experimentation, where actions are situation-dependent and the organization is not over-committed to a given solution for all problems and where, in fact, 'surprise' or random solutions are occasionally generated [34],
(b) builds in obsolescence for strategies, actions, and procedures, possibly via 'sunset' rules on information systems that would stop them running on a certain date, (c) involves individuals with different cognitive and learning styles in problem solving, because indications are that groups that include individuals with different styles perform better, and (d) utilizes multiple objectives when evaluating information systems (cf. Sanders [36]).
9. Conclusions
Our purpose in this paper has been to view the system development process as an exercise in mutual and concurrent learning by ~ analyst and user. We have described the learning process as a dynamic adaptive process of constructing and modifying cognitive models (schemata) for dealing with complex problems. Systems design tools are by plan (whether explicitly stated or not) meant to support the learning process by assisting the analyst and user in modifying their initial cognitive model of the tasks involved. Systems development tools play a critical role in action learning because they aid in the experience-based construction of systems models via progressive cycles of definition and redefinition. This discussion of learning and schemas suggests an integrated architecture of schematic cognitive structures undergirding the development of operational computer-based systems (see fig. 2). Level 1 is a fundamental schema or paradigm (in Kuhn's sense [26]) incorporating an array of beliefs and assumptions about the 'real' world within which a particular system-to-be-modeled is situated. Level 2 comprises representations of the relevant cognitive task models of both the analyst and user. Level 3 is an integrative abstraction (model) of the target system-to-be-modeled. Systems development tools such as data flow diagrams, Warnier/Orr diagrams, the HIPO techniques, PSL/PSA (see Couger, Colter and Knapp [11]), and knowledge representation techniques (production rules, frames and semantic networks) can be used to accomplish this conceptual development. Level 4 is an adaptive model of the user-oriented, computer-based operating system that emerges from the dynamic interplay of the
R.G. Hunt, G.L. Sanders / Propaedeutics o/Decision.making
133
SYSTEM TO BE MODELED (LI)
l I User Cognitive Schema (L2) ~ =
Development
Tool Schema (L3) o
Knowledge
Representation • Semantic Networks . Frames
• Production Rules o Protocol Analysis ComputerSystem (L4)
o Warnier/Orr Diagrams o HIPO
o Prototypes o PSL/PSA l
Analyst Cognitive Schema (L2) ~-~
o Data Flow Diagrams
I Fig. 2. Levels of schema.
other three schema that occurs during a suitably regulated development process. For DSS designers, a major challenge of the future lies in developing systems in level 4 which satisfy some of the wish lists of artificial intelligence researchers (cf. Davis and Lenat [13]). An ideal system would be able to understand itself, describe what it knows, explains its behavior, and change and adjust its behavior (i.e., learn). Technology at level 3 alone will not permit this. Research and development will have to be directed to the intertwining cognitive, developmental and technological facets of each of the schematic levels. As Harmon and King [19] suggest in contrasting conventional programming with the strategies neco essary to develop expert systems, it is plain that development strategies will 'combine a large measure of cognitive psychology with symbolic programming'. They will as well, if they are to be successful, use highly interactive techniques in an iterative user-focused process.
Exciting visions of the future of computerized decision support systems are justified. Achieving them, however, will require major reconceptualization not only of medlodologies, but of basic ideas about the essential tasks of analyst-designers and the orientations of support systems. References [1] Abelson, R.P., Psychological status of t¢ script concept, American Psychologist 36 (1981 ) 715-729. [2] Bahl, H., and R.G. Hunt, A framework for systems analysis for decision support systems, Information and Management 7 (1984) 121-131. [3] Bandura, A., Social learning theory, Prentice-Hall, Engiewood Cliffs, NJ (!977). [4] Bohr, B.H., Application prototyping, Wiley, New York (1984). [5] Beer, M., Organization change and development: A systems view, Goodyear, Santa Monica, CA (1980). [6] Bower, G.H., Cognitive psychology: An introduction, in: W.K. Estes, ed., Handbook of Learning and Cognitive Processes, Erlbaum, Hillsdale (1975) 25-80.
134
R.G. Hunt, G.L. Sanders / Propaedeutics of Decision-making
[7] Bransford, J.D., Human cognition: Learning, understanding and remembering, Wadsworth, Belmont, CA (1979). [8] Brix, V.H., Action learning and control theory, Omega 11 (1983) 491-500. [9] Bugelski, B.R., Principles of learning and memory, Praeger, New York (1979). [10] Cerveny, R.P.E.J. Garrity and G.L. Sanders, The application of prototyping to systems development: A rationale and model, forthcoming in Journal of Management Information Systems. [11] Cougar, J.D., M. Cotler and R. Knapp, Advanced system development/feasibility techniques, Wiley, New York (1982). [12] Davis, G.B., Strategies for information requirements determination, I B M Systems Journal 21 (1982) 4-30. [13] Davis, R., and D. Lenat, Knowledge-based systems in artificialintelligence,McGraw-Hill, N e w York (1982). [14] Dery, D., Decision-making, problem solving and organizational learning, Omega 11 (1983) 321-328. [15] Dery, D., and T.J. Mock, Information support systems for problem solving, Decision Support Systems I (1985) 103-109. [16] Estes, W.K., The stateof the field:General problems and issues of theory and metatheory, in: W.K. Estes, ed., Handbook of learning and cognitive processes, Erlbaum, Hillsdale (1975) 1-24. [17] Gevarter, W.B., Expert systems: Limited but powerful, IEEE Spectrum, Aug. (1983). [18] Gioia, D.A., and P.P. Poole, Scripts in organizational behavior, Academy of Management Review 9 (1984) 449-459. [19] Harmon, P., and D. King, Expert systems: Artificial intelligence in business, Wiley, New York (1985). [20] Hedberg, B., How organizations learn and unlearn, in: P,C. Nystrum and W.H. Starbuck, eds., Handbook of Organizational Design, Oxford University Press, London (1981) 3-27. [21] Hedberg, B., and S. Johnsson, Designing semi-confusing information systems for organizations in changing environments, Database 13, Winter/Spring (1982) 12-24. [22] Horton, D.L, and C.B. Mills, Human Learning and memory, Annual Review of Psychology 35 (1984) 361-394. [23] Hunt, R.G., Cross-purposes in the federal contract procurement system: Military R&D and beyond, Public Administration Review 44 (1984) 247-256. [24] Hunt, R.G., Technology and organization: A psychological interpretation, Proceedings: Eastern Academy of Management, May (1982). [25] Hunt, R.G., J.M. Magenau and V.N. Fails, A method for coding and analyzing decisions in organizations, Working Paper 512, State University of New York, Buffalo, NY, Oct. (1981).
[26] Kulhn, T.S., The structure of scientific revolutions, University of Clfieago Press, Chicago, IL (1970). [27] Kumar, K., Participant values in systems development, Unpublished doctoral dissertation, McMaster University, Hamilton, Ont. (1984). [28] March, J.G., Decision making perspectives, in: A.H. Van de Ven and W.F. Joyce, eds., Perspectives on Organization Design and Behavior, Wiley, New York (1981) 205-244. [29] Mason, R., and I. Mitroff, Challenging strategic planning assumptions, Wiley, New York (1981). [30] Miller, G.A., E. Galanter and K.H. Pribram, Plans and the structure of behavior, Holt, Rinehart and Winston, New York (1960). [31] Mitroff, I., Stakeholders of the organizational mind, Jossey-Bass, San Francisco, CA (1983). [32] Naylor, J.C., R.D. and D.R IIgen, A theory of behavior in organizations, Academic Press, New York (1980). [33] Reed, S.K., Cognition: Theory and applications, Brooks/Cole, Monterey, CA (1982). [34] Robey, D.R., and W. Taggert, Human information processing in information and decision support systems, MIS Quarterly 6 (1982) 61-73. [351 Rosenthal, T.L., and B.J. Zimmerman, Social learning and cognition, Academic Press, New York (1978). [36] Sanders, G.L., MIS/DSS success measure, Systems Objectives and Solutions 4 (1984) 29-34. [37] Sanders, G.L., J.F. Courtney and J.R. Burns, CSF-DSS: A decision support system for identifying critical success factors, Proceedings: American Institute of Decision Sciences, Boston, MA (1981). [38] Sims; R.R., Kolb~s experiential learning theory: A framework for assessing person-job interaction, Academy of Management Review 8 (1983) 501-508. [39] Sprague, R.H. and E.D. Carlson, Building effective decision support systems, Prentice-Hall, Englewood Cliffs, NJ (1984). [40] Stabell, C.B., Integrative complexity of information environment perception and information use, Organizational Behavior and Human Performance 22 (1978) 116-142. [41] Summers, I., and D. White, Creativity techniques: Toward improvement of the decision Process, Academy of Management Review 2 (1976) 99-107. [42] Wingfield, A., Human learning and memory: An introduction, Harper and Row, New York (1979). [43] Weick, K.E., Cognitive processes in organizations, i~: B.M. Staw, ed., Research in Organizational Behavior, JAI Press, Greenwich, CT, (1979) 41-47. [44] Zadeh, L.A., Coping with the imprecision of the real world, Communications of the ACM 27 (1984) 304-311.