Information & Management North-Holland
207
25 (1993) 207-213
Briefing
Expert systems and organizational decision-making Jos Benders
a and Frank
Manders
1. Introduction
b
a lJnir;ersity of Nijmegen, Nijmegen, Netherlands h Tilburg University, Tilburg, Netherlands
Expert systems incorporate decision-making processes, and can be considered as the mechanization of human thinking. Whereas they arc attributed many advantages such as improved decision-making and productivity increases, managers need to realize that such gains are unlikely to be realized unless ample attention is paid to the organizational embedding of expert systems. Knowledge erosion, manipulation of data, and high maintenance costs are but some of the phenomena that endanger the successful use of expert systems. Unless managers are aware of such potential threats and unless actions are taken, expert systems’ effectiveness can be seriously damaged. Keywords: Organizational Knowledge-based systems; organization; Effectiveness.
decision-making; Advice; Human
Expert factors;
Although expert systems have been in development for over 25 years now, they can still be seen as rather “new”. As with earlier technology, much is expected from this latest type. The implementation of new technologies in organizations is often accompanied by a rearrangement of tasks and responsibilities among employees. However, mechanisation and automation are frequently seen as solely technical issues which, in practice, are delegated to technicians and automation consultants [13]. Technicians especially seem to have a blind faith in technology. Beer et al. [l] state that not many corporations or vendors of the new information technology are incorporating human considerations in the design or installation of new technology. Technical people in the client or vendor organization are simply not attuned to the long-term implication of the technology.
systems; Process
Jos Benders (MBA Tilburg University, 1988; PhD University of Nijmegen, 1993) is employed as Junior Research Fellow at the University of Nijmegen, the Netherlands. His research interests include the impact of various forms of automation on organizational functioning and the division of labor around technical systems in general and, more specifically, manufacturing automation (see Optional Options: Work Design and Manufacturing Automation, Aldershot,
Substitution of people through perfectly functioning technical systems is, in the opinion of technicians, the one and only solution to many problems. Van de Poe1 [28] even bluntly states that organizational control still causes problems “because the human factor is still not completely replaceable by machines”. Such a technocratic point of view implies that the embedding of the
Avebury, 1993). of Business AdCorrespondence to: J. Benders, Department ministration, University of Nijmegen, P.O. Box 9108, NL-6500 HK Nijmegen, The Netherlands. Present address F. Manders: NV Interpolis, P.O. Box 90156, NL-5000 LA, Tilburg, The Netherlands. 0378.7206/93/$06.00
0 1993 - El sevier Science
Publishers
B.V. All rights reserved
Frank Manders (MSc in Information Science and MBA Tilburg University, 1988) is employed as management consultant at Interpolis Insurances in Tilburg, the Netherlands. He used to be assistant professor at the Centre for Personnel Management and Research of Tilburg University, the Netherlands.
208
Information & Management
Sriefings
technology in the organization needs little attention. Although most publications about expert systems are indeed restricted to technical aspects, gradually some attention is being paid to organizational aspects [e.g. 4, 23, 24, 26, 29, 31, 341. This concern for the internal environment is necessary as larger organizations begin to show an interest in expert systems; in the long run, this technology will be diffused into the small and medium sized organizations. This paper gives an overview of organizational problems that may arise after the implementation of these systems. Furthermore, it seeks to create a greater awareness of the existence of problems that are likely to influence performance of these systems negatively. It is based on a review of relevant literature.
2. Expert systems Lee [14] defines expert systems as “computer programs that attempt to imitate the reasoning process and knowledge of experts in solving specific types of problems”. An expert should be an individual who is widely recognized as being able to solve problems in a particular field where most other people are less effective or efficient. An expert can make good guesses based on incomplete information, using heuristics to fill in the gaps. Developing a good set of heuristics is a necessity in building an expert system. An expert system should have the expert’s knowledge. If it only contains factual information, it is a knowledge-based system, using knowledge to draw conclusions and/or give diagnoses or advice. An expert system is a knowledge-based system in which at least part of the knowledge includes that of one or more experts. However, in the literature both terms are frequently used interchangeably. Many organizational consequences are the same for both systems. 2.1. The architecture An expert 1. 2. 3. 4. 5.
system generally
knowledge base, inference engine, working memory, explanatory facility, user interface.
consists
of [ll,
321:
Expert System Fig. 1. The architecture
of an expert system.
The knowledge base contains the knowledge of one or more persons, usually in the form of rules (if < condition > then < conclusion > ). The inference engine solves problems by making interpretations or drawing conclusions using the knowledge. The expert system must give users an explanation, when required. It must answer the following questions: - why a certain question is asked, _ what the answer could be (the possibilities), - how the system came to a certain conclusion. The user interface is the “face” of the expert system. The communication, usually by means of a dialogue, is the main function of the interface. Figure 1 shows the relationship between the components of the expert system and between the components and the user.
2.2. Applications There are many possible applications of expert systems, but the main ones are in the fields of diagnosing, planning and design, consulting, controlling and legislation. In contrast to decision support systems applications of expert systems mainly concern the operational level of decisionmaking. Expert systems can be “attractive” for organizations where human knowledge is scarce, when the demand for such knowledge is high, and/or when human experts are difficult to train. The problem has to be both solvable and definable, but must have a certain degree of complexity [8, 11, 271.
.I. Benders, F. Manders / Expert systems
Information & Management
3. Organizational systems
decision-making
and
expert
Expert systems are applied as aids to human decision-making in organizations. They may simulate or even replace human thinking and decision-making. There are many reasons to use expert systems, but all seem to be related to either the prevention of human shortcomings or the improvement of human characteristics. A number of disadvantages of human experts that can be prevented by the use of expert systems are that humans get tired, are inconsistent and slow and get emotional. Apart from these cognitive shortcomings, humans are expensive and if they leave the organization their scarce knowledge is lost. Expert systems can obviate these disadvantages. Moreover, the broader applications of knowledge inside the organization are seen as a major advantage of expert systems compared to humans. A non-expert or outsider can make decisions regarding issues in which he has no or nearly no knowledge by using an expert system. Through expert systems this scarce knowledge is no longer prohibited to one or a few persons, but can be made available to all system users. This is also likely to economize on labor costs, as less experienced persons are generally paid less than experts. 3.1: Prevention of human shortcomings
Although the use of expert systems may help to prevent imperfections of human decision-making due to emotional&y and/or tiredness, both factors do not seem to occur very frequently. From a standpoint of human shortcomings, inconsistency seems the major problem, possibly due to emotional& and/or tiredness, but also due to human inability to assess all relevant criteria to reach a decision. Expert systems are designed not to have these problems. However, there is no guarantee that consistent decisions will actually be reached. The GIGO-effect (garbage in, garbage out) illustrates the importance of input data quality on the quality of the output. The input may be manipulated by the user of the system. An illustration is given by Noe [19]. Despite the fact, that the criteria to grant social security payments are the same throughout the Netherlands, it was observed that different
209
decisions were made for similar situations in different towns by different employees. An expert system was developed to make the decisions more consistent. However, as the following (translated) quote will show, consistency is not always attained: The social worker asks the applicant whether he shares his household with anyone or not. The initial answer is “Yes”. The social worker knits his brows and asks whether the applicant is sure. The tone presumes that the desired answer is “No”. The applicant immediately responds with “Well, eh actually . . . No”. When the user, in this case the social worker, wishes a certain outcome from the expert system, he may manipulate the input, provided he knows what input corresponds with this desired output. Thus, the hypothesis that the use of expert systems eliminates the “bias” in the decision-making process is quite naive. It is a typical technological solution that has to be supplemented by organizational measures in order to perform effectively. Perhaps knowledge engineers think in terms of “bias” elimination, but as long as the user takes the final decision such a goal is “wishful thinking’, especially when the user has a personal interest in the decision. This is very well possible in the case of the evaluation of personnel [9], managerial tasks [15] and credit granting [30]. The expert system can, in such cases, fulfil a legitimizing function: people have a tendency to expect a computer (in this case an expert system) to perform better than humans [71. The user can, however, make incorrect decisions but blame the expert system. A blind faith in technology can work in reverse. 3.2. Preservation of knowledge
Rose [22] provides an interesting example of an expert system that was built to retain the knowledge of a maintenance engineer in Southern California. Based on decades long experience, this engineer was the only person who knew how to maintain the dam of a storage lake. The valley created by this project was and is densily populated. Before this expert retired, he helped to construct an expert system. Although Rose does mention it, the question arises: who will be responsible for the maintenance of this system?
2 10
Briefings
Especially as this is in an area where earthquakes are quite usual, this question is more than just academic. The implication of this example is that when maintenance is needed, the organization will be dependent on new experts. When such new experts are not available, the organization’s dependency on human experts is merely substituted for dependence on the expert system [24]. Moreover, after some period of time no one may be able to tell whether or not the system knowledge is still valid. Van Steenis [31] expects that the maintenance costs of expert systems will be higher than that of conventional information systems, therefore higher than the usual 70% of total costs. The Dutch insurance company Centraal Beheer even decided to abandon an expert system after its development, because it turned out to be too expensive to maintain the system [20]. 3.3. Wider availability of knowledge One of the first and at the same time one of the most difficult activities in developing an expert system is the acquisition of the knowledge from the human “experts”. This is especially hard because much decision-making is based on intuition (cfr. Polayni’s concept of “tacit knowledge” [21]). However, this formalisation of human intuition is needed for an expert system. After its incorporation into an expert system, the expert’s scarce knowledge, being the expert’s base of power in the organization, is no longer the exclusive property of the expert. Socha [25] mentions an expert who refused her cooperation to build an expert system for precisely this reason. In addition, a non-expert may never be sure whether or not the expert has provided the correct knowledge to be incorporated into the expert system. Full cooperation may only be expected if the expert expects an improvement of his/her situation after the system has been implemented. Furthermore, “to err is human” also counts for experts. According to Fox [5], the statement “Expert systems do not make mistakes” is a myth. Humans make mistakes and knowledge acquisition is the most difficult stage in developing an expert system. Mistakes made in developing expert systems can be reduced by using the knowledge of more persons as input knowledge. This, however, means higher developing costs. A sec-
Information & Management
ond disadvantage of this approach is, that there may be disagreement between experts, which is especially likely in case of less structured fields. Finally, there is also the risk of the “representational mismatch” [18], i.e. the expert system is not an adequate reflection of the expert’s knowledge due to mistakes made in the process of formalizing the expert’s knowledge. Such problems can of course be largely prevented by extensive prototyping and testing. However, the only persons who can judge the system’s merits are the experts themselves, which again makes the entire process dependent on the cooperation of the experts. A major, and generally acknowledged shortcoming of expert systems is that their application is usually restricted to a narrow area. At “the borders” of the knowledge-domain, the quality of the output may be low, and the system’s output may need adjustment [26]. In such cases, a critical judgment of the user remains essential. However, the use of expert systems can lead to the erosion of human expert knowledge, because they may allow the user to ignore changes in the knowledge based on newly evolved insights. Just as the introduction of pocket calculators tended to erose users’ ability to do mental arithmetic, the users of expert systems may become less proficient. Karlsen and Oppen [lo] concluded, on the basis of case-studies, that knowledge and skills are lost when not used for a long period of time. Goeranzon and Josefson [6] concluded that newly hired employees, skilled with the help of an expert system, adjust their knowledge to the expert system’s knowledge. In the case of XCON, a similar danger was foreseen [26]. Apparently, people have an intense faith in the outcomes of the system. The risk that users will start to “think mechanically” is not imaginary [311. Can a layman use an expert system to make decisions as well as the human experts? Is the quality of the decisions or decision-making at stake here? A certain amount of knowledge remains necessary, especially when decisions are on the borders of the knowledge-domain and/or the knowledge is quickly obsolete. Expert systems may make knowledge more broadly available in the organization, but the examples mentioned above prove that this may be achieved at the expense of the quality of decision-making. Finally, the availability of the explanatory facil-
.I. Benders, F. Manders / Expert systems
Information & Management
ity does not overcome the effect of blind faith. Zahedi [35] expects that the facility will provide a However, the fact that “constant education”. users can use this explanatory facility does not guarantee that they will actually use it. Its use is optional [2], and incurs a loss of productive time. When the non-expert user’s pay is related to productivity, the explanatory facility is not likely to be used frequently.
4. A typology of expert systems Expert systems can be used in at least four different ways (also see [30] and [3]). Firstly, as a problem solver, the system prescribes a decision that the user follows blindly. Secondly, as an adviser, giving the user advice on one or more alternative decisions. Thirdly, as a provider of an opinion, the system will check whether or not the decision of a human is correct. Fourthly, as a tutor, where it is used to educate human beings so that they become expert in the field. This mode is excluded in the following discussion. Expert systems can be categorised by the level of knowledge needed from their users. When the expertise is low, the user must be able to rely on the expert system, which is only possible if the system:
(1) completely
(2)
covers the knowledge-domain. This means that, even on the borders of the knowledge-domain, no mistakes are made. When the system is incapable of making an accurate decision, this must be reported to the user. is fully up-to-date. Up-to-date means that the outcome/decisions must be in conformity with the current situation and rules. When the knowledge-domain is varying, the knowledge base must be adjusted continuously and rapidly. When maintenance costs are high, the savings may not exceed the costs.
Problems can occur when the present level of knowledge is not equal to the required or desired knowledge-level. This is likely to occur in case of non-routine problem areas [4, 121. Figure 2 illustrates the applicability and the application of expert systems.
211
Problem Domain Cmnplete lncompiste Up-k-&k NOt up-todate Applicability of Expert System Problem SOl”W Adviser, second opinion
Application
PrOblem SO,ver
I
II
111
IV
of Expert System
Adwed iz;i
Fig. 2. The applicability
and the application
of expert systems.
Both axes of the matrix are divided into the categories “advisor/second opinion” and “problem solver”. Expert systems can only be applied as problem solvers (quadrant I) when they are up-to-date and cover the complete knowledge-domain. Generally, the expert system can only be used as an advisor (quadrant IV). Obviously, expert systems in quadrants II and III are applied incorrectly. The IST-state is different from the SOLL-state. For example, quadrant III shows the situation where the expert system could be used as a problem solver, but is used as an advisor/second opinion. Quadrant II seems to occur more often, judging from case-studies in the literature. All disadvantages mentioned in the third section are located in this quadrant. Although the output/ outcome of the expert system can only be seen as advise, they are de facto conclusions. When a non-expert uses such a system as a problem solver, labor costs may be low, but the number of (potential) mistakes might be significant. An interesting problem that may arise in quadrant III is the question of responsibility for mistakes. Who is responsible? Probably the user will be blamed. But could this person have acted otherwise? Was he, as a non-expert, able to prevent the mistake? Or should the organization have prevented the wrong decision by means of organizational measures? An alternative “victim” might be the knowledge engineer, who might have made a mistake when designing the system. And the expert could be blamed if the basis is found to be invalid. Interesting lawsuits are already beginning to arise. Organizations must divide tasks, competencies and responsibilities among man and machine,
212
Briefings
Information & Management
taking into account the shortcomings of expert systems. When a certain level of knowledge of the user is necessary, the organization must take measures to provide enough knowledge. Other organizational measures include: 1. The person who introduces the data into the knowledge base and the person who makes the decision may have to be separated. 2. Having (part of) the decisions made by the expert system validated by a human expert. There may be financial making incorrect decisions.
5. Conclusions
a socio-technical approach to expert systems. Anthropocentric expert systems [4, 121 may prove to be a useful step in this direction. Finally, one may ask whether an investment in expert systems instead of in the education of employees can be seen as a choice of machines in lieu of people. Such a choice may reflect management’s priorities.
Acknowledgement
and social risks in
and discussion
Certainly, many improvements may be expected from expert systems, but these gains are by no means automatically attained by technology alone. To quote Liebowitz [161: As executives consider the development and the use of expert systems in the company, executives should not be fooled by the technology. Expert systems are not the end-all to company problems. [. . .] If they realize the limitations, as well as the advantages, of expert systems, then the possibility for the executive’s overexpectations of expert systems will be minimized. An expert system is nothing more than an aid or resource for achieving organizational goals. According to Williamson [33] an expert system “may be a very good one, but don’t get fooled into thinking you’re going to buy the superior skills and reasoning ability of a true expert just because a vendor calls its product an expert system”. As was the case with other “new” technologies in organizations it is not sufficient to pay attention only to technical aspects in order to reach these goals. The use of any technical system has to be supplemented by proper organizational measures [17, 24, 341. Especially when mistakes have severe consequences and when the expert system is used as a problem solver, the organizational aspects need thorough attention. The investment in expert systems’ technology has to be supplemented by examination of the effect of this technology on the organization in order to realize
An earlier version of this article was presented at the Third International Production Management Conference on Management and New Production Systems in Gothenburg, Sweden, May 27-28, 1991. The authors are indebted to Willem de Lange, Ad Feelders, an anonymous referee, and the editor of this journal for their useful comments and suggestions. The usual disclaimer applies.
References [l] M. Beer, B. Spector, P.R. Lawrence, D. Quinton Mills, and R.E. Walton. Managing Human Assets, The Free Press, New York, 1984. [2] J. Benders. Optional Options: Work Design and Manufacturing Automation, Avebury, Aldershot, 1993. [3] T.A. Byrd. “Implementation and Use of Expert Systems in Organizations: Perceptions of Knowledge Engineers”, Journal of Management Information Systems, Vol. 8, No. 4, 1992, pp. 97-i16. ” Human Centredness and Expert Systems, 141 0. Danielsen. FAST-paper 268, Commission of the European Communities, Brussels, 1991. [51 M.S. Fox. “AI and Expert Systems: Myths, Legends, and Facts”, IEEE Expert, Vol. 5, No. 1, 1990, pp. 8-19. l61B. Goeranzon and I. Josefson feds.). Knowledge, Skills and Artifical Intelligence, Springer, Berlin, 1987. “Expert Systemen: Toepassingen - On171 W. Hartman. twikkeling - Gevolgen voor de organisatie (deel III)“, Maandblad voor Accountancy en Bedrijfseconomie, Vol. 64, No. 4, 1990, pp. 149-152. 181 D. Hertz, The Expert Executive; Using AI and Expert Systems for Financial Management, Marketing, Production and Strategy, J. Wiley and Sons, New York, 1988. B. Teel, E.S. Najar, L.R. Medsker, and 191 B. Humpert, M.Z. Cadez. “PEOPL: A Knowledge-based Systems for the Evaluation of Personnel”, Expert Systems, Vol. 6, No. 2, 1989, pp. 60-72. 1101 T. Karlssen and M. Oppen, Informationstechnologie im Dienstieistungsbereich, Sigma, Berlin, 1985. Kennisllll H. Keus. “Aanpak van expertsysteem-projecten”, systemen, Vol. 1, No. 3, 1987, pp. 22-27.
Information & Management [12] J. Kirby. “On the Interdisciplinary Design of HumanCentered Knowledge-Based Systems”, International Journal of Human Factors in Manufacturing, Vol. 2, No. 3_, 1992, pp. 277-287. [13] A.M. Koopman-Iwema (ed.), Automatiseren is reorganiseren; Richtlijnen voor Personeelsmanagement , Kluwer/ NVP, Deventer, 1986. [14] D. Lee. “Expert Decision Support Systems for Decision Making”, Journal of Information Technology, Vol. 3, No. 2, 1988, pp. 85-94. [15] F. Lehner. “Expert systems for organizational and managerial tasks”, Information and Management, Vol. 23, No. 1, 1992, pp. 31-41. [16] J. Leibowitz. “Approaches for Learning about Expert Systems; A Management Introduction”, Management Decision, Vol. 26, No. 5, 1988, pp. 53-57. [17] D. Leonard-Barton. “The Case for Integrative Innovation: An Expert System at Digital”, Sloan Management Reuiew, Vol. 29, No. 1, 1987, pp. 7-19. [I81 V. Mital. “Knowledge systems for financial advice”, The Knowledge Engineering Review, Vol. 7, No. 3, 1992, pp. 215-249. [19] F. No&. “Expertsysteem helpt voordeurdeler”, Automatisering Gids, Vol. 24, No. 12, 1990, pp. 1-2. [20] F. No:. “Kennissysteem Aegon valt in de prijzen”, Automatisering Gids, Vol. 26, No. 44, 1992, p. 5. [21] M. Polanyi, Personal Knowledge: Towards a Post-Critical Philosophy, Routledge and Kegan Paul, London, 1962. 1221 F. Rose. “Een gekloonde ingenieur”, Intermediair, Vol. 24, No. 44, 1988, pp. 17-19. [23] J.L. Ryan. “Expert Systems in the Future: The Redistribution of Power”, Journal of Systems Management, Vol. 39, No. 11, 1988, pp. 18-21. [24] P. Schefe. “The Impacts of Expert Systems on Working Life - An Assessment”, AI and Society, Vol. 4, No. 2, 1990, pp. 183-195.
.I. Benders, F. Manders / Expert systems
213
[25] W.J. Socha. “Problems in Auditing Expert Systems Development”, EDPACS, Vol. 15, No. 9, 1988, pp. l-6. [26] J.J. Sviokla. “An Examination of the Impact of Expert Systems on the Firm: The Case of XCON”, MIS Quarterly, Vol. 14, No. 2, 1990, pp. 127-140. [27] K. Terplan. “Performance Evaluation and Expert Systems”, EDP Performance Review, Vol. 15, No. 9, 1987, pp. l-9. [28] J.H.R. van de Poel. “Contract en beheersing; Een theoretische analyse”, Maandblad voor Bedrtjfsadministratie en -organisatie, Vol. 93, No. 1107, 1989, pp. 150-157. [29] P. van den Besselaar. “Aangrijpingspunten voor technology assessment”, Proceedings AI Applications ‘89, SCI, [sl.], 1989, pp. 503-516. [30] R. van der Spek (ed). Kennissystemen in de financiele wereld, Werkgroep Expertsystemen, The Hague, 1992. [31] H. van Steenis. “Kennissystemen steeds breder toepasbaa?, Computable, Vol. 23, No. 23, 1990, pp. 41-43. [32] P. Webster. “Kennissystemen en expertsystemen; Hoe staat het ermee?“, Financieel Overheidsmanagement, Vol. 2, No. 10, 1989, pp. 4-7. [33] M. Williamson. Artificial Intelligence for Microcomputers; The Guide for Business Decision Making, Brady Communications Company, New York, 1986. [34] E.E. Woherem. “Human Factors in Information Technology: The Socio-Organisational Aspects of Expert Systems Design”, AI and Society, Vol. 5, No. 1, 1991, pp. 18-33. [35] F. Zahedi. “Artificial Intelligence and the Management Science Practitioner; The Economies of Expert Systems and the Redistribution of MS/OR”, Interfaces, Vol. 17, No. 5, 1987, pp. 72-81.