information management Implementing a methodology by RICHARD VERYARD
Abstract. Many good in[ormation systems methodologies are not used to./'ull ~ectiveness h.v the organizations that acquire them. This is ~[?en due to inadequate planning ~?fthe implementation ~/ the methodology and inadequate consideration of the opportunities Jor organizational change that the methodology creates. This paper discusses the implementation of a methodology as a ,vwcial case 0/ the management o[ change. Keywords. data processing, in[ormation systems, inJormation technology, ,2vstems design methodologies, implementation.
y
ou are the manager of an information systems (IS) department. You decide that the tools and techniques used in your department are out-ofdate, and you consider adopting a structured and automated methodology. You talk to various vendors, who provide some technical information for your staff to evaluate, along with some general (and usually unquantifled) claims about the potential increase in quality and/ or productivity. Various members of your department lobby for their favourite methodology. You may talk to existing users of the methodologies under consideration, or to independent consultants. Eventually the decision is made, to go with one selected methodology, with selected supporting software tools. At this stage, all you have to do is send a few key people on a short training course, get the software installed, make the manuals available to the entire department, and you are ready to begin your first project using the new methodology. In no time at all, the promised benefits reduced development costs, reduced maintenance, etc. will be forthcoming. Wrong/ This paper takes the standpoint that the implementation of a new methodology is much more difficult, and more critical to the success of the methodology within an organization, than the selection. Assuming that you have selected a good methodology, for the right reasons, this paper explains what you need to do next, how to set James Martin Associates, Linleton Road, Ashford, Middlesex, TWI5 ITZ, UK
vol 29 no 9 november 1 9 8 7
detailed and measurable objectives for the methodology, and how to plan and control the implementation. It also discusses some organizational and technical issues; how to fit the methodology, the organization and the technical environment together. Planning
For the purposes of this paper, a methodology is a system of tasks and techniques, supported by automated tools and/or direct experience, for carrying out some or all of the following IS activities: • • • • • • • • • •
planning analysis design development product selection implementation operations maintenance project management project/system coordination
The automated tools will probably be in the form of hardware and software. The remainder of the methodology will be in the form of clerical procedures to be performed by IS and non-IS staff. A methodology is therefore a socio-technical system, with both automated and manual components, integrated into a flexible and coherent structure1'2 Dr~/'t
First, let us consider what happens in many organizations where the implementation is not properly planned and controlled. The introduction of the methodology may be highly visible with statements of commitment and enthusiasm from senior management. Soon, however, attention shifts to other day-to-day concerns. Management roles are redefined or reorganized so that those individual managers originally committed to the methodology are no longer directly responsible for it. To discover who
0950-5849/87/090469-06503.00 ~3 1987 Buttcrworth & Co (Publishers) Ltd.
469
now has responsibility for it may be difficult. Technical staff may also be reorganized or reassigned. The methodology starts to drift. The symptoms of drift are as follows: • Many staff are given formal training in the methodology but few, if any, are given the opportunity to try it out. • A pilot project using the methodology is started but never completed. • A pilot project is completed, but nobody seems interested in following up the results, or learning from them. • No new projects are started using the methodology. • New projects pay lip service to the methodology, but turn out not to be adhering to its principles and guidelines. • The organization abandons dialogue with the methodology vendor and/or with other users of the methodology. So what goes wrong? There are three possible failure scenarios. First, the methodology may have been poorly chosen for the organization, An unbridgeable gap may exist between the requirements of the methodology and the expertise of the staff who are to practise it. Or perhaps the principles of the methodology clash irreconcileably with the style and culture of the organization. With a good methodology, however, such incompatibility is rare. Second, the methodology may be appropriate for the organization but insufficient start-up resources will be allocated to it, and the necessary expertise will not be injected. Most methodologies call for a greater change in staffpractices and skills than can be imparted by formal training alone. Staff new to structured methods may 'freeze' when required to follow a methodology without proper support, thus reducing their productivity, sometimes to nothing. Third, the methodology may overcome these hurdles, get off the ground, and then drift. This drift is due to a failure to plan, and a failure to control. The implementation has been badly managed. Of course, every organization will have some document it can point to, which it calls a plan. Often, this will be little more than a training schedule (but with the training objectives not spelled out), or an allocation of resources to rewrite the standards manual. Bureaucratic organizations commonly do the latter. There is more to planning than that.
Plan contents In a plan it is important to ensure that everybody's expectations are realistic. A plan must make the following explicit: first, what will be achieved by using the methodology? (If productivity, whose productivity, and how measured. If quality, how defined, etc.). 470
Second, how much will be achieved, and by when? For example, there may be a 30 % reduction in elapsed development time in all projects starting after the first six months of methodology use. This may or may not be an achievable target in a given situation. Third, what implementation strategy will be adopted? In particular, will a pilot project be commissioned, and how will the use of the methodology be expanded when the pilot is complete? Fourth, what resources are needed, when and why? Unless the reason for needing the resources is explained, they may well be reallocated or squeezed to cover other eventualities. Fifth, how will the plan be controlled? Either control procedures should be specified or responsibilities should be assigned to named individuals, job positions or groups. The following sections address these planning issues.
Benefit justification Justifying a methodology by its expected benefits, as for any investment, may be done in three ways: Traditional: an investment of SX will give a return of $Y thus yielding a satisfactory net present value, internal rate of return and/or payback period. Relative: case studies, examples and other anecdotal evidence are documented showing how substantial benefits were achieved elsewhere. Alternatively, past situations within this organization may be demonstrated, where substantial benefits would have been achieved had the methodology been in use then. Strategic: a significant weakness in the organization is perceived to exist. This prevents a flexible yet stable response to environmental opportunities and threats, which the methodology is expected to alleviate or remove. The benefits of a new methodology are often intangible, for example: improved quality and reliability of systems and software; improved quality and reliability of the systems development process; greater enduser control and faster response to new business opportunities. Other benefits may be quantifiable, e.g. reduced development/ maintenance costs for systems and software, but hard to estimate accurately. If the benefits of a methodology are longterm and/or indirect, traditional justification may be hard to demonstrate. In retrospect, it may be too complex to separate those achievements that can be attributed to the methodology from those that should be attributed to some other factor. Also, it may be too complex to accurately estimate what the situation would have been without the methodology. Top management may prefer investments whose justification is traditional, but should be aware that quantification of the intangible benefits of improving infrastructure, including information services, is difficult and inexact. Relative and strategic arguments information and software technology
information management should be allowed, to supplement or supplant the traditional cost-benefit justification, but must be well-documented and detailed. It is important to be clear what the expected benefits of the new methodology are.
Benefit quantification and timing Benefits should be defined and measurement procedures specified. Measures that depend on the organizational structure or on the methodology itself should be avoided. For example, a measure of productivity that includes department A's effort, but not department B's, will be invalidated by a redistribution of work between departments A and B. The most common instance of this is when A is the information systems department and B is the enduser department. This increases IS productivity by offloading work to other departments ! And if another measure is the development/maintenance ratio, this will be distorted if the methodology changes what counts as development rather than maintenance. Targets should be set by comparison with results in similar organizations using the same or similar methodology, and negotiated with appropriate managers. No methodology is completely new; any new methodology will be based on prior experience with similar methods. There should always be some historical basis for planning and estimating, although perhaps rough. A fast and furious implementation will demand more resources than a slow and cautious one, and may involve more risk, but will provide quicker benefits. That is a trade-off that will be decided by management according to the prevailing culture and organization preferences. Management generally underestimates the length of time needed for any major technical or organizational innovation to be fully 'bedded in'. A new methodology is both a technical and an organizational innovation and is therefore particularly subject to this danger. This is compounded by tactical misrepresentation, whereby top management exaggerates its impatience for any development. This is done in the belief that it will impart a sense of urgency and productivity to staff. These factors often lead to overoptimistic and foreshortened schedules. Unnecessary haste should be resisted, although this should not be regarded as an excuse for indefinitely prolonging the implementation. Even a small organization may take a couple of years before all IS projects are managed according to the methodology. Large organizations should plan for several years of transition. Therefore, it is important to set intermediate and quantitative (or at least measurable) milestones, for managing the implementation against. All the benefits should not be expected at once, but ranked. It may be possible to focus on areas of the business where the implementation of the methodology will be most beneficial. Some IS
vol 29 no 9 november 1987
methodologies include a top-down planning exercise which will help to address these issues. Due to the expected gradual implementation, it will be possible to compare methodology-driven and other projects in parallel. However, this comparison should be interpreted intelligently. There may be many reasons, including political and psychological ones, for the success or failure of an individual project or its comparative showing against other projects.
Implementation strategy
the pilot project
Having decided what benefits are to be achieved overall, and set intermediate milestones, you then need to decide where to start, and how to proceed. The traditional approach to implementation is the pilot project which is a small, short project that will show the methodology and indicate any possible problems. In addition, the implementation strategy must define whether the early stages of the implementation are to be regarded as experimental, how the experiment will be evaluated and its results used, and how the transition is to be made from experiment to commitment. In some large organizations, the experimental phase may last as much as one or two years. There is usually a dilemma in choosing an area suitable for a pilot project 3. The project must be small enough, and simple enough, to have a good chance of success despite the lack of skills and experience within the organization. It must also be large enough, and complex enough, to provide a real test of the methodology and to provide some useful experience for the future. If the benefits expected from the project are important and/or urgent, this may place the team members under too much pressure to properly learn what they are intended to learn. On the other hand, if the project may not have been bothered with, except for the need for a pilot project, the project will not be taken seriously. Some project managers may argue that their project should be used as the pilot; others may argue that their project is not suitable. In both cases, the motivation may be suspect. Similar difficulties arise with the staffing of a pilot. If you put your best people onto the pilot team, it will not demonstrate the ability of your average people to practise the methodology. Inhouse standards and interpretations of the methodology may be developed so that only the best people can understand them. The average people may lack the necessary confidence they will need when they come to use the methodology later. However, if you do not put your best staff onto the pilot team, does it have any chance of success? The pilot project will be more difficult than later projects because the methodology is not yet tuned to the organization, nor is the organization attuned to the methodology. Expertise will 471
be scarce, and the requisite support structures will not yet be in place. What is the purpose of a pilot project? If the purpose is to demonstrate the methodology, a 'safe' area should be chosen. If the purpose is to test and tune the methodology, a 'typical' area should be chosen. It is practically impossible to satisfy both purposes with a single pilot project. Having completed the pilot project, it will be necessary to draw the appropriate lessons, revise the methodology implementation plan if necessary, and continue with a broader exploitation of the methodology. It is advisable to be prepared for this review. If the pilot project has been a success, there will be much impatience for further exploitation. It is important to maintain this momentum and interest. The evaluation criteria, and some alternative courses of action depending on the criteria, should be mapped out in advance so that the review is not rushed. Think ahead. What will you give the pilot participants to do when the pilot is complete achieve the following goals: make maximum use of the expertise they have acquired, publicly reward them for their successes and reaffirm management commitment to the methodology. It is often difficult to achieve all these simultaneously. Without forward planning it may be impossible to escape the dilemma: either give the pilot participant a less challenging assignment than he/she deserves, or assign responsibilities and tasks to him/her that are not directly related to the methodology and thus losing the continuity of expertise. Plan for the continuity of staffing so that staff can develop their expertise in particular tasks and techniques across several projects.
Resource quantification and timing The level of resources that will have to be devoted to detailed planning of the methodology implementation and to training and supporting the methodology users depends on the implementation strategy, as discussed above. It is often advisable to group those responsible for the coordination and support of the methodology into a project team or task force. You can expect these staffto be overloaded with work, while they themselves may still be learning the methodology. Their responsibilities may include the following tasks: • ensuring the correct balance of expertise on project teams, • managing and/or conducting formal training sessions, • supporting inexperienced colleagues and those using the methodology directly, e.g. programmer/analysts on design/development projects, • providing clarification and extension of standards and guidelines, • establishing the scope of various projects, and the
472
impact of one project on another, which may apply to projects not using the methodology as well as those that do, • implementing and supporting the software that supports the methodology, and building interfaces to other software, e.g. existing data dictionary or program catalogues, • establishing central functions prescribed by the methodology, e.g. data administration, • maintaining contact with similar organizations using the same or similar tools and techniques, and also with the methodology vendor, to see how the methodology is developing elsewhere. Some of this work may be carried out by external consultants but much of it will need to be done by your own staff. Priorities must be clearly set. Perfectionism is often a danger; to get workable standards and guidelines available for use on projects, you may need to encourage a rough-and-ready approach for the first version. Polishing and cosmetic work is a much lower priority. This may discomfort staff who are used to an ethos of 'get it right first time' but will not seem so strange to staff with software prototyping experience.
Monitor and control Determine who is to control the implementation of the methodology, how this is to be done and how regularly. The following aspects should be reviewed and the plans changed if appropriate. Quality: is the quality control process really happening, is it effective, is the level of resources consumed by it appropriate, do the deliverables have the expected level of quality? Estimating: were the estimates of timescales and resources accurate; do plans and schedules need to be adjusted? Was the utilization of project and support staff at the expected level? Commitment: has the methodology been given the required level of commitment? Talent: is the expertise required by the methodology being developed inhouse or is the organization still dependent on external help even for simple and basic tasks? Evolution: what is the cumulative effect of the changes being made to the methodology? Is the methodology becoming simpler or more complex, more or less disciplined? What are the implications of any differences between the inhouse version of the methodology and the standard version?
Organizational issues
Methodology ownership At first, the methodology may be perceived as an external innovation. Any enthusiasm for it may be taken as
information and software technology
information management criticism of the existing situation/practice. IS staff often perceive a new methodology as being a set of fixed and inviolable standard procedures imposed on the IS department. Such perceptions must be altered if the IS staff are to feel good about the methodology. Programmers and analysts may well concede that their current tools and techniques are less than perfect, They should not be given the impression, which overenthusiastic supporters of the methodology may convey, that their current skills and methods are worthless. Otherwise, the IS staff may do one of three things. First, they may resist the methodology, avoid making genuine use of its concepts and principles, and exploit any excuse to ignore its guidelines. Second, they may carry out the methodology, pedantically and bureaucratically, with dogged adherence to the letter rather than the spirit in the hope that the methodology will be proven wrong and their previous methods will be thereby vindicated. Third they may completely forget their previous expertise, put total trust in the methodology, following its every recommendation. They will carry out counterintuitive or even absurd tasks without question because they have lost confidence in their own judgement, because the methodology, which is 'scientific', cannot be wrong. Although the psychology of the third scenario is different to that of the second, the effect is the same. No methodology can survive the absence of intelligent interpretation by its users because no methodology can be 100% right all the time. IS staff must be allowed to make a gentle transition from their old methods to the new methodology. They need to build a bridge, so that they can evaluate and, if appropriate, use their current skills and procedures using the methodology as a framework. They should be allowed, perhaps even encouraged, to compare what the methodology tells them to do with what they would otherwise have done. In some areas, the existing practice may be superior to the new methodology because it has been tailored over many years to the particular organizational context. A good methodology will provide a framework into which such organization-specific knowledge can be incorporated, thus allowing itself to be adapted to fit the organization within which it is being implemented. Changes to the methodology must be carefully justified and controlled, to ensure that the benefits and disciplines of the methodology are maintained and not lost. But it is essential to provide an open channel through which users of the methodology can provide feedback. As the methodology users, both within the IS department and elsewhere in the enterprise, see that they have real power to influence the content and direction of the methodology, to adapt and improve it to better fit their vol 29 no 9 november 1987
particular needs, then they will start to feel 'ownership' of it. As the methodology starts to be regarded as 'ours', commitment to it will grow, and the benefits and disciplines will become more secure. To embed a change in culture and attitudes within an organization, which the new methodology may call for, heroes should emerge. In other words, individuals whose careers can be seen to have been enhanced by their own personal commitment to the methodology. A hero must be prepared to battle for his/her project despite apparent setbacks. He/she must be seen to be closely involved in the project and not use political stratagems to distance himself/herself from potential failure. Clearly a hero must be seen to be competent, and must not alienate nonheroes. Once such heroes have emerged, the ownership of the methodology by the organization will be secure.
Staff management The criteria used for recruiting, managing, rewarding and promoting staff within the information systems department should be reviewed. Are staffmotivated to learn, use, adhere to and develop the methodology? Is the overall staff profile suited to exploiting the methodology? Are the structure and culture of the department still appropriate? A methodology whose benefits are longer-term may clash with a performance evaluation system which only takes recent work into consideration. Thus staffmay only take seriously those tasks and opportunities which provide instant feedback into their paycheque or job grade. This would significantly distort many methodologies. There is also the question of perceived risk. Many programmers and analysts believe that they will lose more status if a project does not succeed, than they will gain if the project succeeds. Furthermore, no methodology can provide a 100% guarantee of success. Thus they are taking a risk with their own careers by using the new methodology. With the current performance evaluation system, the gamble may seem unattractive. The performance evaluation system must be retuned so that the penalties of failure (reduced future opportunities, reduced salary, etc.) are not so high as to discourage the employee from taking the risk, yet not so low as to encourage the employee not to take the project seriously. By rewarding staff for the longer-term consequences of work done in previous years, staff will be encouraged to produce higher-quality, longer-lasting, more stable systems. And longer-term reward structures should reduce the turnover of good staff.
Training It must be assumed that training is provided to staff as near as possible to the start of the project on which they 473
information management are expected to use the methodology. This allows staff being trained to bear in mind the specific roles they will be playing during the project and get maximum benefit from the training session. If general information about the methodology is provided to the entire IS department, this should be regarded as education rather than training. The right amount of material should be included in each training session. Wherever possible, the training should present small chunks that can be easily grasped and 'internalized'. The overall shape of the methodology should not be lost sight of. It does not make sense to ask staff to learn to perform a particular subtask without understanding how the output will be used in subsequent subtasks. Each technique should be taught in context so that staff understand the reasons for doing it and the level of detail that must be devoted to it. When staff are trained to use a new methodology, the training objectives are often unclear. Some training sessions are run with the apparent objective of persuasion, to change the attitudes of the staff being trained. (There is substantial evidence that training alone is not an effective way of changing attitudes). Other training sessions spend most of the time exploring the full power of the modelling languages and logic built into the methodology. Staff are often confused by being shown rarely used features of the methodology before they are comfortable with the basics. Staff are often given the impression that they are expected to memorize a plethora of symbols and rules. Ifa methodology expert is going to be available during the project, either as full-time support or as part-time consultant, it is better for the rarely-used features to be introduced by the expert as and when the need arises on a particular project. The right objective for a formal training session is to enable the individual to make a positive contribution to a project, to solve simple problems with manuals, tools and personal support, and to understand (and know how to extend) the limits of his/her own abilities.
Technical issues A methodology can be viewed as an information system. It involves the creation, communication and interpretation of information and decisions, as models and specifications. It will probably be supported by hardware and software. To implement the methodology requires implementation of the entire system including, but not restricted to, the hardware and software. The following issues are relevant and can be addressed using the standard approaches of information system implementation: conversion of existing work, including program specifications and system documentation, to the new methodology and/or building bridges to enable existing and future work to be integrated; controlling the
474
organization~ task--* formal -* context structure manual procedures
soflware~ hardware
Figure I. The direction of innovation technical efficiency and reliability of the specific tools and techniques of the methodology, individually and as a system; implementing the software and hardware into the technical environment and establishing interfaces with other key components of the technical architecture. It is advisable to let the organizational and administrative innovations drive the purely technical innovations, rather than be driven by the hardware and software s. The direction should be as shown in Figure 1.
Conclusions The implementation of an IS methodology into an organization is a major innovation and should be managed according to the established principles of change management. A methodology carries its own culture and structure and care should be taken to ensure that these do not clash with the culture and structure of the host organization. A methodology is a system (of procedures, communications, etc.) and should be implemented according to established guidelines of good system implementation. Attention to these principles and guidelines will reduce the chances of the methodology becoming stagnant and increase the chances of the promised benefits being both achievable and achieved.
Acknowledgements Thanks are due to Ernie Akemann, Richard Frankel and John Wyatt for their detailed comments.
References 1 Veryard, R 'What are methodologies good for '?' Data Processing Vol 27 No 6 (July/August 1985) pp 9 12 2 llvari, J 'A methodology for IS development as an organizational change: a pragmatic contingency approach' Proc. IFIP TC WG 8.2 Conference Injormation Systems Development for Human Progress in Organizations Atlanta, GA, USA (May 1987) 3 Leonard-Barton, D and Kraus, W A 'Implementing new technology' Harvard Business Review (November/December 1985) 4 Mayon-White, B (Ed). Planning and Managing Change Open University, Harper & Row, London, UK (1986) 5 Damanpour, F and Evan, W M 'Organizational innovation and performance: the problem of organizational "lag" Administrative Science Quarterly Vo129 (September 1984) pp 392409 []
information and software technology