Computer Audit Update
November 1994
people underestimate the likely requirement and actually propose a machine that will be too small, too soon. Occasionally, it is better to spend more now, than to repeat the exercise in the near future.
installations in line with each other. Although this actually cost more money that had been budgeted, it was considered that this was the only way that the original intention of the standby centre could be realized.
Communications cost
Conclusion
The Leeds Building Society in the UK recently received a refund of £200 000 from BT for the rental of lines that were not required, but which had been recommended by BT. Just how much use does your organization make of privately leased lines? Sometimes an analysis can be revealing.
The actual conduct of a VFM review is actually no different from any in-depth review. The difference is usually in the preparation before the review commences, which often means that many potential reviews are never carried out in full, as the preliminary work indicates that a full scope investigation will not in itself provide good value for money!
One company had its own line to Egypt which was hardly used. It had been obtained at a time when a promising contract had been forthcoming, but the contract had never materialised and staff changes meant that the existence of the line had been generally overlooked. The company saved itself £17 000 a year as a result.
Facilities management The company had gone out to tender for the facilities management of its total IT requirements. The internal IT department was also allowed to tender for the job. The internal price was 30% lower than the next lowest tender. A VFM review established that the internal IT department had underestimated their costs by some 20%! Although it was eventually agreed to give the contract to the internal IT department, the organization had a far better understanding of the cost of its IT operation.
John Mitchell is Managing Director of LHS-the Internal Audit Consultancy. He is chairman of the British Computer Society's Computer Audit Specialist Group, a visiting Professor at the University of Luton and a regular contributor to international conferences and joumals. He can be contacted on +44 (0) 1707 654040.
BENCHMARKING Gerry Penfold Benchmarking has become a widely used management tool. As such computer security specialists and auditors need to understand what it is, how it can be used and what benefits and pitfalls to watch out for.
Disaster recovery What is benchmarking? One organization spent millions of pounds on a hot-start standby centre capable of taking the largest of its many data processing centres in the event of disaster. A VFM review revealed that the software at the standby centre had not been kept up-to-date with the other centres. The money spent was effectively wasted, as the standby centre would not have been able to fulfil its role. The o u t c o m e of the review was the establishment of a procedure to keep both
10
Benchmarking is a systematic process for evaluating the business practices of other organizations in order to improve performance and ultimately profitability. It can be applied to almost anything that can be measured but it is more than just measuring for its own sake. Measurement helps to identify performance gaps. Analysis is required to identify the action needed to move from current
©1994 Elsevier Science Ltd
November 1994
performance to potential performance, based on the best practice that can be found. Whatever definition is used, the whole purpose of benchmarking is to find ways of improving. Without a willingness to change, benchmarking is reduced to an information gathering exercise.
Types of Benchmarking There are three types of benchmarking - internal, competitive and generic.
Internal benchmarking involves analysing similar practices in other companies, divisions or departments within a group or company. A good example is Xerox who benchmarked their manufacturing processes in different countries. The main advantages of internal benchmarking are that the data is relatively easy to obtain and it is a good practice area prior to venturing into the outside world. The main disadvantage is that internal best practice may not be significantly better than current performance. Competitive benchmarking takes the analysis to competitors within the same industry. The main a d v a n t a g e s of competitive benchmarking are greater relevance (because competitors are delivering similar products and services to the same market) and more prospect for identifying significantly better practices and therefore performance. The main difficulties are willingness to share information and loss of competitive advantage (if you happen to be the best0. H o w e v e r , c o m p e t i t o r s do share information through independent third parties such as trade associations or consultants. In the UK, the Department of Trade and Industry is supporting the creation of benchmarking clubs through its Enterprise Initiative. Generic benchmarking considers organizations in other industries which have similar processes and which are considered 'best in class'. Moving more towards generic benchmarking increases the opportunities for finding completely new ideas and increases the potential for really significant improvements in performance. A well publicized example is Xerox who were looking for a leader in order processing
@1994 Elsevier Science Ltd
ComputerAudit Update
and warehouse management and found a company called L.L.Bean. Another is American Airlines who benchmarked their turnaround of airlines against an Indycar racing pit stop team. Willingness to share information is not normally a problem with generic benchmarking and useful networks of contacts often develop. However, generic benchmarking can be a very time consuming exercise, especially when trying to identify industries with similar processes and then trying to find individual organizations which are regarded as 'best in class'. In addition, transferring radically new ideas from one company to another can prove difficult, often involving process re-engineering, retraining and redundancy.
Misconceptions Benchmarking is not a simple, one-off exercise. Those that have gained most from benchmarking have made it a permanent feature of managing their business. The conclusions drawn from analysing trends rather than one-off measurements are likely to be more accurate and lead to better decisions on what to change in order to improve. Best practice is a moving target so benchmarking needs to be a continuous exercise - often integrated with quality improvement initiatives. Solutions do not simply emerge as a result of benchmarking. The early stages of benchmarking produce information which must be used in the analysis and decision making process along with existing information about an organization. Neither is benchmarking a 'copycat' process. It is much more concerned with learning and applying the lessons learnt to similar but not identical circumstances. It is rarely quick and easy and often more time-consuming than expected. A typical benchmarking exercise could involve three to five people spending 10-20% of their time for three or four months, depending on scope, criticality and experience. Some short cuts are possible, for example, where an independent third party has
11
Computer Audit Update
already gathered relevant information. But 'quick and dirty' exercises will rarely achieve very much. In s u m m a r y , b e n c h m a r k i n g requires commitment at a senior level, a willingness to learn and change, as well as properly trained staff with sufficient time for the process. With this approach, benchmarking is more than just a buzzword, it is an effective management tool.
The benchmarking process In simple terms, the benchmarking process involves the following steps: •
Determine what to benchmark (What?)
•
Identify resources (Who?)
•
Identify information sources (Where?)
•
Collect and analyse(How?)
•
implement improvements(= Action)
November 1994
•
processes that generate the greatest income or the greatest costs;
•
factors that have the greatest potential for differentiation in the marketplace.
It will also involve asking customers (external and internal) what theyregard as important about a product or service. This approach will avoid wasting time on gathering information which is not relevant and will not be used. Speaking to customers or users can also help to avoid misunderstandings, generate support (even funding!) from them and identify the timeframe for completing the process. When identifying critical factors, it is important to be as specific as possible. If only broad areas are identified then measurement becomes difficult. This can then lead to comparing 'apples and pears' later in the benchmarking process. For example, IT security might be identified as a critical factor for benchmarking. This might then be narrowed down to access controls and then down to a measurable factor such as the time taken to set up or amend user profiles; or the number of access violations per day by staff.
What to benchmark?
Building a team
Determining whatto benchmark is a vital first step. It is essential to focus on the factors that are critical to the business rather just 'nice to know' factors. This requires a clear understanding of existing business processes. This can be achieved in a number of ways, including looking at:
Benchmarking is usually a team effort due to the various roles and responsibilities needed to make it a success. These include project sponsor(s), project managers, data collectors and analysers and support staff. External specialists are often useful for providing training, project management skills or access to existing databases of information. Building the right team involves considering the knowledge of the individuals (especially in the processes being benchmarked), their availability and their communication skills.
•
t h e key m e a s u r e s used by s e n i o r management to control the business;
•
known problem areas; Finding the facts
•
12
parts of the business under the greatest competitive pressures;
When it comes to identifying reliable sources of information, there are two areas to consider: first, general information about critical factors;
@1994 Elsevier Science Ltd
November 1994
second, the identity of organizations considered 'best in class' for the critical factors. General information can be found via trade associations' industry or technical publications, academic and research institutes and consulting organizations. For example, for IT controls and security, useful sources include ISACA (formerly the EDPAA), the larger auditing and consulting firms, the European Security Forum, the special interest groups of the British Computer Society and the IT Faculty of the ICAEW. Members of the above associations are likely to be a good starting point for finding organizations that are considered 'best in class' and who would therefore be good benchmarking partners. Other indicators of potential benchmarking partners include awards and certificates (for example, the Baldridge quality award in the US or possibly ISO 9000 certification), frequent comment in the media (which can often be identified via textline searches) and auditors/consultants who see best practice at their clients.
ComputerAudit Update
and applied to your circumstances needs to be understood. So there is often a need to go back to benchmarking partners for clarification during analysis to ensure that the right conclusions are reached. At this point, decisions can be made about what actions to take, priorities can be set and targets for improvement agreed.
Benefits Organizations that have benefitted from b e n c h m a r k i n g refer to a n u m b e r of improvements: •
better understanding of themselves and their customers;
•
higher quality of service or product;
•
improved efficiency and productivity;
•
enhanced reputation;
Collecting and analysing
•
improved staff motivation;
Once the above three planning stages have been completed, the process of data collection can start with a good prospect of success. A key issue here is ensuring the reliability of the data. Auditors are rather good at this, having had many years experience of gathering audit evidence. Comparing information from different sources is p a r t i c u l a r l y helpful for w e e d i n g out inconsistencies. Techniques for obtaining data from benchmarking partners range from face to face meetings (most effective but time-consuming) to telephone interviews and surveys (less effective but relatively inexpensive).
•
greater innovation.
When the information has been gathered, the analysis can begin. This is a learning process and focuses on the 'how' not merely the 'what' in order to identify the changes necessary to obtain improvement. For example, it is not enough to know that another organization can process and approve new user profiles significantly faster than you. The process used and how it can be adapted
©1994 Elsevier Science Ltd
All of which lead to improved profitability.
Benchmarking IT controls and security How can this help us as IT security specialists and auditors? Well for those organizations that have become heavily dependent on IT for running their business, control over IT resources and their use is increasingly regarded as a critical factor by management. This is particularly true in the financial sector as well as the public sector where public accountability brings an added pressure. As a result, IT managers, internal audit managers and IT security managers can make use of the benchmarking process to improve performance, reduce risk and even to justify budgets in a climate of cost reduction. The approach to benchmarking adopted by computer audit and security specialists at KPMG is worth sharing as an illustration of how
13
Computer Audit Update
benchmarking can be applied to IT controls and security, especially as this is an area where many factors are difficult to measure (e.g. security awareness or adequacy of documentation).
A systematic approach We recognized that collectively we had many years of experience of good and bad practice at a wide range of clients. However, no single specialist had access to all the experience. So the UKfirm started a project to record our knowledge and experience in a more formal way. The result is a relational database which is used to store a range of factors which are important to both our clients and to us as external auditors. These factors cover areas such as: •
penetration of IT
•
management of IT
•
security
•
continuity
•
change management
•
system development
•
internal audit
Scoring For each area there are a number of factors which are scored on a scale of one to five, where 'one' represents 'no control' and 'five' represents 'optimal control'. The system provides examples of controls that would score one to five for each factor. It also prompts the user to record examples of control procedures, especially for factors scoring four or five along with absolute measures where available (e.g. % of users aware of security policy or number of hours to restore systems following a system failure).
14
November 1994
This approach to scoring provides a consistent measure for comparison and avoids the 'apples versus pears' problems that would otherwise arise across a wide range of clients. The record of examples and absolute measures provides a knowledge base of how best practice is achieved. A concise amount of standing information about the organization is also recorded on the database so that the results can be analysed and reported by, for example, i n d u s t r y , size of c o m p a n y or machine environment.
Pilot results The system was piloted at 65 clients as part of our normal statutory audit approach and generated a great deal of interest and comment. Audit teams liked the approach as it added weight to our opinions and recommendations. Clients liked the approach as they saw that it added value to the basic audit service. We liked the approach as it brought requests from clients for more work. However, a number of problems were encountered during the pilot which are worth noting. First, great care had to be taken over the wording of questions, prompts and examples to avoid confusion and misunderstanding. This was addressed when the supporting documentation was revised after the pilot. Second, some individuals displayed a tendency to make allowances when scoring rather than an absolute measure. For example, making allowance for the fact that the organization was small and adjusting a score upwards from say two to three. This was picked up when validating the results and has been addressed by giving more guidance on scoring in the training and the supporting documentation. Third, the database structure has to be flexible enough to handle multiple responses to certain factors, for example, where there is more than one machine environment at one location. This confirmed the need for a relational database. In summary, this process is helping us to identify the most likely areas where the client can
@1994 Elsevier Science Ltd
November 1994
make significant improvements in IT controls and security. In other words, we can help clients with: •
'what' to benchmark by introducing them to a range of potential factors;
•
'who' should benchmark by providing trained staff; and
•
'where' to get information by drawing on our database.
This is then followed by the more important step of using our experience to help the client identify 'how' to improve.
IT security One particular area that is more often regarded as a critical IT factor is IT security. So to support the benchmarking service at a greater level of detail, we have developed an IT security 'healthcheck' service. This is based on the Code of Practice for Managing IT Security prod uced by the Department of Trade and Industry which is expected to become a British Standard in November 1994 and may then become an International Standard. The main areas or 'aspects' covered during the h e a l t h c h e c k are the same as those addressed by the code of practice, namely:
Computer Audit Update
•
system development and maintenance
•
business continuity planning
•
compliance
The code of practice provides a generally accepted benchmark of good practice in IT security. The healthcheck service applies this at two levels. An 'entry level' service which quickly examines key aspects of IT security against the b e n c h m a r k to e s t a b l i s h the d e g r e e of compliance. A more detailed service which identifies what the client needs to do to reach full compliance with the code/standard and then to achieve 'best in class' status if this is appropriate. Information gathered during the healthcheck service is also recorded in the benchmarking database for analysis at a more detailed level. For consistency, the healthcheck service uses the same approach as the benchmarking service for scoring but tackles each area from two different angles. These are capability (how good is a control or procedure in what it aims to achieve?) and effectiveness (what does it actually achieve?). For capability, examples of controls that fit the 'one to five' scoring system are again provided. For effectiveness, three key factors are considered: frequency or availability; coverage; and contribution to the organization's current objectives. The relationship between capability and effectiveness can then be presented graphically as a way of showing where an organization must direct its efforts in order to achieve improvement.
•
information security policy
•
security organization
•
asset classification and control
Conclusion
•
personnel security
•
physical and environmental security
If you have a role in benchmarking what are the secrets of success? I would highlight five key factors:
•
computer and network management
•
system access control
©1994 Elsevier Science Ltd
*
Be prepared to change
•
Be open minded
15
Computer Audit Update
•
First understand yourself
•
Focus on 'how' not 'what'
•
Get the right resources
If these factors are in place, there is every chance that benchmarking will help you to deliver real performance improvements within your organization. Gerry Penfold is a committee member of the EDPAA b London Chapter, a member of KPMG's Computer Audit Services Committee and a regular lecturer on control and security issues. This paper was first presented at Compsec '94, London, UK. The views expressed in it are those of Gerry Penfold and not necessarily those of KPMG.
NEWS
November 1994
system is corrected by consultants. Denver's new airport was due to be opened in March and has cost about £2.0 billion so far, not including airline and Federal Aviation Authority costs for constructing the facilities. The temporary conveyor driven system which relies on humans rather than computers will serve as a useful stopgap until the automated baggage handling system has been completely debugged and is brought online.
Audit Commission report highlights need for effective computer audit A recently published report by the Audit Commission, Opportunity Makes A Thief, has been welcomed by the IIA-UK as it stresses the need for effective computer auditing in the fight against fraud - - a view which has been persistently upheld by the Institute. One in four organizations surveyed in the Audit Commission's report on computer fraud were found to have no internal audit, and only a half of them had computer audit skills.
The opening of Denver International Airport is expected to be delayed until the end of the year due to a faulty computerized baggage handling system reports Computing. The system built by Texas-based BAE Automated Systems uses IBM compatible PCs to direct 400 baggage carts over a 20 mile automated network, but was found to damage baggage and be unreliable, sending luggage to the wrong destinations. The system, which cost nearly £130 million to build, has been designed to provide a unified baggage handling service for all airlines.
A lack of an internal audit and competent internal computer auditors to carry out that function clearly places organizations at increased risk from computer fraud, according tothe IIA-UK. David Bentley, spokesman for the IIA-UK, highlighted that "effective internal audit can help management to understand the risks of fraud and the measures needed to specifically reduce the risk of computer fraud and abuse." In agreement with this issue, the report proposes that internal auditors are well placed to highlight deficiencies in procedures, making management and users more aware of the risks they face, and states the design and maintenance of control systems, with the help of internal audit, is critical to minimizing risk.
To prevent any further delay in the opening of the airport which is running up costs of approximately £0.65 million per day, Denver's city authorities have decided to install a temporary manual system at a further cost of around £30 million whilst the computerized
Mr Bentley acknowledged that the report underlines that there is a problem but added, "management must wake up to the reality that as organizations become more dependent upon information technology there is increased opportunity for the misuse of those facilities." The
Malfunctioning baggage system delays Denver airport
16
{)1994 Elsevier Science Ltd