Integrated computer security

Integrated computer security

June 1990 Although organized crime was a chilling spectre at this conference, other speakers chose to concentrate on the 63% area of insider embezzle...

670KB Sizes 0 Downloads 152 Views

June 1990

Although organized crime was a chilling spectre at this conference, other speakers chose to concentrate on the 63% area of insider embezzlement. Anton Valukas was until recently the US Attorney in the North District, Illinois. During his tenure his office achieved national prominence for its thorough investigations and successful prosecution of white collar crime. He felt that the modern wave of embezzlement was triggered bytwofactors, both of which influenced the ‘Risk of Prosecution vs. Potential Gain’ equation that fraudsters work by. Firstly the economic slump and difficulties within the industry increase the Potential Gain side of the equation. This in turn tempts the more senior and influential executives to embezzle, rather than only the less affluent employees. Along with this there is the increasing automation of the bank process which often removes the traditional checks and balances, leaving little or nothing by way of replacement. This combination, claimed Velukas, led to massive frauds which often remained undetected for a considerable time. Velukas called on auditors to change their priorities and pay more attention to fraud and critical loans, and less to business viability. Traditionally auditors merely note inconsistencies for senior officers to look into this may no longer be sufficient. He also called on security officers to work to limit the power of senior executives. In his experience, financial institutions were often completely dominated by two or three individuals, and sometimes just by the CEO. Security officers should also ensure that the checks and balances of a system actually work, particularly if the process is an automated one. The reality of electronic banking was a background to all the sessions in the conference, with special sessions devoted to crime around ATMs and securing EFT and EDI networks. It was noticeable throughout that regardless of the technology, security solutions remained dependent on the attitudes and effort of the management. As Brian McGinley put it in his

01990 Elsevier Science Publishers Ltd

Computer Fraud & Security Bulletin

concluding remarks: “In this country of gadgets, the emphasis is still on management roles.”

INTEGRATED COMPUTER SECURITY Key areas of management control Mike Timms Baring investment Services Limited Most organizations in the business world produce and use large amounts of data. Apart from the actual business needs, legislation and regulatory bodies demand that information is created and protected and is available when needed. Emerging legislation and strong competition demand that this data is protected not only from the casual eavesdropper, but also from the ubiquitous or industrial spy. Sadly, the majority of businesses seem ill equipped to satisfy these requirements. For example, the City of London police recently found that “more than half of all banks and financial institutions in the City have been victims of fraud.” Perhaps unsurprisingly it also stated that “65% do not have a firm policy on the problem”. This article takes a look at data security from a management viewpoint, discusses some of the more pressing problems and recommends action which should be taken. It stresses the need for security to be addressed from a management viewpoint, integrating it into the fabric of management strategy. The problem is now, it is real and it has heavy penalties for the unwary. I will not therefore tread lightly with glass slippers, but will tackle the problems both bluntly and directly. Before progressing further, I should like to destroy the myth that hacking is today’s biggest computer security issue. It is not. Hacking is certainly receiving a disproportionately large amount of publicity. The real issue is that errors, omissions and interfering with computer input accounts for more than half of all reported computer misuse.

9

Computer Fraud & Security Bulletin

June 7990

A look at any survey of computer misuse or so called computer fraud will reveal that: 1.

Insiders are the most likely group to misuse computing resources.

2.

Misuse is usually associated with input.

3.

Misuse is usually unsophisticated.

A closer look will reveal that misuse, either accidental or deliberate, exploits the weakest parts of computer systems. By and large, if controls to prevent accidental misuse can be devised, deliberate misuse will be prevented, or at least deterred. So the problem is known. What can be done to resolve it? A Corporate View Data is a corporate asset, that is it belongs to the organization, not to any one department or individual. It is l

time consuming and expensive to create,

l

expensive to maintain,

l

vulnerable to loss and corruption, accidentally or by design.

either

For these reasons, data is now seen as an asset that needs protection, particularly when it is stored and processed by computers. Many countries have developed, or are developing legislation to cover data protection. For example, the UK has its Data Protection and Copyright Acts, and Michael Colvin’s Computer Misuse Bill is currently working its way through the various processes of the House of Commons. As more computers are linked by increasingly sophisticated telecommunications networks, the vulnerability of data becomes accented to the point where loss and corruption can occur without the knowledge of the owning organisation.

10

Technologists tell us that the answer is in the technology. But is this true? Technology is often a ‘two edged sword’. Its use enables organizations to achieve their aims, but often at the expense of security. As an example, networks are now generally self learning about any new nodes that are added, but this attribute brings with it the problem that unauthorized nodes will be accepted along with the authorized. These built in problems must be overcome. Too much technology tends to baffle the managers who would prefer to use their skills conducting the business, not trying to understand the jargon and complexity of modern computer science. Is there a way for managers to address the problems at a corporate level, to ensure that controls are integrated with the business and to retain management control of the technologists? I believe that to some extent this can be achieved and I will devote the rest of this article to five key areas of management control. However, I must stress that corporate managers cannot ignore the technology issue completely. For example, the driver of a car does not need to know the intricacies of its mechanics, but nevertheless does need to arrange regular maintenance. Managers who use technology to achieve their aims also need to develop an understanding of what it can and cannot do, its strengths and its weakness. They do not need to become, and should not become, technologists. Integrated Computer Security The control over an organization’s computerized data rests largely on controlling the access to it. To do this successfully it is essential to devise and agree the ground rules that will be used. Without this, security will inevitably be relegated to second place or will be left out altogether and when this happens a valuable corporate asset is put at risk. It is said that the cost of introducing a new control is approximately, l

x 1 at system specification,

l

x 4 after system specification,

01990 Elsevier Science Publishers Ltd

June 7990

l

x 8 after program specification,

l

x 16 after system implementation.

From this it is evident that controls must be built as part of the computer system’s development. Adding them as an after thought is an expensive business. There are five key areas which should be considered by management. These are: l

separation of duties,

l

user authentication,

l

control of access rights,

l

encryption,

l

monitoring.

Separation of duties is the primary safeguard which is applied throughout businesses everywhere. As a control it has been used for something in excess of two thousand years. Thus, it is proven by time. In addition to its traditional uses, the control can be applied to both networks and data. One way is to segment the network into zones where each zone corresponds to a computing or business function. As examples, the traditional separation of production and development must be enforced. Certainly networks with access from the Public Switched Telephone Network or Public Switched Data Network should be separated from any private Local Area Networks. Zones may be connected by ‘bridge’ or by processor. Either device should be configured to filter messages and discard any that originate from an unauthorized source. For data, the control can be applied to users who should be categorized as: l

data owners

l

data users

l

data custodians

l

data protectors.

01990 Elsevier Science Publishers Ltd

Computer Fraud & Security Bulletin

Data owners are the primary users of data. They have the right to input data and to change it. Data owners are allowed to access only their own data. For example, within a pay-roll system, certain staff are responsible for entering rates of pay, others are responsible for entering the number of hours worked. Data users are secondary users of data. They have the right to read data but not to enter or change it. Access is allowed only to staff who need access. For example, our pay-roll clerk may be authorized to access certain pay-roll functions but will be denied access to the financial accounting systems. Many staff are both data owners and data users, and their access rights can be controlled accordingly. All security software allows this concept to be applied although rarely do the vendors provide any practical advice on how to set it up. It is management’s task to determine who has access to what, and to develop the administration necessary to make it work. Data custodians are responsible for the provision of data files as and when they are required. They are usually operations staff who should be denied direct access to data but allowed physical access to the mediaon which data is stored, usually removable disks and magnetic tapes. Data custodians share the responsibility for the safe keeping of data with the data protectors. Data protectors are responsible for the security of data. They have no rights to access data, but administer the access rights of others on the authorization of data owners. They also carry out regular monitoring to detect violators and loopholes in control procedures. This task should not be left to technical staff who have the skills and tools to circumvent the rules, but should be part of the regular management function. User authentication. Networks generally cannot do much to ensure that a user is

11

Computer Fraud & Security Bulletin

authorized, but where possible, parts of the network should be configured to selectively allow messages from certain nodes to be transmitted to other nodes, discarding others. This also has the extra benefit of reducing unnecessary network traffic. Authentication most frequently takes place at the processing nodes. Users identify themselves in some way and the computer either accepts or rejects this claim. There are a lot of devices on the market which provide sophisticated authentication techniques but most organizations continue to use passwords. The use of passwords tends to be badly controlled. If users are allowed to select their own, they invariably select simple names which can be easily guessed. If complicated or difficult to remember passwords are provided, users write them down and are generally careless with them. Passwords are only as good as the care taken by the persons who use them, therefore training in good security practice is essential. Passwords should also be changed regularly. All of this is people related, and should be under management’s control. put once authenticated, users should not be allowed unlimited access to the computing resource. Control of access rights. This should occur at the computer, after authentication. Users should be allowed access only to data, systems and system functions which have been previously authorized. If knowledge of other systems can be denied so much the better. Taking the pay-roll clerk example again, access would be granted to only those parts of the pay-roll system that the clerks need to do their work. Access would not be granted to any other business system. One of the current buzz phrases about which a lot is written is “user friendly”. Whilst undoubtedly it is necessary for computer systems to be reasonably easy to use, if too friendly, the systems themselves can encourage poor security. We should thinkof the buzz phrase as meaning:

12

June 7990

user friendly = abuser friendly. With this in mind systems access rights can be controlled in a way which causes regular users the minimum of delay whilst not disclosing any more information to the would-be abuser than is absolutely necessary. For example no help at all should be available before a user has been authenticated. Encryption is often claimed to be the panacea for good security, which it is not. Encryption sensibly used will dramatically improve security, but it always carries an overhead in system resource which slows response. Encryption can be used to protect data held in files and databases, and also to protect over the transmitted data being telecommunications network. However it relies on the designer to put it where it is effective, not leaving any gaps which can be exploited. Certainly it can be a waste of time encrypting files when the data from that file is decrypted immediately before transmission over the network. Any yet how many manufacturers produce terminals with built in encryption features? The result is Userlds and passwords transmitted in clear form across the network. Hardly secure! The subject of encryption method I’ll leave to those who are equipped with a more devious or mathematical brain than mine. For the moment I’ll go along with DES, but I prefer RSA! What has been said so far relates mainly to preventive controls. This is correct. As the old saying goes, “prevention is better than cure”. However, in today’s world this is not really enough. We need to know that our preventive methods are working and need to be able to correct the situation should they fail. Monitoring should occur at the places where preventive controls are applied, that is, at the network, at authentication and at system, function and data access.

Network monitoring can work on the principle that data within a zone may be

01990 Elsevier Science Publishers Ltd

Computer Fraud & Security Bulletin

June 7990

transmitted without prior approval whilst data transmitted between zones requires prior approval. Monitoring on this basis focuses on the identification and investigation of unauthorized activity whilst minimizing the effect on network performance. It also has the added advantage of dramatically reducing the amount of data to be monitored. It is essential to get the network design correct from the start. In fact, it is essential to incorporate securii requirements at the very earliest stages of any new development. Trying to add security to an existing system is expensive, disruptive and usually produces inferior results. System monitoring, where possible, should make use of the features available in the operating system. For example, DEC’s VMS provides reasonable monitoring facilities. However, most operating systems are somewhat deficient in this area, so layered software such as RACF, ACF2, SECURE or SESAME must be used instead. Typically, monitoring should concentrate on activities such as: l

multiple authentication failures,

l

the use of powerful privileges,

l

attempts to access unauthorized directories and files,

l

attempts to access unauthorized systems,

l

attempts to access unauthorized functions,

l

comparing production object code with independently held object code.

Monitoring, both in real time and subsequently to detect trends, is a powerful deterrent as well as providing effective detection facilities. Naturally, all security breaches must be followed up and newly discovered loopholes closed. Sophisticated integrated security controls such as those I have outlined are expensive to develop and implement. The objective should be

01990 Elsevier Science Publishers Ltd

to develop suitable securiiy tools as part of an ‘umbrella’ of security as the application systems are developed. After implementation, these tools can be steadily improved to provide the level of security needed, coupled with the minimum of system resource overhead and operational difficulty for users. Finally a word about awareness. By far the biggest obstacle to successful computer security is the indifference, or even antipathy of management. It is essential that security measures have the positive backing and active support from the top. With positive support, managerial and technical controls can be supported by awareness integrated, programmes to result in an improved level of security suitable to the needs of the business. Without management support, security will always be compromised. There is no doubt that the security of computers and computer systems will become of increasing importance to businesses as businesses realise the importance of datato their survival. Management cannot leave the security of their business data in the hands of a few technologists where it can be reasonably easily compromised. Management must retain control, for good security is the product of an integration of technology and management procedures. Five components are common to all users of computer systems who have an interest in keeping them secure. 1.

Division of duties. Perhaps the oldest control around, it remains the most important. It is essential to keep development and production apart, and to keep users in only those work areas they are entitled to access.

2.

Authenticate users at initial access. Provide no helpful information before authentication and ensure that authentication is meaningful. Passwords are not ideal, but with the right controls coupled with security

13

Computer Fraud & Security Bulletin

awareness education reasonably well.

they

June 1990

can work

3.

Control access rights. Allow users to work only in those systems and system functions which they need to do their job and keep them in those areas. Allow access to only those files and databases which users need to do their job and keep them out of other user’s files.

4.

Do not encrypt anything unless you have good reason. Blanket encryption slows down response and worse, gives a false sense of security. When you encrypt, do so knowing that you are doing it for the right reasons and make sure that encryption is watertight from data source to file.

5.

Monitor system use and follow up all breaches of security. I have found this to be most effective in reducing opportunistic experimenting.

In conceptual terms these controls are similar to those used by management in the non computing parts of their business. Therefore it should not be too daunting a task to use them in the computing parts. As I said at the beginning, data is a valuable corporate asset. It is up to management to ensure that it is properly protected. Concepts, ideas and views expressed in this article are my own, and do not necessarily represent the policies of Baring fnvestment Services Limited or the Baring Asset Management group.

COMPUTER CONTROL SELECTION The Standard of Due Care approach Charles Cresson Wood information Integrity Given the myriad of different security products and the many different viewpoints

14

about appropriate information security activities, how is a prudent manager to make defensible decisions about security measures? The answer - at least in part - is to employ the most common method of selecting computer security measures (controls). This article explores the often underappreciated advantages and disadvantages of this method, known as ‘Standard of Due Care.’ The Standard of Due Care method involves making decisions basedon what others in similar circumstances are now doing. This approach is employed primarily to avoid future accusations of negligence. The Standard of Due Care method is particularly attractive because it can be used expediently at only minor cost, yet it can yield important and substantive results. Although there are compelling reasons to use this method when first beginning to address information security issues, the method has many deficiencies that are not generally acknowledged. To make prudent decisions that truly reflect the complex realities of the information security area, a variety of decision-making methods, such as checklist completion, threat scenario construction, tiger-team attack, and quantitative risk analysis, must be deployed. The notion of Standard of Due Care has its origin in legal principles. Also called the ‘Duty to Exercise Reasonable Care,’ this principle obliges organizations to use sufficient internal controls for information handling activities, whether or not a computer is involved. The marls have not been sympathetic to defendants who claim they acted in good faith but nonetheless caused a plaintiff harm. For example, in Ford Motor Credit Company vs. Swarens [US 447 SW. 2d 53 (KY 1969)], a finance company repossessed the plaintiff’s car even after he had proven on two occasions that he was current in his payments by showing cancelled cheques to agents of the defendant. The defendant claimed the action was based on a computer error, a defense the curt rejected. In this case, the court reliance on ruled that excessive computer-generated data, without proper

01990 Elsevier Science Publishers Ltd