Data loss prevention – not yet a cure

Data loss prevention – not yet a cure

FEATURE Data loss prevention – not yet a cure Tracey Caldwell Tracey Caldwell, freelance journalist Data Loss Prevention (DLP) technologies have mat...

556KB Sizes 1 Downloads 52 Views

FEATURE

Data loss prevention – not yet a cure Tracey Caldwell

Tracey Caldwell, freelance journalist Data Loss Prevention (DLP) technologies have matured rapidly and now offer a sophisticated means of classifying and monitoring sensitive data. Yet data breaches continue unabated. A March 2011 survey by Check Point and the Ponemon Institute, ‘Understanding Security Complexity in 21st Century IT Environments’, found 75% of UK organisations experienced data loss in the past year, compared with an average of 77% internationally. Customer information was the most common type of data to be compromised (52%), along with intellectual property (36%), employee information (36%) and consumer information (35%).

Private sector woes In the UK, the Information Commissioner, Christopher Graham, has called for businesses to undergo data protection audits. The annual report from the Information Commissioner’s Office (ICO) indicated that private companies reported the most data security

breaches of any sector in 2010/11. The ICO operates a voluntary scheme under which serious breaches are brought to the office’s attention. Figures from the annual report show that of the 603 data security breaches reported to the ICO in 2010/11, 186 occurred in the private sector. Despite this, only 19% of businesses contacted by the ICO accepted the offer to undergo free data protection audits. In contrast, 71% of public sector organisations that were contacted agreed to be audited. Research company Gartner’s ‘Magic Quadrant for Content-Aware Data Loss Prevention’, published in June 2010, identifies four market leaders in the DLP sector – McAfee, Symantec, RSA (EMC) and Websense – with start-ups such as Palisade and GTB Technologies playing catch-up. DLP technologies allow companies to monitor data across their networks and

analyse it using sophisticated technologies to determine what sensitive data is included, who is using it, how they are using it, and enforce policies of who can access what information, what data can be shared, and with whom.

“Garter defines DLP technologies as those that perform content inspection of data at a sophistication beyond simple keyword matching and standard expressions” Garter defines DLP technologies as those that perform content inspection of data at a sophistication beyond simple keyword matching and standard expressions. It points out that, “used to its full capacity DLP is a non-transparent control which means it is intentionally visible to the end user with a primary value proposition of changing user behaviour” – in contrast to the likes of firewalls and anti-virus software.

Employee buy-in

Information assets to be likely lost or stolen: Source: Check Point/Ponemon.

September 2011

Employee buy-in is needed at all levels of an organisation, as DLP implementations raise potentially sensitive issues. “Non-transparent controls represent a culture shift for many organisations and it is critical to get business involvement in the requirements planning and implementation of DLP controls,” states Gartner. The Information Security Forum (ISF) warns: “In the eyes of the law, DLP technology could be used by organisations to monitor employee behaviour – the role of DLP administrators and the Computer Fraud & Security

5

FEATURE

Lior Arbel, Websense.

technology itself should be assessed by a legal department to ensure no privacy or employment laws are broken.” Christian Toon, head of information risk at Iron Mountain, also highlights the human element in achieving a successful DLP implementation: “Data loss prevention technologies are only as good as the employees using them,” he says. “Most technology these days supports a ‘plug and play’ approach for implementation. It’s a shame this isn’t the same for organisational integration. A comprehensive change-management programme is required to ensure that every type of employee is taken through the evolution and is best trained to use this new technology in the manner it is intended.”

Perceptions of complexity As the technology has matured, DLP has suffered from perceptions that it is overly complex, and a failure to address that complexity may render the implementation ineffective. “Many customers believe that DLP is a complicated project because this is how many of the vendors in the DLP market positioned it,” says Lior Arbel, director of strategic data security solutions at Websense, which has worked with several NHS bodies in the UK to create specific policies around healthcare. “They told customers that in order to protect data they would need to classify 6

Computer Fraud & Security

the data first and then run discovery on that data.” Websense offers bespoke policies as part of its product offering. And it has more than 1,100 predefined templates that allow customers to gain value from the DLP product without the need for big investments and changes to current policies and infrastructure. Guy Helmer, CTO at Palisade Systems, believes DLP is experiencing renewed growth. Palisade Systems has focused on mid-size healthcare and financial services companies dealing with Personally Identifiable Information (PII), Personal Health Information (PHI), and US and international data security regulations, including HIPAA, SOX, GLBA and PCI. Many vendors are integrating DLP technologies into existing anti-virus and information security solutions, which a number of people in the industry are calling ‘DLP-lite’ solutions, according to Helmer. “To further distinguish the differences of full DLP from DLP-lite, DLP includes the ability to use a wide variety of analysis approaches beyond just regular expressions – such as lexicons, structured data fingerprints, unstructured document fingerprints, and proximity rules – as different forms of data require different forms of analysis,” he says. “It also has extensive reporting and alert capabilities, with the ability to dive deeply into an incident and review it, plus case management to enable managing groups of incidents under corporate procedures, legal and regulatory requirements.”

Guy Helmer, Palisade Systems.

Helmer often finds a DLP implementation is an early step in defining data security policies in small to medium-size organisations. “Running the DLP system in monitoring mode for a time results in an understanding of the business’s exposure and where changes can be made in operational procedures to reduce risks and improve compliance,” he says.

Email risk Nick Lowe, Check Point’s head of sales for Western Europe, believes that it is preferable to involve the user more, rather than rely on increasing complexity in traditional DLP technologies. “Traditional DLP technologies are hard to set up and ‘train’. Users don’t get the ROI benefit immediately,” he says.

“The approach taken in most DLP solutions is to use artificial intelligence to try and index and classify data, and then monitor its movements via email, but it’s a huge task that requires a hefty amount of personalisation in a given organisation” He points out that email is the biggest risk for data loss because of the volume of emails leaving an organisation daily. He says that, “99.9% of these breaches are not malicious, they’re just mistakes as the user will have auto-filled the wrong address, or attached the wrong file. This is a massive problem for traditional DLP solutions – they need time to set them up and get them working effectively.” He adds that, “The approach taken in most DLP solutions is to use artificial intelligence to try and index and classify data, and then monitor its movements via email, but it’s a huge task that requires a hefty amount of personalisation in a given organisation.” As it is often an accidental click by an individual user that causes data breaches, Check Point’s DLP solution spots pre-determined file types and key words in emails. September 2011

FEATURE

Most frequently cited root causes of data loss or theft. Source: Check Point/Ponemon.

It triggers a pop-up dialogue to the user at the point when they click ‘send’, asking if they really intend to send the attached file to that person’s address, and getting the user to check and confirm. This helps to educate users in security awareness.

Fine tuning A number of vendors is offering solutions that work alongside DLP technologies, to fine-tune them in one way or another. HP TippingPoint, for example, provides an additional layer of control that can be used to enhance existing DLP solutions. Simon Leech, manager solution architects EMEA, HP TippingPoint, says: “There is no silver bullet as far as DLP is concerned. No one vendor today is able to provide a single solution that will fill in all of the exit points that an organisation has in terms of data loss. Any DLP solution is only as good as the way it is configured and the way that an organisation views the classification of their data. As good as some of the DLP solutions that are out there are, an organisation will almost be guaranteed a failed deployment if they haven’t thought about how they can integrate DLP into their organisational culture.” A hurdle that all organisations face is the classification of data and defining

September 2011

“As good as some of the DLP solutions that are out there are, an organisation will almost be guaranteed a failed deployment if they haven’t thought about how they can integrate DLP into their organisational culture” the classification of data and defining data is genuinely sensitive. “In my opinion, one of the biggest reasons for failure has been the inability of an organisation to adequately classify all of the data that they own,” says Leech. “It’s a simple policy change to get an employee to tag a Word document as it is saved, but what

Simon Leech, HP TippingPoint.

about all the historical and archived files? And should it really be up to the employee to determine whether a document is tagged as classified, confidential, or unrestricted?” He adds: “Before organisations even start to think about a product choice, they need to understand what they want to achieve from DLP, how current confidentiality policies support this, and what it would take to migrate the policies into a working solution.” A further issue that has arisen with DLP implementation is addressing toohigh rates of false positives and sorting important alerts from the rest. “The biggest problem is that, sometimes, it can take months to scan an entire network and you will keep getting hundreds and hundreds of quite useless alerts, so it is very difficult to interpret the information and find the critical alerts that could have a real impact on your bottom line,” says Marc Lee, sales director EMEA at Courion. Before running DLP, Courion’s Compliance Manager for Fileshare looks for high-risk areas, as opposed to the data itself, at a more granular level. The DLP tool can then be focused towards these high-risk areas.

DLP in the cloud Gartner points out that “many [DLP] vendors are experimenting with alternative delivery methods such as SaaS and cloud for monitoring some types of network traffic”. The research group counsels: “Organisations should approach this cautiously and understand that detecting sensitive data in the cloud has data propagation issues that must be addressed, such as notifying third parties of the presence of sensitive data outside the organisation’s boundaries.” Steve Latchem, vice-president of The Solutions Group at Mastek, says: “DLP solutions work, in the majority, by perimeter/inhibition techniques: they physically stop the hijack, extraction or removal of targeted data elements using software Computer Fraud & Security

7

FEATURE valuable is protected, businesses need to adopt a data-centric approach based on encrypting all data as it progresses through its lifecycle from the moment of data generation, to securing transactions and the process of data exchange, to controlling access to information.”

Starting points

Mike Smart, SafeNet.

technologies and monitors. However, the issue is that with the increasing use of cloud computing technologies, data needs to be shared and if it’s locked by whole data encryption, or barred from movement, sharing is impossible.”

“Organisations should approach this cautiously and understand that detecting sensitive data in the cloud has data propagation issues” Mastek’s Kameleon solution replaces critical data within a dataset, but keeps it processable, so that cloud computing solutions can continue to work on the data, while critical data elements with real values are secure. Kameleon enables these pseudonymised fields to revert to their original values once returned. Another limitation of DLP has been its inability to classify unstructured and informal data, although as the intelligence of true DLP solutions is improved this issue may be declining. Mike Smart, product and solutions director, EMEA at SafeNet, advocates encrypting all data in any case. “DLP is a great way of data monitoring and uncovering dangerous practices that can result in security breaches,” he says. “However, it works mainly with content classified as security-sensitive data and falls short of protecting highvalue information that is difficult to categorise. To ensure that everything 8

Computer Fraud & Security

For many businesses looking at protecting sensitive data across the network for the first time, the nuances of the artificial intelligence within DLP technologies are the least of their concerns. For them DLP technology can be a valuable starting point to managing sensitive data. Marcus Ranum, CSO of Tenable Network Security, says: “I’d say that the current [DLP] technologies are vastly better than nothing, and nothing is, unfortunately, the norm.”

“The current [DLP] technologies are vastly better than nothing, and nothing is, unfortunately, the norm … If you are already managing firewall logs and intrusion detection alerts, it’s just another piece of information to be tuned and managed” He recommends a pragmatic approach to implementing DLP solutions. “Generally, data loss prevention technologies integrate into your operational workflow as another source of alarms and alerts,” he says. “If you are already managing firewall logs and intrusion detection alerts, it’s just another piece of information to be tuned and managed.” Ranum adds: “There are similar issues to an intrusion detection system, in terms of false positives or false negatives – if you start turning the rules off, you lose information. If you don’t, you may spend a certain amount of time ratracing around after false alarms.” He also warns: “As a computer security purist, I’d note that the time to have taken action was when the vulnerability

Marcus Ranum, Tenable Network Security.

was identified. That’s an issue with data leakage programs – you may be doing some barn-locking after horse departure. Like most computer security technologies, you have to manage data leakage technologies. I am sometimes concerned that organisations are looking at these technologies as a way to save manpower, but the reality is that everything needs care and monitoring if you expect to get any value from it.” An effective DLP implementation will take into account user training before the solutions go live and will act to modify user behaviour in use. Despite all the precautions though, it will still be possible for trusted users to inadvertently expose data in their communications. And a determined maverick is likely to be able to bypass the systems.

Aftermath of data loss Robert Brown, vice-president Eurasia Operations First Advantage Litigation Consulting, comments: “In the decade or so that I’ve been conducting forensic investigations, the most common form of data leakage that I come across is the disgruntled employee trying to make money out of their employer’s intellectual property. I’ve seen everything from sending customer or price lists to a competitor, to taking a whole copy of their employer’s database and file servers before they head off to set up their own competing company.” September 2011

FEATURE He adds: “While prevention is always preferable to cure, companies still need to be prepared to deal with the aftermath of data leakage. It sounds obvious, but in my experience, many companies focus all their efforts on the prevention side of this equation, and fail to consider the possibility that their measures might fail. After all, there’s no such thing as

100% secure when users are also part of that equation.”

About the author Tracey Caldwell is a freelance business technology writer who writes regularly on network and security issues. She is editor of Biometric Technology Today, also published by Elsevier.

Speedy recovery: retrieving lost emails as part of an investigation

Resources • Proctor, Paul and Ouellet, Eric. ‘Magic Quadrant for Content-Aware Data Loss Prevention’. Gartner, 2 June 2010. • ‘Understanding Security Complexity in 21st Century IT Environments’. Check Point and the Ponemon Institute. March 2011.

John Shaw

John Shaw, Head of Forensics, First Advantage Litigation Consulting With the rise of the digital age and the widespread use of computers and electronic communication by individuals and corporations, the volume of emails and business correspondence electronically stored is growing exponentially. As a result, emails are one of the most common forms of documentary evidence requested and admitted as evidence in court, in the same way as other forms of evidentiary data. This means that, with tighter procedural requirements relating to electronic evidence, the way data is stored and recovered plays an integral role in complying with a legal or regulatory investigation.

Recovering missing data This raises the question of how to retrieve lost or deleted emails, in order to effectively meet evidentiary requests. Recovering missing data, especially in the course of discovery, can be surprisingly simple, or dauntingly complex, depending on how the email was deleted or lost. In order to explore which avenues exist for recovering the emails, we need to understand the nature of how the emails were lost. What do we actually mean when we say emails have been ‘deleted’? Is it a case of a user in the organisation having deleted emails from their inbox? Did September 2011

the Exchange Server database become corrupt? Was it an automated data retention policy, based on maximum size or age that deleted the relevant content? Did the CEO’s laptop crash and now the machine won’t reboot? Answering these questions is the key to enable the relevant team of experts to find the most appropriate avenue for recovery.

Handle with care The methods used to retrieve lost or deleted information can be as vital as the data itself. Computer data can be very fragile and can easily be irretrievably lost or permanently corrupted if searches

for potential electronic evidence are not undertaken using professional procedures by forensic experts. Even turning on a computer may cause changes to critical data. Often, in the early stages of the retrieval process, individuals or organisations may act in haste to recover the data as quickly as possible without taking adequate care, which can lead to disastrous consequences. Beware of clumsy handling of the situation in the early stages of recovery, as this may add fuel to the fire and potentially cause irrevocable loss. It is imperative that the process be handled with due diligence by professional forensic experts from the outset, so that recovered emails and communication can survive legal or regulatory scrutiny. There is a real risk on relying on internal IT departments alone to conduct the analysis or recovery, as this may create difficulties with the admissibility of the retrieved emails in court. This is because under certain legal and regulatory procedures, internal IT resources may not be deemed as unbiased parties to provide evidence,

Computer Fraud & Security

9