Where to start with Data Loss Protection
DLP or Data Loss Protection is a strategy for ensuring that end users or malicious actors do not send sensitive or critical information outside the corporate network either maliciously or accidentally. A DLP strategy should only be introduced within organizations that already have a mature security infrastructure.
There are several common scenarios that should be planned for, each that require their own strategies to defend against.
- An end user accidentally sending, releasing, or transferring information that they are normally authorized to access.
- An end user purposefully sending, releasing, or transferring information that they are normally authorized to access.
- A malicious actor purposefully extracting data for harmful use.
Types of DLP
It’s essential to remember that no DLP solution is going to keep your data 100% safe. Almost everyone has a smartphone and can screenshot with very little chance of detection or send data via chat in any number of communication programs (Skype, Google Hangouts, etc). Short of having a SCIF (Sensitive Compartmented Information Facility) the focus for most DLP programs will be on protecting against the loss of larger amounts of data and honest accidents.
- Endpoint protection - Some endpoint solutions already contain DLP options, other solutions come with their own endpoint clients with additional features.
- Network protection - Network traffic monitoring can enhance visibility into data in motion and flowing between segments, organizations, or leaving the enterprise.
- Physical - Any solution should also include proper planning for physical copies of data as well. Printed copies, hard drives, or other equipment that aren’t secured cal also be a high risk that shouldn’t be forgotten.
There are several organizational and procedural factors that come into play when implementing a successful DLP solution.
A proper DLP solution can not be sufficiently implemented without proper asset management and data classification. Attackers use confidential data for their profit by selling it on the black market, to expose or cripple a specific organization, to commit fraud, or to aid in identity theft. While many industry compliance standards such as HIPAA and PCI DSS attempt to dictate the type of information that should be specifically guarded and segregated, that may not be the only data that is classified as confidential in an organization. There may also be contracts and other legal measures that must be consulted for classification and protection of certain data. Steps for correctly classifying data can be described as follows:
1. Identify data sources to be protected.
Completion of this step should produce a high-level description of data sources, where it resides, existing protection measures, data owners and custodians, and the type of resource. Obtaining this information can be difficult, but can be an added part of the documentation process as data owners and custodians are assigned and documented. There are software solutions specifically created for e-discovery of data at rest in an organization.2. Identify information classes.
Information class labels should convey the protection goals being addressed. Classification labels like Critical and Sensitive have different meanings to different people, so it is important that high-level class descriptions and associated protection measures are meaningful and well-defined to the individuals who will be classifying the information as well as those who will be protecting it.3. Map protections to set information classification levels.
Security controls such as differing levels and methods of authentication, air-gapped networks, firewalls/ ACLs, and encryption are some of the protections involved in this mapping.4. Classify and protect information.
All information that has been identified in step 1 should now be classified as dictated in step 2, and protected as in step 3.5. Repeat as a necessary part of a yearly audit.
Data footprints are ever expanding. New software is installed or upgraded with add-ons and now data has grown or changed in scope. A yearly audit of the current data footprint in the enterprise will be required to ensure data continues to be protected as documented.
How many different applications have access to your more sensitive data? Over what ports or protocols or accounts do they access it? What does that traffic normally look like during an average day? What are all of the egress points on the network? Without the answers to these questions it makes it difficult to ensure you know where your data is supposed to reside or be transferred or access to/from. There are three types of data that should be mapped.
- Data at Rest: Data in databases, on file servers, or in custom applications not only should be identified as previously stated, but also encrypted and in a secure physical location.
- Data in Motion: Common avenues of data access to pay attention to (and their ports) are FTP (21), SFTP(22), SMTP(25), HTTP/APIs(443/80/8080), SMB(445). Any network monitoring solution should be able to alert on sensitive data in transit. Knowing where the sensitive data is and how it is structured will go a long way in ensuring the correctly formatted rules are in place.
- Data in Use: Data that can be exfiltrated using things like CDs, USB drives, or with copy/paste to external websites from the user endpoints. A comprehensive DLP solution should have the ability to analyze data before it is transferred to removable media, and automatically encrypt sensitive information as it is stored on the removable media. It should also prevent data from being printed, faxed, or copied into memory to paste to another document
Top Threats and Risks
After there is a good handle on what, where, and how data moves, you can start to paint a picture of the different levels of threats and risks the organization is most likely to need protection against. Assessing threats and risks will be incredibly different for each and every organization. Each internal and external footprint is unique when combined with the individual infrastructure involved. Assessing these includes both a high-level overview as well as in-depth knowledge of assets. Without the knowledge of the threats and risks your organization faces, it is more difficult to custom fit technologies like DLP to provide a suitable defense.
Communicate and Develop Controls
Processes and Policies
It is all well and good you knowing what the policy is with regards to not sharing information with others, but if the entire organization is unaware, then it is not providing you much benefit. Policy documents disseminate information for others to consume. They also set rules and boundaries; by having clearly defined rules it becomes equally clear when someone breaks those rules. This enables appropriate action to be taken. Departments like Human Resources find it difficult to reprimand someone because it “feels like” they may have done something wrong. A clear contravention of a rule is easier to enforce. The policy set can be used to set the overall tone of a company’s security posture. Even if not explicitly laid out, the policy set gives an overall feel as an organization’s approach to security.
Once an organization understands the circumstances under which data is moved, user training can often mitigate the risk of accidental data loss by insiders. Employees often don’t recognize that their actions can result in data loss, and will self-correct when educated. Repetition is a proven, successful way to bridge the gap of compliance, teaching our users real-life skills, and helping secure the infrastructure that we are responsible for protecting.
Any type of DLP deployment should be thoroughly tested in a smaller more controlled implementation before being introduced to the larger enterprise. Begin with simple rules in monitor mode that will allow you to observe what will eventually be allowed, blocked, or alerted on. This period gives the ability to modify strategy and fine tune rules for a smoother roll out on a larger scale.
Don’t just assume that the initial roll out of a DLP solution is working and will continue to work. Reassessments, testing of controls, and change processes should also occur on a regular basis. Just like the majority of security implementations it will not be a one time project, but a process that continues to grow and be shaped by the type of data the organization holds. Add to regular penetration tests a section where DLP is tested from an offensive viewpoint.
DLP Questionnaire Worksheet
You will need to answer the questions listed below to start creating a solid DLP process and architecture.
- Do you support network, endpoint and storage DLP? If not, which ones do you offer?
- Do you support multiple "channels" (network, storage, endpoint) using a single management console and a single policy definition interface? If not, how do these pieces break out?
- For each "channel" (network, endpoint, storage), which content analysis techniques do you support? Please describe in detail (e.g., pattern matching, partial document matching, database fingerprinting).
- Which endpoint operating systems do you support, and what are the performance requirements (memory/processor)? Are there content-aware policy limitations based on the operating system or system specifications?
- What activities can you monitor, and what can you block on endpoints (without requiring an active connection to the server) using content-aware policies? At a minimum, please specify if you support scanning local storage, monitoring/blocking portable storage and monitoring network activity.
- How do you monitor storage (data at rest) activity? Which network file access protocols and document management systems do you support (e.g., CIFS), and do you require or offer an endpoint agent?
- Do you include an email MTA in the product for scanning, quarantining and filtering email? If not, how do you provide DLP for email?
- Describe your network monitoring deployment models (e.g., passive sniffing on SPAN port).
- Can you monitor and control SSL encrypted network traffic? If so, does this require integration with an external SSL proxy? Describe the technique used.
- Can you monitor generic ports and protocols, or are you limited to only particular port/protocol combinations (and how does this affect performance)?
- How many endpoints, storage repositories and network gateways can a single management appliance support?
- What type of data are you aiming to protect?
- What levels of protection will you split the classification into? Eg “High, Medium, Low” or “Classified, Sensitive, Public”
- How many different applications have access to your more sensitive data?
- Over what ports or protocols or accounts do they access it?
- What does that traffic normally look like during an average day?
- What are all of the egress points on the network?
- How many people have access to the most sensitive data?
- Is any sensitive data available from public sources?
Threats & Risks
- What data would an attacker gain the most from?
- What data, if lost, would your organization lose the most money from?
- Are there specific threats that are industry specific?
- Have there been incidents before that would make for good use cases?
- Are there already policies on information sharing?
- Are new controls being implemented that require documentation and policies?
- What level of education do end users currently have?
- What new controls will end users need educated on?
- Should it integrate with Email?
- Is access to cloud sites already allowed?
- Does BYOD come into play?
- What subsection of data will the DLP solution focus on initially?
- Are there proper procedures in place to handle the solution finding or not finding a DLP issue?