Information Security Today Home

New Books

Oracle Identity Management
Building an Effective Information Security Policy Architecture
Digital Privacy
Mechanics of User Identification and Authentication
Information Security Management Handbook, Sixth Edition, Volume 2

Network Content Filtering and Leak Prevention

by George J. Jahchan

Organizations today depend heavily on the Internet, intranets, and their network infrastructures to conduct business. Ensuring the security and integrity of data shared across networks is essential, especially in light of the various regulatory and legislative mandates they must comply with. At the same time, the enforcement challenges facing them are on the rise, and the need for effective security controls is greater than ever. Organizations strive to implement technical controls to assist in enforcing their security policies; however, under certain circumstances some organizations need to monitor the content of packets entering and leaving their network to ensure they detect leaks of confidential information.

Signature- or behavior-based detection and prevention technologies depend on the automated recognition of anomalous conditions: in the first case through signatures and in the second through exceeding a set threshold of deviation from known normal conditions (or baseline). The prevention of unauthorized disclosure of proprietary or confidential data (information leaks) through conventional technologies (such as intrusion detection or prevention) is difficult to manage. Signature-based intrusion detection and prevention relies on attack signatures (bit patterns in packet streams); extending that to include words or word patterns that are contained in application files (databases, office productivity documents, portable document files, or any of the numerous file formats in use today) that would be indicative of a leak of information is difficult.

Conventional technology solutions such as identity and access management, security information management, content management systems, and digital rights management-individually or in combination-help organizations control who has access to sensitive data; however, once authorized access is granted, they have little control over how that data is utilized.

In this chapter from volume 2 of the Information Security Management Handbook, Sixth Edition, we look at controls that can help organizations mitigate the risk of information leaks through networks.

Information-handling security policy should have teeth: a strong policy that clearly outlines the information-handling requirements of the organization and mandates disciplinary measures for policy violations is the first step in controlling information leaks through networks. But a policy without the means to enforce it remains ineffective.

Limiting the protocols or applications that can be utilized by network users in connections to foreign networks helps organizations reduce the vectors through which sensitive information could be leaked. Placing too many restrictions will, however, impede the business, and organizations need to compromise between security and usability.

Once this exercise is complete, and a clear picture of the traffic to be allowed is established, the attention can turn to the mitigation methods for permitted traffic. This chapter covers the most common vectors through which information can be leaked and suggests mitigating controls.

HTTP/FTP. Any document types can be uploaded to a Web site that is designed to "accept" attachments (Web-based e-mail, bulletin boards, etc.). Universal resource locater (URL) filtering-which is typically part of the defense arsenal of companies-can help mitigate this risk. Free Web-based e-mail services are typically classified in the "Web mail" category of URL filtering solutions; thus access to these services can be curtailed by implementing appropriate security controls over Web access (a functionality that is available either in a stand-alone solution or as an add-on to the existing Web caching servers from several vendors). The residual risk will come from uncategorized sites. Denying access to such sites can further reduce the residual risk, but may be deemed unacceptable to the business. Either way, insofar as leak control is concerned, the URL filtering method is binary and lacks granularity.

HTTP/SFTP/SSH and other encrypted traffic. The scenario is similar to the preceding one. Control is binary and lacks granularity. Once access is granted, no further control is possible over content.

Peer-to-peer applications. Risk is best mitigated by preventing the use of such applications. A combination of controls at different layers can be used for maximum effectiveness.

On desktops in Active Directory (AD) environments, group policies can prevent users from installing or running unauthorized applications, including peer-to-peer.

On desktops in all Windows environments, desktop security solutions available from several vendors help organizations control desktop usage and prevent the installation or execution of peer-to-peer applications. These can be used stand-alone or in combination with AD group policies in AD environments.

At the network layer, periphery defenses can be configured to block peer-to-peer traffic, with varying degrees of effectiveness.

Electronic mail (corporate mail systems). Technical solutions exist to (i) inspect the content of messages and attachments (specific file formats) or (ii) archive all or selected mailboxes. Encrypted e-mails or attachments would, however, be difficult to inspect with either of these solutions. In the first case, if the business allows it, rules can specify that unrecognized or encrypted file formats be automatically blocked.

General controls. Network forensics solutions that capture and store all (or filtered) traffic (see figure below) enable the reconstruction and replay of sessions that were previously "recorded," enabling organizations to spot security policy violations. The technology does have limitations though it is expensive and requires expertise to operate effectively. Furthermore, though encrypted traffic can be recorded "as is," its clear-text content cannot be visualized unless the organization has prior knowledge of the encryption algorithms and associated keys, which is rarely the case. HTTPs and SSH are common methods of transferring data in encrypted form.

In addition, archive tools (such as WinZip) now offer built-in strong symmetrical encryption capabilities (up to 256-bit advanced encryption standard [AES]). Any documents encrypted with a strong key that is transferred to the addressee out-of-band cannot be visualized unless the sender discloses (or is forced to disclose) the encryption method and key used. Things are even more difficult in the case of symmetrical keys that are negotiated online through an asymmetrical key exchange (such as during a Secure Sockets Layer session establishment).

Conclusion
The technology designed to protect highly sensitive data from leaks through networks is complex and expensive in terms of acquisition and ongoing operation costs, and its effectiveness is dependent upon what type of traffic an organization allows to permeate through its periphery.

Encryption is a double-edged sword: it helps in ensuring the confidentiality of information traveling across networks, but it also prevents organizations from maintaining the visibility of what sort of information is leaving their networks.

To combat information leaks effectively through networks, organizations must follow the continuous information security plan cycle: assess, design, implement, educate, monitor, and correct. The security personnel's awareness and understanding of vectors that could be used by ill-intentioned persons to sneak sensitive or confidential information out of a network is key to mitigating its risk.

 
Subscribe to
Information Security Today






Powered by VerticalResponse

Share This Article

Mixx it digg


© Copyright 2008 Auerbach Publications