Encryption Everywhere: Ensuring Access Compliance beyond a Doubt

by KJ (Ken) Salchow, Jr.

As enterprises, particularly those in the financial and healthcare verticals, become increasingly concerned with not only the availability of their data but also the confidentiality and integrity of that data, they are investigating architectures that hearken back to the days of centralized computing, a time when immense security could be obtained by simply locking the door to the mainframe room.

Modern enterprises realize that the advent of distributed computing, especially the epitomical incarnation of it in the Internet, has opened their data to vulnerabilities from outside the organization. They are also starting to question why the efforts to protect that data from the outside world stop at their front doors. Increasingly strict and well defined identity-theft and privacy laws are starting to require that data be held secure against internal personnel without a specific need to access and view it and the audit capabilities to prove that compliance.

While returning to a purely centralized computing model is not a viable alternative in the global economy, injecting elements that imitate the innate physical security attributes of the centralized model into the distributed model by means of technological controls is possible. Architectural analysts are exploring the capabilities of WAN optimization to enable a re-centralization of data in the corporate data center; reversing the trend of data dispersion to branch-offices that has taken place over the last decade. This allows much tighter controls on physical access to data, data backups and a centralized approach to network access. Another trend is the increasing attention to client integrity or "health" checking by the likes of SSL VPN vendors, Microsoft (NAP), and Cisco (NAC). This is an attempt to recreate the stability and security of the dumb terminals used in centralized computing-terminals, which were not capable of hosting and distributing malicious code throughout the corporate infrastructure. By creating security profiles that enable enterprises to verify and validate that a requesting client is "safe", they are capable of increasing the assurance level that these clients will not mishandle corporate data or inadvertently expose that data to unauthorized sources.

The last major characteristic of centralized computing, which is only now beginning to be addressed in enterprise architecture, is the simulation of the dedicated and proprietary cabling systems connecting the dumb terminals to the centralized data store, providing confidentiality of data in transport. This characteristic is being addressed by the redeployment of "remote access" technologies on the internal network providing for "encryption everywhere" to confront the last bastion of uncertainty when it comes to distributed networking. While this idea presents a viable solution meeting the requirements of promoting assurance that only authorized individuals can see the data they need-and nothing more, it also presents a host of issues with the current security deployments of most enterprise networks. It is a new way of thinking.

Technically, the ability to provide encryption across all network traffic has been possible for some time, and is increasingly easy to do as processing power continues to increase both on the client and on the server, and network bandwidth capacity continues to grow. Despite the fact that many of these technologies where introduced to provide this solution in the local area network and originally deployed to that end, issues with implementation and problems with compatibility relegated most of them to be repositioned as remote access solutions only. These same issues continued to plague IPSec even as a remote access solution, bringing about the advent of SSL/TLS-based technologies, which overcame many of management and deployment hurdles.

Combine the increasing flexibility and ease of management of SSL/TLS-based remote access solutions with continuing advances in hardware based encryption systems that increase the efficiency of these solutions and you have a viable alternative to re-introduce back into the local area network. If we are to add the re-centralization of the data center and the fact that these solutions have been one of the driving forces in the proliferation of "endpoint security" efforts, it can be clearly seen that the use of these solutions on the internal network is the logical conclusion and integration point of the trend towards centralized, distributed computing, providing the security characteristics of the centralized model with the need for distributed access. Additionally, because these solutions are already deployed successfully in the wide area network, and increasingly in the wireless LAN environment, they present a unified platform from which to control all network access to all network resources. They create a centralized and simplified access security model and the lock on the mainframe room door.

Driving Assurance and Compliance
Assurance and compliance are two tenuous goals, neither of which can easily be defined and both of which need to be defended. Compliance is really the end goal of many security implementations today. Let's face it, if enterprises weren't mandated to provide security and confidentiality of their data, most of them would find reasons not to bother. The reason compliance is so nebulous is because of the ever growing number of regulations and the fact that most of them were not written by information security practitioners, making them somewhat difficult to interpret and hard to implement. Compliance may be the stated goal, but without assurance, you don't really have much at all. Assurance can mean many things, but (in a nutshell) it relates the level of trust and confidence you have in your security implementation to achieve its stated goals and the ability to prove it. That last part is always the key: the ability to prove it. You must be able to defend both your implementation as complying with regulations and you must be able to back that up by showing how you ascertain assurance.

One of the key aspects of providing assurance (and thus proving compliance) is the management and auditing of the security implementation. The ability to show both the policy and evidence of the policy in action is critical. Providing unified policy management and auditing in the current information security implementation can be a daunting task as the number of devices that participate continues to increase. Many organizations are looking at "correlation" software that unifies the logs of multiple devices in an attempt to make sense of their diverse and varying security implementations. However, these systems tend to be "black boxes" that must be able to communicate with an increasingly divergent number of vendors, applications and devices and whose major value appears to be in the intelligence and accuracy of their proprietary heuristic algorithms. This leads one to question how accurate any of them are and thus the true value of the data they provide. Unified management systems suffer from many of the same drawbacks and also often lack the deep integration to truly create and deploy enterprise wide policies across disparate devices and technologies.

Applying the principles we've discussed, however, proves a new and unique opportunity to prove compliance by increasing assurance. By re-centralizing an organization's data and employing point-to-point encryption between the client system and that data-regardless of whether the client is internal or half a world away, we provide a single path that requires policy and audit; a single point of access. Furthermore, employing end-point security to those clients further aids in assurance and compliance. This reduction in complexity can virtually eliminate the need to correlate data between multiple systems in order to provide a clear picture of what data is being accessed, by which users, and where that access is coming from. Having a unified "choke point" to mediate this single point of access makes policy creation and deployment a singularly simple process.

This is a perfect example of how complexity is the enemy of good security and simplicity, not more software or systems, but a unification of these processes, is the harbinger of assurance.

Roadblocks to Success
The advantages to simplifying access, creating assurance and direct auditing capabilities, are significant. There are, however, two significant roadblocks to the adoption of a unified access methodology employing "encryption everywhere". The fist relates to the patchwork of security devices already architected and deployed within the enterprise network. The second is the very nature of how networks are architected, managed and provisioned in the modern enterprise.

Traditional network firewall systems limit access to the network via combinations of IP address and port-with port level control being one of the significant security implementations. Many, today, also do some level of protocol analysis to ensure that the traffic traversing ports is appropriate for the port they are using. In an "encryption everywhere" approach with SSL/TLS, all access comes to only one port-443. To further complicate matters, the protocol of the actual traffic traversing the tunnel is invisible to the firewall so it cannot do any protocol analysis of that traffic-other than verifying that the tunnel itself is truly SSL/TLS. In essence, at least to the point of tunnel termination, traditional firewalls become nearly useless. In much the same way, IDS/IPS which monitors network traffic to detect malicious activity, would also be blinded as, in an "encryption everywhere" paradigm, that traffic too becomes invisible as the IDS/IPS systems are unable to see the actual traffic within the encrypted tunnel. These systems would have to be relegated to very specific points in the network (within the secured data center only) where traffic is not encrypted or used out-of-band where they can decrypt the data and analyze it, but cannot have any control over the traffic itself. The same can be said for virtually any security device which relies on the ability to "see" the traffic to determine its nature and validity.

Outside of the security arena, these same issues can drastically impact the way enterprise networks are deployed, maintained and provisioned. Without the ability to "see" the traffic, technologies like rate-shaping and in-line compression across wide-area links would also either become useless or need to be redeployed deeper into the network than their edge deployments today. Rate-shaping systems would lose many of the protocol or data specific capabilities they use to prioritize some data over others. It would become virtually impossible with current deployments in conjunction with "encryption everywhere" to tell the difference between SIP traffic traversing the encrypted tunnel versus standard Web browsing activities. Packet sniffing for troubleshooting purposes would encounter similar issues, as would sophisticated traffic monitoring and modeling tools. Most of the network architecture would only carry "one" kind of traffic-the encrypted kind.

An additional issue is that most users today take for granted the ability to work in peer-to-peer fashion; being able to share files and resources directly between two end-points. While an "encryption everywhere" or "universal access" methodology does not completely remove this capability, it does drastically change the nature of that ability. Peer-to-peer access would no longer be at the sole discretion of the users, but would also need to be mediated as part of the access policy. From an information security perspective, this is probably a really good thing; peer-to-peer traffic is one of the primary culprits in the spread of viruses and leakage of sensitive data outside the enterprise. This point will most likely be lost to the millions of corporate users who might find this advantage to be akin to "big brother"--which of course it is--but it also adds tremendously to the concepts of compliance, assurance and the basic tenants of information security.

There are undoubtedly many more complications to an "encryption everywhere" paradigm, but it is just as certain t hat, like these examples, they all boil down to one primary roadblock; abhorrence of change. Change is scary. The users are afraid that they will lose some essential capabilities and IT personnel are afraid that they will lose their jobs; enterprise managers are afraid of having to justify changes in the architecture and the possible removal of capital assets without admitting that all the time and money they've spent in the past are still insufficient to provide the level of assurance and security the organization truly needs.

A World of Difference
Fortunately, about the only constant in the world of technology is that it rarely remains constant; change is inevitable. Sooner or later, fear of change is overtaken by fear of not changing. It is therefore important to understand the benefits of an "encryption everywhere" approach as well as what future technology can bring to that design.

Obviously, encrypting all data in transport to keep it from unauthorized view is an easily understood benefit of "encryption everywhere". What might not be so obvious, however, is how this also fortifies general access assurance. If all data must be transmitted across an encrypted tunnel, then-in combination with traditional AAA mechanisms and newer end-point security-access to data can also be restricted by denying the instantiation of the tunnel itself.Unlike today where users may not be able to access the data but can still attack the resource, the inability to even communicate with the network on which the resource lives substantially increases the security of the resources. This fits nicely with recently introduced Network Access Control (generic NAC) initiatives and the use of 802.1x which protect access to the general network.

Because all access traffic is treated uniformly from a transport standpoint, the information obtained to authorize an encrypted tunnel could also be used to modify the access. Integrating this information with traditional packet-based firewalls can allow policy driven, dynamic firewall rules on a per-connection basis. This allows different rules to be applied based not only on source and destination, but also based on the state of the client machine, the trust level of the network being used to create the connection or even the trust level of the device-type itself; e.g., PDA versus Laptop versus Home Machine. These features are already employed on remote access networks and could easily benefit the internal network as well via a unified strategy.

Controlling access via a unified, encrypted tunnel also provides some potentially unique opportunities to realize the marketing hype of a "self healing" or "self defending" network. Requiring traffic to traverse an encrypted tunnel that is authorized and substantiated on a "per client" basis, makes it very difficult to spoof; maybe not impossible (although you could employ client-side authentication as well), but very difficult at least. If you then integrate your IPS with whatever device is authorizing and instantiating tunnels, suspect users can be completely disconnected from the network, requiring them to re-authenticate and re-validate their system against the enterprise security policy. This has the opportunity to stop attacks and force the system propagating them to be completely cut-off from the enterprise infrastructure; making it more difficult and time consuming for an intentional offender and giving the unintentional offender the ability to remediate their machine before too much damage is done. Logging of such offenses would be straight-forward and accurate.

We previously mentioned the control and authorization of peer-to-peer traffic as a roadblock to success; but it is also a powerful tool in promoting security and assurance. If users are not, by default, able to communicate with peers, outbreaks of client-side viruses and attacks will be greatly attenuated. Users who do not meet the enterprise security policy (AV, personal firewall, etc.) won't be able to even access the production network or their authenticated peers, drastically reducing the number of outbreaks. Even users who meet the policy, but may lack the most recent patch or are vulnerable to a zero-day event, will remain unable to be attacked by or to attack other client nodes on the network. This is a tremendous advantage. Almost as advantageous is being able to authorize and control peer-to-peer traffic, which will allow the enterprise to provide additional assurance that an authorized individual or device has not intentionally or mistakenly been provided access to unauthorized individualsor devices. While not necessarily bullet-proof, it is certainly more than can be said of today's enterprise network.

The list of possible opportunities is virtually endless, and while not all of them are strictly related to "encryption everywhere", it is the vital stepping stone to many of them and the key to assurance for any of them.

The Future of Information Security
It is time to take a step back and evaluate our existing security architectures for what they are, an endless attempt to patch and secure each new class of vulnerability as they are discovered and understood. To use an analogy, the understood security "hole" when ACLs where first implemented was only a few feet wide. We soon discovered that the "hole" was more likely to be tens of feet wide and deployed firewalls to bridge the difference between what ACLs could cover and the new understanding of the security risks. With each new security risk discovered, the "hole" continues to grow and we continue to patch the bridge crossing this chasm with various new technologies. What has been left is a discordant, mismatched patchwork of security implementations, precariously balanced across the "hole", each with its own unique management, maintenance, strengths and weaknesses. It is time to replace this organically grown security architecture with one single design created specifically to cross the entire chasm as we know it today and ideally built with the ability to extend itself as the security risks continue to grow.

Re-architecting the network from a security standpoint is long overdue. Re-architecting the network based on the requirements of an "encryption everywhere" paradigm takes us one step closer to being able to employ universal access methodologies. Universal access methodologies give us the new foundation from which to build the security architectures to handle the next decade of technological development and regulatory compliance in a more proactive, rather than reactive, manner.

The future of information security lies in the adoption of universal access technology eliminating the complexity of network-based, patchwork security implementations with a more transactional or communications-based simplicity with far less parameters: the client, the host, the communications channel, and whether the communications is authorized and valid based on the context of the communication attempt. By promoting a single point of entry, and point-to-point encryption, these solutions provide the security required, a single point of policy enforcement, and a single point of augmentation as new risks are understood, while also providing for centralized, detailed auditing and a drastic reduction in complexity; keys necessary to provide accesses compliance-beyond a doubt.

About the Author
Ken Salchow, MCSE, CCNP, CCE, C|EH, CISSP, has been employed by F5 Networks, Inc. for the past five years where he has served in several capacities, currently as a Product Marketing Manager. In addition, he is the owner/operator of Binary Forensics, LLC, a boutique computer forensics lab serving the legal community in criminal and civil litigation and Digital Interlopers, LLC, a boutique penetration and testing organization serving small/medium business entities.

Article © Copyright 2006 F5 Networks, Inc. Used by permission.