Information Security Today Home

New Books

Managing an Information Security and Privacy Awareness and Training Program, Second Edition
Mobile Device Security: A Comprehensive Guide to Securing Your Information in a Moving World
Information Security Risk Analysis, Third Edition
Information Security Management: Concepts and Practice
The Executive MBA in Information Security

Adaptive Threats and Defenses

Sean Price

The survival of living organisms is often dependent on their ability to compensate for changes in their environment. The ability of an organism to compensate for changes encountered is referred to as adaptation. Predominately, the methods of adaptation involve changes in the organism's behavior, physical characteristics, or both. Some creatures are able to learn new skills or tricks that allow them to cope when changes occur. In other cases an organism might undergo a genetic mutation that provides it with a slight advantage over its rivals allowing it to survive better given the changed conditions. Adaptation can also occur with the combination of altered behaviors and new mutations. The ability to adapt is also exhibited in the cyber realm by threats and defenses. This article is primarily focused on the adaptability of attacker malware and defender security tools.

Threats and defenses have evolved over the years. The emergence of the first forms of malware and hacker tools was followed by defensive tools and techniques. As new methods of attack are pursued defensive measures arise to counter the threat. This constant struggle between attackers and defenders is sometimes referred to as an ongoing arms race. The goals of attackers and defenders are equally opposed to each other. Attackers seek to exploit a system while the defenders attempt to prevent compromises. The objectives for each of these competitors could be summarized with the following:

Threat ObjectivesDefense Objectives
Discover new weaknessesCounteract known threats
Exploit new and old vulnerabilitiesDetect deviations from normal activity
Hide presenceIdentify abuse of the system
Retain a foothold in compromised systemsMitigate known vulnerabilities

Evolution of Threats and Defenses
Over time the objectives of threats and defenses has not changed much. However, the methods used to achieve their objectives have substantially evolved. In the early days, threats were single purpose and could be generally categorized according to its attack vector. Initially, the taxonomy of malware was predominately marked by viruses, worms, backdoors, keystroke loggers, and Trojan horses. Human threats included hackers, crackers, and social engineers. Adaptations soon appeared with the emergence of malware such as spyware and remote-access Trojans. Similarly, the human threat evolved with the new uses of spam and phishing techniques. More recently threats and defenses began to exhibit adaptability by use techniques from different categories. The use of multiple categories is regarded as a compound threat or defense.

Attackers quickly learned that combining attack vectors enabled deeper penetration and more automation. Malware authors began to incorporate a variety of attack methods into their code. Instead of a worm simply infecting one system after another through a single exploit, it would drop packages enabling further compromise of the system. Bots, for example, are a recent evolutionary step in malware that are perhaps the most troubling. They automate much of the manual activity previously accomplished with hacker tools.

To a lesser extent compound defenses have emerged. Many security products now incorporate multiple defensive measures such as antivirus, anti-spyware, phishing filters, spam blockers and firewalls. These efforts appear to be more about consolidation and rivalry between the products of security vendors as opposed to focused efforts to compete against malicious code. The impact of compound defenses seems much less substantial than the effect of compound attacks.

Adapting Threats
The ability of a threat to retain its relevance is strongly tied to its capability to adapt. Automated and external human threats often exploit a weakness to gain further access within a system. As weaknesses are corrected or countermeasures put in place, the relevance of a threat is diminished. Threats must change their malware to adapt to these changes that prevent or restrict their tools from performing their devious tasks.

It is important to bear in mind that automated threats such as malware are largely dependent on the hackers that code them. Aside from polymorphism, malware adaptations are strongly linked to human intervention. Yet the source code influencing the polymorphism is a human generated response that is driven by the need to adapt as a method of detection evasion. In the not too distant future, malware integrated with machine intelligence may be capable of generating original source code, discovering new vulnerabilities and create unique methods to exploit any vulnerability. Although some might suggest that view is close to reality, there is little evidence to suggest that machine intelligence is close to achieving this level of abstract cognition. Nevertheless, it is likely that some malware will incorporate some or all of these attributes on a limited scale.

The evolutionary nature of threats is manifested from two points of view. First, changes in the behavior of the threat provide one method which an adaptation can be achieved. In this regard, behavior reflects the actions malware takes to achieve its objectives. Relevant actions include file operations, registry manipulation, and network activity. Process spawning is also pertinent to behavior as well. The second point of view is that of mutation. The predominate manifestations of a mutation involves changes to the code logic or structure of the binary. Changes in behavior will likely induce mutations in the underlying code. However, a mutation is also a tactical maneuver supporting adaptations that allow it to avoid detection.

Behavioral Changes
The actions and activity of a threat is an indicator of its behavior. The longer a threat uses the same behavior the more likely it is that defenses will detect and deploy countermeasures against it. Threats, therefore, adapt by changing the methods and techniques used to attack and retain a stronghold in a victim system. Continuous adaptations in malware, such as bots, are a common occurrence.

Attack Vectors: Attackers regularly seek new methods to accomplish their objectives. An attack vector is the methods and techniques used to exploit a particular vulnerability. It is essentially the cumulative steps taken to exploit the flaw. For any given vulnerability there may exist a multitude of ways to exploit it. Threats can adapt their attack vector behavior by modifying the actions pursued to compromise the system.

Vulnerability Exploitation: A threat agent may attempt to exploit one or more vulnerabilities to achieve its objective. In time, due to awareness and flaw remediation, a targeted vulnerability might disappear, become irrelevant, or prove too difficult to effectively exploit. To remain pertinent, the threat must be capable of choosing different vulnerabilities to attack. A changing list of vulnerabilities to choose from provides the threat with a means to alter its attack behavior. The affect of this behavior enables the threat to adapt to environments where some vulnerabilities are mitigated. Bots, in particular, are often coded with the capability to exploit multiple vulnerabilities. Having the ability to select among vulnerability options can also make it more difficult for defensive mechanisms to target a particular threat.

Command and Control: Much of the malware in the wild today rely on some form of command and control. This enables the threat agent to communicate with and direct the activities of the malware. With respect to botnets, command and control is recognized as important aspect of their value. The three main types of communication methods can be categorized as:

  • Independent: In this category a malware opens a communication channel and listens for commands. The listening activity could be TCP, UDP, or both types of ports. In these cases the malware threat does not know where its command will come from.
  • Centralized: Some malware know how to contact their master. This could be a particular website, email address, but the most common is an Internet Relay Chat area. In these cases the malware looks to a primary address to receive commands.
  • Decentralized: One trend among attackers is to organize the malware as a collective entity. This has the advantage of ensuring the malware can survive and can make it more difficult to detect the command and control origin. This type of control is similar to peer-to-peer (P2P) networks.

A threat can adapt its behavior within each of these categories. For instance, malware using an independent method of command and control could change the port on which it is listening. It may also change the application protocol used, imitating other known services or something completely novel. Centralized threats can exhibit behavior change by contacting different or new centralized command centers to obtain instructions. Lastly, decentralized malware might try to mimic legitimate P2P, change its underlying application protocol, or use encryption. It is worth noting that a sufficiently "intelligent" malware agent might be capable of selecting among all three categories. This type of behavior could make it more difficult for network monitoring to detect its presence.

System Interaction: In most cases, a threat exploiting a weakness results in the appearance of executable binaries on the compromised system. The binaries are usually standalone, but could be attached to other objects, in the case of a virus. These malicious software components may interact with the system in a variety of ways. Some of the most common instances follow.

  • Executables: An executable file runs as an independent process. The threat might be contained entirely in the executable or it may rely on it as a way to initiate other activities such as downloading other malware.
  • Extensions: A malicious library could be used to extend existent malware or have it loaded into legitimate applications allowing it run more discretely.
  • Injections: Similar to extensions, a library might be injected into the execution space of another process. Some viruses create their own thread of execution within the host process. Although it is not truly an injection, the execution of a virus has many of the same implications.
  • Rootkits and Drivers: Malware at this level has the capability to hide its activities or that of participating malicious processes.

Threats can alter their behavior by interacting with the system through any and all of the aforementioned methods. By changing the way a threat interacts with a system it increases the likelihood that it will either avoid detection or make it more difficult to remove.

Storage and Configuration: The code enabling the threat to execute on a system must exist somewhere in storage and also requires some configuration method to prompt its activation. Threats will alter their behavior by a variety of methods that obfuscate their storage locations. They may change their file names and/or extensions to hide their presence. Configuration entries are generally needed to assure that the threat is launched regularly.

  • Obscure File and Directory Names: This can include names that are randomized. Some threats create random names while others select from a pre-populated list within the malware package.
  • Alteration of Registry Entries: Some threats create their own entries or rely on the entries of legitimate software.
  • Alternate Data Streams: Malware hiding in an alternate data stream is not easily observed with standard system tools.
  • Changes to Configurations Files: This is similar to the methods used for registry entries.
  • Masquerading as Legitimate Files: A threat might select a file name that is similar to legitimate software. In some cases the threat might actually rename or replace a legitimate binary file. The malware then is loaded whenever calls are made to the replaced file. In these cases the threat will proxy the requests to the actual library whether it is in another file or is included with the threat itself.

Malware will often change their storage locations and configuration methods to remain a step or two ahead of defensive countermeasures. This adaptive behavior of modern threats enables it to persist within a compromised system.

Recruiting: A small number of threat agents can multiple their capabilities by duplicating their efforts. The common denominator of this behavior is to entice people to execute their malicious code. Malware are increasingly acting as the intermediary between the human threat agent and the human victim. A number of enticements are commonly used to recruit new victims.

  • Trojaned Freeware: An offer for a new free toy with hidden strings attaching to the victim's system.
  • Pornography: The promise of a gratuitous glimpse into an act of indiscretion through a link or attachment.
  • Financial: A lure to easy riches that turns out to be true for the attacker.
  • Spyware: The participant is promised reduced rates or access to a particular application by using a particular software product. Often times, much more is disclosed than what was agreed to.
  • Scareware: The end user is informed that their system is infected with malware and encouraged to download an antivirus tool to help clean their system. They are commonly enticed to purchase the fake antivirus product. This is reportedly a big income generator for many attacks.
  • Phishing: Masquerading as a legitimate entity, such as an online bank, but really duping the email recipient into disclosing their private information.
  • Problem Solving: Unwitting participants solve reverse Turing test problems, such as CAPTCHAS, that are too difficult for malware to deduce. Often this is used with other enticements such as pornography.

Threat Mutations
Code updates to malware has parallel attributes to evolution in living organisms. In time, a given piece of malware must adapt or it will be more readily recognized by defensive measures such as antivirus and anti-spyware tools. Mutations in this regard are essential for the malware to maintain relevance. Mutation to avoid detection is a common malware tactic. The following summarizes some of the reasons why threat mutations are an aspect of adaptation.

Defeat Signatures: An unchanging or static nature of an attacker makes detection easier over time. Researchers and security product vendors constantly seek the telltale signs and behaviors of attackers. Once the attributes are learned the data is compiled into tools and techniques that can be used to detect the presence of an attacker. Threats must, therefore, change their code and alter techniques to defeat signature analysis. New versions of the malware or polymorphic techniques are common methods used to defeat signature based defenses.

Code Improvements: Some malware is just plain buggy. It is not uncommon for malware to cause poor performance or even disrupt applications. In recent years the shift from hacking by rogue amateurs to those of Nation States and Organized Crime are accompanied by improved code reliability. Today malware is less likely to affect performance. However, it is important to consider that operating system improvements may have also contributed to more stable performance even when misbehaving applications are present. In any case, attackers will regularly attempt to upgrade their malware allowing it to better adapt to its environment.

Detection avoidance: Overtime malware methods have become more sophisticated. Much of the efforts for improvements are related to techniques that hide the presence of the malicious code. Malware adaptations are increasingly disguising their activities to mimic legitimate system and network activity. Several years ago much of the malware resided in a single executable or may have included a small number of libraries. The existences of these tools were often readily observable in the file system and could be seen as distinct processes when executing. The next evolutionary jump emerged as add-ons to existing products. Malware increased the ability to cloak their activities by taking advantage of legitimate software features that enable extensions. Examples of these include system hooks, add-ons for office productivity software, and browser helper objects. By running as a loaded module the malware avoids detection of some process monitoring techniques, but are still observable through system tools. In case of extensions the malware plays by the rules of the operating system. In contrast, other techniques such as vulnerability exploits and process injection are used to force a target application to run the attackers code of choice. These approaches to detection avoidance are more stealthy and not easily identified. The latest evolution of detection avoidance involves the use of rootkits and system drivers. These methods allow the tool itself and accomplice malicious processes to operate largely undetectable by most system and security tools. Malware with stealthy capabilities hide their behavior by intercepting and filtering application programming interface (API) calls that could be used to reveal their presence. The trend of malware and attackers has gone from brazen attacks defacing popular sites or sensational attacks against a well known Internet presence to discrete compromises as chilling as any clandestine espionage activity could achieve.

Added Capabilities: New features incorporated into malware increase its value and potentially expand the influence of an attacker. Increased capabilities represent a maturation of the malware which is a type of adaptation for survival. For example, an update might give the attacker the capability to scan other hosts for weakness or act as a relay for other malicious activity. Increased capability enables the threat to adapt to an environment and potentially sustain or propagate its existence.

New Objectives: An attacker may periodically change targets or attack vectors. This is a common occurrence in botnets where the bot-herder rents out the zombies to service their customer requests (Geer 2005). The ability to change objectives is a tactical adaptation that makes for a superior weapon. Older malware usually had limited objectives that were not altered. Nowadays malware can attack new targets using different vectors or exploits through the receipt of software modules embedded with the new objectives and commands (McLaughlin 2004).

Upgrade Survival: Overtime, systems are upgraded or re-installed. Threats must be able to adapt to new technologies for the relevance to remain. As an example, a system owner might migrate from one technology, such as an email client or a web browser, to another. An adaptable threat will be able to accommodate the change and continue unimpeded if the migration represents a vector for exploitation. Upgrades to the underlying operating system can also affect the ability of the threat to endure. Adaptable threats anticipate or respond to these changes through code changes that allow their existence to continue. Although, a threat outside of a supply chain injection may not be capable of surviving a fresh installation of the OS or applications, it can attempt to persist by incorporating itself with legitimate applications and data. A threat that carefully infuses itself with data and applications targeted by managed backups enable malware longevity due to upgrades.

Self-preservation: It is not uncommon for malware to disrupt, disable, or destroy security controls in a system. Some aggressive threats will alter access controls to protect themselves. Others reportedly disable host-based firewalls and antivirus software (Abu Rajab, Zarfoss, Monrose, and Terzis 2006). This sort of activity ensures communications with the threat agent will remain intact. Yet, other malware may be so bold as to delete programs or audit data that could be used to detect or disrupt it. The techniques and methods used for self preservation must adapt according to changes in technology and the environment of the compromised system.

Competition: Imagine that a zombie computer is under the influence of different bot-masters. Serving multiple masters might produce erratic behavior on the machine. There appears to be an unspoken consensus in the evil realm of malware creators that a zombie should not exhibit personality disorders. This perceived consensus is most likely imaginary. In reality some malware attack and remove other malware (Osorio and Klopman 2006). Additionally, some malware reportedly patch existing vulnerabilities (Abu Rajab, Zarfoss, Monrose, and Terzis 2006). The reasons for this competition probably include rivalry, dominance, or economic advantage. In this regard competition among living organisms spills over into the cyber realm and is witnessed as malware on malware attacks. The escalation of malware competition is yet another dimension of the adaptable nature of threats.

Adaptive Defenses
Agile defenses are necessary to counteract adaptable threats. Defensive countermeasures are continuously challenged by the rapid changes they must deal with. On one hand defenses must adapt to changes in their environment. Network expansions and new technology can easily introduce exploitable weaknesses. On the other hand, threat aggressiveness continues to escalate. The rapid evolution of malware puts continuous pressure on defenses to adapt. From the perspective of a defender, adaptation is an imperative that must meet the challenges of environmental changes while remaining competitive with adaptive threats.

Attackers continuously conduct new and inventive assaults on network defenders. New attack methods brought about by malware adaptability are met with adaptive defenses. The discoveries of new attacks are often shared in the security community. Conjectured exploits by security researchers or actual exploits discovered in the Internet are reported by numerous public and private organizations. This new information is often integrated into defensive countermeasures resulting in an adaptation to the threat.

Defenses exhibit adaptation through behavior modification and mutations. The objectives of defenses tend to be more reactionary to threat activity. In contrast, the adaptations of threats are more exploratory and proactive. In this regard, defense adaptations tend to lag those of threats.

Behavior Modification
Defensive controls face more challenges than do their nemeses. Tools used to defend a system require behavior modifications to account for changes to the environment as well as proliferating and changing threats. Behavior modifications entail the methods used to accommodate the rapid changes in the organization, technology, and known threats.

Frequency: Adaptive defenses may alter the period with which they conduct their surveillance. For instance, vulnerability scanners might ordinarily be used on a monthly basis. If the organization is experiencing substantial growth or more frequent compromises then the frequency of the control is increased. The timeframe between the moments of detection activity presents opportunity for a threat to attack a system. Increasing the frequency behavior is an adaptive approach to counteract rising malware instances.

Breadth: Security controls are not always deployed in every possible location in a network due to resource constraints. Furthermore, a control might also peer into a narrow band of activity in its attempt to identify attacks. In some cases, the purview of a control can be expanded to cover a larger area. This could be through increased instances in a system or through expansion of the band of activity monitored. Altering the breadth of a control impacts its behavior. Essentially, an increase in the horizontal nature of the control adapts the behavior of the defensive mechanism to detect malicious activity.

Depth: Viewing activity in a system from the perspective of the Open Systems Interconnection (OSI) model provides a vertical perspective regarding the behavior of a security control. A security control might ordinarily operate at only one layer of the model. An adaptive defensive tool might occasionally perform inspections at other layers of the model to detect attacks or actual compromises. This type of capability demonstrates a change in behavior that could be very useful in detecting adaptive threats.

Indicators: Many defensive mechanisms rely on precompiled indicators or signatures of known attacks. Adaptable defenses have the capability to compare system activity against new and prior indicators to detect active attacks. Adaptability through indicators is one of the most predominate behavior modifications of defensive components.

Baselines: Well managed systems have a number of documented baselines. These include baselines for hardware, software, network connectivity, and configurations. Defensive tools with the capability to detect system components and configurations can validate baselines. Security controls with the ability to make comparisons between system changes and a documented baseline exhibit adaptable behavior.

Learning: Perhaps the most intriguing representation of behavior modification occurs when a security control actually learns something. Machine learning techniques are commonly found in security tools designed to look for anomalous activity. Two common implementations of machine learning are used by intrusion detection and spam filtering. Intrusion detection products make use neural networks and support vector machine algorithms. Spam filters typically uses Bayesian techniques. In both cases, the tools learn what is normal versus what is not and raise alerts when anomalous features are encountered.

Interactions: In the near future, technologically advanced security controls will receive data from multiple sources. This will provide the security control with the ability to form a more coherent picture of the cyber landscape. A security control with this advanced capability would be able to make predictions or advise human counterparts and other participating security controls of the current security state of the system. According to the situational awareness from the influx of data these advanced tools will exhibit behavior unlike its archaic predecessors. The interaction among these tools will form a collective that shares threat information and alters its behavior accordingly. For now, we rely on the interactions and sharing among humans to influence the most robust defensive control, the information security professional.

Defense Mutations
Most of the behavior modifications are realized through mutations of the affected defensive control. Adaptation by way of mutation is for the most part straight forward regarding security controls. It is important to note that a mutation need not necessarily be compiled. The inclusion of any sort of logic enabling adaptability qualifies as a mutation. Some of the most prominent mutations employed to achieve adaptability follow.

Signatures: Features, behaviors, and characteristics of malware and indicators of threat activity comprise attack signatures. In many instances the files containing the signature information are compiled into a library or proprietary data file. From this perspective, the inclusion of the new signatures results in a mutation of the defensive tool.

Rule Sets: This type of mutation is comprised of multiple "if-then" statements. Rule sets permit the logical evaluation of witnessed activity. Some rule sets are ordered to form logic trees. This allows for granular decisions based on collected information. Changes in the environment or attacker behavior may require changes in rule sets. Altering rule sets according to changes or trends enables adaptability through this type of mutation.

Thresholds: Cumulative events can be used to activate security controls. For instance, a control monitoring access control failures might not raise an alert unless the number of failures exceeds 10 events in less than one second. A threshold of this type is designed to detect malware behaviors given the successively repetitive failures in a period of time too fast to be driven by a human manipulating a graphical user interface. The composition and attributes of thresholds can be changed to adapt to new threats or changes in the behaviors of known malware.

Defensive Adaptation Weaknesses
The traditional model used by defenders has been to shore up weaknesses or adapt to threats by deploying new or modified tools counteracting the particular threat. Often times this can be over an extended period of time after a vulnerability is disclosed and exploit code is available. Sadly, adaptive defenses are primarily reactionary. Defensive measures usually target specific types of attacks deemed imminent. Rarely, will an organization incorporate a new defensive measure that is not focused on a particular threat or attack vector that has not been experienced by the organization or industry. The main reason for this line of thinking has to do with risk. An organization may deem a particular threat, likelihood, or loss to be minimal. Due to the prevalence of risk management by way of qualitative assessments, coupled with the ever present problem of scare resources, it is not uncommon for managers to be optimistic about their level or risk. As such adaptive or forward thinking defenses are not commonly deployed. This has the unfortunate side affect of causing defensive countermeasures to play catch-up with the attackers. This is evident by the relentless cycle of patching and signature updating.

Specificity: Defensive measures often rely heavily on specific signatures. The effort required to adapt to a new threat may be greater than that needed by the threat. Signature development can also require substantial time and effort to compile that can significantly lag a threat that is rapidly propagating. Considering the time and resources needed to detect, develop, and deploy an adaptation for a given threat it seems that attackers have the upper economic hand.

Timeliness: The creation of a countermeasure for a given threat may be well after significant damage occurs. In some cases this reactive adaptation can be too little too late. It is now common for exploits for previously unknown vulnerabilities to found in the wild. In this regard defensive adaptations are entirely reactive with respect to weaknesses and exploits.

Growth Rate: New threats are beginning to emerge at a rate faster than security defensive measures can adapt. A recent estimate claims that the volume of malicious code exceeds the production of legitimate software. One implication of this growth rate is that defenses may consume substantially more resources to determine if a threat is present or not. This reduction in efficiency will likely inhibit the ability of the defensive measures to adequately adapt to the ever increasing number of threats. Furthermore, the sheer volume may also imply that a larger number of malware is circulating in the wild that are unknown to defensive product vendors. This is to suggest that false negatives (malware missed by detectors) will increase. A substantial growth rate has the effect of overwhelming our defenses by a numerically superior enemy.

Environment: Changes in the system environment present unique challenges. Growing organizations regularly add new network equipment to accommodate a growing user base. New technologies enabling increased productivity may also include new vulnerabilities. Defenses must not only adapt to system growth but new technologies as well.

Search Space: Adaptations that attempt to characterize what is normal in a system often fail due to complexity. Defensive techniques such as anomaly detection are commonly designed to look at everything to identify features that are not normal. This requires the defensive tool to look at the entire universe of possibilities. Tuning is often used to increase performance by reducing the search space. However, the search space is often still too large allowing false positives to persist. In some cases, items ignored by the rule set can be abused by attackers and thus avoid detection.

Constraints: Whereas malware authors are free to attempt anything desired to achieve their objectives, defenders are much more constrained. The adaptability of defensive measures is reduced due to factors in their environment.

  • Financial: Adaptability often requires a monetary tradeoff. Whether the cost involves time, people, or materials the lack of sufficient financial resources can constrain defense adaptations.
  • Personnel: Adequately trained people must be assigned to monitor, respond, and manage defensive controls. Adaptations that are different from those existing are impacted by the abilities of those assigned responsibility. A superior adaptation is of little use if the end users are unable to implement it properly.
  • Performance: An adaptation must not severely degrade system performance. Whereas malware can be careless about performance issues, defensive measures with performance problems are often unacceptable even when they provide an important adaptation to a class of threats.
  • Usability: Defensive adaptations that are effective, but too difficult to use will not find favor with those who need them most. Complicated adaptations will be abandoned or circumvented by humans who are attempting to accomplish a particular task.
  • Management: A properly managed system implements change control. However, this can impede deployment of the adaptable defense. An adaptable control might be altogether avoided if it is perceived to be too difficult to manage.
  • Operations: Effective defenses contribute to security operations. An adaptable defense that exists in a silo inhibits the flow of security operations.
  • Design: Ideal security controls are built-in rather than bolt-on. Unfortunately, most adaptable defenses are bolt-on. Integration efforts may be hampered by complexity of the tool.
  • Perceptions: Qualitative risk assessments might lead management to conclusions that a particular adaptable defense is unnecessary. Perceptions based on insufficient or inaccurate information inhibit the acquisition and deployment of adaptable defenses.

These constraints burden defensive adaptability. Constraints impact the ability of a defensive control to compete with the unbridled capabilities of malware encountered. The competition between adaptable threats and defenses are becoming increasingly unbalanced in the favor of malware. In this regard, attackers have a distinct advantage that is evident by the continued rise in compromises and data losses.

Strengthening Defensive Adaptations
Security is first and foremost a people problem. Weaknesses in systems are going to occur. All security problems have their root in people. Some programmers will make mistakes when coding that is further missed by reviews and quality assurance. System integrators will occasionally put things together incorrectly. System administrators will introduce configuration errors or fail to follow procedures. Users will also make honest mistakes and fall victim to an attacker's trickery. Let us not forget that malware is an offspring of the warped efforts of bad people. All sorts of unsavory individuals such as criminals, spies, and terrorists are ultimately directing the actions of malware. Our goal as security professionals should be to not only employ adaptive defensive technology, but also establish adaptive operations that are proactive. In this regard, the reactive nature of our current adaptable defenses can be augmented with techniques and processes that are prepared for the worst. Consider some of the following during the design, implementation, and management of security operations for information systems.

Anticipate Compromises: Develop an attitude that the best plans will eventually be circumvented. Manage stakeholder expectations by advocating proactive measures that can be used as early warning detection of failed countermeasures. Note areas within the system that are at higher risk for compromise and conduct more frequent reviews.

Response Plans: Contingency planning and incident response are invaluable tools that can be used to prepare for the eventual compromise in a system. Having a plan is great, but it is only as good as those who are sufficiently familiar with its guidance. The plans should be regularly practiced and updated when weaknesses are discovered. Ensure the plans address the actions required to clean the system and restore normal operations.

Penetration Testing: Periodically attempt to break into your system. Hire reputable professionals to do the same. Use some of the same tools attackers use to compromise a system. Penetration testing should be used to exercise contingency and incident response plans.

Operational Alternatives: Few, if any, software products have proven impervious to vulnerabilities. Unfortunately, new vulnerabilities seem to be reported weekly for some products. Critical vulnerabilities with exploitable code in the wild may subject an organization to unacceptable risk. At such times it may be prudent to deploy or have ready other products that can be used instead of the one with a critical vulnerability. Require the use of alternative applications until all instances of the vulnerable product are appropriately patched. Consider altering access controls on the affected application to prevent intentional or accidental use. The downside to operational alternatives is increased management complexity and cost. The cost of a potential exposure and cleanup should be compared with the periodic licensing and management expenses.

Defense-in-Depth: Traditionally, defense in depth relies on the overlapping of policy, people, and technological countermeasures. Although this is a good idea it is proving too shallow. For instance, systems are often protected from malware by a policy that requires antivirus tools that are regularly updated, people to configure and use the tools, and the tools themselves deployed on workstations and servers. The problem with this approach becomes apparent when all of the mechanisms fail. Rather than use another antivirus product it would be better to implement secondary controls that could be used to detect and/or prevent virus propagation. Access controls, auditing, least privilege, network segregation, and intrusion detection are just some of the tools that can serve double duty to detect and defend against malware. But, they must be properly implemented and monitored to sufficiently detect the failure of the primary defense in depth controls.

Monitor for Changes: Ensure system managers have complete listings of the authorized hardware devices, software components, and their configurations in the system. Frequent sweeps of these aspects of the system should be conducted. Any changes not found in the listings should be immediately investigated. Issues identified should be corrected if inappropriate or included in the system listings if authorized. Identifying unauthorized changes to hardware, software, and configuration baselines is the most effective way to determine the existence of adaptable threats.


About the Author
From Information Security Management Handbook, Sixth Edition, Volume 4, edited by Harold F. Tipton and Micki Krause. New York: Auerbach Publications, 2010.

 
Subscribe to Information Security Today





Powered by VerticalResponse


© Copyright 2008-2010 Auerbach Publications