Social Science, Psychology, and Security Awareness: Why?
In any book, guide, or article on information security, it is impossible to avoid a discussion on the role of people in an information security program. Information security, like everything else, is a human enterprise and is influenced by factors that impact the individual. It is well recognized that the greatest information security danger to any organization is not a particular process, technology, or equipment; rather, it is the people who work within the "system" that hide the inherent danger.
One of the technology industry's responses to this danger has been the ever-important information security awareness program. A well-designed, effective awareness program reminds everyone — IT staff, management, and end users — of the dangers that are out there and things that can be done to defend the organization against them. The intent of this chapter is not to be a "how-to" on writing a security awareness program. There are numerous authors and specialists who have offered expertise in this field, as well as a plethora of reference materials that are available to everyone on the mechanics of writing an awareness program.
Rather, the main goal of this chapter is to explore and exploit the scientific body of knowledge around the psychology of how humans behave and make decisions. Using psychological principles that social scientists and psychologists have discovered over the past 50 years, we can produce security awareness programs that are more personal, relevant, and persuasive. Ultimately, knowing, understanding, and applying what we know about the engines of personal behavior will allow us to write more effective awareness programs.
Attitudes and Social Science in Everyday Life: Love Those Commercials!
Scientists have been studying the factors that drive and influence decision making and behavior for hundreds of years. There are scientists who specialize in these factors, such as environment (heat, cold, pain) and biology (genetics, neuroscience) Because information security practitioners cannot really manipulate these factors for benefit in awareness), this chapter focuses on the works of a group of scientists called social psychologists, who have collected a wonderful body of knowledge that we can directly apply.
Some individuals often doubt scientific knowledge and bemoan the lack of applicability in real life. Basically, is what social psychologists know of value (especially to information security practitioners)? The good news is that the social psychologists' findings have been widely known, accepted, and applied for years by a variety of different groups and people to great effect. Examples include political campaigns, activists, and sales people. However, social psychologists' knowledge of human behavior has been most effectively exploited in the field of advertising to persuade people to buy goods that, in many cases, people do not need. There is no reason why these same principles cannot be used to make security awareness programs more effective. After all, if people can be persuaded to buy a plastic singing fish for $29.95, they should be even more receptive to information that can actually benefit them, such as keeping their passwords secret.
Attitudes: The Basics
Before delving into a discussion of the various techniques for influence and persuasion, readers need to understand the basics of what we are trying to change. What structure or object in our minds are we trying to change to positively or negatively impact behavior? The answer to this question is our attitudes.
Attitudes are defined as our positive or negative response to something. For example, if I have a negative attitude toward privacy, I am more willing to give out network passwords and usernames to random, unauthorized people. If I have a positive attitude toward a new corporate security awareness program, I am more likely to abide by it as well as be a proponent. As you can clearly see, attitudes not only define our "feeling" toward something, but also play a role in our behavior. We, as information security professionals, need to be aware of attitudes (their structure and function) for three reasons:
1. Predictor of behavior. Attitudes are a good predictor of behavior. That is why surveys are an invaluable tool in an overall security program. If you can determine the target population's attitudes toward information security issues such as privacy and confidentially, you can use that information to predict how secure your environment will be. For example, if you have a large call center population with a measured negative attitude toward privacy, you can reasonably predict that the employees are not employing good work clean-up habits; i.e., shredding trash, logging out of workstations.
2. Targets of change. Attitudes can be targeted for change. If you can subtly or directly change someone's attitude, you can consequently change behavior. It is often easier to change behavior through an attitude shift than to change behavior directly. For example, a learned, repeated behavior such as leaving a workstation logged in while away is difficult to change directly. However, a strong emotional appeal toward the individual's attitude about confidentiality might have a better effect.
3. Source of risk. Attitudes are a source of risk for an information security professional. Extreme attitudes toward someone or something can lead to irrational cognitive function and behavior. This is one of the most feared situations for an information security manager, because it cannot be rationally predicted. Although an individual might "know" and "feel" that what he is doing is wrong, he might still be blinded by rage, love, or obsession into destructive behavior such as stealing, inappropriate access, confidentiality violations, etc.
Attitude Structure and Function: The ABC's of the Tripartite Model
For 30 to 40 years, the immense practical value of studying attitudes has encouraged social psychologists' research. During that time, they have learned a lot about attitudes through experimentation, population studies, and statistical analysis. One of the results of their labor has been a mathematical modeling of attitudes called the Tripartite Model. The Tripartite Model, also known as the ABC Model, presents attitude as an amalgam of three separate measurable components: affect, behavior, and cognition.
1. Affect. The affective component is the emotional aspect of our attitudes. Our feelings toward an object or subject play an important role in determining our attitudes. We are more likely to participate and do things that make us feel happy or good. Our aversion to things that elicit feelings of guilt, pain, fear, or grief can be used to change attitudes and, eventually, behavior. The affective appeal to our attitudes is common in TV commercials that make us laugh (e.g., beer commercials) or make us afraid (e.g., an alarm system), thus changing our attitudes toward a certain product.
A security awareness program can easily be written to appeal to these emotional responses. An excellent example of this phenomenon is the series of identity theft commercials that depicts the results of someone stealing someone else's credit card number.
The behavior component is derived from the fact that our behavior serves as a feedback mechanism for our attitudes. In short, "doing" leads to "liking." In an ingenious experiment, two randomly selected groups of subjects were asked to rate how much they liked a cartoon they were watching. The two groups watched the same cartoon, with only one group biting a pencil to simulate the facial muscles of a smile. It was found that the group that had to bite on a pencil rated the cartoon as being much more amusing and likeable than the group that did not. Other similar experiments with a variety of different tasks found that forcing yourself to do something you may not like, such as changing network passwords, may change your attitude toward privacy.
The cognitive component is the thoughtful, thinking aspect of our attitudes. Opinions toward an object or subject can be developed based solely on insightful, process-based thinking. It is no wonder that the nature of TV commercials during news programs is radically different than that aired on Saturday mornings. During news programs, people are more likely to be processing information and "thinking." Therefore, advertisers, with the help of social psychologists, have been attacking the cognitive component of our attitudes toward cars, cell phones, and other products, listing features and benefits (for cognitive processing) rather than using imagery.
Examples: The Tripartite Model and Customizing Security Awareness
A better understanding of the structure of attitudes allows us to more effectively customize our awareness program toward the target audience. Consider the following business environments and their security awareness requirements. Think about what component of the ABC Model of Attitudes is the most likely to result in changes in behavior through a security awareness program.
- The law firm. This law firm is based in Washington, D.C., and has 500 attorneys and more than 1000 associated staff. Each of the firm's attorneys is issued laptops and travel often to trial sites with sensitive information. The biggest concern is laptop security, with the firm having "lost" several laptops with client information.
- The call center. This call center, located in Dallas, Texas, has 400 call takers of low skill level processing credit card purchases of refurbished printers in a large, open area. The call center has recently had a series of incidents in which customers' credit numbers have been stolen by employees and used illegally.
- The hospital. This hospital, in Miami, Florida, has one of the largest and busiest emergency rooms in the country. Medical information is processed by doctors and nurses in open work areas that allow easy access to PC workstations. Due to recent HIPAA regulations, the hospital must change the behavior of its healthcare providers in better safeguarding patient information.
If you thought about cognitive (listing consequences of lost laptop to clients), affective (provide visual reminders of consequences of criminal behavior), and behavior (change desktop locations) appeals for the environments above, you were correct. If you thought of other components for the environments above, you were also correct. It is important to note that there is no right or wrong answer, just possibilities. In each of these cases, one aspect of the Tripartite Model may have produced better results than another. But more importantly, these examples demonstrate that by understanding what attitudes are and how they are structured, we can glean invaluable clues into how to tailor our information security awareness programs to have more impact on specific groups of users.
Science of Persuasion and Influence: Now the Good Part! Time to Change Your Mind!
The previous sections of this chapter established a foundation for understanding what our attitudes are; how they are constructed; and how they can be influenced to predict, motivate, and change behavior. We have applied our understanding of attitudes into methods that can be used to create more influential security awareness programs. This section shifts the focus toward what scientists have found in the phenomenon of influence. This area of social psychology dealing specifically with the changing of attitudes and behavior is known as persuasion.
Due to the immense practical value of knowledge about the mechanisms of persuasion, over 50 years of research has been accumulated by many psychologists at numerous universities. With this vast knowledge of the science and art of influence, we as information security practitioners should incorporate it as part of our repertoire in information security programs.
The following sections describe some of the most well-known phenomena in the science of influence. Each phenomenon will be described, along with some of the scientific evidence that has been performed on it. A discussion of the application of this phenomenon in an information security awareness context is also provided.
Reciprocity: Eliciting Uninvited and Unequal Debts
The obligation to reciprocate on debt has been observed by scientists in every culture on this planet. Sociologists, who study populations and cultures, believe that the need to reciprocate favors or debt is so pervasive that modern civilization could not have been built without it. Debt obligation allows for division of labor, exchange of goods and services, systems of gift and aid, and trade. However, social psychologists have discovered that people's innate sense of reciprocation can be manipulated. In fact, our innate sense of indebtedness can be subtly exploited so that uneasy feelings of debt can be obtained without invitation. What is worse is that a small favor can produce a sense of obligation that can be used to return a much bigger favor.
Our innate need to reciprocate (and sometimes reciprocate with more than what we need to) has been demonstrated in a variety of different experiments. A classic experiment involved two groups of subjects who were asked to purchase raffle tickets. The only difference between the two groups was that the first group was provided a free soda before being asked to purchase raffle tickets. It was found that the group that was given a soda, on average, purchased more than double the amount of raffle tickets than the group that was not given free soda. Considering that at the time of the study, a raffle ticket was 500 times the price of a soda, the return on investment (ROI) was high indeed. This unequal, reciprocating phenomenon has been demonstrated in countless experiments and can be seen in daily life in places such as airports with Hari Krishnas and their flowers (for donations) and at supermarkets with their free samples (ever buy a block of cheese after eating a sample?).
Information security professionals can use our natural need to reciprocate by offering inexpensive "favors" or "gifts" as part of the security awareness program. Trinkets such as "awareness program" pencils, magnets, and mouse pads can be cheaply procured and easily distributed to elicit indebtedness in the user population. Although there may not be conscious or direct evidence of indebtedness, it does exist and may play a role in an individual deciding to take the security awareness program seriously. The investment in these favors is generally very low and the ROI, even if it has a subtle role in preventing a security incident, is so high that it makes good sense to provide these free "samples" to your organization's "shoppers."
Cognitive Dissonance: Win Their Body, and Their Hearts and Minds Will Follow
Cognitive dissonance occurs when an individual performs an action that is contrary to his belief or attitude. It is the subconscious "tension" that is created when action is contrary to belief. An individual will alleviate this cognitive dissonance by changing his belief structure; i.e., change his attitudes. In anecdotal terms, this is an example of the heart and mind following the body when forced to perform distasteful tasks.
The best evidence for cognitive dissonance was discovered by psychophysiologists specializing in measuring physiological response from psychological stimuli. Dissonance experimentalists have been able to directly measure dissonance through physiological tests such as heart rate, blood pressure, and galvanic skin response. When subjects were asked to perform tasks that were contrary to their attitudes, an immediate physiological response was measured. When continually pressed to repeat the contrary task, alleviation of dissonance was measured over time, along with changes in attitudes.
Security practitioners can use cognitive dissonance to their advantage when introducing new security policy procedures that are not popular with the user community. Unpopular policies such as mandatory password changes, proper disposal of sensitive material, and adherence to physical security practices may initially be met with resistance. When introduced, these policies might be perceived as nothing more than a nuisance. However, consistency is the key. By making these security requirements mandatory and consistent, the practitioner will find that over the long-term, user dissatisfaction will wane and positive attitude change toward the program may occur as a result of cognitive dissonance.
Diffusion of Responsibility: InfoSec IS NOT My Problem!
People behave differently based on the perception of being part of a group as opposed to being an individual. It has been commonly observed that people tend to work less in a group than as individuals when only group output is measured. People, in addition, tend to feel less responsibility in a group than as a single individual. The bigger the group, the lower the felt sense of personal responsibility. Social psychologists call this diffusion of responsibility and the phenomenon is commonly observed across all cultures.
An extreme example includes an event in which a woman senselessly was beaten, stabbed, and murdered in an alleyway in New York while 38 neighbors watched from their windows. When interviewed, these neighbors referred to the presence of others as the source of their inaction. Another extreme example of diffusion of responsibility is suicide-baiting, when an individual in a group yells "jump" while observing a person on the ledge of a building. Suicide-baiting almost never occurs during the day with one or two people, but is much more common at night when mobs of people are gathered.
Diffusion of responsibility has been demonstrated in numerous scientific experiments. However, the most interesting and insightful one occurred in a basement at Ohio State University where various students were brought into a room and told to scream as loud as they could into a microphone. Each student was shown other rooms and told that there were anywhere from one to ten other students screaming with them (in other rooms), and that only group output would be measured. In reality, there were no other students, only a perception of such. It was reliably found that people tended to scream incrementally less, depending on the number they thought were screaming with them. Diffusion of responsibility has been reliably found in a variety of different tasks and cultures.
Diffusion of responsibility is most likely to occur in anonymous group environments. Recall the example in the previous section of this chapter of the large call center where credit card numbers are being processed.
Although a security awareness program may exist and apply to the workers of the call center, diffusion of responsibility is likely to be playing a role in how seriously the workers are taking security precautions. Environments such as data processing centers, helpdesks, and call centers, with their generic cubicle office structures, promote de-individualization and diffusion of responsibility. Not only is productivity lessened but also more importantly, workers are less likely to take programs like information security seriously, because they could incorrectly perceive having no personal responsibility for network security.
So what can practitioners do to lessen the impact of diffusion of responsibility? What can organizations do to minimize the negative attitude of "InfoSec IS NOT my problem" in a group setting?
Individualization: InfoSec IS My Problem!
The absolute antithesis of diffusion of responsibility is the effect of individualization on behavior. When people are reminded of themselves, for example, via visual stimuli or personal introspection, they tend to behave completely opposite than in an anonymous group. When individualization is perceived, people tend to be more honest, work harder, eat less, and take more responsibility. This is the reason why mirrors are common in retail stores (prevent theft by individualization) while they are never found in restaurant dining rooms (promote diffusion). In the case of the murder of Catherine Genovese in front of 38 neighbors in New York, individualization (pointing to a single person and screaming for help) could have resulted in action rather than the tragedy that occurred.
Much like diffusion of responsibility, there have been countless studies performed on the effects of de-individualization and individualization in groups. In the infamous Stanford "prison" study, students were randomly selected and separated into two groups: "prisoners" and "guards." These two student groups were introduced into a mock prison created for the experiment. Shockingly, over six days, the two groups experienced so much de-individualization within the experiment that the study had to be stopped. The "guards" had lost so much individual identity that they began to torment and abuse the "prisoners" beyond the requirement of the study. The "prisoners" who were deprived of individual identities began to experience psychosomatic disorders such as rashes, depression, and random moaning. The scientists concluded that so much de-individualization took place that students lost regard for human life and
Although the examples and studies provided in this section appear extreme, they are documented events. The effects of de-individualization and individualization are real and play a role in how users perceive their role in an information security awareness program. In the credit card processing call center example, de-individualization can encourage theft, carelessness, and loss of productivity. By making small, inexpensive investments and encouraging individuality, organizations can enhance their security program's effectiveness. Examples of such investments include mirrors, name plates, name tags, customized workspaces, and avoidance of uniforms.
Group Polarization: Group Dynamics in Security Awareness
Group interaction tends to polarize attitudes on a given subject rather than moderate it. This phenomenon of group polarization, also known as risky shift, has been a surprise finding by social psychologists in their study of group dynamics. Individuals in a group tend to shift and adopt more extreme attitudes toward a given topic over time. Scientists surmise that several factors are at work in this phenomenon, including diffusion of responsibility and a natural gravitation toward the creation of a group authority figure with the most extreme view of the group.
Group dynamics scientists have found that individuals think and behave quite differently when exposed to the attitudes of a group. Studies have found that test subjects of similar attitudes toward a subject (for example, a group of students who all feel moderately for capital punishment) once introduced to group discussions and activities, almost always come out individually more polarized toward the subject. In many cases, attitude "ring leaders" with the most extreme views arise to take group authority roles.
Group polarization could be both an asset and a liability for the information security practitioner. In an organization that may already have an inclination toward having a safe, secure environment (military, intelligence, and government), group dynamics and polarization may serve an enhancing role in the security awareness program. Unfortunately, the opposite effect may be experienced in environments where decentralization and personal freedom have been the norm. Educational and nonprofit organizations have a difficult time implementing strong security programs due to the communal, trust-based relationships that are fostered in them. It is important for the security practitioner to remember that user populations that may be predisposed to a specific opinion about information security will end up having enough stronger feelings about it after group interaction.
Social Proof: We Have Found the Information Security Enemy and It Is Us!
People determine what behavior is correct in a given situation to the degree that they see others performing it. Whether it is figuring out which utensil to use at a dinner party or deciding whether to let a stranger follow you into an office building, we use the actions of others as important guidelines in our own behavior. We do this because early in life we learn that doing as "others do" is more likely than not the right behavior.
Social proof has been repeatedly demonstrated in very simple, yet classic experiments. In one study, psychologists took a group of toddlers who were extremely fearful of dogs and showed them a child playing with dogs for 20 minutes a day. The scientists found that after only four days, more than 65 percent of the toddlers were willing to step into a pen alone with a dog. Even more remarkable was that the experiment produced similar results when it was repeated with video footage rather than a live child and dog.
Social proof in an information security environment can be both a blessing and curse. When others are able to observe positive attitudes and action toward aspects of a security awareness program, social proof can serve as a multiplier in encouraging positive behavior. However, examples of negative attitude and action toward security awareness policies (disregard, indifference, or denigration) can quickly spread, especially in confined environments such as processing centers, help desks, and call centers. It is up to information security managers and senior management of an organization to swiftly deal with those who set bad examples, and to encourage, promote, and foster those who take corporate security policies seriously.
Obedience to Authority: The High-Ups Say So!
Sociologists have observed that the inherent drive to obey authority figures is omnipresent across all cultures. They surmise that a hierarchical organization of individuals offers immense advantages to a society. It allows for the ability to manage resources, create trade, organize defense, and have social control over the population. The proclivity to obey authority figures may have a biological foundation with the same behavior being observed in a variety of different animals.
Deference to authority has been a well-researched field within social psychology. After World War II, social scientists wanted to understand how ordinary people were motivated to commit horrible atrocities. The common answer they found was that they were just following orders. In a well-known series of experiments at Yale University, Stanley Milgram found that randomly selected subjects were willing to deliver horrendous electrical shocks to a screaming participant on the orders of a researcher wearing a labcoat. This study found that as long as the researcher continued to prompt the test subject, the vast majority of subjects would continue to inflict pain, even after the victim had apparently lost consciousness.
Milgram performed a series of these experiments (with a variety of wrinkles thrown in) and found that individuals would almost always defer to the researcher for orders. When asked by a researcher to stop, 100 percent of the people stopped delivering shocks. When two white lab-coated researchers were included in the experiment that gave contradictory shock orders, it was found that test subjects always attempted to determine who was the higher ranking of the two researchers (rank). Factors such as proximity (standing next to the subject versus on a phone), sex (male versus female researchers), appearance (lab coat versus not), size (short versus tall) were all determined to play a role in people's willingness to obey authority. These studies were also performed in Europe and Asia, and no discernable differences were observed across cultures.
It is universally agreed that management buy-in and approval of an information security program is considered an essential requirement for success. However, approval and sponsorship is only a small fraction of the potential role management can play in an awareness program. Because people are predisposed to authority, management's active participation (being the lab-coated researcher) in the awareness program can only serve to magnify the impact of the program. Information security practitioners should look to leverage authority figures and determinants such as proximity (personal announcements instead of e-mails) and rank (having active participation from the highest-ranking manager possible) to maximize the power of the message as much as possible.
Familiarity and Repeated Exposure: The Price of Security Is Eternal Vigilance
Does familiarity breed contempt? Or does repeated exposure lead to liking? Scientists have found overwhelming evidence that repeated exposure to stimuli almost always results in positive attitude change. Radio stations repeatedly play the same songs, and for good reason - because we enjoy the song more when it is constantly repeated.
Pioneering scientists at the University of Michigan, and consequently other universities, have been studying repeated exposure versus liking for more than 30 years. They have found strong, consistent evidence of repeated exposure and familiarity leading to liking in a vast array of experiments. Bob Zajonc, in his classic experiment, found that students rated nonsense syllables as having positive connotations in direct proportion to the amount of times they were exposed to them. This phenomenon has been repeated with a variety of different stimuli, including objects, pictures, symbols, sounds, and faces.
As mentioned previously, consistency is one of the keys to a more persuasive security awareness program. Even in the face of end-user dissatisfaction, repeated exposure to the various components and policies and rationales for the program is essential for changing end-user attitudes. The most common mistake that is observed with a security awareness program is its inconsistency. Often, there is great activity and enthusiasm during the introduction of a security program; but after months have passed, there is little semblance of the initial fanfare. A trickle of e-mails and periodic postings on corporate newsgroups are all that is left to remind the users of the program. A program that is designed with consistency and longevity in mind (regular status communications, weekly workshops, daily E-reminders, and management announcements) will have a better chance of changing the attitudes of the user community to adopt the various parts of the security awareness program.
Information security awareness programs serve a critical role in keeping an organization safe by keeping the user community vigilant against the dangers of intruders. This chapter enlisted the help of social scientists —. experimental psychologists, sociologists, and psychophysiologists —. who have worked to further our knowledge about how we think and behave, making our security awareness programs more relevant, powerful, and effective. Through their research, we have found that at the core of our action are our attitudes. Knowing the subtle, unconscious ways to influence and nudge these attitudes can be a useful asset in implementing a more persuasive and effective security awareness program.
Implementing an Information Security Awareness Program>/a>
Why Information Security Training and Awareness Are Important/a>
Samuel Chun, CISSP, is the director of the Cyber Security Practice for HP Enterprise Services U.S. Public Sector. He is responsible for the strategy, portfolio development and industry messaging of cyber security services and solutions for U.S. Public Sector clients. He is also the lead subject matter expert for cyber security policy for HP Global Government Affairs. Chun is a regular speaker at industry conferences and cyber security policy workshops and legislative briefings in Washington, DC. He recently provided expert testimony on the "State of Federal Information Security" at a hearing before the House Subcommittee on Government Management, Organization and Procurement.
From Information Security Management Handbook, 2011 CD-ROM Edition edited by Harold F. Tipton and Micki Krause. New York: Auerbach Publications, 2011.