Information Security Today Home

New Books

The Executive MBA in Information Security, ISBN 9781439810071
Information Security Management Handbook, 2012 CD-ROM Edition
Information Security Management Metrics: A Definitive Guide to Effective Security Monitoring and Measurement, ISBN 9781420052855
Managing an Information Security and Privacy Awareness and Training Program, Second Edition by Rebecca Herold; ISBN 9781439815458
Building an Effective Information Security Policy Architecture, ISBN 9781420059052

Introduction to Computer Ethics

Rebecca Herold

The consideration of computer ethics fundamentally emerged with the birth of computers. There was concern right away that computers would be used inappropriately to the detriment of society, or that they would replace humans in many jobs, resulting in widespread job loss. To grasp fully the issues involved with computer ethics, it is important to consider the history. The following provides a brief overview of some significant events.

Consideration of computer ethics is recognized to have begun with the work of MIT professor Norbert Wiener during World War II in the early 1940s, when he helped to develop an anti-aircraft cannon capable of shooting down fast warplanes. This work resulted in Wiener and his colleagues creating a new field of research that Wiener called cybernetics, the science of information feedback systems. The concepts of cybernetics, combined with the developing computer technologies, led Wiener to make some ethical conclusions about the technology called information and communication technology (ICT), in which Wiener predicted social and ethical consequences. Wiener published The Human Use of Human Beings in 1950, which described a comprehensive foundation that is still the basis for computer ethics research and analysis.

In the mid-1960s, Donn B. Parker, at the time with SRI International in Menlo Park, CA, began examining unethical and illegal uses of computers and documenting examples of computer crime and other unethical computerized activities. He published "Rules of Ethics in Information Processing" in Communications of the ACM in 1968, and headed the development of the first Code of Professional Conduct for the Association for Computing Machinery, which was adopted by the ACM in 1973.

During the late 1960s, Joseph Weizenbaum, a computer scientist at MIT in Boston, created a computer program that he called ELIZA that he scripted to provide a crude imitation of "a Rogerian psychotherapist engaged in an initial interview with a patient." People had strong reactions to his program, some psychiatrists fearing it showed that computers would perform automated psychotherapy.

Weizenbaum wrote Computer Power and Human Reason in 1976, in which he expressed his concerns about the growing tendency to see humans as mere machines. His book, MIT courses, and many speeches inspired many computer ethics thoughts and projects.

Walter Maner is credited with coining the phrase "computer ethics" in the mid-1970s when discussing the ethical problems and issues created by computer technology, and taught a course on the subject at Old Dominion University. From the late 1970s into the mid-1980s, Maner's work created much interest in university-level computer ethics courses. In 1978, Maner published the Starter Kit in Computer Ethics, which contained curriculum materials and advice for developing computer ethics courses. Many university courses were put in place because of Maner's work.

In the 1980s, social and ethical consequences of information technology, such as computer-enabled crime, computer failure disasters, privacy invasion using computer databases, and software ownership lawsuits, were being widely discussed in America and Europe. James Moor of Dartmouth College published "What Is Computer Ethics?" in Computers and Ethics, and Deborah Johnson of Rensselaer Polytechnic Institute published Computer Ethics, the first textbook in the field in the mid-1980s. Other significant books about computer ethics were published within the psychology and sociology field, such as Sherry Turkle's The Second Self, about the impact of computing on the human psyche, and Judith Perrolle's Computers and Social Change: Information, Property and Power, about a sociological approach to computing and human values.

Maner Terrell Bynum held the first international multidisciplinary conference on computer ethics in 1991. For the first time, philosophers, computer professionals, sociologists, psychologists, lawyers, business leaders, news reporters, and government officials assembled to discuss computer ethics. During the 1990s, new university courses, research centers, conferences, journals, articles, and textbooks appeared, and organizations like Computer Professionals for Social Responsibility, the Electronic Frontier Foundation, and the Association for Computing Machinery-Special Interest Group on Computers and Society (ACM-SIGCAS) launched projects addressing computing and professional responsibility. Developments in Europe and Australia included new computer ethics research centers in England, Poland, Holland, and Italy. In the U.K., Simon Rogerson, of De Montfort University, led the ETHICOMP series of conferences and established the Centre for Computing and Social Responsibility.

Regulatory Requirements for Ethics Programs

When creating an ethics strategy, it is important to look at the regulatory requirements for ethics programs. These provide the basis for a minimal ethical standard upon which an organization can expand to fit its own unique organizational environment and requirements. An increasing number of regulatory requirements related to ethics programs and training now exist.

The 1991 U.S. Federal Sentencing Guidelines for Organizations (FSGO) outline minimal ethical requirements and provide for substantially reduced penalties in criminal cases when federal laws are violated if ethics programs are in place. Reduced penalties provide strong motivation to establish an ethics program. Effective November 1, 2004, the FSGO was updated with additional requirements:

  • In general, board members and senior executives must assume more specific responsibilities for a program to be found effective:
    • Organizational leaders must be knowledgeable about the content and operation of the compliance and ethics program, perform their assigned duties exercising due diligence, and promote an organizational culture that encourages ethical conduct and a commitment to compliance with the law.
    • The commission's definition of an effective compliance and ethics program now has three subsections:
  • Subsection (a) - the purpose of a compliance and ethics program
  • Subsection (b) - seven minimum requirements of such a program
  • Subsection (c) - the requirement to periodically assess the risk of criminal conduct and design, implement, or modify the seven program elements, as needed, to reduce the risk of criminal conduct

The purpose of an effective compliance and ethics program is "to exercise due diligence to prevent and detect criminal conduct and otherwise promote an organizational culture that encourages ethical conduct and a commitment to compliance with the law." The new requirement significantly expands the scope of an effective ethics program and requires the organization to report an offense to the appropriate governmental authorities without unreasonable delay.

The Sarbanes-Oxley Act of 2002 introduced accounting reform and requires attestation to the accuracy of financial reporting documents:

  • Section 103, "Auditing, Quality Control, and Independence Standards and Rules," requires the board to:
    • Register public accounting firms
    • Establish, or adopt, by rule, "auditing, quality control, ethics, independence, and other standards relating to the preparation of audit reports for issuers"
  • New Item 406(a) of Regulation S-K requires companies to disclose:
    • Whether they have a written code of ethics that applies to their senior officers
    • Any waivers of the code of ethics for these individuals
    • Any changes to the code of ethics
  • If companies do not have a code of ethics, they must explain why they have not adopted one.
  • The U.S. Securities and Exchange Commission approved a new governance structure for the New York Stock Exchange (NYSE) in December 2003. It includes a requirement for companies to adopt and disclose a code of business conduct and ethics for directors, officers, and employees, and promptly disclose any waivers of the code for directors or executive officers. The NYSE regulations require all listed companies to possess and communicate, both internally and externally, a code of conduct or face delisting.

In addition to these, organizations must monitor new and revised regulations from U.S. regulatory agencies, such as the Food and Drug Administration (FDA), Federal Trade Commission (FTC), Bureau of Alcohol, Tobacco, and Firearms (BATF), Internal Revenue Service (IRS), and Department of Labor (DoL), and many others throughout the world. Ethics plans and programs need to be established within the organization to ensure that the organization is in compliance with all such regulatory requirements.

Example Topics in Computer Ethics

When establishing a computer ethics program and accompanying training and awareness programs, it is important to consider the topics that have been addressed and researched. The following topics, identified by Terrell Bynum, are good to use as a basis.

Computers in the Workplace. Computers can pose a threat to jobs as people feel they may be replaced by them. However, the computer industry already has generated a wide variety of new jobs. When computers do not eliminate a job, they can radically alter it. In addition to job security concerns, another workplace concern is health and safety. It is a computer ethics issue to consider how computers impact health and job satisfaction when information technology is introduced into a workplace.

Computer Crime. With the proliferation of computer viruses, spyware, phishing and fraud schemes, and hacking activity from every location in the world, computer crime and security are certainly topics of concern when discussing computer ethics. Besides outsiders, or hackers, many computer crimes, such as embezzlement or planting of logic bombs, are committed by trusted personnel who have authorization to use company computer systems.

Privacy and Anonymity. One of the earliest computer ethics topics to arouse public interest was privacy. The ease and efficiency with which computers and networks can be used to gather, store, search, compare, retrieve, and share personal information make computer technology especially threatening to anyone who wishes to keep personal information out of the public domain or out of the hands of those who are perceived as potential threats. The variety of privacy-related issues generated by computer technology has led to reexamination of the concept of privacy itself.

Intellectual Property. One of the more controversial areas of computer ethics concerns the intellectual property rights connected with software ownership. Some people, like Richard Stallman, who started the Free Software Foundation, believe that software ownership should not be allowed at all. He claims that all information should be free, and all programs should be available for copying, studying, and modifying by anyone who wishes to do so. Others, such as Deborah Johnson, argue that software companies or programmers would not invest weeks and months of work and significant funds in the development of software if they could not get the investment back in the form of license fees or sales.

Professional Responsibility and Globalization. Global networks such as the Internet and conglomerates of business-to-business network connections are connecting people and information worldwide. Such globalization issues that include ethics considerations include:

  • Global laws
  • Global business
  • Global education
  • Global information flows
  • Information-rich and information-poor nations
  • Information interpretation

The gap between rich and poor nations, and between rich and poor citizens in industrialized countries, is very wide. As educational opportunities, business and employment opportunities, medical services, and many other necessities of life move more and more into cyberspace, gaps between the rich and the poor may become even worse, leading to new ethical considerations.

Common Computer Ethics Fallacies

Although computer education is starting to be incorporated in lower grades in elementary schools, the lack of early computer education for most current adults led to several documented generally accepted fallacies that apply to nearly all computer users. As technology advances, these fallacies will change; new ones will arise, and some of the original fallacies will no longer exist as children learn at an earlier age about computer use, risks, security, and other associated information. There are more than described here, but Peter S. Tippett identified the following computer ethics fallacies, which have been widely discussed and generally accepted as being representative of the most common.

The Computer Game Fallacy. Computer users tend to think that computers will generally prevent them from cheating and doing wrong. Programmers particularly believe that an error in programming syntax will prevent it from working, so that if a software program does indeed work, then it must be working correctly and preventing bad things or mistakes from happening. Even computer users in general have gotten the message that computers work with exacting accuracy and will not allow actions that should not occur. Of course, what computer users often do not consider is that although the computer operates under very strict rules, the software programs are written by humans and are just as susceptible to allowing bad things to happen as people often are in their own lives. Along with this, there is also the perception that a person can do something with a computer without being caught, so that if what is being done is not permissible, the computer should somehow prevent them from doing it.

The Law-Abiding Citizen Fallacy. Laws provide guidance for many things, including computer use. Sometimes users confuse what is legal with regard to computer use with what is reasonable behavior for using computers. Laws basically define the minimum standard about which actions can be reasonably judged, but such laws also call for individual judgment. Computer users often do not realize they also have a responsibility to consider the ramifications of their actions and to behave accordingly.

The Shatterproof Fallacy. Many, if not most, computer users believe that they can do little harm accidentally with a computer beyond perhaps erasing or messing up a file. However, computers are tools that can harm, even if computer users are unaware of the fact that their computer actions have actually hurt someone else in some way. For example, sending an email flame to a large group of recipients is the same as publicly humiliating them. Most people realize that they could be sued for libel for making such statements in a physical public forum, but may not realize they are also responsible for what they communicate and for their words and accusations on the Internet. As another example, forwarding e-mail without permission of the author can lead to harm or embarrassment if the original sender was communicating privately without expectation of his message being seen by any others. Also, using e-mail to stalk someone, to send spam, and to harass or offend the recipient in some way also are harmful uses of computers. Software piracy is yet another example of using computers to, in effect, hurt others.

Generally, the shatterproof fallacy is the belief that what a person does with a computer can do minimal harm, and only affects perhaps a few files on the computer itself; it is not considering the impact of actions before doing them.

The Candy-from-a-Baby Fallacy. Illegal and unethical activity, such as software piracy and plagiarism, are very easy to do with a computer. However, just because it is easy does not mean that it is right. Because of the ease with which computers can make copies, it is likely almost every computer user has committed software piracy of one form or another. The Software Publisher's Association (SPA) and Business Software Alliance (BSA) studies reveal software piracy costs companies multibillions of dollars. Copying a retail software package without paying for it is theft. Just because doing something wrong with a computer is easy does not mean it is ethical, legal, or acceptable.

The Hacker's Fallacy. Numerous reports and publications of the commonly accepted hacker belief is that it is acceptable to do anything with a computer as long as the motivation is to learn and not to gain or make a profit from such activities. This so-called hacker ethic is explored in more depth in the following section.

The Free Information Fallacy. A somewhat curious opinion of many is the notion that information "wants to be free," as mentioned earlier. It is suggested that this fallacy emerged from the fact that it is so easy to copy digital information and to distribute it widely. However, this line of thinking completely ignores the fact the copying and distribution of data is completely under the control and whim of the people who do it, and to a great extent, the people who allow it to happen.

Hacking and Hacktivism

Hacking is an ambivalent term, most commonly perceived as being part of criminal activities. However, hacking has been used to describe the work of individuals who have been associated with the open-source movement. Many of the developments in information technology have resulted from what has typically been considered as hacking activities. Manuel Castells considers hacker culture as the "informationalism" that incubates technological breakthrough, identifying hackers as the actors in the transition from an academically and institutionally constructed milieu of innovation to the emergence of self-organizing networks transcending organizational control.

A hacker was originally a person who sought to understand computers as thoroughly as possible. Soon hacking came to be associated with phreaking, breaking into phone networks to make free phone calls, which is clearly illegal.

The Hacker Ethic. The idea of a hacker ethic originates in the activities of the original hackers at MIT and Stanford in the 1950s and 1960s. Stephen Levy outlined the so-called hacker ethic as follows:

  1. Access to computers should be unlimited and total.
  2. All information should be free.
  3. Authority should be mistrusted and decentralization promoted.
  4. Hackers should be judged solely by their skills at hacking, rather than by race, class, age, gender, or position.
  5. Computers can be used to create art and beauty.
  6. Computers can change your life for the better.

The hacker ethic has three main functions:

  1. It promotes the belief of individual activity over any form of corporate authority or system of ideals.
  2. It supports a completely free-market approach to the exchange of and access to information.
  3. It promotes the belief that computers can have a beneficial and life-changing effect.

Such ideas are in conflict with a wide range of computer professionals' various codes of ethics.

Ethics Codes of Conduct and Resources

Several organizations and groups have defined the computer ethics their members should observe and practice. In fact, most professional organizations have adopted a code of ethics, a large percentage of which address how to handle information. To provide the ethics of all professional organizations related to computer use would fill a large book. The following are provided to give you an opportunity to compare similarities between the codes and, most interestingly, to note the differences and sometimes contradictions in the codes followed by the various diverse groups.

The Code of Fair Information Practices. In 1973 the Secretary's Advisory Committee on Automated Personal Data Systems for the U.S. Department of Health, Education and Welfare recommended the adoption of the following Code of Fair Information Practices to secure the privacy and rights of citizens:

  1. There must be no personal data record-keeping systems whose very existence is secret;
  2. There must be a way for an individual to find out what information is in his or her file and how the information is being used;
  3. There must be a way for an individual to correct information in his records;
  4. Any organization creating, maintaining, using, or disseminating records of personally identifiable information must assure the reliability of the data for its intended use and must take precautions to prevent misuse; and
  5. There must be a way for an individual to prevent personal information obtained for one purpose from being used for another purpose without his consent.

Internet Activities Board (IAB) (now the Internet Architecture Board) and RFC 1087. RFC 1087 is a statement of policy by the Internet Activities Board (IAB) posted in 1989 concerning the ethical and proper use of the resources of the Internet. The IAB "strongly endorses the view of the Division Advisory Panel of the National Science Foundation Division of Network, Communications Research and Infrastructure," which characterized as unethical and unacceptable any activity that purposely:

  • Seeks to gain unauthorized access to the resources of the Internet,
  • Disrupts the intended use of the Internet,
  • Wastes resources (people, capacity, computer) through such actions,
  • Destroys the integrity of computer-based information, or
  • Compromises the privacy of users.

Computer Ethics Institute (CEI). In 1991 the Computer Ethics Institute held its first National Computer Ethics Conference in Washington, D.C. The Ten Commandments of Computer Ethics were first presented in Dr. Ramon C. Barquin's paper prepared for the conference, "In Pursuit of a 'Ten Commandments' for Computer Ethics." The Computer Ethics Institute published them as follows in 1992:

  1. Thou Shalt Not Use a Computer to Harm Other People.
  2. Thou Shalt Not Interfere with Other People's Computer Work.
  3. Thou Shalt Not Snoop around in Other People's Computer Files.
  4. Thou Shalt Not Use a Computer to Steal.
  5. Thou Shalt Not Use a Computer to Bear False Witness.
  6. Thou Shalt Not Copy or Use Proprietary Software for Which You Have Not Paid.
  7. Thou Shalt Not Use Other People's Computer Resources without Authorization or Proper Compensation.
  8. Thou Shalt Not Appropriate Other People's Intellectual Output.
  9. Thou Shalt Think about the Social Consequences of the Program You Are Writing or the System You Are Designing.
  10. Thou Shalt Always Use a Computer in Ways That Insure Consideration and Respect for Your Fellow Humans.

National Conference on Computing and Values. The National Conference on Computing and Values (NCCV) was held on the campus of Southern Connecticut State University in August 1991. It proposed the following four primary values for computing, originally intended to serve as the ethical foundation and guidance for computer security:

  1. Preserve the public trust and confidence in computers.
  2. Enforce fair information practices.
  3. Protect the legitimate interests of the constituents of the system.
  4. Resist fraud, waste, and abuse.

The Working Group on Computer Ethics. In 1991, the Working Group on Computer Ethics created the following End User's Basic Tenets of Responsible Computing:

  1. I understand that just because something is legal, it isn't necessarily moral or right.
  2. I understand that people are always the ones ultimately harmed when computers are used unethically. The fact that computers, software, or a communications medium exists between me and those harmed does not in any way change moral responsibility toward my fellow humans.
  3. I will respect the rights of authors, including authors and publishers of software as well as authors and owners of information. I understand that just because copying programs and data is easy, it is not necessarily right.
  4. I will not break into or use other people's computers or read or use their information without their consent.
  5. I will not write or knowingly acquire, distribute, or allow intentional distribution of harmful software like bombs, worms, and computer viruses.

National Computer Ethics and Responsibilities Campaign (NCERC). In 1994, a National Computer Ethics and Responsibilities Campaign (NCERC) was launched to create an "electronic repository of information resources, training materials and sample ethics codes" that would be available on the Internet for IS managers and educators. The National Computer Security Association (NCSA) and the Computer Ethics Institute cosponsored NCERC. The NCERC Guide to Computer Ethics was developed to support the campaign.

The goal of NCERC is to foster computer ethics awareness and education. The campaign does this by making tools and other resources available for people who want to hold events, campaigns, awareness programs, seminars, and conferences or to write or communicate about computer ethics. NCERC is a nonpartisan initiative intended to increase understanding of the ethical and moral issues unique to the use, and sometimes abuse, of information technologies.

(ISC)2 Code of Ethics. The following is an excerpt from the (ISC)2 Code of Ethics preamble and canons, by which all CISSPs and SSCPs must abide. Compliance with the preamble and canons is mandatory to maintain certification. Computer professionals could resolve conflicts between the canons in the order of the canons. The canons are not equal and conflicts between them are not intended to create ethical binds.

Code of Ethics Preamble.

  • Safety of the commonwealth, duty to our principals, and to each other requires that we adhere, and be seen to adhere, to the highest ethical standards of behavior.
  • Therefore, strict adherence to this Code is a condition of certification.

Code of Ethics Canons.

Protect society, the commonwealth, and the infrastructure

  • Promote and preserve public trust and confidence in information and systems.
  • Promote the understanding and acceptance of prudent information security measures
  • Preserve and strengthen the integrity of the public infrastructure.
  • Discourage unsafe practice.

Act honorably, honestly, justly, responsibly, and legally

  • Tell the truth; make all stakeholders aware of your actions on a timely basis.
  • Observe all contracts and agreements, express or implied.
  • Treat all constituents fairly. In resolving conflicts, consider public safety and duties to principals, individuals, and the profession in that order.
  • Give prudent advice; avoid raising unnecessary alarm or giving unwarranted comfort. Take care to be truthful, objective, cautious, and within your competence.
  • When resolving differing laws in different jurisdictions, give preference to the laws of the jurisdiction in which you render your service.

Provide diligent and competent service to principals

  • Preserve the value of their systems, applications, and information.
  • Respect their trust and the privileges that they grant you.
  • Avoid conflicts of interest or the appearance thereof.
  • Render only those services for which you are fully competent and qualified.

Advance and protect the profession

  • Sponsor for professional advancement those best qualified. All other things equal, prefer those who are certified and who adhere to these canons. Avoid professional association with those whose practices or reputation might diminish the profession.
  • Take care not to injure the reputation of other professionals through malice or indifference.
  • Maintain your competence; keep your skills and knowledge current.
  • Give generously of your time and knowledge in training others.

Organizational Ethics Plan of Action

Peter S. Tippett has written extensively on computer ethics. He provided the following action plan to help corporate information security leaders to instill a culture of ethical computer use within organizations:

  1. Develop a corporate guide to computer ethics for the organization.
  2. Develop a computer ethics policy to supplement the computer security policy.
  3. Add information about computer ethics to the employee handbook.
  4. Find out whether the organization has a business ethics policy, and expand it to include computer ethics.
  5. Learn more about computer ethics and spreading what is learned.
  6. Help to foster awareness of computer ethics by participating in the computer ethics campaign.
  7. Make sure the organization has an E-mail privacy policy.
  8. Make sure employees know what the E-mail policy is.

Fritz H. Grupe, Timothy Garcia-Jay, and William Kuechler identified the following selected ethical bases for IT decision making:

Golden Rule: Treat others as you wish to be treated. Do not implement systems that you would not wish to be subjected to yourself. Is your company using unlicensed software although your company itself sells software?

Kant's Categorical Imperative: If an action is not right for everyone, it is not right for anyone. Does management monitor call center employees' seat time, but not its own?

Descartes' Rule of Change (also called the slippery slope): If an action is not repeatable at all times, it is not right at any time. Should your Web site link to another site, "framing" the page, so users think it was created and belongs to you?

Utilitarian Principle (also called universalism): Take the action that achieves the most good. Put a value on outcomes and strive to achieve the best results. This principle seeks to analyze and maximize the IT of the covered population within acknowledged resource constraints. Should customers using your Web site be asked to opt in or opt out of the possible sale of their personal data to other companies?

Risk Aversion Principle: Incur least harm or cost. Given alternatives that have varying degrees of harm and gain, choose the one that causes the least damage. If a manager reports that a subordinate criticized him in an e-mail to other employees, who would do the search and see the results of the search?

Avoid Harm: Avoid malfeasance or "do no harm." This basis implies a proactive obligation of companies to protect their customers and clients from systems with known harm. Does your company have a privacy policy that protects, rather than exploits customers?

No Free Lunch Rule: Assume that all property and information belong to someone. This principle is primarily applicable to intellectual property that should not be taken without just compensation. Has your company used unlicensed software? Or hired a group of IT workers from a competitor?

Legalism: Is it against the law? Moral actions may not be legal, and vice versa. Might your Web advertising exaggerate the features and benefits of your products? Are you collecting information illegally on minors?

Professionalism: Is an action contrary to codes of ethics? Do the professional codes cover a case and do they suggest the path to follow? When you present technological alternatives to managers who do not know the right questions to ask, do you tell them all they need to know to make informed choices?

Evidentiary guidance: Is there hard data to support or deny the value of taking an action? This is not a traditional "ethics" value but one that is a significant factor related to IT's policy decisions about the impact of systems on individuals and groups. This value involves probabilistic reasoning where outcomes can be predicted based on hard evidence based on research. Do you assume that you know PC users are satisfied with IT's service or has data been collected to determine what they really think?

Client/customer/patient choice: Let the people affected decide. In some circumstances, employees and customers have a right to self-determination through the informed consent process. This principle acknowledges a right to self-determination in deciding what is "harmful" or "beneficial" for their personal circumstances. Are your workers subjected to monitoring in places where they assume that they have privacy?

Equity: Will the costs and benefits be equitably distributed? Adherence to this principle obligates a company to provide similarly situated persons with the same access to data and systems. This can imply a proactive duty to inform and make services, data, and systems available to all those who share a similar circumstance. Has IT made intentionally inaccurate projections as to project costs?

Competition: This principle derives from the marketplace where consumers and institutions can select among competing companies, based on all considerations such as degree of privacy, cost, and quality. It recognizes that to be financially viable in the market, one must have data about what competitors are doing and understand and acknowledge the competitive implications of IT decisions. When you present a build or buy proposition to management, is it fully aware of the risk involved?

Compassion/last chance: Religious and philosophical traditions promote the need to find ways to assist the most vulnerable parties. Refusing to take unfair advantage of users or others who do not have technical knowledge is recognized in several professional codes of ethics. Do all workers have an equal opportunity to benefit from the organization's investment in IT?

Impartiality/objectivity: Are decisions biased in favor of one group or another? Is there an even playing field? IT personnel should avoid potential or apparent conflicts of interest. Do you or any of your IT employees have a vested interest in the companies that you deal with?

Openness/full disclosure: Are persons affected by this system aware of its existence, aware of what data are being collected, and knowledgeable about how it will be used? Do they have access to the same information? Is it possible for a Web site visitor to determine what cookies are used and what is done with any information they might collect?

Confidentiality: IT is obligated to determine whether data it collects on individuals can be adequately protected to avoid disclosure to parties whose need to know is not proven. Have you reduced security features to hold expenses to a minimum?

Trustworthiness and honesty: Does IT stand behind ethical principles to the point where it is accountable for the actions it takes? Has IT management ever posted or circulated a professional code of ethics with an expression of support for seeing that its employees act professionally?

How a Code of Ethics Applies to CISSPs

In 1998, Michael Davis described a professional ethics code as a "contract between professionals." According to this explanation, a profession is a group of persons who want to cooperate in serving the same ideal better than they could if they did not cooperate. Information security professionals, for example, are typically thought to serve the ideal of ensuring the confidentiality, integrity, and availability of information and the security of the technology that supports the information use. A code of ethics would then specify how professionals should pursue their common ideals so that each may do his or her best to reach the goals at a minimum cost while appropriately addressing the issues involved. The code helps to protect professionals from certain stresses and pressures (such as the pressure to cut corners with information security to save money) by making it reasonably likely that most other members of the profession will not take advantage of the resulting conduct of such pressures. An ethics code also protects members of a profession from certain consequences of competition, and encourages cooperation and support among the professionals.

Considering this, an occupation does not need society's recognition to be a profession. Indeed, it only needs the actions and activities among its members to cooperate to serve a certain ideal. Once an occupation becomes recognized as a profession, society historically has found reason to give the occupation special privileges (for example, the sole right to do certain kinds of work) to support serving the ideal in question (in this case, information security) in the way the profession serves society.

Understanding a code of ethics as a contract between professionals, it can be explained why each information security professional should not depend upon only his or her private conscience when determining how to practice the profession, and why he or she must take into account what a community of information security professionals has to say about what other information security professionals should do. What others expect of information security professionals is part of what each should take into account in choosing what to do within professional activities, especially if the expectation is reasonable.

The ethics code provides a guide to what information security professionals may reasonably expect of one another, basically setting forth the rules of the game. Just as athletes need to know the rules of football to know what to do to score, computer professionals also need to know computer ethics to know, for example, whether they should choose information security and risk reduction actions based completely and solely upon the wishes of an employer, or, instead, also consider information security leading practices and legal requirements when making recommendations and decisions.

A code of ethics should also provide a guide to what computer professionals may expect other members of our profession to help each other do. Keep in mind that people are not merely members of this or that profession. Each individual has responsibilities beyond the profession and, as such, must face his or her own conscience, along with the criticism, blame, and punishment of others, as a result of actions. These issues cannot be escaped just by making a decision because their profession told them to.

Information security professionals must take their professional code of ethics and apply it appropriately to their own unique environments. To assist with this, Donn B. Parker describes the following five ethical principles that apply to processing information in the workplace, and also provides examples of how they would be applied.

1. Informed consent. Try to make sure that the people affected by a decision are aware of your planned actions and that they either agree with your decision, or disagree but understand your intentions. Example: An employee gives a copy of a program that she wrote for her employer to a friend, and does not tell her employer about it.

2. Higher ethic in the worst case. Think carefully about your possible alternative actions and select the beneficial necessary ones that will cause the least, or no, harm under the worst circumstances. Example: A manager secretly monitors an employee's email, which may violate his privacy, but the manager has reason to believe that the employee may be involved in a serious theft of trade secrets.

3. Change of scale test. Consider that an action you may take on a small scale, or by you alone, could result in significant harm if carried out on a larger scale or by many others. Examples: A teacher lets a friend try out, just once, a database that he bought to see if the friend wants to buy a copy, too. The teacher does not let an entire classroom of his students use the database for a class assignment without first getting permission from the vendor. A computer user thinks it's okay to use a small amount of her employer's computer services for personal business, since the others' use is unaffected.

4. Owners' conservation of ownership. As a person who owns or is responsible for information, always make sure that the information is reasonably protected and that ownership of it, and rights to it, are clear to users. Example: A vendor who sells a commercial electronic bulletin board service with no proprietary notice at logon, loses control of the service to a group of hackers who take it over, misuse it, and offend customers.

5. Users' conservation of ownership. As a person who uses information, always assume others own it and their interests must be protected unless you explicitly know that you are free to use it in any way that you wish. Example: Hacker discovers a commercial electronic bulletin board with no proprietary notice at logon, and informs his friends, who take control of it, misuse it, and offend other customers.


About the Author
Rebecca Herold, CISSP, CISM, CISA, FLMI, is an information privacy, security, and compliance consultant, author, and instructor with over 16 years of experience assisting organizations of all sizes in all industries throughout the world. Rebecca has written numerous books, including Managing an Information Security and Privacy Awareness and Training Program, Second Edition (Auerbach Publications), along with dozens of book chapters and hundreds of published articles. Rebecca speaks often at conferences, and develops and teaches workshops for the Computer Security Institute. Rebecca is resident editor for the IT Compliance Community and also an adjunct professor for the Norwich University Master of Science in Information Assurance (MSIA) program.
From Official (ISC)2 Guide to the CISSP CBK, Harold F. Tipton (Ed.). New York: Auerbach Publications, 2006.


 
Subscribe to Information Security Today







Information Security Today Task Bar

© Copyright 2009-2013 Auerbach Publications