ACM adopted the first code of computer ethics in 1973. Since that time, the profession of computer science has matured to the extent that a well-developed set of ethical principles have evolved to guide the ethical practice of the discipline. One of two well-known codes of ethics today is the ACM Code of Ethics, passed in 1992, which identifies the elements of ethical professional conduct stated in the form of 24 imperatives of personal and professional responsibility. "It outlines fundamental considerations that contribute to society and human well-being and those that specifically relate to professional responsibilities, organizational imperatives, and compliance with the code [1]." That code is now undergoing an update to address possible blind spots or anachronisms that may have resulted from changes in technology or the profession since 1992. Similarly, the 1999 Software Engineering Code of Ethics and Professional Practice [2], the product of an ACM/IEEE-CS Joint Task Force, articulates eight Principles related to the behavior of and decisions made by professional software engineers, including practitioners, educators, managers, supervisors, and policy makers. These two codes of ethics have provided the moral compass needed by the discipline of computer science to evolve into a true profession. Even for those who do not consider themselves to be "professionals" the Computer Ethics Institute published the Ten Commandments of Computer Ethics [3] in 1992 to guide all computer users in ethical behavior.

ins01.gif

However, as is always the case with computer technology, the technical capabilities of computers and the internet, specifically, have moved rapidly beyond the human ability to guarantee a safe and ethical environment in cyberspace, no matter how ethical the developers may be. One consequence is that cyberspace is plagued with rogue individuals, groups, and even state-sponsored bad actors who have the aim to commit fraud, crime, espionage, damage to infrastructure, even terrorism. To combat this threat, it has been necessary to train a cadre of computer security experts who may have to use similar tools and strategies to neutralize such threats. This causes the cybersecurity landscape to shift every year with organizations desperate to fill the growing chasm of security jobs amid a serious shortfall of skilled graduates. In this frenetic climate, the tendency is to focus on developing individuals' cybersecurity knowledge and talent and to put them on the front line as quickly as possible, not considering how new recruits could potentially abuse these abilities. Lacking context on cybersecurity ethics, individuals must defer to their personal moral compass. This leads to good decisions as often as it leads to mistakes. The issue has become—what is an appropriate set of ethical standards for cybersecurity experts? Do we need a Code of Ethics for Cybersecurity?

Rainbow of Hackers

In addressing this issue, Aidan Knowles, an Ethical Hacking Engineer for IBM, defines what he calls the "rainbow of hackers." He groups hackers into three categories—black hats (the bad guys), white hats (the good guys), and grey hats (somewhere in between). As Knowles states [5], the three types of hackers carry out similar actions. They all use the same tools and resources to target various aspects of computer infrastructure—applications, networks, systems, hardware, and software as well as people. What differentiates their activities are motivation, legality, permission, and pre-knowledge of others regarding their actions. Each type of hacker has a different goal in mind for their work, as Knowles indicates:

"A white hat is commonly employed or contracted to carry out an attack under explicit permission and clear-cut boundaries. The goal of white hats' work is to research, find and test vulnerabilities, exploits and viruses in their defined targets. The findings of these professional engagements are reported directly to the target to enable them to fix any holes and strengthen their overall security posture. White hats are also sometimes involved in developing security products and tools."

"In contrast, black hats cause great intentional damage and profit at the expense of their targets…This darker side of the hacker spectrum can be further subcategorized into different camps: cybercriminals, cyber spies, cyber terrorists and hacktivists. Malicious actors may not always be operating externally from their victim. Research suggests that the insider threat within an organization's networks and premises, including from current or former employees and contractors, is responsible for a large portion of successful hacks. To carry out attacks, black hats may develop their own malicious tools but will frequently employ or repurpose existing white-hat software."

"Grey hats, as the name suggests, are more ambiguous in their definition. Their work may be classified as leaning toward good or bad on the spectrum depending on your perspective. The term gray hat is sometimes used to describe those who break the law but without criminal intent. This definition may include cyber vandals who deface websites and so-called rogue security researchers who publicly share discovered vulnerabilities without notifying or receiving prior permission from their targets."

The most important takeaway message from Knowles and the motivation for this article is that, "Without clear ethical standards and rules, cybersecurity professionals are almost indistinguishable from the black-hat criminals against whom they seek to protect systems and data."

ISSA Code of Ethics

To fill the compelling need for an ethics code for cybersecurity professionals, the Information Systems Security Association established the following Code of Ethics for its members in 2006 [4]:

  1. Perform all professional activities and duties in accordance with all applicable laws and the highest ethical principles;
  2. Promote generally accepted information security current best practices and standards;
  3. Maintain appropriate confidentiality of proprietary or otherwise sensitive information encountered in the course of professional activities;
  4. Discharge professional responsibilities with diligence and honesty;
  5. Refrain from any activities which might constitute a conflict of interest or otherwise damage the reputation of or is detrimental to employers, the information security profession, or the Association; and
  6. Not intentionally injure or impugn the professional reputation or practice of colleagues, clients, or employers.

Although the code is quite general, it does provide a moral framework for ethical cybersecurity practice based upon the principles of integrity, respect for confidentiality and privacy, and avoiding conflicts of interest.

Examples of Actions Requiring an Ethical Response

Using the general ethical principles delineated by ISSA, here are specific examples of appropriate actions to be taken while carrying out typical cybersecurity duties.

Denial of Service (DoS) attack recovery: During firewall security scans, the security team may discover a port call that results in a denial of service attack. The low road response to such an attack is to hack and attack the host back. However, that could result in other sites being caught in the DoS crossfire. The high road response in keeping with standards 1 and 2 above is to block the attack and gather forensic evidence to respond to the attack legally and ethically.

Penetration testing and response: Cybersecurity professionals often do penetration testing to determine the robustness of firewalls and security features in a system. When a weakness or vulnerability is found, it could allow a remote hacker to take control of the system and cause significant harm. There are two possible responses. The low road response is immediate full disclosure—publishing full details of the vulnerability as soon as possible and making the information available to everyone without restriction. This could enable black hat hackers to exploit the weakness before it is fixed. The high road response, "responsible disclosure," is more nuanced. Responsible disclosure requires the security expert to confidentially report the weakness to the company, work with the company to develop a fix within a given timeframe, and then publicly disclose the vulnerability and the fix at the same time [6]. This response would be in keeping with standards 3, 5 and 6 above.

Fighting malignant worms with benign worms: Should a cybersecurity expert release a benign worm "in the wild" when she believes that it might be able to patch a known vulnerability, inoculate systems to protect them from a malignant worm and keep it from spreading? The low road response would be to release it and hope for the best. The high road response, which can be concluded from standards 2, 4, and 5 above, would be to make the benign worm code publicly available with sufficient caveats that knowledgeable professionals should use it with care.

Cybersecurity experts, the white hats, work with sensitive data, have access to company and national secrets, and generally wield much power over networks, systems, and data. How individuals handle this responsibility comes down to their ethical yardstick, and reinforcing that ethical yardstick is a fundamental responsibility of the programs that train these experts. Teaching students how to make ethical decisions using the ethics codes in place through professional associations and in exemplary corporate cultures strengthens and reinforces the students' ethical yardsticks. It can be argued that developing ethical cybersecurity experts is even more important than developing technically competent cybersecurity experts.

• Resources for Teaching Computer Ethics:

Baase, S. The Gift of Fire: Social, Legal, Ethical Issues in Computing. 4th Edition, Prentice Hall, New Jersey, 2013.

Bowyer, Kevin W. Ethics and Computing. 2nd Edition, Wiley-IEEE Press, October 26, 2000).

Edgar, Stacey. Morality and Machines: Perspectives on Computer Ethics. Boston, MA: Jones and Bartlett, 1997.

Friedman, B. and Kahn, P. H., Jr., Educating Computer Scientists: Linking the Social and the Technical. Communications of the ACM, 37 1 (1994), 65–70.

Gotterbarn, D. Informatics and Professional Responsibility. Science and Engineering Ethics, 7,2 (2001), 221–230.

Gotterbarn, D. and Riser, R. Ethics Activities in Computer Science Courses. Computers & Society Newsletter, 26,3 (1996), 13–17.

Huff, C.R., Martin, C.D. and Project ImpactCS Steering Committee. Computing Consequences: A Framework for Teaching Ethical Computing (First Report of the ImpactCS Steering Committee). Communications of the ACM, 38,12 (1995), 75–84.

Johnson, D.G. Computer Ethics. 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2009.

Kling, R. and Dunlop, C. Computerization and Controversy: Value Conflicts and Social Choices. Academic Press, New York, NY, 1991.

Liffick, B. Analyzing Ethical Scenarios. Proceedings of the ETHICOMP95 Conference on Ethical Issues of Computing, DeMontfort University, Leicester, UK, March, 1995.

Martin, C.D., Huff, C. Gotterbarn, D., Miller, K. Implementing a Tenth Strand in the Computer Science Curriculum (Second Report of the ImpactCS Steering Committee), Communications of the ACM, 39,12 (1996), 75–84.

Martin, C.D. and Holz, H.J. Integrating Social Impact and Ethics Issues Across the Computer Science Curriculum. Information Processing 92: Proceedings of the 12th World Computer Congress, Madrid Spain, Vol. II: Education and Society, 239–245. Elsevier Science Publishers, North Holland, September, 1992.

Miller, K. Computer ethics in the curriculum. Computer Science Education, 1 (1988), 37–52.

Quinn, M. J. Ethics for the Information Age. 7th Edition, Pearson Publishing, 2016.

Spinello, R. Cyberethics: Morality and Ethics in Cyberspace. 6th Edition, Bartlett Jones Publishers, 2017.

Tavani, H.T. Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing. 4th Edition, Wiley Publishing. 2012.

References

1. ACM. ACM Code of Ethics. 1992; https://www.acm.org/about-acm/code-of-ethics. Accessed 2017 January 9.

2. ACM/IEEE Joint Task Force. Software Engineering Code of Ethics and Professional Practice. 1999; http://www.acm.org/about/se-code. Accessed 2017 January 9.

3. Computer Ethics Institute. Ten Commandments of Computer Ethics. 1992; http://computerethicsinstitute.org/publications/tencommandments.html. Accessed 2017 January 9.

4. ISSA. ISSA Code of Ethics. 2006; http://www.issa.org/?page=CodeofEthics

5. Knowles, A. The Hacker Rainbow; https://securityintelligence.com/how-black-hats-and-white-hats-collaborate-to-be-successful/. Accessed 2017 January 9.

6. Tull, J. A Snapshot in Cybersecurity Ethics; http://informationassurance.regis.edu/ia-programs/resources/blog/cyber-security-ethics. Accessed 2017 January 9.

Author

C. Dianne Martin
Department of Computer Science
Science and Engineering Hall
Room 4100
George Washington University
Washington, DC 20052, USA
dmartin@gwu.edu

Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.

Contents available in PDF
View Full Citation and Bibliometrics in the ACM DL.

Comments

There are no comments at this time.

 

To comment you must create or log in with your ACM account.