The integration of ethics into computer science instruction is not a new idea (for example, see [10]), but it is still useful to illustrate how this integration can be done in specific kinds of courses [14]. In this article, we present a hands-on ethics lesson for students involved in programming. We think this exercise can be used in any undergraduate class that engages students in software development. Although our examples are written using JavaScript, and can run in any modern browser, they could be easily adapted to other programming languages and platforms. Our goal in this exercise is to encourage students to think more carefully about the ethical implications of the kinds of work students are likely to be asked to do in their first software development jobs.

In the first part of this lesson, students are encouraged to make modifications to an existing program. That program simulates a system designed to monitor hospital patients' vital signs. If desired, an instructor could give a specification instead of the initial version, and require the students to develop that initial version themselves.

These activities are designed for students working individually or in small groups. The strategy is to alternate technically oriented tasks with time for reflection. This lesson is meant for students to do at least one software development step and one reflection in a single session. Because there are multiple stages, the entire exercise can be stretched across multiple classes, or could be accomplished in a single long session. The repeating cycle of specification, coding, and reflection is shown in Figure 1.

Using case studies in ethics education has a long history in business, medicine, and engineering. For example, see Herkert [7], Jennings [9], and Watson [17]. A distinctive characteristic of this proposed lesson is the integration of software artifacts, leading to an intertwining of software engineering issues and societal impact issues.

Phase 1:
An Initial Version of a Patient Monitoring System

The first version of a program is available at [11]. The source code for that HMTL web page, including JavaScript, Cascading Style Sheet (CSS), and Hypertext Mark Language (HTML), is in a single file. If the students are to develop this first version on their own, then this completed program would not be given to them; instead, they would be given a specification, such as the one in Appendix A. In either case, students should have a working web page that simulates patient monitoring, including vital signs of a hypothetical patient, before continuing with this exercise. We will assume that students are shown our Version 1. Figure 2 is a screenshot of that version.

This is a simplified version of actual software that could be used in a clinical setting; it is not nearly compliant with typical standards for disabled users. For example, the audible alarm (a central focus of the exercise) would not be an issue for deaf users. Colorblind users might find it difficult to distinguish between the red out of range numbers and the black in range numbers. The issue of disabled users is mentioned in a later reflection question. The simplicity of this version will make it easier for beginning programmers to do the tasks required; however, some faculty may want to modify this code in the initial version, or require students (especially students with more skills than beginners) to modify the program during the exercise, to increase its utility for a broader user base. With some changes, the ethical issues surrounding access to users of different abilities could become the focus of the exercise.

The speed of the simulated patient monitoring is governed by the constant timeUnit in the Javascript code. The code available for all three versions linked to in this paper set that constant to 1000 milliseconds, or one second. All three scripts have the same values for the constants that control how often the vital signs are sampled, the initial values of the vital signs, the random perturbations of each sign, and the ranges of each sign. With the values specified in these example programs, the alarms are tripped in a matter of minutes. (Since random numbers are involved, the behavior varies, but it usually takes more than a minute but less than 4 minutes for at least one vital sign to go out of range.)

The alarms are a central issue as the exercise plays out, and it is crucial that students listen to the alarm often. Students and the instructor may find it tedious waiting for the alarm to sound during testing, and the instructor may suggest reducing timeUnit to speed things up. However, timing for implementations of the second version (described below) should be examined carefully if timeUnit is changed. Version 2 requires a pause of 30 seconds, and the Version 2 script given in this paper assumes a one second timeUnit when implementing that pause.

Students should visit the website of Version 1 (or their own version, or another student's version, if those are available), and execute the simulation multiple times. They should be instructed to record their reactions to the program. In our Version 1, the program is reset by reloading the page into the browser.

No matter how the students get the Version 1, it should be mentioned (though not necessarily emphasized) that the program is being designed for use with actual patients. (The first sentence of the specification states this.) However, in its initial version (and in all the versions in this exercise), a simulated, virtual patient is being monitored. If students ask about this, the instructor should suggest that after the software is developed and tested with the simulated patient, the plan is to use it with real patients with a minimum of changes.

After examining the execution of the system multiple times, students will then discuss (either in small groups, or as a class), the following questions:

  1. What was your overall impression of the program?
  2. Did you record any failures, or strange behaviors, of the program you executed?
  3. When you executed the program, did the visual and auditory alarms catch your attention?
  4. What did you do after the alarms were activated?
  5. Do you have any suggestions for improving a new version?

The instructor should not try to lead this discussion in any specific direction. However, the instructor should list the suggestions for improvements where all the students can see those suggestions. It is particularly helpful if at least one student suggestion includes some limits or controls that will allow the user to turn off the alarms, and/or reset the vital signs readouts. Our Version 1 includes a loud alarm bell that we expect to be annoying, and we anticipate that at least one student will want to be able to turn it off somehow during testing. If no student lists such a change for the new version, the instructor should add to the list the following change: "Mute the audible alarm after a certain time."

Phase 2:
A New Version of the Patient Monitoring System

During the second phase, the instructor gives the students one or more changes to make in the program. Depending on the time available, and on the students' development skills, the instructor may decide to include several changes; the exercise is improved if there are multiple changes specified. However, whether there is one change or several changes specified, this specific change should be included:

"Modify the program so that the alarm sound ceases after 30 seconds. The visual alarm should continue."

After students have made this (and perhaps other) changes, they should test their modified programs, and the programs of others. Students should record the results of their testing, and record any impressions they have of the revised functionality. A second version of this program that is like Version 1 (above), but includes the change that turns off the alarm sound after 30 seconds (our Version 2) is available at [12].

Students will then discuss the following questions, in order, as a class. We do not expect the class discussion to finish this list of questions. That is explained beneath the questions.

  1. What are the advantages of the new version of this program?
  2. Are there any disadvantages of the new version?
  3. Did you have any questions about the changes when they were specified to you?
  4. Do you foresee any problems with this new version when the program is converted for use with real patients instead of simulated patients?
  5. Do you foresee any problems that might arise for patients with this program? For nurses? For the hospital?
  6. Do you foresee any safety concerns with turning off the audible alarm after 30 seconds?
  7. Do you think that 30 seconds was sufficient time for the alarm bell to sound? Why or why not?

We expect that as the class works through these questions, eventually a student will suggest that turning off the alarm sound might cause problems when this program is converted to real patients. If the alarm bell is turned off, or turned off too quickly, the warning might be ignored or never noticed. The patient may suffer if health professionals do not notice that there is a warning about the patient's vital signs. If a patient suffers harm, nurses may be reprimanded for not responding in a timely fashion. The hospital might be sued for negligence. It is important that these issues be put on the table. Once at least one student raises this issue (perhaps as one concern among others), the instructor can move to a discussion of this specific aspect of the revised program.

At this point, the instructor should encourage a discussion about the advisability of turning off the audible alarm. Eventually, the instructor should ask for alternatives to the "30 second rule" that was used in Version 2. Eventually one of the students (or the instructor if necessary) will suggest a means to turn off the audible alarm manually. This could be appropriate, it might be argued, if a nurse discovers the problem indicated by one or more vital signs going out of range, and has taken appropriate action. (That is, after the audible alarm has served its purpose, it can safely be turned off.)

The instructor now tells the students to make that change in the program. The specification is as follows:

"Add functionality to Version 2 to make a Version 3 that allows manual turning off of the audible alarm. Until the alarm is turned off manually, it should keep sounding. The visual alarm should stay on whenever a vital sign has gone out of range."

Phase 3:
A Third Version of the Patient Monitoring System

Our Version 3 is available at [13]. This program adds a button that turns off the audible alarm. After students have produced their third version of the program, they should look at their own and others' versions. After those examinations, the students are broken up into small groups to discuss the following questions:

  1. Does the manual system function as specified to turn off the audible alarm?
  2. Can you think of any problems with this system design?
  3. If a user were deaf or colorblind, would the current version be sufficient? Why or why not?
  4. Should anyone be able to manually turn off the audible alarm, or should only some users be authorized?
  5. In at least one version of this program, numbers start out black, and then turn red when a vital sign moves from normal to abnormal. Should the numbers turn back to black if the vital sign in question returns to the normal range? Why or why not?
  6. Can you think of any other weaknesses of this version of the program with respect to its eventual use with real patients?

The instructor should list any weaknesses that come up in the discussion of these questions. At some point, the instructor should invite the students to suggest program modifications that would address one or more of those weaknesses. For example, someone may point out that only authorized medical staff should be allowed to turn off the audible alarm. This could be accomplished with a password protection for the "turn off the alarm" feature. If there were password protection, then we could envision having other protected functionality, such as resetting the simulated vital signs to normal.

It seems likely that the discussions will sometimes focus on the difference between a simulated patient monitoring system and a system that monitors actual human patients. We think it is important for the instructor to acknowledge the important distinction between those two kinds of systems, but then also point out that a simulated system might someday be converted to a system for real patients. The instructor should point out that in the specification given for the system under consideration in this exercise, the conversion to a system monitoring human patients was explicitly mentioned in its first sentence. This possibility means that decisions made for monitoring a simulated patient could indeed have consequences for human patients in the future.

Phase 4:
Reflections

After the discussion that ends Phase 3, the instructor may want the students to change their Version 3 implementations into a Version 4 that improves some of the weaknesses of Version 3. Whether a Version 4 is produced, we think this exercise should end with a final phase in which students reflect on the rest of the exercise. Individually, in small groups, or as a class, students should discuss the following questions. Depending on how much time the instructor wants the students to spend on this exercise, their response to the questions could be verbal or written.

  1. At any time during this exercise, did you have questions about the software you were asked to write that you didn't ask?
  2. Were you surprised when you realized the software specified might lack important safety features?
  3. Did you remember noticing that the initial specification mentioned human patients? In retrospect, do you think that is an important detail in the specification? Why or why not?
  4. When software is being developed for use with patients, how much responsibility do you think the software developer should have to ensure patient safety? Is there anyone else who shares that responsibility?
  5. What are some other kinds of software in which human safety is an issue? In these cases, how much of the responsibility for human safety belongs to the software developer?

To prepare for this discussion, the instructor might read some literature about the ethical responsibilities of software developers. Collins et al. [3], Stanberry [16], and Brown [1] are three examples of this literature. There is a recorded ACM webcast [4] that might be of interest to the students and the instructor; the webcast features Dr. Donald Gotterbarn, a well-known expert in professional ethics for software engineers. Another approach is to tie students' reflections to a code of professional ethics such as the Software Engineering Code of Ethics and Professional Practice adopted by the ACM and the IEEE Computer Society [8]. Principle 3 of this code focuses on software engineering products, and may be particularly appropriate for this exercise. The instructor (and eventually the students) might also study literature on alarm fatigue in medical settings, such as Graham and Cvach [6].

Although we think the instructor should look at some of these resources before the exercise, we recommend that students study some of these resources after the exercise. We think experiencing for themselves how technical decisions interact with human values will prime them for learning about professional ethical responsibilities. The exercise is then a structure to encourage discovery, followed by reflections, followed by deeper learning. This final phase in the exercise is like the "debriefing" following experiential learning exercises described by Sims [15].

Conclusion

It is important that computer science instructors lead students to consider their ethical responsibilities as computing professionals. Some computer science faculty have expressed reservations about teaching ethics to their students [14]. We suggest that computer science faculty can be more comfortable and more effective in teaching about ethics when the ethical issues emerge naturally from teaching about computing. The exercise above is one example in which the technical details lead directly to questions about professional responsibility. Other resources that might be helpful to faculty wanting to develop their own exercises include Epstein's classic "Case of the Killer Robot" [5], and Collins and Miller's "Paramedic Ethics." [2]

The exercise was designed to encourage students to discover the importance of focusing on people affected by their decisions during software development. It is only after this discovery that a discussion about computer ethics and professional responsibilities is attempted.

Appendix A. Specification of a Patient-Monitoring Simulation

PURPOSE: Ultimately, this system will be used to monitor hospital patients. In this initial version, the system will simulate a patient's vital signs, and display the measurements on a web page. The vital signs to be monitored are temperature, diastolic blood pressure, systolic blood pressure, and pulse. The simulated values will start with initial values, and then will be periodically adjusted randomly. If any of the measurements are out of the range specified as normal, an audio alarm sounds, and an alarm icon is displayed.

INPUTS:
No inputs necessary in this version. Eventually, patient vital signs will be read in from hardware devices.

OUTPUTS:
After initializations, a web page is displayed, showing the current values of the vital signs. These values are updated periodically.

CONSTANTS:
TimeUnit: The number of milliseconds between updates to the screen. Value: 1000

InitialTemperature:
Given as Fahrenheit degrees, rounded to one decimal place = 98.6
HiTemperature = 104.0
LowTemperature = 90.0

InitialDiastolicBloodPressure:
Given as mm of mercury. = 80
HiDiastolicBloodPressure = 120
LoDiastolicBloodPressure = 45

InitialSystolicBloodPressure:
Given as mm of mercury. = 120
HiSystolicBloodPressure = 170
LoSystolicBloodPressure = 70

Pulse:
Given a beats / minute. = 60
HiPulse = 90
LoPulse = 30

NOTE: At each TimeUnit, a random number from a uniform distribution between 0 and 1 is added to the current value of each vital sign. In addition, a "trend constant" for each vital sign is subtracted. By adjusting the trend constants, different scenarios can be simulated. Use the following trend constants in your initial version:

trendTemp = 0.475
trendDiastolicBloodPressure = 0.755
trendSystolicBloodPressure = 0.15
trendPulse = 0.7

DETAILS: Title the web page as "Patient Monitoring." Make up a name for the patient being monitored, a name for the supervising nurse, and a name for the company producing the software. Display all the information on one web page. Feel free to format the information in a way that uses the screen geography wisely, and is easy to read.

• Acknowledgements

We appreciate the students who have engaged in this experience during two semesters of software engineering classes at the University of Missouri—St. Louis. Thanks also for helpful suggestions from reviewers and the editors.

References

1. Brown, I., & Adams, A. A. The ethical challenges of ubiquitous healthcare International Review of Information Ethics, 8,12 (2007), 53–60.

2. Collins, W. R., & Miller, K. W. Paramedic ethics for computer professionals. Journal of Systems and Software, 17,1 (1992), 23–38.

3. Collins, W. R., Miller, K. W., Spielman, B. J., & Wherry, P. How good is good enough?: an ethical analysis of software construction and use. Communications of the ACM, 37,1 (1994), 81–91.

4. Gotterbarn, D. and Miller, K. (2014). Do good and avoid evil… and why that is complicated in computing. ACM webcast; https://www.youtube.com/watch?v=b8TJvEr6eSQ. Accessed 2016 October 14.

5. Epstein, R.G. Case of the killer robot; http://www.onlineethics.org/Resources/killerrobot/robot.aspx. Accessed 2017 January 5.

6. Graham, K. C., & Cvach, M. Monitor alarm fatigue: standardizing use of physiological monitors and decreasing nuisance alarms. American Journal of Critical Care, 19,1 (2010), 28–34.

7. Herkert, J. R. Engineering ethics education in the USA: Content, pedagogy and curriculum. European Journal of Engineering Education, 25,4 (2000), 303–313.

8. IEEE-CS / ACM Join Task Force on Software Engineering Ethics and Professional Practices. Software Engineering Code of Ethics and Professional Practice (5.2); http://seeri.etsu.edu/Codes/TheSECode.htm. Accessed 2016 October 14.

9. Jennings, M. M. Business Ethics: Case Studies and Selected Readings. 8th ed. (Stamford, CT: Cengage Learning, 2014).

10. Miller, K. Integrating computer ethics into the computer science curriculum. Computer Science Education, 1,1 (1988), 37–52.

11. Miller, K. (2016). Simulated patient monitoring system, Version 1; https://edocs.uis.edu/kmill2/www/PatientMonitoringV1.html. Accessed 2016 October 15.

12. Miller, K. (2016). Simulated patient monitoring system, Version 2; https://edocs.uis.edu/kmill2/www/PatientMonitoringV2.html. Accessed 2016 October 15.

13. Miller, K. (2016). Simulated patient monitoring system, Version 3; https://edocs.uis.edu/kmill2/www/PatientMonitoringV3.html. Accessed 2016 October 15.

14. Quinn, M. On teaching computer ethics within a Computer Science department. Science and Engineering Ethics 12,2 (2006), 335–343.

15. Sims, R. R. Debriefing experiential learning exercises in ethics education. Teaching Business Ethics, 6,2 (2002), 179–197.

16. Stanberry, B. Legal ethical and risk issues in telemedicine. Computer Methods and Programs in Biomedicine, 64,3 (2001), 225–233.

17. Walton, D. N. Ethics of Withdrawal of Life-Support Systems: Case Studies on Decision Making in Intensive Care. (Praeger, 1987).

Authors

David K. Larson
University of Illinois at Springfield
Department of Management Information Systems
One University Plaza, MS UHB 4021, Springfield, IL 62703
larson.david@uis.edu

Keith W. Miller
University of Missouri - St. Louis
Computer Science Dept. and the College of Education
1 University Blvd., MS 100 MH, St. Louis, MO 63121
millerkei@umsl.edu

Figures

F1Figure 1. When teaching ethics through software development, the instructor can specify a task, students can code it, and then the students and the instructor can then reflect on the ethical significance of the work.

F2Figure 2. A screenshot of the Version 1 program after two vital signs have gone out of range.

©2017 ACM  2153-2184/17/03  $15.00

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.

Contents available in PDF
View Full Citation and Bibliometrics in the ACM DL.

Comments

There are no comments at this time.

 

To comment you must create or log in with your ACM account.