In the first part of this lesson, students are encouraged to make modifications to an existing program. That program simulates a system designed to monitor hospital patients' vital signs. If desired, an instructor could give a specification instead of the initial version, and require the students to develop that initial version themselves.
These activities are designed for students working individually or in small groups. The strategy is to alternate technically oriented tasks with time for reflection. This lesson is meant for students to do at least one software development step and one reflection in a single session. Because there are multiple stages, the entire exercise can be stretched across multiple classes, or could be accomplished in a single long session. The repeating cycle of specification, coding, and reflection is shown in Figure 1.
Using case studies in ethics education has a long history in business, medicine, and engineering. For example, see Herkert , Jennings , and Watson . A distinctive characteristic of this proposed lesson is the integration of software artifacts, leading to an intertwining of software engineering issues and societal impact issues.
An Initial Version of a Patient Monitoring System
This is a simplified version of actual software that could be used in a clinical setting; it is not nearly compliant with typical standards for disabled users. For example, the audible alarm (a central focus of the exercise) would not be an issue for deaf users. Colorblind users might find it difficult to distinguish between the red out of range numbers and the black in range numbers. The issue of disabled users is mentioned in a later reflection question. The simplicity of this version will make it easier for beginning programmers to do the tasks required; however, some faculty may want to modify this code in the initial version, or require students (especially students with more skills than beginners) to modify the program during the exercise, to increase its utility for a broader user base. With some changes, the ethical issues surrounding access to users of different abilities could become the focus of the exercise.
The alarms are a central issue as the exercise plays out, and it is crucial that students listen to the alarm often. Students and the instructor may find it tedious waiting for the alarm to sound during testing, and the instructor may suggest reducing timeUnit to speed things up. However, timing for implementations of the second version (described below) should be examined carefully if timeUnit is changed. Version 2 requires a pause of 30 seconds, and the Version 2 script given in this paper assumes a one second timeUnit when implementing that pause.
Students should visit the website of Version 1 (or their own version, or another student's version, if those are available), and execute the simulation multiple times. They should be instructed to record their reactions to the program. In our Version 1, the program is reset by reloading the page into the browser.
No matter how the students get the Version 1, it should be mentioned (though not necessarily emphasized) that the program is being designed for use with actual patients. (The first sentence of the specification states this.) However, in its initial version (and in all the versions in this exercise), a simulated, virtual patient is being monitored. If students ask about this, the instructor should suggest that after the software is developed and tested with the simulated patient, the plan is to use it with real patients with a minimum of changes.
After examining the execution of the system multiple times, students will then discuss (either in small groups, or as a class), the following questions:
- What was your overall impression of the program?
- Did you record any failures, or strange behaviors, of the program you executed?
- When you executed the program, did the visual and auditory alarms catch your attention?
- What did you do after the alarms were activated?
- Do you have any suggestions for improving a new version?
The instructor should not try to lead this discussion in any specific direction. However, the instructor should list the suggestions for improvements where all the students can see those suggestions. It is particularly helpful if at least one student suggestion includes some limits or controls that will allow the user to turn off the alarms, and/or reset the vital signs readouts. Our Version 1 includes a loud alarm bell that we expect to be annoying, and we anticipate that at least one student will want to be able to turn it off somehow during testing. If no student lists such a change for the new version, the instructor should add to the list the following change: "Mute the audible alarm after a certain time."
A New Version of the Patient Monitoring System
During the second phase, the instructor gives the students one or more changes to make in the program. Depending on the time available, and on the students' development skills, the instructor may decide to include several changes; the exercise is improved if there are multiple changes specified. However, whether there is one change or several changes specified, this specific change should be included:
"Modify the program so that the alarm sound ceases after 30 seconds. The visual alarm should continue."
After students have made this (and perhaps other) changes, they should test their modified programs, and the programs of others. Students should record the results of their testing, and record any impressions they have of the revised functionality. A second version of this program that is like Version 1 (above), but includes the change that turns off the alarm sound after 30 seconds (our Version 2) is available at .
Students will then discuss the following questions, in order, as a class. We do not expect the class discussion to finish this list of questions. That is explained beneath the questions.
- What are the advantages of the new version of this program?
- Are there any disadvantages of the new version?
- Did you have any questions about the changes when they were specified to you?
- Do you foresee any problems with this new version when the program is converted for use with real patients instead of simulated patients?
- Do you foresee any problems that might arise for patients with this program? For nurses? For the hospital?
- Do you foresee any safety concerns with turning off the audible alarm after 30 seconds?
- Do you think that 30 seconds was sufficient time for the alarm bell to sound? Why or why not?
We expect that as the class works through these questions, eventually a student will suggest that turning off the alarm sound might cause problems when this program is converted to real patients. If the alarm bell is turned off, or turned off too quickly, the warning might be ignored or never noticed. The patient may suffer if health professionals do not notice that there is a warning about the patient's vital signs. If a patient suffers harm, nurses may be reprimanded for not responding in a timely fashion. The hospital might be sued for negligence. It is important that these issues be put on the table. Once at least one student raises this issue (perhaps as one concern among others), the instructor can move to a discussion of this specific aspect of the revised program.
At this point, the instructor should encourage a discussion about the advisability of turning off the audible alarm. Eventually, the instructor should ask for alternatives to the "30 second rule" that was used in Version 2. Eventually one of the students (or the instructor if necessary) will suggest a means to turn off the audible alarm manually. This could be appropriate, it might be argued, if a nurse discovers the problem indicated by one or more vital signs going out of range, and has taken appropriate action. (That is, after the audible alarm has served its purpose, it can safely be turned off.)
The instructor now tells the students to make that change in the program. The specification is as follows:
"Add functionality to Version 2 to make a Version 3 that allows manual turning off of the audible alarm. Until the alarm is turned off manually, it should keep sounding. The visual alarm should stay on whenever a vital sign has gone out of range."
A Third Version of the Patient Monitoring System
Our Version 3 is available at . This program adds a button that turns off the audible alarm. After students have produced their third version of the program, they should look at their own and others' versions. After those examinations, the students are broken up into small groups to discuss the following questions:
- Does the manual system function as specified to turn off the audible alarm?
- Can you think of any problems with this system design?
- If a user were deaf or colorblind, would the current version be sufficient? Why or why not?
- Should anyone be able to manually turn off the audible alarm, or should only some users be authorized?
- In at least one version of this program, numbers start out black, and then turn red when a vital sign moves from normal to abnormal. Should the numbers turn back to black if the vital sign in question returns to the normal range? Why or why not?
- Can you think of any other weaknesses of this version of the program with respect to its eventual use with real patients?
The instructor should list any weaknesses that come up in the discussion of these questions. At some point, the instructor should invite the students to suggest program modifications that would address one or more of those weaknesses. For example, someone may point out that only authorized medical staff should be allowed to turn off the audible alarm. This could be accomplished with a password protection for the "turn off the alarm" feature. If there were password protection, then we could envision having other protected functionality, such as resetting the simulated vital signs to normal.
It seems likely that the discussions will sometimes focus on the difference between a simulated patient monitoring system and a system that monitors actual human patients. We think it is important for the instructor to acknowledge the important distinction between those two kinds of systems, but then also point out that a simulated system might someday be converted to a system for real patients. The instructor should point out that in the specification given for the system under consideration in this exercise, the conversion to a system monitoring human patients was explicitly mentioned in its first sentence. This possibility means that decisions made for monitoring a simulated patient could indeed have consequences for human patients in the future.
After the discussion that ends Phase 3, the instructor may want the students to change their Version 3 implementations into a Version 4 that improves some of the weaknesses of Version 3. Whether a Version 4 is produced, we think this exercise should end with a final phase in which students reflect on the rest of the exercise. Individually, in small groups, or as a class, students should discuss the following questions. Depending on how much time the instructor wants the students to spend on this exercise, their response to the questions could be verbal or written.
- At any time during this exercise, did you have questions about the software you were asked to write that you didn't ask?
- Were you surprised when you realized the software specified might lack important safety features?
- Did you remember noticing that the initial specification mentioned human patients? In retrospect, do you think that is an important detail in the specification? Why or why not?
- When software is being developed for use with patients, how much responsibility do you think the software developer should have to ensure patient safety? Is there anyone else who shares that responsibility?
- What are some other kinds of software in which human safety is an issue? In these cases, how much of the responsibility for human safety belongs to the software developer?
To prepare for this discussion, the instructor might read some literature about the ethical responsibilities of software developers. Collins et al. , Stanberry , and Brown  are three examples of this literature. There is a recorded ACM webcast  that might be of interest to the students and the instructor; the webcast features Dr. Donald Gotterbarn, a well-known expert in professional ethics for software engineers. Another approach is to tie students' reflections to a code of professional ethics such as the Software Engineering Code of Ethics and Professional Practice adopted by the ACM and the IEEE Computer Society . Principle 3 of this code focuses on software engineering products, and may be particularly appropriate for this exercise. The instructor (and eventually the students) might also study literature on alarm fatigue in medical settings, such as Graham and Cvach .
Although we think the instructor should look at some of these resources before the exercise, we recommend that students study some of these resources after the exercise. We think experiencing for themselves how technical decisions interact with human values will prime them for learning about professional ethical responsibilities. The exercise is then a structure to encourage discovery, followed by reflections, followed by deeper learning. This final phase in the exercise is like the "debriefing" following experiential learning exercises described by Sims .
It is important that computer science instructors lead students to consider their ethical responsibilities as computing professionals. Some computer science faculty have expressed reservations about teaching ethics to their students . We suggest that computer science faculty can be more comfortable and more effective in teaching about ethics when the ethical issues emerge naturally from teaching about computing. The exercise above is one example in which the technical details lead directly to questions about professional responsibility. Other resources that might be helpful to faculty wanting to develop their own exercises include Epstein's classic "Case of the Killer Robot" , and Collins and Miller's "Paramedic Ethics." 
The exercise was designed to encourage students to discover the importance of focusing on people affected by their decisions during software development. It is only after this discovery that a discussion about computer ethics and professional responsibilities is attempted.
Appendix A. Specification of a Patient-Monitoring Simulation
PURPOSE: Ultimately, this system will be used to monitor hospital patients. In this initial version, the system will simulate a patient's vital signs, and display the measurements on a web page. The vital signs to be monitored are temperature, diastolic blood pressure, systolic blood pressure, and pulse. The simulated values will start with initial values, and then will be periodically adjusted randomly. If any of the measurements are out of the range specified as normal, an audio alarm sounds, and an alarm icon is displayed.
No inputs necessary in this version. Eventually, patient vital signs will be read in from hardware devices.
After initializations, a web page is displayed, showing the current values of the vital signs. These values are updated periodically.
TimeUnit: The number of milliseconds between updates to the screen. Value: 1000
Given as Fahrenheit degrees, rounded to one decimal place = 98.6
HiTemperature = 104.0
LowTemperature = 90.0
Given as mm of mercury. = 80
HiDiastolicBloodPressure = 120
LoDiastolicBloodPressure = 45
Given as mm of mercury. = 120
HiSystolicBloodPressure = 170
LoSystolicBloodPressure = 70
Given a beats / minute. = 60
HiPulse = 90
LoPulse = 30
NOTE: At each TimeUnit, a random number from a uniform distribution between 0 and 1 is added to the current value of each vital sign. In addition, a "trend constant" for each vital sign is subtracted. By adjusting the trend constants, different scenarios can be simulated. Use the following trend constants in your initial version:
trendTemp = 0.475
trendDiastolicBloodPressure = 0.755
trendSystolicBloodPressure = 0.15
trendPulse = 0.7
DETAILS: Title the web page as "Patient Monitoring." Make up a name for the patient being monitored, a name for the supervising nurse, and a name for the company producing the software. Display all the information on one web page. Feel free to format the information in a way that uses the screen geography wisely, and is easy to read.
We appreciate the students who have engaged in this experience during two semesters of software engineering classes at the University of Missouri—St. Louis. Thanks also for helpful suggestions from reviewers and the editors.
4. Gotterbarn, D. and Miller, K. (2014). Do good and avoid evil… and why that is complicated in computing. ACM webcast; https://www.youtube.com/watch?v=b8TJvEr6eSQ. Accessed 2016 October 14.
5. Epstein, R.G. Case of the killer robot; http://www.onlineethics.org/Resources/killerrobot/robot.aspx. Accessed 2017 January 5.
8. IEEE-CS / ACM Join Task Force on Software Engineering Ethics and Professional Practices. Software Engineering Code of Ethics and Professional Practice (5.2); http://seeri.etsu.edu/Codes/TheSECode.htm. Accessed 2016 October 14.
11. Miller, K. (2016). Simulated patient monitoring system, Version 1; https://edocs.uis.edu/kmill2/www/PatientMonitoringV1.html. Accessed 2016 October 15.
12. Miller, K. (2016). Simulated patient monitoring system, Version 2; https://edocs.uis.edu/kmill2/www/PatientMonitoringV2.html. Accessed 2016 October 15.
13. Miller, K. (2016). Simulated patient monitoring system, Version 3; https://edocs.uis.edu/kmill2/www/PatientMonitoringV3.html. Accessed 2016 October 15.
David K. Larson
University of Illinois at Springfield
Department of Management Information Systems
One University Plaza, MS UHB 4021, Springfield, IL 62703
Keith W. Miller
University of Missouri - St. Louis
Computer Science Dept. and the College of Education
1 University Blvd., MS 100 MH, St. Louis, MO 63121
Figure 1. When teaching ethics through software development, the instructor can specify a task, students can code it, and then the students and the instructor can then reflect on the ethical significance of the work.
©2017 ACM 2153-2184/17/03 $15.00
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.Contents available in PDF
View Full Citation and Bibliometrics in the ACM DL.
To comment you must create or log in with your ACM account.