About 20 years ago, I discovered that I had difficulty hearing high pitches in my left ear, and I needed a hearing aid. Over the years, I have used several devices—the most recent in the past few months. This column begins describing my experiences—largely with the recent hearing aids. Perhaps surprisingly, this recent history yields several implications and lessons for software development, ethics, product documentation, and teaching computing. In what follows, specific observations and lessons may or may not be new to ACM Inroads readers, but perhaps this case study may provide a helpful framework for examination and discussion.
My "Hearing Aid" Experience
Background: About 20 years ago, over several months, I realized I had trouble hearing questions from female students during class, but not from men. Frequently, I asked many women to repeat their questions, but few men—likely giving the unintended impression of gender bias. After having my hearing tested, I discovered my left ear had considerable trouble hearing high pitches. (My right ear had normal hearing.)
In talking to an audiologist and hearing aid center, I learned that digital hearing aids at that time could separate hearing frequencies into roughly 13 different groups and boost the sound independently within those ranges.1 Effectively, a hearing aid in my left ear could provide the needed adjustments for female voices, while making little change to the lower frequencies.
Early Resolution: Immediately after testing, I got a hearing aid, with settings adjusted to my hearing. (In subsequent years, I had new tests, and new frequency settings by the audiologist were easy and quick.) With this device, my apparent pattern of asking women to repeat was gone!
As an aside, the impact of impaired hearing of high frequencies is widely reported (e.g., see [3]). However, I have mentioned this experience to numerous other faculty, and most seemed unaware that such troubles might arise and be hearing related. (Only one CS faculty member over the 20 or so years indicated such matters were "well known" and largely impacted female students.)
Recent Experience: A few months ago, my hearing aid was clearly aging, so I bought a replacement. As one might expect, over the past 1–2 decades, technology has evolved substantially. In many cases, the hearing aid connects to one's mobile device, and an app provides numerous options (e.g., set volume, select filters for background noise, etc.). Within this context, I purchased a "mid-level" hearing aid—one that allowed adjustments for multiple frequencies, various filters, etc. I certainly did not need (or want) lots of bells and whistles (play on words not necessarily intended—but read on to learn what actually happened).
After working with my audiologist to configure the hearing aid and corresponding app on my mobile device, I returned home, and all went fine for the rest of the day.
The next day turned out to be an adventure.2
- In the morning, I put on the hearing aid, turned it on, and after half an hour walked outside for my morning commute.
- During my walk to the car, when I shifted my computer bag on my shoulder,3 I heard a cluster or barrage of diverse, brief notes that continued for several seconds.
- After parking at a university parking lot, and later when walking alone around campus, my hearing aid stated "search failed" several times—no one was near me.
- Later during an early afternoon class, I heard a racket from my hearing aid to the extent that I paused the class to consult my mobile device. Apparently, my cell phone had called a contact with whom I collaborated about five years earlier. Upon checking after class, my cell phone had called this person three times during class—although I had neither called nor spoken to that person for at least seven years.4
As you might guess, I was unhappy about web searches from my mobile device, and I was truly upset that it made at least 3 unprompted phone calls to my contacts list.
Upon following up, my audiologist indicated that the hearing aid typically establishes two Bluetooth connections to a mobile device: one for the app and another to handle phone calls and other audio. However, my hearing aid was not supposed to have the capability for voice commands (e.g., Siri, Alexa, or similar). Further, my audiologist contacted the company that confirmed my hearing aid was not equipped for voice commands. They had no explanation regarding how either the web searches or the phone calls could have been initiated.
Overall, no explanation is known for the apparently-autonomous web searches and phone calls, and I was told to disconnect the Bluetooth connection to the phone (audio), but not to the app. In this degraded mode, I experienced few troubles, although I used the device with much anxiety—until I returned it for another brand.
Implications for Product Documentation
After this experience with unintended web searches and phone calls, I did a careful search of the manufacturer's pamphlets and web pages. As far as I can tell, no documentation mentions voice commands, autonomous searches, or other similar "features." Further, the audiologist had no knowledge of such capabilities, and the company indicated my hearing aid did not have any such functions.
On the other hand, upon subsequent exploration of the app, I found a menu item for "voice assist" several menus down. Further, that option was identified as being "on,", so upon my discovery I immediately turned it "off"—whatever the risk, I wanted no part of any actions this might perform!
Implications for Software Development
Since I now live in a community for seniors, numerous friends and neighbors have limited vision, and these people often welcome voice-activated commands—they often have trouble interacting with screens. From this perspective, including voice-assisted commands may be quite worthwhile.
However, developers clearly must consider user interfaces and default settings, so that interested users can take advantage of these features, while others are not caught unaware. Further, in some cases interests and needs may conflict. For example, under what circumstances is it appropriate for voice commands to handle phone calls? Should procedures/features be different for receiving calls and making them?
Overall, I can think of numerous use cases for both audiences—but beyond considerations of the two populations separately, design must find ways to address circumstances when advantages/capabilities for one group might be considered disastrous for another group.
Implications for Ethical and Social Issues for Computing
Substantial evidence indicates that my hearing aid initiated semi-autonomous web searches (on unknown search terms/phrases) and phone calls (based on my phone contacts list)—apparently using hearing-aid "voice assist" and a mobile device on which voice commands (e.g., Alexa, Siri, or similar) were turned off. Naturally this experience raises numerous ethical and social issues. Some questions follow:
- What policies, procedures, and/or practices should be established to prevent unwanted, semi-autonomous actions by devices?
- How should designers proceed in identifying possible unwanted actions? (In this case, apparently neither the audiologist nor the company knew that the hearing aid (working in some way with a mobile device) was technically capable of initiating the web searches or phone calls.)
- If the result of a semi-autonomous action causes harm, who/what should be held accountable? What process should be followed?
- If a general disclaimer were to appear within an application's documentation, would this resolve any legal, social, or ethical concerns? Would a specific disclaimer (e.g., regarding web searches or phone calls) have a different impact concerning these concerns?
- To what extent would a culture following the ACM Code of Ethics and Professional Conduct [1] mitigate some or all of such legal, social, and/or ethical matters?
Implications for Teaching Computing
At a reasonably obvious level, this experience report may provide a novel example for class discussion of the numerous computing-related topics discussed here (e.g., documentation, software development, and social/ethical issues).
At a deeper level, a class might work with an agency for the hearing or sight impaired to harness connections between hearing aids and mobile devices to address local needs of persons with disabilities. For example, such work might make worthwhile semester or team projects or drive some student-faculty collaborations.
Turning to matters of trust and risk within the classroom, some students can be intimidated if their comments might become known outside the classroom. Already mobile devices and computers have the capability of recording sound during discussions, but perhaps interactions with hearing aids could enable the streaming of discussions widely—in real time.
As yet another dimension, hearing-impaired students may require hearing aids to facilitate full engagement in classes, and modern hearing aids typically require control through apps on mobile devices. In such an environment, use of a mobile device may be essential to adjust hearing-aid settings. However, semi-autonomous commands also might have the potential to connect students with external materials (e.g., during a test). Since streaming material to a hearing aid currently is well developed, it seems only a modest step for a capability for web searches or phone calls through hearing aids during a closed-book assessment. The author is unaware of such applications at present, but recent technological advances might suggest such capabilities may not be far off.
Conclusion
The National Council on Aging observes that "Presbycusis generally begins in your 50s or 60s. It might be difficult to notice the extent of hearing loss with presbycusis because it's often so slow and gradual." [3] Further, the agency identifies several "Key Takeaways" regarding hearing loss:
- "Presbycusis is a type of hearing loss that gradually occurs as we age.
- "One in three people ages 65–74 have hearing loss, and almost half aged 75 and older have hearing loss. [1] Age-Related Hearing Loss (Presbycusis). NIDCD. NIH. Update March 17, 2023. Found on the internet at https://www.nidcd.nih.gov/health/age-related-hearing-loss
- "The most common symptoms of age-related hearing loss are perceiving sound as muffled, a persistent ringing sound, and difficulty hearing high-pitched sounds." [3]
Within a classroom environment, gradual hearing loss can make student questions hard to understand, particularly impacting questions and discussion by women. Also, a web search suggests such deterioration of hearing is widely known. However, when I mentioned my difficulty in having to ask many women to repeat their comments/questions (but few men), only one of the dozens of faculty contacted seemed to recognize this widespread problem; all of the others indicated they were unaware of the issue. When one is aware of one's own troubles hearing high pitches, I believe my own experience suggests that talking with an audiologist and obtaining a digital hearing aid may make a substantial difference! But a first step is recognizing the problem may exist!
Beyond recognition of hearing loss and its consequences, this case study also highlights several important issues for the development, documentation, and use of digital hearing aids—particularly as "high-tech" capabilities and options may be available.
Within the realm of computing and computing education, perhaps this "hearing aid" experience report will provide an engaging and worthwhile framework for the discussion of numerous opportunities, limitations, and consequences related to technology!
References
1. Aitam, Niharika, "How to check the execution time of Python script?", Tutorials Point, URL: https://www.tutorialspoint.com/how-to-check-the-execution-time-of-python-script, August 9, 2023; accessed 2023 Nov 13.
2. Boyaci, H. "4. Accurate timing" [Java based], psychWithJava - Psychophysics Programming with JAVA, URL: http://vision.psych.umn.edu/users/boyaci/Guide_v00/guidePrcsTime.pdf, 2013; accessed 2023 Nov 13.
3. GeeksForGeeks, "How to measure time taken by a function in C?", GeeksForGeeks, URL: https://www.geeksforgeeks.org/how-to-measure-time-taken-by-a-program-in-c/, June 21, 2022; accessed 2023 Nov 13.
4. Levitin, A., Introduction to The Design and Analysis of Algorithms, Third Edition, Pearson Education, Inc., 2012.
5. Parker, J., Cupper, R., Kelemen, C., Molnar, R., and Scragg, G., "Laboratories in the Computer Science Curriculum", Computer Science Education, 1, 3, 1990: 205–221.
6. Scatalon, L. P., Carver, J. C., Garcia, R. E., and Barbosa, E. F., "Software Testing in Introductory Programming Courses: A Systematic Mapping Study", SIGCSE Technical Symposium 2019, Minneapolis, MN, February-March, 2019: 421–427.
7. StackOverflow, "Timing algorithm: clock() vs time() in C++", StackOverflow, URL: https://stackoverflow.com/questions/12231166/timing-algorithm-clock-vs-time-in-c, November 7, 2020; accessed 2023 Nov 13.
8. Walker, H., Examples of Code and Output for Comparing Algorithms, URL: https://walker.cs.grinnell.edu/acm-inroads/experiencing-efficiency/index.html,\; accessed 2023 Nov 13.
9. Walker, H. M., "Teaching and a sense of the dramatic, act ii", Teaching Computing: A Practitioner's Perspective, Chapter 42, CRC Press, 2018: 299–304.
10. Wikipedia, System Time, Wikipedia, URL: https://en.wikipedia.org/wiki/System_time, October 20, 2023; accessed 2023 Nov 13.
Author
Henry M. Walker
Department of Computer Science
Noyce Science Center
Grinnell College
1116 Eighth Avenue
Grinnell, Iowa 50112 USA
[email protected]
Footnotes
1. At the time, the audiologist reported that 10 or more years earlier, only analog hearing aids were available. These were simple amplifiers and would boost all frequencies. Thus, to get proper volume for higher pitches, low pitch volumes likely would have been overwhelming. Levitt [2] provides a especially interesting history of the development of digital hearing aids.
2. In partial retirement, I teach a couple computing courses at Sonoma State University—about 50 minutes from home, if the commute goes smoothly.
3. My mobile device is always on my belt, not near my computer case.
4. I routinely turn off all voice commands (e.g., Alexa, Siri, etc.) whenever I purchase a new computer or mobile device, so it seems very unlikely that my mobile device or computer initiated these searches or calls.
In class, pay attention to whether you ask one group of students to repeat their question or comments, but not others. If so, consider taking a hearing test!
When technological devices have capabilities for autonomous/semi-autonomous actions, brochures, or user manuals likely have an obligation to publicize their existence and use—and how to turn them off. Users need to know!
When a computing application is intended to serve two or more different populations, it is natural to identify use cases for each group. However, if interests and/or needs differ, designers also may need to consider how those in one group might interact with use cases focused on other people or groups. High-level integration may require rethinking how detailed use cases might be handled.
Technological applications can have both intended and unintended consequences and thus raise numerous social and ethical issues. How can such matters be addressed—particularly when a developer may not be aware those consequences are possible?
Beyond the use of this experience report as a possible launching point for the discussion of numerous topics regarding software and its potential impact/consequences, capabilities for semi-autonomous search and phone calls involving hearing aids also may impact a classroom environment for private class discussion, test security (e.g., for closed-book, in-class exercises), and the nature of some assignments and team projects.
Copyright held by author.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2024 ACM, Inc.
Contents available in PDFView Full Citation and Bibliometrics in the ACM DL.
To comment you must create or log in with your ACM account.
Comments
There are no comments at this time.