In computing, it is easy to emphasize the promise and importance of the future, focus on the challenges of the present, but forget what has happened in the past because—we tend to believe—everything changes so fast. Sometimes it is, however, useful to stop and consider what we can learn from history in our own fields. This does not apply only to those who do this professionally, with the methods and persistence of historians. All of us can benefit from visits to the past. One of the most significant benefits of such a pause from the frenetic pursuit of the next thing in the future is an improved understanding of our tendency to reinvent the wheel.

In information systems (IS), systems development and systems analysis and design (SA&D) form one of the core focus areas in education and organizational practice. In IS, our particular focus is on the organization and management of the SA&D processes and the mechanisms required to ensure that the outcomes of the analysis, design, and development processes truly serve the needs of the organization that engages in the development effort.

SA&D methodologies have gone through a dramatic set of changes over the past several decades. If asked about this, most of us would identify a progression from lack of any structured methodologies to waterfall to more advanced plan-driven methods (such as the spiral model) to various types of agile with prototyping, Rapid Application Development, Joint Application Development, and others considered as "alternative development methodologies" somewhere along the standard path. In IS education, we are not quite sure yet how to deal with agile development and DevOps (a compound of "development" and "operations"), given the way they require a strong integration between planning, analysis, design, coding, deployment, and operations—coding and operations have never been among our strongest focus areas.

In the context of discussions regarding the development of SA&D, we have a tendency to trivialize particularly early plan-driven forms of SA&D and present them as pure waterfall model, consisting of pre-defined stages that had to follow each other in a strictly specified sequence. We also tend to think that we have only recently learned to understand the complexities of understanding human behavior, organizational needs, technical requirements and implementation together.

Given all this, it is healthy and interesting to revisit documents that have had a seminal role in the development of our thinking regarding SA&D at its early stages and notice that many core issues and proposed solutions to them have stayed the same since the early years of organizational computing. A paper by Winston Royce entitled "Managing the Development of Large Software Systems" published in 1970 [2] is at times credited to be one of the first documented efforts to codify the waterfall model. It is fascinating to go back to the original document and find out that the picture painted by Royce is much more complicated and multifaceted than the "waterfall" label might suggest.

It is, indeed, true that Royce presents in the paper a waterfall-like model that he labels "Implementation steps to develop a large computer program for delivery to a customer" (although he never uses the term "waterfall"). This is, however, only a starting point of the paper. He states, for example: "I believe in this concept [a waterfall model with iteration between successive stages], but the implementation described above is risky and invites failure." [2, p. 329] Royce continues the paper by introducing five mechanisms the sole purpose of which is, he states, to "eliminate most of the development risk."

There is not enough space to discuss these five mechanisms at a detailed level here, but their intent can be summarized relatively easily. First, they introduce forms of iteration that are not confined to successive steps in the model. Second, they essentially recommend the use of prototyping and/or the development of a minimum viable product/solution. Third, they emphasize the importance of testing, not quite to the extent of proposing test-driven development, but still in ways that are more intensive from a typical waterfall model. Fourth, these mechanisms focus on involving the customer throughout the development process. Pedagogically, I believe it would be highly useful for our students to see—based on a seminal document—that some of the key challenges and proposed solutions have stayed the same since the early days of the development of complex computing solutions.


Computing technology development during the 45-year period since 1972 has exceeded many expectations, but in matters related to how to effectively structure the development of complex IT solutions and how to ensure that these solutions are fully aligned with individual and organizational needs, many of the fundamental questions and some of the solutions have stayed the same.


Another interesting and influential document from the same era as Royce's paper is ACM's first graduate level curriculum recommendation [1] entitled "Curriculum Recommendations for Graduate Professional Programs in Information Systems" by Ashenhurst et al. This model, published in 1972 and developed during the preceding years, features many characteristics that are still highly relevant. For example, this model specifies graduate outcome expectations ("needed knowledge and abilities") categorized into six groups: a) people, b) models, c) systems, d) computers, e) organizations, and f) society. A closer analysis reveals that many of the expectations are still fully valid and that the characterization of information systems as a discipline that integrates individual, organizational, and technical perspectives both in education and in research has existed from the early days of organizational computing. Computing technology development during the 45-year period since 1972 has exceeded many expectations, but in matters related to how to effectively structure the development of complex IT solutions and how to ensure that these solutions are fully aligned with individual and organizational needs, many of the fundamental questions and some of the solutions have stayed the same.

This is important from the pedagogical and program design perspective because we often feel a strong and urgent need to emphasize latest technologies at the cost of many other aspects of course and program design. Technological currency is, of course, important, but even a quick trip to our discipline's history suggests that the field of information systems and its approaches to addressing computing challenges has more permanency than we would expect. At times, it makes sense to carefully consider what we can learn from the past, instead of focusing all of our energy on keeping up with the unknown future.

References

1. Ashenhurst, R. L. Curriculum Recommendations for Graduate Professional Programs in Information Systems. Communications of the ACM, 15, 5 (1972), 364–398.

2. Royce, W. W. Managing the Development of Large Software Systems. in Technical Papers of Western Electronic Show and Convention. (1970), 1–9.

Author

Heikki Topi
Bentley University
Smith 408
175 Forest Street
Waltham, MA 02452
htopi@bentley.edu

Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2017 ACM, Inc.

Contents available in PDF
View Full Citation and Bibliometrics in the ACM DL.

Comments

There are no comments at this time.

 

To comment you must create or log in with your ACM account.