Software development, for all the contributions it has made to society in terms of information availability and improved efficiency; it is a high risk venture. Reportedly, 70% of software projects either fail to achieve their full purpose or fail entirely. The reasons for this high failure rate are varied and numerous; however, they are rarely associated with the technical challenge of its development, but rather failures of the process in which they were created (The Standish Group, 1994).
For those beginning new software development projects, mitigation of this risk involves knowing how to appropriately select the software development methodology that will be used on the project. The purpose of this case study is to compare and contrast the Waterfall and Agile software development methodologies; two of the most commonly used methodologies to date (Laplante & Neill, 2004). By the end of this case study, it will be demonstrated that the anticipated amount of rework during the course of a project will be the factor in determining which methodology to consider; however, before delving into a comparison, the next section provides some background on why software projects fail.
In the beginning, about the time of the Second World War, computers were simple. In June of 1944, a computer called the Electronic Numerical Integrator and Computer (ENIAC) was first put into operation (Goldstine, 1972). Widely considered the world’s first general use computer and credited with starting the modern computer age, it consisted of about 17,500 vacuum tubes, 70,000 resistors, 10,000 capacitors, 1,500 relays, 6,000 manual switches, weighed 30 tons, spanned 1,800 square feet, and required dozens of technicians and engineers to maintain and operate; however, despite its impressive footprint, it could only perform a limited number very simple mathematic calculations. Programs written for it could contain no more than 5,000 additions, 357 multiplications or 38 division expressions. (Williams, Christianson, & Beth, 1998) In these early days, although it took weeks to program computers such as the ENIAC, due to their extremely limited capabilities, very little consideration was giving to how to approach the development of programs. The concept of software engineering as its own discipline and field of study did not exist.
In the two decades that followed, computers vastly improved. New innovations and improvements lead to increased speeds and capabilities, which lead to an increased desire to leverage these machines to tackle more complex problems; however, as the field of computer science was focused on improving the capabilities of computers, still very little effort was invested in how to approach the development of programs that could full leverage these new capabilities. As a result, a subtle but disturbing trend was beginning to form; increasingly software projects were beginning to run over their schedules, their budgets, and were resulting in programs of decreasing quality while at the same time becoming increasingly more difficult to maintain (Naur & Randell, 1969). Edgar S. Dijkstra, in his famous lecture entitled The Humble Programmer, reflected on the state of software development in those days:
“[The major problem is] that the machines have become several orders of magnitude more powerful! To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming has become an equally gigantic problem” (Dijkstra, The Humble Programmer, 1972).
In 1967, tasked with assessing the entire field of computer science, the NATO Science Committee established a study group led by Friedrich Ludwig Bauer (F. L. Bauer), a German computer scientist and professor emeritus at the University of Technology in Munich. The group decided to focus its attention to the problems of software development, and in late 1967, it recommended the holding of a conference on the subject. (Naur & Randell, 1969)
In 1968, the first NATO Software Engineering Conference was held in Garmisch, Germany, where two terms were coined: software engineering and the software crisis (Dijkstra, There is still a war going on, 1993; Naur & Randell, 1969). The term software engineering was chosen deliberately as it was considered provocative and reinforces the group’s conclusion that software development needs to be based on theoretical foundations and practical disciplines as they are in all traditional branches of engineering. The second term, the software crisis, was used to describe the then current state of applied computer science in how it appeared that writing complex programs was proving more difficult than creating the machine that could run them. They attributed the observed trend of increasing project failures to the lack of a field study dedicated to developing principles and methodology by which to manage the development of software.
Engineering, regardless of the discipline, is about providing solutions to problems. If it is cold, call a mechanical engineer to build a heater; if a river needs to be crossed, call a structural engineer to build a bridge; and if there is an opportunity to increase business process efficiencies through automation, call a software engineer. However, where software engineering differs from other traditional engineering disciplines, the difficult part is not in the implementation of the solution but rather understanding the problem it intends to solve.
In the book, Wicked Problems, Righteous Solutions: A Catalog of Modern Software Engineering Paradigms, Peter DeGrace and Leslie Stahl described the task of gathering system requirements and developing designs as a wicked problem, a term that refers to any problem where the requirements to solve it are incomplete, contradictory, or change in a manner such that their solutions are difficult to determine because of complex interdependencies (Degrace, 1990). Being that the collection of system requirements is fundamental in any software development project, mitigating the risk associated with it being a wicked problem is of utmost importance.
Since the first NATO Software Engineering Conference in 1968, where the first software development methodology was introduced (Naur & Randell, 1969), there have numerous methodologies proposed, but all serve the propose of imposing a structured process in the development of software products. Regardless of the methodology, this process, known as the software development process (SDP) (White, 2003, p. 2), generally consists of four basic steps: understand the problem (i.e. requirements collection), devising a solution (i.e. design), implementing it (i.e. development), and verifying its results (i.e. testing).
There are two schools of thought in how to approach the SDP: Incremental methodologies, such as Waterfall, that advocate the use of a single, but intensive, SDP iteration and Iterative methodologies, such as Agile, that advocate the use of numerous iterations in order to build their products modularly.
In 1970, Winston W. Royce, in his paper, Managing the Development of Large Software Projects, proposed one of the first and widely used incremental methodologies. Although he never used the term in his paper (Royce, 1970), the incremental methodology he would propose
would later become known as Waterfall and would become one of the most widely used software development model in history. This being the case, the Waterfall method is sometimes referred to as the Traditional Software Development Process (Gibbs, 2006).
Waterfall is a linear process in which development of software is depicted as cascading through the following four SDP phases sequentially: requirements analysis, design, implementation, and testing/validation. Occasionally, a fifth phase is appended, called maintenance, which severs to identify the ongoing maintenance of a product, once it is in production, is also a component of its lifecycle.
According the book, Project Management for the IBM Unified Process, the basic tenants of the waterfall method are (Gibbs, 2006):
- One cannot build anything before knowing its requirements.
- Rework should be minimized by not beginning development until all the requirements and specifications are agreed upon.
- One may not begin work on a subsequent phase until all the work of the current phase is complete.
This model advocates that great pains should be taken to ensure that each phase of the lifecycle is thoroughly completed before proceeding to the next one. The goal of this approach is to provide confidence that one can proceed to a sequential phase when one knows that one all the prerequisites for that phase is completed. For example, if one can be assured that one has painstakingly gathered all the specifications in the requirements phase, then one can begin constructing a design with great confidence that no detail was overlooked in the subsequent design phase; likewise, with a comprehensive design, development of the actual code in the Implementation phase is as simple as following the blueprint outlined in the preceding phase; and so on.
Contending with just one lifecycle iteration does have its benefits. Below is a list of its observed advantages (Gibbs, 2006):
- With just one lifecycle iteration, it become intuitively obvious what tasks should be performed in which phase.
- Its linear approach is simple to learn and understand by its project team.
- Relative to planning for multiple, or potentially an unknown, number of lifecycle iterations, the waterfall method is vastly easier to schedule.
- Due to it consisting of long duration phases and ease of scheduling, it is easier to determine staffing and resource allocations because the specific times when a particular skill sets and/or equipment will be needed are easier to identify.
- Also due to its long phase durations, time intensive tasks, such as creating thorough system documentation, can be allocated in the schedule.
- By exhaustively completing one aspect of a project before another, it is conducive to the identification of project milestones which can be used to mark the apparent progress of the project.
- The amount of rework is theoretically reduced as problems are identified earlier in the process before their impact to its schedule is compounded if discovered in subsequent phases.
It is arguably easier to identity problems earlier on in the process than later. If realized, this ability can greatly improve the chances of success, for as Steve McConnell in his book entitled Rapid Development: Taming Wild Software Schedules, points out: “a requirements defect that is left undetected until construction or maintenance will cost 50 to 200 times as much to fix as it would have cost to fix at requirements time” (McConnell, 1996). Recall that project overruns, in time and budget, are two of the three measures by which a projects failure is defined (The Standish Group, 1994).
The managerial and administrative advantages of Waterfall largely are the products of following a single lifecycle iteration approach. The disadvantages of the Waterfall model are not a product of its model, per se, but rather that applying it requires that the following two assumptions are true (Gibbs, 2006):
- The requirements are known completely at the beginning of the project and are not likely to change significantly.
- The requirements are understood completely before the beginning of the project.
As described in the book, Wicked Problems, Righteous Solutions: A Catalog of Modern Software Engineering Paradigms, Peter DeGrace and Leslie Stahl described the task of gathering requirements and developing designs as a “wicked problem” a term that refers to any problem where the requirements to solve it are incomplete, contradictory, or change in a manner so that the solutions to them are difficult to determine because of complex interdependencies (Degrace, 1990).
A facet of the wicked problem of collecting requirements is that clients often do not fully understand what they need, let be able to convey those requirements in their entirety, without seeing examples of what might work, or more often, examples of what does not work. David Parnas, in his book A Rational Design Process: How and Why to Fake It, referred to this need for iterative feed back when he wrote: “Many of the details only become known to us as we progress in the implementation. Some of the things that we learn invalidate our design and we must backtrack” (Parnas).
This explains one of the greatest disadvantages of Waterfall. For all of the benefits that its rigid structure affords, its rigidness is not accommodating to project environment circumstances where the status of requirements could be considered a wicked problem or when sponsors require an amount of iterative feedback to understand them. Due to its insistence that all requirements be collected before any development begins, discouraging iterative feedback serves to only encouraging the occurrence of necessary specification changes being discovered later in the development process.
There is no one single Agile methodology, but rather a family of them. In 2001, a conference was held in the Snowbird ski resort in Utah, where predominate figures in the software industry whom advocate iterative development practices created the Agile Manifesto (Fowler, 2006); a collection of principles and practices which were written in the spirit of its preamble creed:
“Individuals and interactions over processes and tools; Working software over comprehensive documentation; Customer collaboration over contract negotiation; Responding to change over following a plan; That is, while there is value in the items on the right, we value the items on the left more” (Beedle, et al., Manifesto for Agile Software Development, 2001).
Whereas Waterfall and incremental models strive to minimize requirements change, Agile thrives on it. The Agile Manifesto comprises of eleven main principles which serve to promote iterative feedback and continuous user engagement. Its first principle, which it admits to be its highest, states: “Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage” (Beedle, et al., Manifesto for Agile Software Development, 2001).
Agile methodologies subscribe to the same basic process of collecting requirements, devising a solution, implementing it, (etc.) as Waterfall; but instead of utilizing a single lifecycle iteration to accomplish all aspects of the project in highly scheduled intensive phases; it advocates that numerous, and sometimes an unknown) of lifecycle iterations are required to develop software.
One thing that distinguishes Agile from Waterfall is that most Agile methodologies advocate a period of time to reassess the execution of the project. One principle in the Agile Manifesto states: “At regular intervals the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly” (Beedle, et al., Principles behind the Agile Manifesto, 2001).This period of time allows the project be more nimble in addressing the often fluid priorities of a business, remove observed inefficiencies and implement means to improve the collaborative efforts between the client and the development team.
Because each iteration includes a requirements phase, this serves to minimize the costly impact observed when necessary changes are identified late in the development process. This revisiting of requirements constantly ensures that sponsors are getting the iterative feedback they need.
In practice, this involves numerous iterations to ensure sustainable development. “The sponsors, developers, and users should be able to maintain a constant pace indefinitely. […] Business people and developers must work together daily throughout the project.” (Beedle, et al., Principles behind the Agile Manifesto, 2001) Continuous customer engagement is one way in which Agile serves to mitigate the risk associated with a lack of user involvement. Also, what this implies is, much like how only a witness can say when a sketch artist’s drawing looks like the suspect, this iterative approach also implies that only the client themselves can determine when a software product meets their specifications.
One principle of the Agile Manifesto is: “working software is the primary measure of progress” (Beedle, et al., Principles behind the Agile Manifesto, 2001). The goal of each cycle is to build upon the product of the previous one. By providing a stream of increasing functionality to the client, this facilitates a high degree of iterative feedback which can be used to determine when a project meets their specifications.
As what can be gleaned by the tenets listed above is that the spirit of Agile is to undo the sins of the incremental methodologies by focusing, if not obsessing, on fostering iterative feedback by keeping the client engaged throughout the entire duration of the development process and delivering a constant flow of functionality for their evaluation and revaluation. This is backed up by research at The Standish Group which also has observed that:
“smaller time frames, with delivery of software components early and often will increase the success rate. […] Growing software engages the user earlier, each component has an owner or a small set of owners, and expectations are realistically set. In addition, each software component has a clear and precise statement and set of objectives. Software components and small projects tend to be less complex. Making the projects simpler is a worthwhile endeavor because complexity causes only confusion and increased cost.” (The Standish Group, 1994)
For its observed benefits of reducing project risk attributed to fostering iterative feedback over Waterfall, Agile methodologies are also not without criticisms. Due to being predicated on small face-to-face teams, it has been observed that it may not be suitable for use in developing distributed systems by different teams. This both hampers companies from utilizing subcontractors for the development of specialized subsystems and impedes their ability to handle the development of large-scale complex systems which simply cannot be constructed by a small group of people (Turk, France, & Rumpe).
Another great criticism of agile methodologies is that because neither the development team nor the client understands the entire scope of the development until
it is undertaken, it makes it extremely difficult to develop project schedules and estimate budgets. Additionally, because the number of revolutions required to complete the project is unknown, there is no clear milestone by which to judge the progress of the project. Time costs money in business and with an inability to estimate the amount of time it will take, it becomes correspondingly more difficult for organizations to calculate budgets and to determine their eventual return on investment (Turk, France, & Rumpe). To those whom write checks, this approach may be very disconcerting and can create difficulties in establishing the type of agreements contractually necessary in the commissioning work.
Before comparing Agile and Waterfall directly, to their credit, it should be known that both have been used successfully to deliver products on time, within budget, and that meet their specifications.(The Standish Group, 2004). Although both methodologies have empirical evidence that they work; however, it is the circumstances of the project that determines when one is better to use than the other.
The essential advantage of Waterfall is that it is simple and has a rigid structure. This rigid structure makes it conducive to establishing schedules, identifying obvious project milestones, efficiently allocate resources, and is over all is a simpler process to understand and manage. However, because its rigid structure calls for all of the requirements to be collected before any design or is performed, it can serves to minimize the amount of time the project sponsors are engaged in the process of developing the product and also discourages the use of iterative feedback sometimes required by the sponsors to completely understand their specifications. It has been observed by the Standish Group a lack of user engagement and changing specifications were two of the primarily reasons that software projects fail (The Standish Group, 1994).
Agile, on the other hand, due to its approach of aggregating product over numerous short iterations, it is very dexterous in responding to changing specifications and thrives when user involvement is at its maximum. However, because its structure is so fluid and seemingly open-ended, it can be a difficult one to manage and anticipate. Additionally, because Agile is so dependent on user involvement, in circumstances were sponsors’ time is limited or otherwise cannot fully commit to become a part of the process, the process itself may fail to thrive.
For those whom are starting new projects or are in positions to select a methodology, they would do well to first evaluate the environment in which the product will be developed. If the environment consists of the following properties, which indicate that requirement are likely not due to change, then a Waterfall model ought to be considered:
- The requirements are known completely at the beginning of the project and are not likely to change significantly
- The requires are understood completely before the beginning of the project
- It is known that the requirements can be implemented using the existing technologies at hand.
- The technologies used on the project do not change during the duration of it.
- The team assembled for the project is experienced and familiar with the subject matter of the project and the problem it is attempting to resolve.
Environments where the anticipated amount of rework or amount of iterative feedback needed to compelte the project is low; therefore, it work Reason being, the properties describe indicate an environment where requirements are likely not due to change because t where the amount of rework and iterative feedback required to complete the project are probably low
Degrace, P. (1990). Wicked Problems, Righteous Solutions. Englewood Cliffs, , New Jersey: Prentice Hall PTR.
Dijkstra, E. W. (1972). The Humble Programmer. Retrieved March 19, 2011, from http://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340.html
Dijkstra, E. W. (1993, December 3). There is still a war going on. Retrieved March 1, 2011, from Department of Computer Sciences: The University of Texas at Austin: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD11xx/EWD1165.html
Gibbs, R. D. (2006). Project Management with the IBM Rational Unified Process: Lessons from the Trenches. Prentice Hall.
Goldstine, H. (1972). The Computer: from Pascal to von Neumann. New Jersey: Princeton Press.
McConnell, S. (1996). Rapid Development: Taming Wild Software Schedules. Redmond, Wa: Microsoft Press.
Naur, P., & Randell, B. (1969, Oct 7-11). Software Engineering. Retrieved March 16, 2011, from http://homepages.cs.ncl.ac.uk/brian.randell/NATO/nato1968.PDF
Parnas, D. (n.d.). A Rational Design Process: How and why to fake it. Retrieved March 1, 2011, from http://web.cs.wpi.edu/~gpollice/cs3733-b05/Readings/FAKE-IT.pdf
Royce, W. (1970). Managing the Development of Large Software Systems. Retrieved March 22, 2011, from http://www.cs.umd.edu/class/spring2003/cmsc838p/Process/waterfall.pdf
The Standish Group. (1994). Caos: The Standish Group Report. Retrieved March 20, 2011, from Educause: http://www.educause.edu/ir/library/pdf/NCP08083B.pdf
The Standish Group. (2004). Standish: Project Success Rates Improved Over 10 Years. Retrieved April 4, 2011, from Software Mag: Application Development: http://www.softwaremag.com/L.cfm?doc=newsletter/2004-01-15/Standish
Turk, D., France, R., & Rumpe, B. (n.d.). Limitations of Agile Software Processes. Retrieved April 1, 2011, from Agile Alliance: http://www.agilealliance.com/system/article/file/1096/file.pdf
Williams, R. S., Christianson, B., & Beth, T. (1998). Computing in the 21st Century: Nanocircuitry, Defect Tolerance and Quantum Logic [and Discussion]. Philosophical Transactions: Mathematical, Physical and Engineering Sciences , 356 (1743), 1783-1791.