Nothing more has to said to be burned at the stake!
We all know the disadvantages of Waterfall Software Development, A.K.A. Big Design Up Front, A.K.A. pure evil. But before we condemn Waterfall and every company for applying it, we should understand the context when it was created.
“Standing on the shoulder of giants” is something very over looked in Software Development. Actual Status Quo was achieved not only by past successes, but also past failures. Today, we know Waterfall was wrong in many things. But was it wrong in everything?
Little bit of History
I like history. History explains why things are the way they are. Tells us how we got there. If we forget about History, History will repeat itself, errors included.
To understand why Waterfall was invented, we need to go back in history. There is an excellent talk with Robert C. Martin (Uncle Bob), where he explains it. In a nutshell: Software Development, once populated by older and more experienced people (mainly women) was invaded by a bunch of 20 year old “yahoos” in the 70’s. There had to be a way of keeping them other control. So Waterfall was invented.
Waterfall was invented to have some predictability over the work done by very inexperienced people. We have to be honest, it was not such a bad idea, at the time. Many years have passed and things changed a bit. Computer Science became a field of its one. We have many great books from experienced people, that came up with much better ways of doing Software Development. But most of all, we have many years worth of mistakes to learn from.
Software Development Lifecycle
From the humble beginnings of mathematicians writing on a white board, to computers affordable enough for each developer have several at their disposal, Software projects increased hugely on complexity. More than one person could handle. More than one team could handle. More than a person could oversee it in every single detail.
To handle such complexity, people decomposed the development in to distinct phases. Different people with different skills would could into play in different stages. Tasks could be spread through multiple teams. The brake down of a software development process into different phases was very well made. So well, it is still used today:
Looking at the image, all Agile advocate get chills down their spine! But it shouldn’t. What Agile methodologies advice is not to drop these phases. Sprint 0 in SCRUM is requirement gathering. User stories? Requirement analysis under another name. User Acceptance tests is nothing but testing. And a project always has an end. It is a really bad sign if it doesn’t have to be maintained. All the same phases are there! The separation is blurred, but it still exist, in much shorter cycles.
More recently, developers stop thinking that way. There is no design, or design is part of the implementation. Even worst, with TDD, testing, analysis, design and implementation are all the same. And yet, Agile teams don’t succeed much better.
We can not eliminate the different phases from development or sit at the keyboard and furiously write code, clumping it up in a single “Implementation” phase. We need to be humble enough and have courage to admit we are not able to tackle problems without thing, asking question until we understand the problem enough to try and solve it.
“Analysts” or “Software designers” did not wrote the code themselves. They had to think in an abstract way. There was no thinking through the keyboard. So they used techniques such as pseudocode, diagrams or models to help them deal with complex problems and being able to transmit their thoughts to people implementing it.
The good thing about abstraction is see to the whole picture, without being obfuscated with the little details (Care for some Domain Driven Design?). You can’t see the forest for the trees. “Code as documentation” argument is used to skip the white board directly into coding.
The minute you start writing code, you loose sense of the whole picture. People can only reason with one abstraction level at a time. One of the developer’s myths is “it doesn’t apply me”, so it is hard to make a point on those cases.
Modeling & Documentation
“Code as documentation” is the wooden stake put through the heart of the documentation talk. “Go read the code, it is self-explanatory”. Self-biased view aside, I cannot understand how you can explain a system to a non-technical with code or for an outsider or newcomer to gain knowledge about the overall system in a timely or feasible way.
Another developer’s myth used is “it’s in my head”. Turnover of staff is a reality, the head might leave in the future. If a developer is on vacation, sick or worst scenarios, the head won’t be there when needed. More common: people forget. “It doesn’t apply to me” myth makes this a hard sale.
The core point here is different levels of abstraction. Code explains a system on a very small scope and low level of abstraction. The minute you think about two systems, you cannot go by without drawing a diagram. Two simple squares joined by a line is a model. Not code, a model.
What waterfall did wrong was to try document and model everything up-front down to the small detail, so no matter how dumb or inexperienced someone was, there was no way she/he could go wrong.
You read a book and you realize half the time they are repeating what others said. It is not a bad a thing! Using others knowledge enables you to go further, making something new. It is what we call progress. You keep the good things from the past. Learn from the bad things, avoiding to fall into the same mistakes. It was how Agile came about.
It is with great sadness, I see good things being forgotten.