Steve McConnell is the author of Code Complete, one of the classic software engineering titles that should be on the bookshelf of any professional developer. I first read this book when I started out in software development, and rediscovered it again a couple of year ago. Amazingly enough, the first (and still current) edition dates back as far as 1993, yet much of the material is still as valid today. Nevertheless, the last decade has of course seen many changes in the industry, from the advance of object-orientated programming to the recent fashion for agile development. He delivered a “state of the nation” speech today at the Whidbey technical reviewers’ workshop that reviewed the last decade of software development and set the stage for the forthcoming release of the second, heavily revised, edition of his book next week (UK, US).
His remarks on the advancement of software construction over the last decade seemed to me to fall into a couple of themes:
Adoption of a higher level of design and construction methodology
Design has been raised a level. Over the last decade, programmers have grown used to design abstraction from the subroutine to the class. As we think about OO programming, we often get hung up on issues such as polymorphism and single / multiple inheritance. Moving forward, the greatest legacy of OO might actually be that we have learnt to abstract.
One of the most common causes of technical failure in the 1990s was integration. The “big bang” approach of waiting until the end of a project to perform major integration has been shown to be a failure. Over the last decade, we’ve seen fairly widespread adoption of the idea that we integrate the code in small chunks on a daily basis, giving us the opportunity to see the quality of the code and supporting a more incremental approach. This unit-based approach to integration focuses software quality discussions more at the component level than the individual line of code.
The availability of standard libraries has also been a great boon. In the past, people tended to build their own personal toolbox of libraries; nowadays the standardisation of these libraries as part of a language or framework has improved software quality. Once again, this raises the level of abstraction: rather than concentrating on writing new code, much of the .NET code we write today is wiring up components together.
Steve suggested (perhaps a little tongue-in-cheek) that Visual Basic “was the language that Smalltalk was trying to be”! It was the first development environment to make widespread use of off the shelf components (using VBX). It even learnt some of the lessons from languages such as Ada. For example, the C++/Java design decision that switch statements fall through case blocks by default is a fundamentally dangerous situation: Steve described how a large electrical outage on the west coast of the US was once traced to just a mislocated break statement.
Technology changes impacting software development techniques
Faster computers have implications for optimisation, programming languages and development. The speed of hard disk access today is comparable to the speed of memory access a decade ago. In building the benchmarks for Code Complete, it took as many as a million iterations in order to measure a difference between two choices. The implication is that we should worry less about pure performance (at the language level, at least) and concentrate more on safety and other issues.
The web has also had an underestimated effect on development. The growth of FAQs, discussion groups and search capabilities have significantly changed the way that people search for answers. You no longer need to buy huge quantities of reference documentation but can simply search a wealth of information online quickly and easily.